• Hi there and welcome to PC Help Forum (PCHF), a more effective way to get the Tech Support you need!
    We have Experts in all areas of Tech, including Malware Removal, Crash Fixing and BSOD's , Microsoft Windows, Computer DIY and PC Hardware, Networking, Gaming, Tablets and iPads, General and Specific Software Support and so much more.

    Why not Click Here To Sign Up and start enjoying great FREE Tech Support.

    This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Nvidia Ampere: release date, specs and rumors


PCHF Tech News
Jan 10, 2015
The Nvidia Ampere – or more specifically, the Nvidia GeForce RTX 3080 – may be the most anticipated product in the computing world, and while the rumor mill has been spinning, speculating that the next consumer-facing graphics card would be based on Ampere, that hasn't quite happened yet.

Instead, Nvidia announced Ampere much in the same way it did with Volta a few years ago – built primarily for Data Center with no mention of GeForce. Now that AI has become so important in the world, thanks in large part to the unprecedented rise in cloud computing, Nvidia has been hard at work developing the A100 GPU, which should deliver a whopping 20x improvement in raw compute power.

These days IoT devices are piling up in everyone's homes, while companies like Amazon are creating, for example, grocery stores that let you automatically purchase products as you toss them into your shopping cart, and cars are getting ready to drive by themselves. All of this requires a ton of compute power, which is why the new 7nm Ampere architecture, along with its 3rd-generation Tensor Cores, is such a big deal.

The GPU should be available soon for any business that could benefit from such sheer computing power, along with systems like the DGX A100, which packs eight A100 GPUs into a single rack that will cost an eye-watering $1 million. Plenty of buyers for both the DGX A100 and the A100 on its own have been lined up, with the likes of Microsoft, Google, Dell and Amazon getting in on the Ampere action.

We don't know if Nvidia Ampere is going to be behind the next GeForce cards yet – and we wouldn't exactly rule it out since Nvidia CEO Jensen Huang has said Ampere would be used in "all next-generation cards". However, this statement is muddied when you consider that he also said that "there's great overlap in the architecture, but not in the configuration.”

Still, that doesn't mean that the GPU architecture won't impact the lives of us normal folks who can't afford to drop hundreds of thousands of dollars. What the consumer-facing GeForce Ampere cards will look like remains to be seen. However, there are a ton of rumors out there, so we gathered them all up here. So, be sure to keep this page bookmarked, and we'll keep it updated with all the latest news, rumors and straight-up gossip.

Cut to the chase

  • What is it? Nvidia's 7nm next-generation GPU architecture
  • When will it be available? Available right now for enterprise, TBD for consumers
  • What will it cost? TBD
Nvidia Ampere release date

While the first DGX A100 systems were delivered to Argonne National Laboratory near Chicago in early May to help them research the novel coronavirus, the consumer-facing Nvidia Ampere GPUs still haven't been announced.

If they do make it into the next round of Nvidia's GeForce GPUs, like the highly-anticipated 3000-series graphics cards, then the current pattern of release dates suggests that we should see them arrive sometime toward the middle to end of Q3 2020, and possibly even in Q4. And, with the next-generation consoles like the PS5 and Xbox Series X likely launching around November, we could see Nvidia launch its next-generation GeForce cards around the same time to capitalize on the next-gen gaming hype.

Either way, we won't know until Nvidia unveils the next graphics cards.

Nvidia Ampere price

There hasn't been much concrete news about the latest line of Nvidia GeForce cards to speak of, so trying to anticipate what we can expect as far as pricing goes is a bit of a crapshoot.

With the Ampere-based server-side GPUs representing a 20x increase of raw computing power over the Volta GPUs, it's safe to say that an Ampere-based GeForce GPU will have some eye-popping performance gains over the current Turing-based RTX cards. While it's very doubtful we'll see a GeForce 3080 Ti with 20 times the power of the RTX 2080 Ti, it will still see a significant boost, possibly as high as 50%, if we're to believe some of the specs that popped up on Twitter recently.

These cards will be a hot commodity when they are released, so there's definitely reason to believe that Ampere GeForce cards might be priced even higher than Turing cards were when they launched.

There's still some room to hope though that prices will feel some downward pressure before long. Intel isn't the only one getting a heavy case of agita from AMD lately as the industry's underdog chipmaker has also been putting up some very competitive Radeon GPUs against Nvidia's GeForce cards in recent years as well.

With consumers becoming much more price-sensitive these days, it simply isn't enough for Nvidia to feel confident that their current technological superiority will keep them on top. Ampere's incredible processing power will be cold comfort if Nvidia ends up surrendering a sizable part of the consumer graphics card market to AMD because they were offering lower-priced-but-still-powerful Radeon GPUs.

So Nvidia might just decide it's better to price their newest Ampere-based GeForce cards aggressively to counter AMD's competing cards, like the upcoming AMD RDNA 2, than to try to squeeze as many dollars as they can out of very high-end users. Until Nvidia tells us something - anything - about their upcoming GeForce releases, though, it's all still speculation.

Nvidia Ampere specs

The Nvidia A100, which is also behind the DGX supercomputer is a 400W GPU, with 6,912 CUDA cores, 40GB of VRAM with 1.6TB/s of memory bandwidth. Needless to say, it's kind of a behemoth – but it kind of has to be.

But, it's even more powerful than it lets on. Nvidia is claiming that this GPU is a 20x jump in performance over the last generation, which makes it easy to understand why companies from Amazon Web Services to Microsoft are already jumping in on the action.

But what does that mean for GeForce? Well, it's unlikely that we're going to see an Nvidia GeForce RTX 3080 Ti that's going to be 20x more powerful than the RTX 2080 Ti, but the leaks we've seen so far have been promising.

The latest leak points to an Nvidia GeForce RTX 3080 Ti with 5,376 CUDA cores with a boost clock of 2.2GHz, which would equal to about 23.65 Teraflops of compute – which itself may disprove this leak, as it lists 21 TFLOPs. Plus, the 320W TDP listed might be a red flag too.

Though another speculation does point to the Nvidia GeForce RTX 3080 Ti, whose Asus model may have been leaked recently, having a massive 627mm² die. This further fuels rumors that Nvidia Ampere GeForce GPUs will be based on Samsung's 8nm node, rather than the 7nm TSMC node that the pro-level Ampere chips are based on.

Another leak, this one seeming more accurate, points to an Nvidia GeForce RTX 3080 with 4,352 CUDA cores and 10GB of VRAM, based on the Ampere architecture. This leak also suggests that we're getting an Nvidia GeForce RTX 3090.

Specifically, it points to three GPUs based on the GA102 GPU: an RTX 3080, an RTX 3090 and the new Titan. We've gone into some pretty deep detail about why we don't think an Nvidia GeForce RTX 3090 is going to be a thing, but replace that with an RTX 3080 Ti, and you have yourself a tidy little rumor.

Other GeForce RTX 3000 cards have made an appearance as well. Speculations that the RTX 3070 Ti will be built on the GA104-400 while the RTX 3070 will be built on a cut-down GA104-300 have surfaced. The same source is claiming that the RTX 3070 specs will include 2,944 CUDA cores (the same as the existing RTX 2080) and 8GB of GDDR6 video memory, while the RTX 3070 Ti will up the ante to 3,072 CUDA cores and have GDDR6X memory support.

Either way, with the next generation of games pushing higher resolutions, and with more games with ray tracing hitting the market thanks to the new consoles, you can expect these next graphics cards to be incredibly powerful, especially if they have to go up against AMD RDNA 2 or "Big Navi" cards.

But, again, we'll just have to wait and see.

Nvidia Ampere performance

One of the most exciting parts about watching the world of PC components is keeping up with all the (probably fake) leaked benchmarks. And, because the thirst for new Nvidia graphics cards is super high right now, it seems like everyone is coming out of the woodwork to show off what the next graphics cards can apparently do.

The latest of these leaked benchmarks comes courtesy of hardware leaker _rogame and shows an "unknown Nvidia Ampere GPU" managing a score of 18,257 in 3DMark Time Spy. To put that in perspective, in our initial RTX 2080 Ti review, Nvidia's current-gen flagship managed 12,123 points. That's a 44% jump in performance – nothing to shake a stick at.

Earlier rumors even pointed to the Nvidia GeForce RTX 3080 being up to 40% faster than the 2080 Ti – and things are starting to line up.

No matter how you slice it, it seems like we're going to see a much greater jump in performance than we saw when Nvidia Turing followed Pascal in 2018 – and we're excited to get our hands on it.

What we want to see

Since we know some of the broad strokes of what Nvidia's Ampere architecture can do, here's what we'd really like to see once the new lineup of graphics cards is released.

Keep launch prices in line with those of previous-generation GPUs
While nobody is out here begging a tech company to charge them more money for a product, the rate of price inflation in the GPU market has been ridiculous for some time now, especially the Nvidia GeForce 2080 Ti. Yes, demand is the driving force behind these price increases, but your market position is only tenable as long as your competition is putting out a noticeably inferior product. That is increasingly no longer the case.

AMD Radeon graphics cards have offered more than enough power for most users for a while now and typically do so at a much lower price than Nvidia does. Sure, they may not come with the latest ray-tracing technology like Nvidia's latest cards do, but if we can't afford a card with ray tracing anyway, going with AMD looks increasingly like the sensible choice for many consumers.

Continue to improve on ray tracing's potential
With the advance from DLSS 1.0 to 2.0, ray tracing made a huge leap in terms of framerate performance and other important benchmarks for the technology, but it still isn't the kind of tech you could run consistently and get great framerates with, even with some high-performance hardware supporting it.

It would be great to see another jump of a similar scale from DLSS 2.0 to 3.0 so that ray-tracing might become a standard feature players will actually use regularly in games. It'd be a major missed opportunity for Nvidia not to go all-in on this tech since ray-tracing is one of the biggest appeals of a flagship GeForce card right now, especially since it's something that AMD is only set to offer on some of its high-end GPUs later this year.

More than anything, we would love to see Nvidia Ampere continue moving the technology forward if for no other reason than to pressure AMD to move more quickly toward democratizing ray tracing supported graphics cards. Nothing like a bit of cutthroat competition to help move the state of the art along.

Offer Ray Tracing across the product stack
With Nvidia Turing, Ray Tracing was this revolutionary technology, and was genuinely worth the high price to be on the cutting edge. However, if you wanted in, you would have to opt for at least the Nvidia GeForce RTX 2060. Some of the most popular graphics cards in that lineup, like the Nvidia GeForce GTX 1660 Ti, didn't have the RT cores enabled, which meant hardware-accelerated ray tracing was out of reach for a large portion of the audience.

With Nvidia Ampere, if Ampere is even the architecture behind the next GeForce cards, we would love to see RT and Tensor cores enabled all the way down the product stack, so even budget users can get in on the ray tracing goodness – even if they have to set ray tracing to low at 1080p.

Continue reading...