Hardware support: An analysis of generational performance gains for Nvidia's GPUs |
- An analysis of generational performance gains for Nvidia's GPUs
- DOOM Eternal | Official GeForce RTX 3080 4K Gameplay - World Premiere
- An oldie from 2008: Western Digital Researching 20,000RPM Hard Disk to Fight Solid State Drives
- Micro Center confirms no pre-orders for RTX 3080 via Twitter.
- NVIDIA GeForce RTX 3090, RTX 3080 and RTX 3070 custom model pricing in UK
- Facebook-Oculus login rift grows as sales stop in Germany
- [Techspot] Gone But Not Forgotten: Palm
- Liquid Freezer II Second Revision – What’s Changed?
- Can someone help me understand memory and IO Speeds?
- T-Mobile's LG Velvet is powered by the brand new MediaTek Dimensity 1000C chipset
- Review: Scythe Mugen 5 cooler - PC Battlestations on YT
- Big Navi easter egg in AMD-collaboration Fortnite map
- [AnandTech] Arm Announces Cortex-R82: First 64-bit Real Time Processor
- Air Cooling In The Cooler Master NR200
- Nvidia’s Founders Edition RTX 3090, 3080, 3070 GPUs Won’t Be Sold In Australia
- Qualcomm’s 8cx Gen 2 5G processor promises a new wave of better ARM-based laptops
- Review: Razer Blade Pro 17 (2020) stuns in all the right ways
An analysis of generational performance gains for Nvidia's GPUs Posted: 03 Sep 2020 09:05 PM PDT PurposeSince the unveiling of Ampere I've seen a lot of contradictory and incorrect statements regarding the generation-over-generation performance changes. The key statement I'm seeing is a variation of "Turing's performance gain was smaller than normal, and Ampere is a return to normal." This actually hasn't been the case, and I want to highlight that with actual data. So the purpose is to better educate users on the generation-over-generation performance gains from Nvidia's product stack. MethodologyThere is no perfect way to do this. But I am going to try to do this as fairly as possible, and to go as far back as possible. Here are the key points to my methodology.
Flaws in the Methodology
Generational Gains - x60, x70, and x80Note: The 400 series is reflecting performance gains over the 200 series. There was only a GTX 260 and 280, no GTX 270. (Yes, I'm aware of the 275/285 refreshes)
Generational Gains - Anomalies in the above data.
Generational Gains - AnalysisLooking at this reminds me of the targeted pay raises you'd see in milpay charts. It's clear that rather than going for a uniform generational increase, Nvidia had targeted increases for specific reasons. For example, the GTX 970 was insanely popular. Why? It was very close to 980 performance, but $220 cheaper. It was FAR ahead of the 960 in performance, but only $100 more expensive than the 4GB variant. The 970 was the most logical choice for prospective 900-series owners. Nvidia corrected this by narrowing the 1060/1070 performance gap (the 960 --> 1060 upgrade path was a greatest generation-over-generation improvement I've seen since the Voodoo 2). The RTX 2060 saw further gains over its cohorts in the Turing-based 20-series. The key takeaways should be:
How it stacked up - x60Here I'll examine how each x60 product stacked up against the prior generation lineup. I'll do the same for the x70 and x80 below. Why do this? Nvidia claimed that the 3070 has > 2080 Ti performance. I want to see how often this happens. In each segment, I'll attempt to project the Ampere variant's real-world performance uplift. This is solely for fun and should not be taken as gospel. Because this is time-consuming, I'm not going back to Fermi. Instead, I'm going to start with Maxwell. This allows me to compare to prior x80 Ti products since the 780 Ti was the return of this suffix. Also, math is funny. To be consistent, I wanted to use the card in the left column as the base. So while the GTX 960 is 29.08% slower than the GTX 780, using the 780 as the base, the 780 is 41% faster than the 960. If this is confusing, here's an easier to follow example. Let's say that the GTX A is half the speed of the GTX B. That means that GTX A is 50% slower, or, GTX B is 100% faster.
The GTX 960 was not a good card, as I mentioned earlier. It was telling that prior X60s were advertised as mainstream AAA gaming cards, but the GTX 960's marketing campaign revolved around esports titles. The GTX 960 was a marginal upgrade over the GTX 760. It really wasn't worth buying for an x60 owner unless you were still on Fermi. But the GTX 1060 changed that, roughly doubling the 960's performance. It beat out the prior generation GTX 970, while roughly matching the GTX 980. We saw more of the same from the RTX 2060. It was a worthy upgrade for GTX 1060 owners, beat out the GTX 1070, and roughly matched the GTX 1080. With DLSS, it actually comes out ahead in some modern titles. Where should the RTX 3060 line up? The optimistic side of me expects RTX 2080 performance out of it. But the pessimistic side of me thinks that Nidia might want to revert some of the massive gains the x60 has had in recent generations. I could see it roughly matching the RTX 2070, keeping the 6GB VRAM of prior cards, and being offered at $349. This would allow room for a Ti or Super to come in with 8GB of RAM and RTX 2080-like performance once AMD plays their hand, while still retaining a spot for the 3070. Like the 900-series, I suspect this is going to be a generation where Nvidia tries to coax traditional x60 buyers into an x70. How it stacked up - x70
The GTX 970 was a great card and the 1070 was even better. As we can see here, the x70 trading blows with the prior x80 Ti has occurred every generation since the re-introduction of the x80 Ti. Therefore, we should not be shocked that Nvidia is claiming that the RTX 3070 is faster than the RTX 2080 Ti. My projection isn't quite as rosy. From Nvidia's claims, it appears that their comparison includes a lot of RTX titles. Most games still don't support DLSS and/or ray-tracing. I think that the RTX improvements bring the average up, and rasterization brings it down. As such, I expect the 3070 to be slightly faster than the 2080 Ti in RTX-enabled titles, but slightly slower in non-RTX-enabled titles. Also, in that same chart Nvidia claimed that the 3080 was twice as fast as the 2080, but the preview that we saw from Digital Foundry showed more of a roughly ~80% improvement. So, be pessimistic until we get real reviews. How it stacked up - x80
Seems a bit like a tick-tock cycle. Compared to the prior x80 Ti, we see slighter faster, much faster, slightly faster...so the next one would be "much faster." And that is what Nvidia has claimed, that the 3080 would be noticeably faster than the 2080 Ti, and double the performance of the 2080. As noted above, DF's preview showed the 3080 as being roughly 80% faster than the 2080 in games pre-selected by Nvidia. I think the worst case that we can expect is a card that improves over its predecessor much like the 1080 did. That's a pretty damn good worst case. ConclusionThis was a data-driven post so the conclusion depends on the questions that you had and the perspective you have coming into this. We can conclude that Turing was not a subpar performance uplift, just an "average" uplift paired with a price hike. Pascal was not a typical uplift, but instead, the best uplift Nvidia has had in at least a decade. And we can see that Ampere seeks to match or even beat that amazing uplift. If you're an x80 buyer, the RTX 3080 looks to give the best generation-over-generation uplift we've seen in the segment, at least over the last decade. Worst case is that it "only" duplicates Pascal's uplift. This is going to be a great card. If you're an x70 buyer, you should expect more of what you're used to - performance within spitting distance of last generation's x80 Ti. Best case is that it's a littler faster, and worst case is that it's a little slower. There is no bad news here, unless you recently bought a 2080 Ti. But if you're an x60 buyer, there's some uncertainty. Nvidia has good reasons to release a 2080 competitor (if AMD has something competitive in the next few months), and good reasons to launch a cheaper dud with 6GB of VRAM (to leave plenty of space for a Ti/Super, and to encourage x60 owners to move to a 3070). There's a lot of latitude here and I'm curious to see which direction Nvidia ultimately goes with this card. Footnote: I will be manually posting this in r/Nvidia and r/Hardware. If you would like to see this discussion in your favorite subreddit, feel free to post it there. Just please PM me a link to the discussion, or ping me in a reply. "Username mentions in the OP don't provide a Reddit notification, and I would like to show up to answer any questions over my methodology. [link] [comments] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
DOOM Eternal | Official GeForce RTX 3080 4K Gameplay - World Premiere Posted: 03 Sep 2020 06:30 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
An oldie from 2008: Western Digital Researching 20,000RPM Hard Disk to Fight Solid State Drives Posted: 03 Sep 2020 11:50 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Micro Center confirms no pre-orders for RTX 3080 via Twitter. Posted: 03 Sep 2020 12:39 PM PDT Micro Center got back to me on Twitter confirming two things
[link] [comments] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
NVIDIA GeForce RTX 3090, RTX 3080 and RTX 3070 custom model pricing in UK Posted: 03 Sep 2020 11:21 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Facebook-Oculus login rift grows as sales stop in Germany Posted: 04 Sep 2020 02:01 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
[Techspot] Gone But Not Forgotten: Palm Posted: 03 Sep 2020 10:30 PM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Liquid Freezer II Second Revision – What’s Changed? Posted: 04 Sep 2020 01:27 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Can someone help me understand memory and IO Speeds? Posted: 03 Sep 2020 03:09 PM PDT Hello everyone! I apologize if this is obvious but even after searching on google and reddit I cant seem to understand. So I have heard about, the IO of XB1, PS5, PC (RTX IO) that is supposed to make the speed really fast. (Upwards of 5GB compressed) When I look up SSDs This one was one of the fastest at 5GB/s read speed. However I believe PCIE 3.0 supports speeds up to 16GB which almost makes PCIE 4.0 seem irrelevant for the foreseeable future of gaming. I believe the new 3080's said the speed is 19 GB's. How does this work, if even at great optimal speeds the SSD can only output (assume mostly just textures) at 5GB's? What is all that speed used for? Is there any need for PCIE 4.0 as far as GPUs go until the speeds of SSDs get closer to 16+ GB/s? Im sure a lot of things that I mention in this post are completely wrong, but I appreciate any help or explanations! Thanks a lot :) [link] [comments] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
T-Mobile's LG Velvet is powered by the brand new MediaTek Dimensity 1000C chipset Posted: 03 Sep 2020 11:20 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Review: Scythe Mugen 5 cooler - PC Battlestations on YT Posted: 03 Sep 2020 06:53 PM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Big Navi easter egg in AMD-collaboration Fortnite map Posted: 04 Sep 2020 01:57 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
[AnandTech] Arm Announces Cortex-R82: First 64-bit Real Time Processor Posted: 03 Sep 2020 10:54 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Air Cooling In The Cooler Master NR200 Posted: 03 Sep 2020 06:07 PM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Nvidia’s Founders Edition RTX 3090, 3080, 3070 GPUs Won’t Be Sold In Australia Posted: 04 Sep 2020 02:20 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Qualcomm’s 8cx Gen 2 5G processor promises a new wave of better ARM-based laptops Posted: 03 Sep 2020 07:10 AM PDT | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Review: Razer Blade Pro 17 (2020) stuns in all the right ways Posted: 03 Sep 2020 06:31 PM PDT |
You are subscribed to email updates from /r/hardware: a technology subreddit for computer hardware news, reviews and discussion.. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |
No comments:
Post a Comment