Nvidia GeForce RTX 4080 Review: More Efficient, Still Expensive
The RTX 4080 has all the technological advancements of the Ada Lovelace architecture, with a price that's difficult to justify. It's too close to the 4090 to entice extreme performance enthusiasts, and soon it will have to contend with AMD's RX 7900 series. But lots of people prefer Nvidia and want DLSS and are willing to pay the piper his dues.
Second-fastest GPU (for now)
Much improved efficiency
Excellent ray tracing performance
Packs all the Ada Lovelace enhancements
High price without the halo performance of the 4090
Needs DLSS 3 to truly shine in gaming performance
AMD's RDNA 3 could provide strong competition
Lingering concerns surrounding the 16-pin connector
Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.
The Nvidia GeForce RTX 4080 is the follow-up to last month's RTX 4090 launch, now one of the best graphics cards and the top listing in our GPU benchmarks hierarchy. Of course, a bit of the shine has come off thanks to the melting 16-pin connectors. The good news: RTX 4080 uses less power, which should mean it's also less likely to funnel enough power to melt the plastic connector… maybe. The bad news: At $1,199, it's still priced out of reach for most gamers and represents a big jump in generational pricing, inheriting the RTX 3080 Ti launch price that we also felt was too high.We already know most of what to expect from Nvidia's Ada Lovelace architecture, so the only real question now is how performance scales down to fewer GPU shaders, less memory, less cache, a narrower memory interface, etc. Let's quickly look at the specifications for a few of the top Nvidia and AMD GPUs.
There's a relatively large gap between the RTX 4080 and the larger RTX 4090. You get most of an AD103 GPU — 76 of the potential 80 Streaming Multiprocessors (SMs) — but that's still 40% fewer GPU shaders and other functional units than the RTX 4090. Clock speeds are similar, you get 33% fewer memory channels, VRAM, and bandwidth, and the rated TBP drops by 29%. On paper, the RTX 4090 could be up to 70% faster based on the theoretical compute performance, and that's a concern.$1,199 is hardly affordable, so it feels like anyone even looking at the RTX 4080 should probably just save up the additional $400 for the RTX 4090 and go for broke — or melted. But then the RTX 4090 has been sold out at anywhere below $2,100 since launch, which means it could actually be a $900 upsell, and that's far more significant.The pricing becomes even more of a concern when we factor in AMD's Radeon RX 7900 XTX/XT cards coming next month. We now have all the pertinent details for the first cards using AMD's RDNA 3 GPU architecture, and they certainly look promising. Prices are still high, but the specs comparisons suggest AMD might be able to beat the RTX 4080 while costing at least $200–$300 less. This means, unless you absolutely refuse to consider purchasing an AMD graphics card, you should at least wait until next month to see what the red team has to offer.
However, Nvidia does have some extras that AMD is unlikely to match in the near term. For example, the Deep learning and AI horsepower in RTX 4080 far surpass what AMD intends to offer. If we've got the figures right, AMD's FP16 and INT8 throughput will be less than a third of the RTX 4080.Nvidia also offers DLSS 3 courtesy of the enhanced Optical Flow Accelerator (OFA). Ten games already support the technology: Bright Memory: Infinite, Destroy All Humans! 2 - Reprobed, F.I.S.T.: Forged in Shadow Torch, F1 22, Justice, Loopmancer, Marvel's Spider-Man Remastered, Microsoft Flight Simulator, A Plague Tale: Requiem, and Super People. That's about half as many DLSS 3 games in less than a month as those with AMD's FSR2 technology. Of course, you need an RTX 40-series GPU for DLSS 3, while FSR2 works with pretty much everything.Nvidia GPUs also tend to be heavily favored by professional users, or at least their employers. So while true workstations will likely opt for the RTX 6000 48GB card as opposed to a GeForce RTX 40-series, there's certainly potential in picking up one or more RTX 4080 cards for AI and deep learning use. Content creators may also find something to like, though again, if you're willing to pay for a 4080, it may not be a huge step up in pricing to nab a 4090 instead.Another piece of good news (depending on which side of the aisle you fall, we suppose) is that GPU mining remains unprofitable. Gamers won't be able to offset the price of a new graphics card through cryptocurrency mining, but at least there should be more GPUs available for gamers. Now let's see exactly what Nvidia has to offer with its new RTX 4080.
Current page:Nvidia GeForce RTX 4080 Founders Edition Review
Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.
Intel Terminates 11th Gen Tiger Lake CPUs, 500-Series Mobile Chipsets
Intel Details PowerVia Backside Power Delivery Technology
LG StanbyME Go 27LX5 Portable Monitor Targets Picnickers
By Joe ShieldsMay 27, 2023
By Charles JefferiesMay 26, 2023
By Christian EberleMay 25, 2023
By Jarred WaltonMay 24, 2023
By Denise BertacchiMay 24, 2023
By Jarred WaltonMay 23, 2023
By Brandon HillMay 22, 2023
By Myles GoldmanMay 21, 2023
By Andrew E. FreedmanMay 20, 2023
By Sarah Jacobsson PurewalMay 20, 2023
By Joe ShieldsMay 19, 2023
MORE: Best Graphics Cards MORE: GPU Benchmarks and Hierarchy MORE: All Graphics Content AMD Rearchitects OpenGL Driver for a 72% Performance Uplift : Read more