Nvidia GeForce GTX 1080 Pascal Review

How We Tested Nvidia's GeForce GTX 1080

While we try to maintain a standardized test bed across Tom’s Hardware editors and offices, allowing writers across the globe to compare results, today’s review called for a different approach.

Instead of our Haswell-E-based Core i7-5930K at 3.5GHz, we’re using a Skylake-based Core i7-6700K at 4GHz, giving us two generations worth of IPC improvements and an extra 500MHz base clock rate to alleviate host processing bottlenecks wherever they may surface. Of course, the CPU’s LGA 1151 interface also calls for a different motherboard—we tapped MSI’s Z170A Gaming M7 for all of our game benchmarks, and dropped in G.Skill’s F4-3000C15Q-16GRR memory kit composed of four 4GB modules at DDR4-3000. Crucial’s MX200 SSD remains, as does the Noctua NH-12S cooler and be quiet! Dark Power Pro 10 850W power supply.

Gone is Windows 8.1, though. Prior to benchmarking, we installed a clean version of Windows 10 Professional and a new suite of games representing popular AAA titles, some DirectX 12-specific selections and a mix of genres.

Because the GeForce GTX 1080 is a flagship, its competition is limited to the top end from AMD and Nvidia. We chose a GeForce GTX Titan X, 980 Ti, and 980 to go against it, along with AMD’s Radeon R9 Fury X, Fury, and 390X. All of the cards are reference except for Sapphire’s Nitro Radeon R9 Fury and MSI’s R9 390X Gaming 8G.

Drivers And Benchmarks

The Maxwell-based cards employ Nvidia’s newest driver at the time of testing, GeForce Game Ready Driver 365.10. For the GeForce GTX 1080, we had to use the company’s press driver, 368.13. All three cards based on AMD GPUs use Radeon Software Crimson Edition 16.5.2 Hotfix, released on May 10.

Our benchmark suite also reflects some changes. A few old favorites remain—notably Battlefield 4, Grand Theft Auto V and The Witcher 3. But we’re also adding Hitman, Project CARS, Rise of the Tomb Raider, The Division and Ashes of the Singularity.

The Ashes charts represent DirectX 12 performance using the game’s built-in benchmark/logging tool. Hitman and Tomb Raider are presented using DirectX 11. However, we have results from DirectX 12 using those games as well, which we’ll mention in the analysis (spoiler: in most cases, performance drops with DirectX 12). Everything else is DirectX-based, recorded with Fraps. Nvidia is making a DirectX 12-compatible FCAT overlay available, but there simply wasn’t enough time ahead of the launch to experiment with it.

Swipe to scroll horizontally
GameSettings
Ashes of the SingularityDirectX 12, Extreme quality preset, built-in benchmark
Battlefield 4DirectX 11, Ultra quality preset, custom Tom’s Hardware benchmark (Tashgar jeep ride), 100-second Fraps recording
Grand Theft Auto VDirectX 11, Very High quality settings, 4x MSAA, built-in benchmark (test five), 110-second Fraps recording
HitmanDirectX 11, Ultra level of detail, FXAA, High texture quality, built-in benchmark, 100-second Fraps recording
Project CARSDirectX 11, Ultra quality settings, High anti-aliasing, High texture resolution, Nürburgring Sprint, 100-second Fraps recording
Rise of the Tomb RaiderDirectX 11, Custom quality preset, Very High quality settings, built-in benchmark, 80-second Fraps recording
The DivisionDirectX 11, Custom quality preset, Ultra quality settings, Supersampling temporal AA, built-in benchmark, 90-second Fraps recording
The Witcher 3DirectX 11, Highest quality settings, HairWorks disabled, custom Tom’s Hardware benchmark, 100-second Fraps recording
  • toddybody
    These power consumption charts are making me cross eyed :/
    Reply
  • JeanLuc
    Chris, were you invited to the Nvidia press event in Texas?

    About time we saw some cards based of a new process, it seemed like we were going to be stuck on 28nm for the rest of time.

    As normal Nvidia is creaming it up in DX11 but DX12 performance does look ominous IMO, there's not enough gain over the previous generation and makes me think AMD new Polaris cards might dominate when it comes to DX12.
    Reply
  • slimreaper
    Could you run an Otoy octane bench? This really could change the motion graphics industry!?
    Reply
  • F-minus
    Seriously I have to ask, did nvidia instruct every single reviewer to bench the 1080 against stock maxwell cards? Cause i'd like to see real world scenarios with an OCed 980Ti, because nobody runs stock or even buys stock, if you can even buy stock 980Tis.
    Reply
  • cknobman
    Nice results but honestly they dont blow me away.

    In fact, I think Nvidia left the door open for AMD to take control of the high end market later this year.

    And fix the friggin power consumption charts, you went with about the worst possible way to show them.
    Reply
  • FormatC
    Stock 1080 vs. stock 980 Ti :)

    Both cards can be oc'ed and if you have a real custom 1080 in your hand, the oc'ed 980 Ti looks in direct comparison to an oc'ed 1080 worse than the stock card in this review to the other stock card. :)
    Reply
  • Gungar
    @F-minus, i saw the same thing. The gtx 980Ti overclocks way better thn 1080, i am pretty sure OC vs OC, there is nearly no performance difference. (disappointing)
    Reply
  • toddybody
    @F-minus, i saw the same thing. The gtx 980Ti overclocks way better thn 1080, i am pretty sure OC vs OC, there is nearly no performance difference. (disappointing)

    LOL. My 980ti doesnt hit 2.2Ghz on air. We need to wait for more benchmarks...I'd like to see the G1 980ti against a similar 1080.
    Reply
  • F-minus
    Exactly, but it seems like nvidia instructed every single outlet to bench the Reference 1080 only against stock Maxwell cards, which is honestly <Mod Edit> - pardon. I bet an OCed 980Ti would come super close to the stock 1080, which at that point makes me wonder why even upgrade now, sure you can push the 1080 too, but I'd wait for a price drop or at least the supposed cheaper AIB cards.
    Reply
  • FormatC
    I have a handpicked Gigabyte GTX 980 Ti Xtreme Gaming Waterforce at 1.65 Ghz in one of my rigs, it's slower.
    Reply