Skip to main content

Nvidia GeForce GTX 1070 Ti 8GB Review: Vega In The Crosshairs

Overclocking

Here’s the part that Nvidia doesn’t allow its board partners to pursue, but still offers up to gamers through utilities like MSI Afterburner: manual overclocking.

There is one slider you don't get access to, though. Voltage manipulation is locked out entirely. Other Pascal-based graphics cards can be tuned through small voltage increases. However, to protect GeForce GTX 1080, the GeForce GTX 1070 Ti is deliberately handicapped in this one area.

MSI's GeForce GTX 1070 Ti Titanium has a theoretical power limit of 239W. Our goal is to see how far the default fan curve takes us without making the noise level intolerable. Conversely, we also want to figure out how high the GPU Boost clock goes if we let the fan spin as fast as it can.

All of the board partner cards our German lab received overclock similarly. However, the cards with a maximum power limit of 120% are at a significant disadvantage because 2050 MHz requires their fans to spin as fast as they'll go.

Graphics CardPower TargetFanGPU Boost Frequency(After 30 Minutes)
Nvidia GeForce GTX 1070 Ti FE120%100%2063 MHz
MSI GTX 1070 Ti Titanium 8G133%Auto2050 MHz
133%100%2101 MHz
Gigabyte GTX 1070 Ti G1 Gaming120%100%2063 MHz
Gainward GTX 1070 Ti Phoenix GS120%100%2050 MHz
iGame GTX 1070 Ti Vulcan X Top134%Auto2063 MHz
134%100%2114 MHz

The resulting curves show how GPU Boost frequencies drop as temperatures rise.

The corresponding voltages show us that GeForce GTX 1070 Ti isn't held back by its power limit. In fact, an additional 6 to 8W wouldn’t have been a problem. Instead, voltage is the limiting factor.

Ultimately, an almost constant 1.05V pushes GP104 to an impressive 2126 MHz for a short period of time with the fan speed set to 100% (and while the GPU’s temperature is below 42°C). This is exactly one step below Nvidia’s maximum voltage of 1.062V.

Infrared Picture Analysis

We also want to know how MSI deals with the waste heat generated by an additional 50W of thermal energy. To this end, we compare the card’s stock fan curve to its operation at 100% duty cycle.

Even when those 10cm fans turn as fast as possible, our equipment only registers ~43 dB(A). That's less noise than many graphics cards under load in their stock configurations.

At 2050 MHz, we measure 68°C, which won’t be a problem for the board’s longevity. An 85°C reading from the memory is 10°C below Micron's specified maximum, while a 95°C peak above the VRMs is merely acceptable. In the long run, that number shouldn't go up any further.

Image 1 of 2

Image 2 of 2

Pushing the fans to their limit has a significant impact on thermal readings. Long-term operation at 2.1 GHz shouldn’t be a problem, since temperatures won’t be an obstacle.

The Pitfalls of Overclocking

So, what about some performance data? In short, it's not consistent enough for us to graph.

Our German and U.S. labs deliberately compared the same models and found that chip quality was dramatically different between them. Dialing in a 133% Power Target and +220 MHz GPU offset caused crashes on one board and not the other. In the end, our less impressive sample only allowed a +150 MHz offset, resulting in a ~2 GHz GPU Boost frequency with the fans running at 100%. This delta is large enough to cast doubt on any attempt at universal benchmark results.

Memory can bottleneck overclocking results too, even with a +150 MHz increase. Depending on the game and resolution, you might see 8%-higher frame rates at 2050 MHz, while in other situations you realize gains between 2-5%.

We simply need more samples before judging the overclocking qualities of GeForce GTX 1070 Ti with blanket statements. Some of you will get lucky, while others strike out. We're trying to manage expectations after running into our own inconsistencies.


MORE: Best Graphics Cards


MORE: Desktop GPU Performance Hierarchy Table


MORE: All Graphics Content

  • 10tacle
    Yaaayyy! The NDA prison has freed everyone to release their reviews! Outstanding review, Chris. This card landed exactly where it was expected to, between the 1070 and 1080. In some games it gets real close to the 1080, where in other games, the 1080 is significantly ahead. Same with comparison to the RX 56 - close in some, not so close in others. Ashes and Destiny 2 clearly favor AMD's Vega GPUs. Can we get Project Cars 2 in the mix soon?

    It's a shame the overclocking results were too inconsistent to report, but I guess that will have to wait for vendor versions to test. Also, a hat tip for using 1440p where this GPU is targeted. Now the question is what will the real world selling prices be vs. the 1080. There are $520 1080s available out there (https://www.newegg.com/Product/Product.aspx?Item=N82E16814127945), so if AIB partners get closer to the $500 pricing threshold, that will be way too close to the 1080 in pricing.
    Reply
  • samer.forums
    Vega Still wins , If you take in consideration $200 Cheaper Freesync 1440p wide/nonwide monitors , AMD is still a winner.
    Reply
  • SinxarKnights
    So why did MSI call it the GTX 1070 Ti Titanium? Do they not know what Ti means?

    ed: Lol at least one other person doesn't know what Ti means either : If you don't know Ti stands for "titanium" effectively they named the card GTX 1070 Titanium Titanium.
    Reply
  • 10tacle
    20334482 said:
    Vega Still wins , If you take in consideration $200 Cheaper Freesync 1440p wide/nonwide monitors , AMD is still a winner.

    Well that is true and always goes without saying. You pay more for G-sync than Freesync which needs to be taken into consideration when deciding on GPUs. However, if you already own a 1440p 60Hz monitor, the choice becomes not so easy to make, especially considering how hard it is to find Vegas.
    Reply
  • 10tacle
    For those interested, Guru3D overclocked their Founder's Edition sample successfully. As expected, it gains 9-10% which puts it square into reference 1080 territory. Excellent for the lame blower cooler. The AIB vendor dual-triple fan cards will exceed that overclocking capability.

    http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1070_ti_review,42.html
    Reply
  • mapesdhs
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64?

    Also, wrt the testing of Division, and comparing to your 1080 Ti review back in March, I notice the results for the 1070 are identical at 1440p (58.7), but completely different at UHD (42.7 in March, 32.7 now); what has changed? This new test states it's using Medium detail at UHD, so was the March testing using Ultra or something? The other cards are affected in the same way.

    Not sure if it's significant, but I also see 1080 and 1080 Ti performance at 1440p being a bit better back in March.

    Re pricing, Scan here in the UK has the Vega 56 a bit cheaper than a reference 1070 Ti, but not by much. One thing which is kinda nuts though, the AIB versions of the 1070 Ti are using the same branding names as they do for what are normally overclocked models, eg. SC for EVGA, AMP for Zotac, etc., but of course they're all 1607MHz base. Maybe they'll vary in steady state for boost clocks, but it kinda wrecks the purpose of their marketing names. :D

    Ian.

    PS. When I follow the Forums link, the UK site looks different, then reverts to its more usual layout when one logs in (weird). Also, the UK site is failing to retain the login credentials from the US transfer as it used to.

    Reply
  • mapesdhs
    20334510 said:
    Well that is true and always goes without saying. You pay more for G-sync than Freesync which needs to be taken into consideration when deciding on GPUs. ...

    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.

    Ian.

    Reply
  • FormatC
    @Ian:
    The typical Freesync-Buddy is playing in Wireframe-Mode at 720p ;)

    All this sync options can help to smoothen the output, if you are too sensitive. This is a fact, but not for everybody with the same prio.
    Reply
  • TJ Hooker
    20334648 said:
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64?
    From other benchmarks I've seen, DX12 performance in BF1 is poor. Average FPS is a bit lower than in DX11, and minimum FPS far worse in some cases. If you're looking for BF1 performance info, I'd recommend looking for benchmarks on other sites that test in DX11.
    Reply
  • 10tacle
    20334667 said:
    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.

    Well I'm not sure I understand your point. The benchmarks show FPS exceeding 60FPS, meaning maximum GPU performance. It's about matching monitor refresh rate (Hz) to FPS for smooth gameplay, not just raw FPS. But regarding the Freesync argument, that's usually what is brought up in price comparisons between AMD and Nvidia. If someone is looking to upgrade from both a 60Hz monitor and a GPU, then it's a valid point.

    However, as I stated, if someone already has a 60Hz 2560x1440 or one of those ultrawide monitors, then the argument for Vega gets much weaker. Especially considering their limited availability. As I posted in a link above, you can buy a nice dual fan MSI GTX 1080 for $520 on NewEgg right now. I have not seen a dual fan MSI Vega for sale anywhere (every Vega for sale I've seen is the reference blower design).
    Reply