Nvidia GeForce GTX 1070 Ti 8GB Review: Vega In The Crosshairs

Temperature & Clock Rates

Comparing the GeForce GTX 1070 Ti Titanium 8G’s gaming frequencies to those of Nvidia’s Founders Edition card yields an interesting finding: the latter achieves a slightly higher GPU Boost clock rate in spite of significantly higher temperatures. Did we get a bad sample from MSI or a great one from Nvidia? A comparison using boards from Zotac, Gigabyte, Colorful, and Gainward suggests that we really hit the jackpot with our Founders Edition card, similar to previous launches.

Otherwise, the two boards act as you'd expect them to. Subjected to rising temperatures, Nvidia's GeForce GTX 1070 Ti FE drops from its initial 1911 MHz to alternating numbers just above the 1800 MHz mark, whereas MSI's GeForce GTX 1070 Ti Titanium settles right below the Founders Edition’s range.

The stress test results paint a similar picture. Running the MSI card in an open or closed case doesn’t really seem to make much of a difference.

Our voltage measurements shed some light on those clock rate results. The Founders Edition card hosts a gem of a GPU. Its slightly higher voltages allow it to hit more aggressive frequencies. What are the chances we'd see two stellar samples? In fact, our U.S. and German labs both scored real winners, so maybe someone who pre-ordered a card from geforce.com could chime in with their experience using the comments section.

More Data: Nvidia GeForce GTX 1070 Ti FE Infrared Pictures

The GeForce GTX 1070 Ti FE exhausts all of its waste heat out the I/O bracket, so there's no point in taking measurements using an open test bench. Bottom line: Nvidia’s design is great for cooling, regardless of your case.

More Data: MSI GeForce GTX 1070 Ti Titanium Infrared Pictures

MSI’s Twin Frozr VI thermal solution does its job well, and is almost inaudible to boot. Small changes to the GTX 1080 Gaming X’s cooler clearly have some noticeable effects. Most important, the memory hot-spot we complained about previously is gone. Some credit for this goes to Nvidia's use of GDDR5, rather than hotter GDDR5X.

The memory’s maximum temperature of 95° is never reached during our stress test, either. These are great temperatures to report.

More Data: Cool Down Process Infrared Pictures

These pictures illustrate the cold spots we get from MSI's and Nvidia's coolers. The GeForce GTX 1070 Ti FE cools down uniformly across its surface, whereas MSI's Titanium 8G board has a much cooler spot above its GPU package.

Thermal performance is lost due to the lack of contact between MSI's VRM sink and the main cooler.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

This thread is closed for comments
39 comments
    Your comment
  • 10tacle
    Yaaayyy! The NDA prison has freed everyone to release their reviews! Outstanding review, Chris. This card landed exactly where it was expected to, between the 1070 and 1080. In some games it gets real close to the 1080, where in other games, the 1080 is significantly ahead. Same with comparison to the RX 56 - close in some, not so close in others. Ashes and Destiny 2 clearly favor AMD's Vega GPUs. Can we get Project Cars 2 in the mix soon?

    It's a shame the overclocking results were too inconsistent to report, but I guess that will have to wait for vendor versions to test. Also, a hat tip for using 1440p where this GPU is targeted. Now the question is what will the real world selling prices be vs. the 1080. There are $520 1080s available out there (https://www.newegg.com/Product/Product.aspx?Item=N82E16814127945), so if AIB partners get closer to the $500 pricing threshold, that will be way too close to the 1080 in pricing.
  • samer.forums
    Vega Still wins , If you take in consideration $200 Cheaper Freesync 1440p wide/nonwide monitors , AMD is still a winner.
  • SinxarKnights
    So why did MSI call it the GTX 1070 Ti Titanium? Do they not know what Ti means?

    ed: Lol at least one other person doesn't know what Ti means either :
    If you don't know Ti stands for "titanium" effectively they named the card GTX 1070 Titanium Titanium.
  • 10tacle
    2562892 said:
    Vega Still wins , If you take in consideration $200 Cheaper Freesync 1440p wide/nonwide monitors , AMD is still a winner.


    Well that is true and always goes without saying. You pay more for G-sync than Freesync which needs to be taken into consideration when deciding on GPUs. However, if you already own a 1440p 60Hz monitor, the choice becomes not so easy to make, especially considering how hard it is to find Vegas.
  • 10tacle
    For those interested, Guru3D overclocked their Founder's Edition sample successfully. As expected, it gains 9-10% which puts it square into reference 1080 territory. Excellent for the lame blower cooler. The AIB vendor dual-triple fan cards will exceed that overclocking capability.

    http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1070_ti_review,42.html
  • mapesdhs
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64?

    Also, wrt the testing of Division, and comparing to your 1080 Ti review back in March, I notice the results for the 1070 are identical at 1440p (58.7), but completely different at UHD (42.7 in March, 32.7 now); what has changed? This new test states it's using Medium detail at UHD, so was the March testing using Ultra or something? The other cards are affected in the same way.

    Not sure if it's significant, but I also see 1080 and 1080 Ti performance at 1440p being a bit better back in March.

    Re pricing, Scan here in the UK has the Vega 56 a bit cheaper than a reference 1070 Ti, but not by much. One thing which is kinda nuts though, the AIB versions of the 1070 Ti are using the same branding names as they do for what are normally overclocked models, eg. SC for EVGA, AMP for Zotac, etc., but of course they're all 1607MHz base. Maybe they'll vary in steady state for boost clocks, but it kinda wrecks the purpose of their marketing names. :D

    Ian.

    PS. When I follow the Forums link, the UK site looks different, then reverts to its more usual layout when one logs in (weird). Also, the UK site is failing to retain the login credentials from the US transfer as it used to.
  • mapesdhs
    202972 said:
    Well that is true and always goes without saying. You pay more for G-sync than Freesync which needs to be taken into consideration when deciding on GPUs. ...


    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.

    Ian.
  • FormatC
    @Ian:
    The typical Freesync-Buddy is playing in Wireframe-Mode at 720p ;)

    All this sync options can help to smoothen the output, if you are too sensitive. This is a fact, but not for everybody with the same prio.
  • TJ Hooker
    117741 said:
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64?

    From other benchmarks I've seen, DX12 performance in BF1 is poor. Average FPS is a bit lower than in DX11, and minimum FPS far worse in some cases. If you're looking for BF1 performance info, I'd recommend looking for benchmarks on other sites that test in DX11.
  • 10tacle
    117741 said:
    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.


    Well I'm not sure I understand your point. The benchmarks show FPS exceeding 60FPS, meaning maximum GPU performance. It's about matching monitor refresh rate (Hz) to FPS for smooth gameplay, not just raw FPS. But regarding the Freesync argument, that's usually what is brought up in price comparisons between AMD and Nvidia. If someone is looking to upgrade from both a 60Hz monitor and a GPU, then it's a valid point.

    However, as I stated, if someone already has a 60Hz 2560x1440 or one of those ultrawide monitors, then the argument for Vega gets much weaker. Especially considering their limited availability. As I posted in a link above, you can buy a nice dual fan MSI GTX 1080 for $520 on NewEgg right now. I have not seen a dual fan MSI Vega for sale anywhere (every Vega for sale I've seen is the reference blower design).
  • TJ Hooker
    117741 said:
    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.

    Most of the 1440p results have the 1070/Ti/Vega56 in the 60-90 fps range with high settings. What sort of framerates do you consider appropriate for adaptive sync use?
  • Embra
    Are there any true DX12 games out yet? Do they not have some elements of DX11 still within?

    Curious if the Vega 64 is the air cooled or water cooled version.

    I think the 1070ti performs exactly as expect.
  • 10tacle
    ^^If you mean exclusive to DX12, Hitman was the first native DX12 developed game. Others are Gears Of War 4, Halo 2, and Forza 7. There are probably others, but those are the only ones I know of off the top of my head. The question nobody can answer is how much DX11 remnant programming is involved in those so-called DX12 exclusive games.
  • vancliff
    You forgot to mention the memory size for these cards.
  • 10tacle
    1857640 said:
    You forgot to mention the memory size for these cards.


    Is that supposed to be a joke or are you serious? Look at the charts and GPU descriptions, specifically that "8GB" reference next to the GPU type.
  • elbert
    Great review Chris. The only thing I didn't like was Nvidia's price. Nothing anyone but Nvidia can do about that tho. The 1070ti I fill needs to be $429 while the 1070 a more reasonable $329. While the rx580 is matched in performance by the 1060 6GB Nvidia needs a 2 card SLI option to offset the rx580's one big advantage.
  • awatz
    I live in the Philippines. The price of the 1070Ti is around $647 which is about the same price as most 1080s and Vega 56.

    The Galax 1080 EXOC Sniper is way cheaper at $540
  • jfunk
    The whole point of Freesync / G-sync is so you can run at somewhere in between 60-144+. If you had hardware that could hit 144+ 100% of the time you wouldn't need the technology at all.

    That makes these cards literally the perfect case for 1440p adaptive sync monitors. I had to build in July 2016 due to outside circumstances so wound up with a 1070 + 1440p G-Sync monitor. But today a Vega 56 + 1440p Freesync monitor is unquestionably the better value and there's no comparison. It's $200 cheaper for basically the same results.

    If you're running 1080p, then you don't really need a 1070 or better in the first place.
  • vancliff
    10Tacle in the chart it mention the data rate not the memory. Is not a joke. personal attack removed
  • cangelini
    202972 said:
    Yaaayyy! The NDA prison has freed everyone to release their reviews! Outstanding review, Chris. This card landed exactly where it was expected to, between the 1070 and 1080. In some games it gets real close to the 1080, where in other games, the 1080 is significantly ahead. Same with comparison to the RX 56 - close in some, not so close in others. Ashes and Destiny 2 clearly favor AMD's Vega GPUs. Can we get Project Cars 2 in the mix soon? It's a shame the overclocking results were too inconsistent to report, but I guess that will have to wait for vendor versions to test. Also, a hat tip for using 1440p where this GPU is targeted. Now the question is what will the real world selling prices be vs. the 1080. There are $520 1080s available out there (https://www.newegg.com/Product/Product.aspx?Item=N82E16814127945), so if AIB partners get closer to the $500 pricing threshold, that will be way too close to the 1080 in pricing.


    Yeah, Igor and I were both surprised when his MSI card exceeded 2.1 GHz and mine was limited to ~2 GHz. In contrast, my 1070 Ti FE card flew right past 2.1 GHz (obviously with no extra voltage). Pretty impressive, but not necessarily consistent. I still have the overclocking charts on my workstation--we'll probably save those for individual card reviews, though, until we can figure out where retail samples are landing.
  • cangelini
    117741 said:
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64? Also, wrt the testing of Division, and comparing to your 1080 Ti review back in March, I notice the results for the 1070 are identical at 1440p (58.7), but completely different at UHD (42.7 in March, 32.7 now); what has changed? This new test states it's using Medium detail at UHD, so was the March testing using Ultra or something? The other cards are affected in the same way. Not sure if it's significant, but I also see 1080 and 1080 Ti performance at 1440p being a bit better back in March. Re pricing, Scan here in the UK has the Vega 56 a bit cheaper than a reference 1070 Ti, but not by much. One thing which is kinda nuts though, the AIB versions of the 1070 Ti are using the same branding names as they do for what are normally overclocked models, eg. SC for EVGA, AMP for Zotac, etc., but of course they're all 1607MHz base. Maybe they'll vary in steady state for boost clocks, but it kinda wrecks the purpose of their marketing names. :D Ian. PS. When I follow the Forums link, the UK site looks different, then reverts to its more usual layout when one logs in (weird). Also, the UK site is failing to retain the login credentials from the US transfer as it used to.


    Looking at the charts, there are two events that hammer the minimums (and frankly, I personally don't care for minimums as indicators of anything truly meaningful). One is the start of the test, where we give ~5 seconds for the game to load but still often see choppy performance. Second is about :50 into the video on this page: http://www.tomshardware.com/reviews/battlefield-1-directx-12-benchmark,5017.html, where the camera turns. There's more chop on that sequence than anywhere else.

    As for The Division, I averaged 32.7 FPS in the 1080 Ti review...I believe using the Ultra preset. In our 1070 Ti story, I dialed down to Medium quality and saw 42.7 FPS. Since I knew perf would be marginal on 1070-class hardware at 4K, I sacrificed quality in the name of (hopefully) playable performance.

    A lot of companies using "overclocked" names actually had overclocked 1070 Tis before Nvidia told them they'd have to standardize on a reference frequency. Things were changing pretty quickly leading up to launch ;) On that note, keep an eye on Vega 56 prices after this weekend--I think we're all eager to see if those lower prices last.
  • cangelini
    45049 said:
    Great review Chris. The only thing I didn't like was Nvidia's price. Nothing anyone but Nvidia can do about that tho. The 1070ti I fill needs to be $429 while the 1070 a more reasonable $329. While the rx580 is matched in performance by the 1060 6GB Nvidia needs a 2 card SLI option to offset the rx580's one big advantage.


    Agreed 100%. Would have loved to see 1070 Ti closer to $400 and 1070 pushed down to $350 (after supposedly starting at $380 more than a year ago). I'm told this could have been possible if 1070 Ti didn't employ the pricier thermal solution and beefier power supply. But again, NV was gunning for Vega 56's performance, while guys like me are looking for better value stories.
  • tazmo8448
    if it would be offered at it's opening price ($279) it would, by far, be the best selling GPU but nooo...they held out for the $$
  • tazmo8448
    what with 1080p being sufficient