Nvidia GeForce GTX 1070 Ti 8GB Review: Vega In The Crosshairs

Early Verdict

GeForce GTX 1070 Ti lands just where you’d expect it. Nvidia knew its target and hit it. We only wish the company was more aggressive with pricing. Instead, it chooses to leave nothing on the table. Expect great frame rates at 2560x1440, similar to GeForce GTX 1070 and Radeon RX Vega 56, but nothing you haven’t seen before.

Pros

  • +

    Excellent 1440p performance

  • +

    Faster than Radeon RX Vega 56

  • +

    Highly overclockable

  • +

    GTX 1080-class cooling/power

  • +

    Availability at launch price

Cons

  • -

    Uninspiring price

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

Meet GeForce GTX 1070 Ti

Did you think Nvidia was done launching GPUs based on its Pascal architecture? Despite a fairly comprehensive line-up of GeForce GTX 10-series cards, the company isn’t ready to give AMD the last word in high-end graphics. And so, it’s wedging a new model between the GeForce GTX 1070 and 1080. This one is designed to shine against Radeon RX Vega 56 where GTX 1070 faltered.

To prepare the GeForce GTX 1070 Ti for its mission, Nvidia channels a lot of GTX 1080’s DNA, including vapor chamber cooling and a five-phase power supply. Although the 1070 Ti’s GP104 processor has one of its SMs disabled, performance from the remaining 19 is so good that Nvidia forces board partners to standardize their operating frequencies. Otherwise, overclocked models would beat entry-level GeForce GTX 1080s out of the box.

A $450 price tag really doesn’t leave much room between existing 1070s and 1080s, so there will still be overlap above and below new 1070 Tis. Neat segmentation doesn’t seem to be the point, though. This card appears purpose-built to take Radeon RX Vega 56 out at the kneecaps.

Meet GeForce GTX 1070 Ti

GeForce GTX 1070 Ti is based on the same GP104 processor we introduced you to last May in our Nvidia GeForce GTX 1080 Pascal Review. The 7.2-billion transistor chip is a product of TSMC’s 16nm FinFET Plus manufacturing.

As you know, GeForce GTX 1080 utilizes GP104 in its entirety, exposing 2560 CUDA cores through 20 Streaming Multiprocessors. GTX 1070 was realized by lopping off five of those SMs, leaving 1920 active CUDA cores. Meanwhile, GeForce GTX 1070 Ti sports 19 SMs. Given 128 single-precision CUDA cores and eight texture units per SM, that adds up to 2432 CUDA cores and 152 texture units. Already, the 1070 Ti looks more like 1080 than its namesake.

Swipe to scroll horizontally
GPUGeForce GTX 1080 (GP104)GeForce GTX 1070 Ti (GP104)GeForce GTX 1070 (GP104)
SMs201915
CUDA Cores256024321920
Base Clock1607 MHz1607 MHz1506 MHz
GPU Boost Clock1733 MHz1683 MHz1683 MHz
GFLOPs (Base Clock)822878165783
Texture Units160152120
Texel Fill Rate277.3 GT/s244.3 GT/s201.9 GT/s
Memory Data Rate10 Gb/s8 Gb/s8 Gb/s
Memory Bandwidth320 GB/s256 GB/s256 GB/s
ROPs646464
L2 Cache2MB2MB2MB
TDP180W180W150W
Transistors7.2 billion7.2 billion7.2 billion
Die Size314 mm²314 mm²314 mm²
Process Node16nm16nm16nm

Going a step further, Nvidia gives 1070 Ti a 1607 MHz base clock, exactly matching the 1080’s floor under taxing workloads. A 1683 MHz GPU Boost rating isn’t as aggressive, but again, GeForce GTX 1070 Ti’s plumbing is very 1080-like, so we might anticipate more overclocking headroom than, say, a vanilla 1070 and its heat pipe-based cooler. This is reinforced by a 180W thermal design power specification. Again, that’s GTX 1080 territory compared to 1070’s 150W target.

GP104’s back-end remains intact, including an aggregate 256-bit memory bus, 64 ROPs, and 2MB of shared L2 cache. But whereas GeForce GTX 1080 employs 8GB of 10 Gb/s GDDR5X memory, driving up to 320 GB/s of bandwidth, the 1070 Ti uses 8 Gb/s GDDR5, just like GeForce GTX 1070. If you were hoping this card would serve up superior Ethereum mining performance, that memory spec may be disappointing. Good news for gamers though, right?

Swinging At A Fastball

Architecturally, there’s not much more to say. You can see how GeForce GTX 1070 Ti leans more heavily 1080 than 1070, deliberately going as far as necessary to counter Radeon RX Vega 56. Nvidia really has no excuse if it misses its target here.

Had Nvidia gone much further, it would have eclipsed GTX 1080’s performance. In fact, the company required board partners to cap their operating frequencies to keep overclocked models from beating certain GeForce GTX 1080 SKUs. That won’t stop enthusiasts from overclocking 1070 Ti with popular tools like MSI Afterburner.

It will, however, limit what board partners can charge for GeForce GTX 1070 Ti, since they’re only able to offer bigger coolers and flashier features, rather than guaranteed clock rates. Prior to launch, models available for pre-sale all fell between $450 and $500—a mere $50 spread. In comparison, GTX 1080s ranged from $490 to $720, a $230 difference.

The samples we have in our U.S. and German labs top 2 GHz without much trouble. No doubt, it seems like Nvidia is giving gamers the wink and knowing nod by beefing up 1070 Ti and then shipping clock rates that protect its pricier model. Let’s get into some testing to see if Nvidia’s newest addition can earn a place between two well-established stalwarts in the company’s portfolio.


MORE: Best Graphics Cards


MORE: Desktop GPU Performance Hierarchy Table


MORE: All Graphics Content

Chris Angelini
Chris Angelini is an Editor Emeritus at Tom's Hardware US. He edits hardware reviews and covers high-profile CPU and GPU launches.
  • 10tacle
    Yaaayyy! The NDA prison has freed everyone to release their reviews! Outstanding review, Chris. This card landed exactly where it was expected to, between the 1070 and 1080. In some games it gets real close to the 1080, where in other games, the 1080 is significantly ahead. Same with comparison to the RX 56 - close in some, not so close in others. Ashes and Destiny 2 clearly favor AMD's Vega GPUs. Can we get Project Cars 2 in the mix soon?

    It's a shame the overclocking results were too inconsistent to report, but I guess that will have to wait for vendor versions to test. Also, a hat tip for using 1440p where this GPU is targeted. Now the question is what will the real world selling prices be vs. the 1080. There are $520 1080s available out there (https://www.newegg.com/Product/Product.aspx?Item=N82E16814127945), so if AIB partners get closer to the $500 pricing threshold, that will be way too close to the 1080 in pricing.
    Reply
  • samer.forums
    Vega Still wins , If you take in consideration $200 Cheaper Freesync 1440p wide/nonwide monitors , AMD is still a winner.
    Reply
  • SinxarKnights
    So why did MSI call it the GTX 1070 Ti Titanium? Do they not know what Ti means?

    ed: Lol at least one other person doesn't know what Ti means either : If you don't know Ti stands for "titanium" effectively they named the card GTX 1070 Titanium Titanium.
    Reply
  • 10tacle
    20334482 said:
    Vega Still wins , If you take in consideration $200 Cheaper Freesync 1440p wide/nonwide monitors , AMD is still a winner.

    Well that is true and always goes without saying. You pay more for G-sync than Freesync which needs to be taken into consideration when deciding on GPUs. However, if you already own a 1440p 60Hz monitor, the choice becomes not so easy to make, especially considering how hard it is to find Vegas.
    Reply
  • 10tacle
    For those interested, Guru3D overclocked their Founder's Edition sample successfully. As expected, it gains 9-10% which puts it square into reference 1080 territory. Excellent for the lame blower cooler. The AIB vendor dual-triple fan cards will exceed that overclocking capability.

    http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1070_ti_review,42.html
    Reply
  • mapesdhs
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64?

    Also, wrt the testing of Division, and comparing to your 1080 Ti review back in March, I notice the results for the 1070 are identical at 1440p (58.7), but completely different at UHD (42.7 in March, 32.7 now); what has changed? This new test states it's using Medium detail at UHD, so was the March testing using Ultra or something? The other cards are affected in the same way.

    Not sure if it's significant, but I also see 1080 and 1080 Ti performance at 1440p being a bit better back in March.

    Re pricing, Scan here in the UK has the Vega 56 a bit cheaper than a reference 1070 Ti, but not by much. One thing which is kinda nuts though, the AIB versions of the 1070 Ti are using the same branding names as they do for what are normally overclocked models, eg. SC for EVGA, AMP for Zotac, etc., but of course they're all 1607MHz base. Maybe they'll vary in steady state for boost clocks, but it kinda wrecks the purpose of their marketing names. :D

    Ian.

    PS. When I follow the Forums link, the UK site looks different, then reverts to its more usual layout when one logs in (weird). Also, the UK site is failing to retain the login credentials from the US transfer as it used to.

    Reply
  • mapesdhs
    20334510 said:
    Well that is true and always goes without saying. You pay more for G-sync than Freesync which needs to be taken into consideration when deciding on GPUs. ...

    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.

    Ian.

    Reply
  • FormatC
    @Ian:
    The typical Freesync-Buddy is playing in Wireframe-Mode at 720p ;)

    All this sync options can help to smoothen the output, if you are too sensitive. This is a fact, but not for everybody with the same prio.
    Reply
  • TJ Hooker
    20334648 said:
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64?
    From other benchmarks I've seen, DX12 performance in BF1 is poor. Average FPS is a bit lower than in DX11, and minimum FPS far worse in some cases. If you're looking for BF1 performance info, I'd recommend looking for benchmarks on other sites that test in DX11.
    Reply
  • 10tacle
    20334667 said:
    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.

    Well I'm not sure I understand your point. The benchmarks show FPS exceeding 60FPS, meaning maximum GPU performance. It's about matching monitor refresh rate (Hz) to FPS for smooth gameplay, not just raw FPS. But regarding the Freesync argument, that's usually what is brought up in price comparisons between AMD and Nvidia. If someone is looking to upgrade from both a 60Hz monitor and a GPU, then it's a valid point.

    However, as I stated, if someone already has a 60Hz 2560x1440 or one of those ultrawide monitors, then the argument for Vega gets much weaker. Especially considering their limited availability. As I posted in a link above, you can buy a nice dual fan MSI GTX 1080 for $520 on NewEgg right now. I have not seen a dual fan MSI Vega for sale anywhere (every Vega for sale I've seen is the reference blower design).
    Reply