Nvidia GeForce GTX 1070 Ti 8GB Review: Vega In The Crosshairs

Did you think Nvidia was done launching GPUs based on its Pascal architecture? Despite a fairly comprehensive line-up of GeForce GTX 10-series cards, the company isn’t ready to give AMD the last word in high-end graphics. And so, it’s wedging a new model between the GeForce GTX 1070 and 1080. This one is designed to shine against Radeon RX Vega 56 where GTX 1070 faltered.

To prepare the GeForce GTX 1070 Ti for its mission, Nvidia channels a lot of GTX 1080’s DNA, including vapor chamber cooling and a five-phase power supply. Although the 1070 Ti’s GP104 processor has one of its SMs disabled, performance from the remaining 19 is so good that Nvidia forces board partners to standardize their operating frequencies. Otherwise, overclocked models would beat entry-level GeForce GTX 1080s out of the box.

A $450 price tag really doesn’t leave much room between existing 1070s and 1080s, so there will still be overlap above and below new 1070 Tis. Neat segmentation doesn’t seem to be the point, though. This card appears purpose-built to take Radeon RX Vega 56 out at the kneecaps.

Meet GeForce GTX 1070 Ti

GeForce GTX 1070 Ti is based on the same GP104 processor we introduced you to last May in our Nvidia GeForce GTX 1080 Pascal Review. The 7.2-billion transistor chip is a product of TSMC’s 16nm FinFET Plus manufacturing.

As you know, GeForce GTX 1080 utilizes GP104 in its entirety, exposing 2560 CUDA cores through 20 Streaming Multiprocessors. GTX 1070 was realized by lopping off five of those SMs, leaving 1920 active CUDA cores. Meanwhile, GeForce GTX 1070 Ti sports 19 SMs. Given 128 single-precision CUDA cores and eight texture units per SM, that adds up to 2432 CUDA cores and 152 texture units. Already, the 1070 Ti looks more like 1080 than its namesake.

GPU
GeForce GTX 1080 (GP104)
GeForce GTX 1070 Ti (GP104)
GeForce GTX 1070 (GP104)
SMs
20
19
15
CUDA Cores
2560
2432
1920
Base Clock
1607 MHz
1607 MHz
1506 MHz
GPU Boost Clock
1733 MHz
1683 MHz
1683 MHz
GFLOPs (Base Clock)
8228
7816
5783
Texture Units
160
152
120
Texel Fill Rate
277.3 GT/s
244.3 GT/s
201.9 GT/s
Memory Data Rate
10 Gb/s
8 Gb/s
8 Gb/s
Memory Bandwidth
320 GB/s
256 GB/s
256 GB/s
ROPs
64
64
64
L2 Cache
2MB
2MB
2MB
TDP
180W
180W
150W
Transistors
7.2 billion
7.2 billion
7.2 billion
Die Size
314 mm²314 mm²314 mm²
Process Node
16nm
16nm16nm

Going a step further, Nvidia gives 1070 Ti a 1607 MHz base clock, exactly matching the 1080’s floor under taxing workloads. A 1683 MHz GPU Boost rating isn’t as aggressive, but again, GeForce GTX 1070 Ti’s plumbing is very 1080-like, so we might anticipate more overclocking headroom than, say, a vanilla 1070 and its heat pipe-based cooler. This is reinforced by a 180W thermal design power specification. Again, that’s GTX 1080 territory compared to 1070’s 150W target.

GP104’s back-end remains intact, including an aggregate 256-bit memory bus, 64 ROPs, and 2MB of shared L2 cache. But whereas GeForce GTX 1080 employs 8GB of 10 Gb/s GDDR5X memory, driving up to 320 GB/s of bandwidth, the 1070 Ti uses 8 Gb/s GDDR5, just like GeForce GTX 1070. If you were hoping this card would serve up superior Ethereum mining performance, that memory spec may be disappointing. Good news for gamers though, right?

Swinging At A Fastball

Architecturally, there’s not much more to say. You can see how GeForce GTX 1070 Ti leans more heavily 1080 than 1070, deliberately going as far as necessary to counter Radeon RX Vega 56. Nvidia really has no excuse if it misses its target here.

Had Nvidia gone much further, it would have eclipsed GTX 1080’s performance. In fact, the company required board partners to cap their operating frequencies to keep overclocked models from beating certain GeForce GTX 1080 SKUs. That won’t stop enthusiasts from overclocking 1070 Ti with popular tools like MSI Afterburner.

It will, however, limit what board partners can charge for GeForce GTX 1070 Ti, since they’re only able to offer bigger coolers and flashier features, rather than guaranteed clock rates. Prior to launch, models available for pre-sale all fell between $450 and $500—a mere $50 spread. In comparison, GTX 1080s ranged from $490 to $720, a $230 difference.

The samples we have in our U.S. and German labs top 2 GHz without much trouble. No doubt, it seems like Nvidia is giving gamers the wink and knowing nod by beefing up 1070 Ti and then shipping clock rates that protect its pricier model. Let’s get into some testing to see if Nvidia’s newest addition can earn a place between two well-established stalwarts in the company’s portfolio.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

Create a new thread in the Reviews comments forum about this subject
38 comments
Comment from the forums
    Your comment
  • 10tacle
    Yaaayyy! The NDA prison has freed everyone to release their reviews! Outstanding review, Chris. This card landed exactly where it was expected to, between the 1070 and 1080. In some games it gets real close to the 1080, where in other games, the 1080 is significantly ahead. Same with comparison to the RX 56 - close in some, not so close in others. Ashes and Destiny 2 clearly favor AMD's Vega GPUs. Can we get Project Cars 2 in the mix soon?

    It's a shame the overclocking results were too inconsistent to report, but I guess that will have to wait for vendor versions to test. Also, a hat tip for using 1440p where this GPU is targeted. Now the question is what will the real world selling prices be vs. the 1080. There are $520 1080s available out there (https://www.newegg.com/Product/Product.aspx?Item=N82E16814127945), so if AIB partners get closer to the $500 pricing threshold, that will be way too close to the 1080 in pricing.
    2
  • samer.forums
    Vega Still wins , If you take in consideration $200 Cheaper Freesync 1440p wide/nonwide monitors , AMD is still a winner.
    1
  • SinxarKnights
    So why did MSI call it the GTX 1070 Ti Titanium? Do they not know what Ti means?

    ed: Lol at least one other person doesn't know what Ti means either :
    If you don't know Ti stands for "titanium" effectively they named the card GTX 1070 Titanium Titanium.
    0
  • 10tacle
    Anonymous said:
    Vega Still wins , If you take in consideration $200 Cheaper Freesync 1440p wide/nonwide monitors , AMD is still a winner.


    Well that is true and always goes without saying. You pay more for G-sync than Freesync which needs to be taken into consideration when deciding on GPUs. However, if you already own a 1440p 60Hz monitor, the choice becomes not so easy to make, especially considering how hard it is to find Vegas.
    3
  • 10tacle
    For those interested, Guru3D overclocked their Founder's Edition sample successfully. As expected, it gains 9-10% which puts it square into reference 1080 territory. Excellent for the lame blower cooler. The AIB vendor dual-triple fan cards will exceed that overclocking capability.

    http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1070_ti_review,42.html
    1
  • mapesdhs
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64?

    Also, wrt the testing of Division, and comparing to your 1080 Ti review back in March, I notice the results for the 1070 are identical at 1440p (58.7), but completely different at UHD (42.7 in March, 32.7 now); what has changed? This new test states it's using Medium detail at UHD, so was the March testing using Ultra or something? The other cards are affected in the same way.

    Not sure if it's significant, but I also see 1080 and 1080 Ti performance at 1440p being a bit better back in March.

    Re pricing, Scan here in the UK has the Vega 56 a bit cheaper than a reference 1070 Ti, but not by much. One thing which is kinda nuts though, the AIB versions of the 1070 Ti are using the same branding names as they do for what are normally overclocked models, eg. SC for EVGA, AMP for Zotac, etc., but of course they're all 1607MHz base. Maybe they'll vary in steady state for boost clocks, but it kinda wrecks the purpose of their marketing names. :D

    Ian.

    PS. When I follow the Forums link, the UK site looks different, then reverts to its more usual layout when one logs in (weird). Also, the UK site is failing to retain the login credentials from the US transfer as it used to.
    0
  • mapesdhs
    Anonymous said:
    Well that is true and always goes without saying. You pay more for G-sync than Freesync which needs to be taken into consideration when deciding on GPUs. ...


    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.

    Ian.
    0
  • FormatC
    @Ian:
    The typical Freesync-Buddy is playing in Wireframe-Mode at 720p ;)

    All this sync options can help to smoothen the output, if you are too sensitive. This is a fact, but not for everybody with the same prio.
    2
  • TJ Hooker
    Anonymous said:
    Chris, what is it that pummels the minimums for the 1080 Ti and Vega 64 in BF1 at 1440p? And why, when moving up to UHD, does this effect persist for the 1080 Ti but not for Vega 64?

    From other benchmarks I've seen, DX12 performance in BF1 is poor. Average FPS is a bit lower than in DX11, and minimum FPS far worse in some cases. If you're looking for BF1 performance info, I'd recommend looking for benchmarks on other sites that test in DX11.
    0
  • 10tacle
    Anonymous said:
    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.


    Well I'm not sure I understand your point. The benchmarks show FPS exceeding 60FPS, meaning maximum GPU performance. It's about matching monitor refresh rate (Hz) to FPS for smooth gameplay, not just raw FPS. But regarding the Freesync argument, that's usually what is brought up in price comparisons between AMD and Nvidia. If someone is looking to upgrade from both a 60Hz monitor and a GPU, then it's a valid point.

    However, as I stated, if someone already has a 60Hz 2560x1440 or one of those ultrawide monitors, then the argument for Vega gets much weaker. Especially considering their limited availability. As I posted in a link above, you can buy a nice dual fan MSI GTX 1080 for $520 on NewEgg right now. I have not seen a dual fan MSI Vega for sale anywhere (every Vega for sale I've seen is the reference blower design).
    2
  • TJ Hooker
    Anonymous said:
    It's a bit odd that people are citing the monitor cost advantage of Freesync, while article reviews are not showing games actually running at frame rates which would be relevant to that technology. Or are all these Freesync buyers just using 1080p? Or much lower detail levels? I'd rather stick to 60Hz and higher quality visuals.

    Most of the 1440p results have the 1070/Ti/Vega56 in the 60-90 fps range with high settings. What sort of framerates do you consider appropriate for adaptive sync use?
    2
  • Embra
    Are there any true DX12 games out yet? Do they not have some elements of DX11 still within?

    Curious if the Vega 64 is the air cooled or water cooled version.

    I think the 1070ti performs exactly as expect.
    0
  • 10tacle
    ^^If you mean exclusive to DX12, Hitman was the first native DX12 developed game. Others are Gears Of War 4, Halo 2, and Forza 7. There are probably others, but those are the only ones I know of off the top of my head. The question nobody can answer is how much DX11 remnant programming is involved in those so-called DX12 exclusive games.
    0
  • vancliff
    You forgot to mention the memory size for these cards.
    -1
  • 10tacle
    Anonymous said:
    You forgot to mention the memory size for these cards.


    Is that supposed to be a joke or are you serious? Look at the charts and GPU descriptions, specifically that "8GB" reference next to the GPU type.
    0
  • elbert
    Great review Chris. The only thing I didn't like was Nvidia's price. Nothing anyone but Nvidia can do about that tho. The 1070ti I fill needs to be $429 while the 1070 a more reasonable $329. While the rx580 is matched in performance by the 1060 6GB Nvidia needs a 2 card SLI option to offset the rx580's one big advantage.
    1
  • awatz
    I live in the Philippines. The price of the 1070Ti is around $647 which is about the same price as most 1080s and Vega 56.

    The Galax 1080 EXOC Sniper is way cheaper at $540
    0
  • jfunk
    The whole point of Freesync / G-sync is so you can run at somewhere in between 60-144+. If you had hardware that could hit 144+ 100% of the time you wouldn't need the technology at all.

    That makes these cards literally the perfect case for 1440p adaptive sync monitors. I had to build in July 2016 due to outside circumstances so wound up with a 1070 + 1440p G-Sync monitor. But today a Vega 56 + 1440p Freesync monitor is unquestionably the better value and there's no comparison. It's $200 cheaper for basically the same results.

    If you're running 1080p, then you don't really need a 1070 or better in the first place.
    2
  • vancliff
    10Tacle in the chart it mention the data rate not the memory. Is not a joke. personal attack removed
    1
  • cangelini
    Anonymous said:
    Yaaayyy! The NDA prison has freed everyone to release their reviews! Outstanding review, Chris. This card landed exactly where it was expected to, between the 1070 and 1080. In some games it gets real close to the 1080, where in other games, the 1080 is significantly ahead. Same with comparison to the RX 56 - close in some, not so close in others. Ashes and Destiny 2 clearly favor AMD's Vega GPUs. Can we get Project Cars 2 in the mix soon?

    It's a shame the overclocking results were too inconsistent to report, but I guess that will have to wait for vendor versions to test. Also, a hat tip for using 1440p where this GPU is targeted. Now the question is what will the real world selling prices be vs. the 1080. There are $520 1080s available out there (https://www.newegg.com/Product/Product.aspx?Item=N82E16814127945), so if AIB partners get closer to the $500 pricing threshold, that will be way too close to the 1080 in pricing.


    Yeah, Igor and I were both surprised when his MSI card exceeded 2.1 GHz and mine was limited to ~2 GHz. In contrast, my 1070 Ti FE card flew right past 2.1 GHz (obviously with no extra voltage). Pretty impressive, but not necessarily consistent. I still have the overclocking charts on my workstation--we'll probably save those for individual card reviews, though, until we can figure out where retail samples are landing.
    1