Nvidia GeForce GTX 1080 Ti 11GB Review

Power Consumption

Our graphics card test setup and methodology are detailed in How We Test Graphics Cards.

At idle, the GeForce GTX 1080 Ti Founders Edition’s power consumption lands exactly where we'd expect: right around 13W.

One result that really stands out is the >30W difference in our gaming workload between the cold and fully-exercised card. We see power drop a bunch once the 1080 Ti reaches its full operating temperature, which means it's hitting a thermal limit and being slowed down by GPU Boost.

To back our claim, we recorded the temperature and clock rate during warm-up and plotted them in the same graph:

Our stress test yields the same drop in frequency, though it's less pronounced due to the much higher load.

How do these results compare to Nvidia's Titan X (Pascal)? A look at our library of data yields some interesting findings.

The GeForce GTX 1080 Ti's average clock rate is a tad higher than the Titan X's once both cards reach their peak operating temperature. However, the 1080 Ti gets there faster, giving the Titan X a boost early on.

Also interesting is that the Titan X's curve is a lot smoother than the 1080 Ti's, which behaves somewhat frenetically.

The power consumption curves as graphs are provided below, and we start by comparing the cold and warmed-up cards:

Notice the same large difference in power consumption between the cards' temperature levels.

We documented the GeForce GTX 1080 Ti's power consumption and current exclusively at its operating temperature because the two maximum values ended up being almost identical. This is due to the power target of 250W imposing a hard limit, which Nvidia manages to hit very well.

Adherence To Specifications

Ever since the launch of AMD's Radeon RX 480, we look closely at every new card's adherence to the PCI-SIG’s specifications. Nvidia's GeForce 1080 Ti Founders Edition is no exception, and it passes our test with flying colors. It doesn't use the 3.3V rail at all; only the 12V rail.

Our readings put the motherboard slot's 12V rail at approximately 4.4A. Given a ceiling of 5.5A, the card has plenty of room to spare.

Create a new thread in the Reviews comments forum about this subject
This thread is closed for comments
76 comments
    Your comment
    Top Comments
  • dstarr3
    Oh my. I've got a 980 Ti now, and I thought I could hold out until Christmas 2018 or so to upgrade, but seeing that this card has nearly double the FPS... That's a pretty big deal...
    12
  • Other Comments
  • dstarr3
    Oh my. I've got a 980 Ti now, and I thought I could hold out until Christmas 2018 or so to upgrade, but seeing that this card has nearly double the FPS... That's a pretty big deal...
    12
  • HaB1971
    Would love one, but pointless for 1080p gaming which is what I am restricted to thanks to 2 x 27inch 1080p monitors. I don't need to replace those either they work, they are good enough and not interested in VR etc. 4k for me, is still too pricey
    0
  • sillynilly
    So sexy and will be replacing my 1080. Amazing card
    1
  • salgado18
    Anonymous said:
    Oh my. I've got a 980 Ti now, and I thought I could hold out until Christmas 2018 or so to upgrade, but seeing that this card has nearly double the FPS... That's a pretty big deal...


    Why not wait? Your card is still great, and you can pick up a 2080 or Vega 2 by then. Unless you can't live without 4K at Ultra, keep your card.
    3
  • Ray_58
    Hab1971, 4k really isn't that bad price wise, its ok, part of the problem is the LCD panel industry milking the crap out of 1080p resolutions still up to the 300$ pricepoint when in actuality we should have been at base standard 2k TN/IPS panels at the 160$-300$ range. Still today though the greatest costs are the fact 60htz is still standard and any increase is massive price cost increases, and obviously Gsync for NVidia. Still spending $500-$800 on a monitor and then dropping 700$ on this is a bit 2 much for the mainstream. Id rather buy a 2k IPS screen with Gsync at 700$ than a 4k 60htz monitor at 400$
    3
  • t1gran
    Why Titan X (Pascal) performance is worse here than it was in it's review?
    0
  • envy14tpe
    Anonymous said:
    Oh my. I've got a 980 Ti now, and I thought I could hold out until Christmas 2018 or so to upgrade, but seeing that this card has nearly double the FPS... That's a pretty big deal...


    I feel the same. I see the jump in BF1 to be massive and enough to warrant a 1080 Ti for 1440p gaming. I'm holding out until June when the next Nvidia price drops.
    0
  • FormatC
    Teaser:
    Just started to bring this baby under water. Let's check, what OC is doing :)

    4
  • dstarr3
    Anonymous said:
    Anonymous said:
    Oh my. I've got a 980 Ti now, and I thought I could hold out until Christmas 2018 or so to upgrade, but seeing that this card has nearly double the FPS... That's a pretty big deal...


    Why not wait? Your card is still great, and you can pick up a 2080 or Vega 2 by then. Unless you can't live without 4K at Ultra, keep your card.


    Well, two reasons: 1) I'd like to upgrade to 1440p/144 this year. 2) I also have an HTPC with a 770 in it that needs an upgrade. I was considering buying a 1060 for that computer, but instead, I might just buy this 1080 Ti for my main rig and put its 980 Ti in the HTPC.

    Either way, no purchasing until Christmas, because I hate paying full price for just about anything. So I've got time to think about this.
    1
  • Clamyboy74
    competition is brewing, good
    2
  • TheoCF
    "Ghost Recon Wildlands uses the same AnvilNext 2.0 engine as The Division, which was patched to support DX12. Ghost Recon is launching as a DirectX 11 title, though."

    Tom Clancy's The Division is actually built on Snowdrop engine. I'm almost positive that no games released on AnvilNext support DirectX12.
    0
  • cangelini
    Anonymous said:
    "Ghost Recon Wildlands uses the same AnvilNext 2.0 engine as The Division, which was patched to support DX12. Ghost Recon is launching as a DirectX 11 title, though."

    Tom Clancy's The Division is actually built on Snowdrop engine. I'm almost positive that no games released on AnvilNext support DirectX12.


    Whoops--you're right. I knew that, too. Someone else mentioned to me in the last couple of days that Division was AN 2.0, and that must have stuck for some reason. Fixed now!
    0
  • Gurg
    While I realize it isn't an actual game, it would be nice to see results for Firestrike Time Spy so we can compare results to sli setups. From other sites tests its results obviously beats my 980 sli set up.

    Finally a single gpu card that can pretty much max out games at 4k. This looks to be a terminal card solution for 4k 60hz refresh, though with an OEM solution cooling or closed loop solution.
    1
  • Gurg
    Anonymous said:
    So sexy and will be replacing my 1080. Amazing card


    Why not just buy a reduced price 1080 for sli?
    0
  • JackNaylorPE
    Anonymous said:
    Hab1971, 4k really isn't that bad price wise, its ok, part of the problem is the LCD panel industry milking the crap out of 1080p resolutions still up to the 300$ pricepoint when in actuality we should have been at base standard 2k TN/IPS panels at the 160$-300$ range. Still today though the greatest costs are the fact 60htz is still standard and any increase is massive price cost increases, and obviously Gsync for NVidia. Still spending $500-$800 on a monitor and then dropping 700$ on this is a bit 2 much for the mainstream. Id rather buy a 2k IPS screen with Gsync at 700$ than a 4k 60htz monitor at 400$


    I agree on the 2k 144 hz over 4k 60 Hz. But now with the partnership between NVidia and AU Optronics on the new 144Hz 4K gaming displays, we are looking at an advance in panel technology far above anything in recent years, at least on paper. The Acer Predator XB272-HDR and Asus ROG PG27UQ bring many changes:

    4k 144 Hz IPS
    DP 1.4
    HDR display capabilities ( HDR10)
    G-Sync HDR technology (reduces eliminate input lag with HDR gaming)
    Increased brightness, superior contrast
    Quantum Dot color filter (wide color gamut)
    90%+ DCI-P3 color space coverage (increased HDR gaming realism and color saturation).
    384 individual lighting and local dimming zones ("increased contrast precision and deep, inky blacks")
    HDR panels in these monitors monitors will provide 1000nits of brightness (current 4K PC monitors can barely hit350 nits).

    The price of the 2k 144 hz IPS monitors have dropped in the past 2 weeks my guess partly in response to the announcement of these new panels, just as the 1080 Ti has pushed down 1080 pricing. So far cost projections have ranged from $950 to $1200 which doesn't see like a big premium over the $700-$799 2k models given the increased resolution and feature set


    Quote:
    Why not just buy a reduced price 1080 for sli?


    Unlike previous generations where scaling averaged 70% and 96 to > 100% on the more demanding games, SLI scaling averages only 18% at 1080p and 30% at 2k.

    Can't wait for the reviews on the **real** 1080 Tis .... ones that don't throttle, better VRMS, chokes, etc to see what these extras can deliver.
    0
  • JackNaylorPE
    As to the card, I wouldn't be anxious to run out and buy a FE cards as throttling is again severe, resulting in clock reduction from 1850 MHz down to bouncing between 1600 MHz and 1700 Mhz



    Overclocking is a bit disappointing at just 15.7% (with 120% per limit, 90C max temp, 100% fan speed) .. another reason to wait for th AIB cards.
    1
  • rrahel
    http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164-4.html

    In this 980ti review, GTA 5 was running at 40fps @4k now with the release of new cards, 980ti is running GTA5 at 30fps @4k and 1080 is running gta5 at 40 fps.
    -1
  • FormatC
    Other settings ;)
    0
  • FormatC
    Anonymous said:
    Overclocking is a bit disappointing at just 15.7% (with 120% per limit, 90C max temp, 100% fan speed) .. another reason to wait for th AIB cards.
    This is exactly the reason why we haven't made any OC results. It is simply too stupid to try this and the reason for my follow-up with a water cooler (seen on my teaser before).
    0
  • cangelini
    Anonymous said:
    Other settings ;)


    Yup--check out the benchmark description. We changed some of the settings to make the workload more demanding.
    0