Update: Nvidia Titan X Pascal 12GB Review

Power Consumption Results

Measurement Methodology & Graphical Illustration

The measurement and analysis software we're gradually transitioning to, PresentMon, integrates a whole host of sensor data with the frame time measurements. This allows us to chart how individual characteristics of a graphics card’s performance (like temperature) influence others (such as power consumption) in real time. We’re including the reader-friendly version of our oscillography measurement graphs as well, of course.

The measurement intervals are twice as long. There's also a hardware-based low-pass filter and software-based variable filter in place (the latter is a feature of the software used to analyze data; it's designed to evaluate the plausibility of very short load peaks and valleys). The resulting curves are a lot smoother than the old ones; we hope you derive more value from them as a result.

Power consumption is measured according to the processes outlined in The Math Behind GPU Power Consumption And PSUs.

You'll find a larger number of bar graphs, and higher-resolution versions of our power consumption charts that you can expand by clicking on them. We restructured our topic sections, added more comparison bar graphs, and, finally, added different scenarios to our measurements. In addition to power consumption, we also examine current to determine whether the graphics card stays within all of its relevant specifications. Our test equipment doesn't change, though:

Power Consumption Measurement

Test Method
Contact-free DC Measurement at PCIe Slot (Using a Riser Card)
Contact-free DC Measurement at External Auxiliary Power Supply Cable
Direct Voltage Measurement at Power Supply

Test Equipment
2 x Rohde & Schwarz HMO 3054, 500 MHz Digital Multi-Channel Oscilloscope with Storage Function
4 x Rohde & Schwarz HZO50 Current Probe (1 mA - 30 A, 100 kHz, DC)
4 x Rohde & Schwarz HZ355 (10:1 Probes, 500 MHz)
1 x Rohde & Schwarz HMC 8012 Digital Multimeter with Storage Function

Power Consumption and Temperature Problems

We already mentioned the fascinating new capabilities of our measurement and analysis software. This is why we’re jumping ahead a bit and presenting the card’s power consumption during our Doom benchmark in relation to its temperature. Without thermal throttling, power consumption would land at almost 249 W, regardless of whether we tested at QHD or UHD. We chose to conduct our temperature-related measurements at the lower resolution, since it allows the card to warm up a bit more slowly, making the curves a bit easier to interpret.

The new Titan X hits its thermal limit after only two minutes of full load. Consequently, we set the fan speed to 75% in order to avoid having the better cooling negatively influence our measurement results. Otherwise, power consumption would drop by approximately 30 W to under 220 W during the gaming loop!

Power Consumption at Different Loads

In addition to the usual benchmarks, we include a few different games that rely on a variety of rendering paths and graphics settings. Doom at 4K with TSSAA (8TX) proves to be the most challenging benchmark, pushing the card to 249 W, just shy of its 250 W power limit. The torture loop tops that number with 252 W. However, there aren't many enthusiasts who enjoy playing FurMark.

The lowest power consumption during the gaming loop also comes from Doom. To achieve this, it’s set to OpenGL and TXAA (1TX), though. All of the other games fall somewhere between the two extremes. As you can see, Metro: Last Light loses its position as the highest-power title in our suite.

The gray bar represents power consumption based on those load peaks that made it through our filters to the smoother curve. That bar doesn't have any practical significance since the peaks we measured are too brief for them to matter (even if the shortest-duration ones were already filtered out by this point).

Power Connector Load Distribution

The following chart looks at how the load is distributed across the power rails during a taxing, but realistic, gaming load and stress test. What’s important is that the overall load is balanced well between the motherboard slot connector and six-pin power connector. This is certainly being achieved; Nvidia's Titan X draws less than 55 W via the motherboard’s slot.

Here are the corresponding graphs for gaming and our stress test. Click on them for a larger version.

The PCI-SIG’s specifications only apply to current, meaning power consumption results on their own don't tell the whole story. Our readings put the motherboard slot significantly below 4.5 A. Given a ceiling of 5.5 A, this is most certainly on the safe side with lots of room to spare. This result’s hardly surprising in light of our low power consumption measurements for this connector.

Of course, there are larger graphs for the current measurements as well.

Power Consumption Comparison with Other Graphics Cards

Finally, we’d like to know how the Titan X with its GP102 GPU stacks up against other graphics cards. We're using the peak power consumption numbers for this comparison because they're what the previous results consisted of.

Nvidia stays true to form and sets a hard power target of 250 W. The card’s performance could be increased tremendously by getting rid of that cap. Unfortunately, our German lab doesn't have a second sample of the card, so we can’t run the usual overclocking tests or reconfigure the card for our customary water cooling setup.

MORE: Best Graphics Cards

MORE: Desktop GPU Performance Hierarchy Table

MORE: All Graphics Content

This thread is closed for comments
20 comments
    Your comment
  • chuckydb
    Well, the thermal throttling was to be expected with such a useless cooler, but that should not be an issue. If you are spending this much on a gpu, you should water-cool it!!! Problem solved
  • Jeff Fx
    I might spend $1,200 on a Titan X, because between 4K gaming and VR I'll get a lot of use out of it, but they don't seem to be available at anything close to that price at this time.

    Any word when we can get these at $1,200 or less?

    I wish I was confident that we'd get good SLI support in VR, so I could just get a pair of 1080s, but I've had so many problems in the past with SLI in 3D, that getting the fastest single-card solution available seems like the best choice to me.
  • ingtar33
    $1200 for a gpu which temp throttles under load? THG, you guys raked AMD over the coals for this type of nonsense, and that was on a $500 card at the time.
  • Sakkura
    Interesting to see how the Titan X turned into an R9 Nano in your anechoic chamber. :D

    As for the Titan X, that cooler just isn't good enough. Not sure I agree that memory modules running 90 degrees C is "well below" the manufacturer's limit of 95 degrees C. What if your ambient temperature is 5 or 10 degrees higher?
  • hannibal
    No problem, the card will throtle down even more in those cases...
  • hotroderx
    Basically the cards just one giant cash grab... I am shocked toms isn't denouncing this card! I could just see if Intel rated a CPU at 6ghz for the first 10secs it was running. Then throttled it back to something more manageable! but for those 10 secs you had the worlds fastest retail CPU.
  • tamalero
    Does this means there will be a GP101 with all core enabled later on? as in TI version?
  • hannibal
    TitanX Ti... No, 1080ti is cut down version. Most full ships will go to professinal cards and maybe we will see TitanZ later...
  • blazorthon
    An extra $200 for a gimped cooler makes for a disappointing addition to the Titan cards.
  • Sakkura
    98197 said:
    Does this means there will be a GP101 with all core enabled later on? as in TI version?


    No. The same chip with all the cores enabled would still be the same chip. However, it does mean there could eventually be a new Titan card with a fully enabled GP102. The same way the original Titan was succeeded by the Titan Black.
  • filippi
    This was just the final step before the gpu we actually want to see...
  • xapoc
    If this was only $800..
  • Sammy10
    and 1200 dollars later , Nvidia did not find it in their heart to toss a hybrid cooler on tob that baby! CheapoVidia.
  • DeerSpotter
    I repeat, will it play crysis?
  • Sammy10
    2313579 said:
    I repeat, will it play crysis?


    For that you need the glorious Gtx 480
  • cub_fanatic
    Lol @ reviewing this card like it actually is a gaming GPU complete with several game results comparing it to $400-$450 cards like the Fury and GTX 1070. I wonder how many people actually buy Titan series cards with the sole purpose of playing video games. I wonder how that figure compares to the number of people who buy Titan cards for non-gaming applications. When the first Titan came out, it seemed like it was more of a budget Quadro instead of an ultra high end gaming part. Now, it feels like the Titan's sole purpose is to get a few hundred extra bucks out of the wallets of impatient high end gamers in between the release of the GTX x80 and the GTX x80ti cards. Once they sell enough of these Titans to those people then they'll release a GTX 1080ti that might have a few GB less VRAM, maybe less FP64 performance but the same CUDA cores and everything else and which games just as good as the Titan X for hundreds of dollars less. Once the 1080ti is out, nobody would see a reason to buy a Titan X if all you are doing is gaming. It is a pretty smart business move by Nvidia.
  • Sakkura
    983365 said:
    Once they sell enough of these Titans to those people then they'll release a GTX 1080ti that might have a few GB less VRAM, maybe less FP64 performance but the same CUDA cores and everything else and which games just as good as the Titan X for hundreds of dollars less. Once the 1080ti is out, nobody would see a reason to buy a Titan X if all you are doing is gaming. It is a pretty smart business move by Nvidia.


    The Titan X already has the FP64 performance nerfed into the ground. A lowly R9 280X would crush a Titan X in FP64 performance. On paper, even the old Radeon HD 5870 would beat the Titan X.
  • _MOJO_
    This new architecture, which definitely delivers, is so ludicrously expensive. I paid close to $600 for a GTX 580 a few years ago. That was a good investment in hind sight since I made half back several years later, but these prices are insane, especially considering the price to performance.

    $1200 to play 4K at 60 fps? My 4GB 980 still plays everything I want beautifully at 1440p. I just cannot wrap my mind around this yet- not at that premium price. I'll wait for more games coming out, the evolution of VR, and the prices on these cards dropping.
  • tps3443
    Save up your money.. Buy a graphics card when it is first released, and enjoy it! The GTX 980 had a life cycle of nearly 2 years before the GTX1080 was released. And the GTX980 is still fast for gaming! Especially once Overclocked!

    I love my Nvidia GTX1080 Founders Edition! I've adjusted the default fan profile a little, send some air flow it's way, from my case fans. Overclocked it to 2128/11,400 memory. And it is a screaming demon! And, I plan to use it for it least another 18 months.

    I can play 4K, and enjoy it, with a very smooth experience!

    The Titan X Pascal is great! It is 15-30% faster than a GTX1080. But, you've gotta pay to play! I could hardly afford my GTX1080. If I could afford a Titan P , I would buy it in a second!

    Bare in mind though, you can overclock a GTX1080 Founders Edition to roughly 20% more performance out of the box.

    Happy life, happy gaming, happy overclocking, this is what it's all about people!