Power Consumption
Slowly but surely, we’re spinning up multiple Tom’s Hardware labs with Cybenetics’ Powenetics hardware/software solution for accurately measuring power consumption.
Powenetics, In Depth
For a closer look at our U.S. lab’s power consumption measurement platform, check out Powenetics: A Better Way To Measure Power Draw for CPUs, GPUs & Storage.
In brief, Powenetics utilizes Tinkerforge Master Bricks, to which Voltage/Current bricklets are attached. The bricklets are installed between the load and power supply, and they monitor consumption through each of the modified PSU’s auxiliary power connectors and through the PCIe slot by way of a PCIe riser. Custom software logs the readings, allowing us to dial in a sampling rate, pull that data into Excel, and very accurately chart everything from average power across a benchmark run to instantaneous spikes.
The software is set up to log the power consumption of graphics cards, storage devices, and CPUs. However, we’re only using the bricklets relevant to graphics card testing. Gigabyte's Aorus GeForce RTX 2080 Ti Xtreme 11G gets all of its power from the PCIe slot and a pair of eight-pin PCIe connectors. Should higher-end 2080 Ti boards need three auxiliary power connectors, we can support them, too.
Idle
An average power reading of just under 18W is a little higher than what we measured from GeForce RTX 2080 Ti FE. Then again, the Aorus GeForce RTX 2080 Ti Xtreme 11G does have an extra fan, plus a bunch of lighting that the FE board lacks.
It’s also worth noting that Gigabyte offers semi-passive functionality, which cuts power consumption slightly. But we prefer a bit of active cooling, even at idle load levels, so we test with the fans spinning.
Gaming
Running the Metro: Last Light benchmark at 2560 x 1440 with SSAA enabled pushes GeForce RTX 2080 Ti to the max, yielding an average power consumption measurement of 302W. Most of that power is delivered evenly through both eight-pin auxiliary connectors.
We pulled most of the lower-end cards out of our comparison chart since they use quite a bit less power than the GeForce RTX 2080 Ti Xtreme 11G. However, AMD’s reference Radeon RX Vega 64 remains. Although it’s much slower through our benchmark suite, we can see that AMD’s flagship tries to maintain a similar power target.
Recording current through three runs of the Metro: Last Light benchmark gives us a line chart that resembles power consumption, naturally. Still, breaking the results down this way tells us that the PCIe slot hovers around 4A—well under its 5.5A ceiling.
FurMark
Power consumption under FurMark isn’t much higher than our gaming workload. There is a slightly higher peak reading, and one of the eight-pin connectors shoulders more of the task than before. Still, an average of 303W shows that Gigabyte has its Aorus GeForce RTX 2080 Ti Xtreme 11G capped. Once that power ceiling is hit, voltage and frequency are scaled back.
Maximum utilization yields a nice, even line chart as we track ~10 minutes under FurMark.
Tracking power consumption over time in FurMark doesn’t look much different from what we saw under Metro: Last Light. The Aorus GeForce RTX 2080 Ti Xtreme 11G holds steady around 300W. AMD’s Radeon RX Vega 64 tries to maintain the same power level, but has to switch between throttle states after a few minutes. Even a GeForce GTX 1080 Founders Edition starts acting up after a while.
Current draw over the PCIe x16 slot is slightly higher than 4A, and well under the 5.5A ceiling defined by the PCI-SIG.
MORE: Best Graphics Cards
MORE: Desktop GPU Performance Hierarchy Table
MORE: All Graphics Content
I am not sure where they get their pricing but they did the same with the RX590 review with a over priced 1060. I think its a site algorithm that pulls prices and not a person as any normal person can easily find better deals.
It is insane, that you guys, are now accepting Nvidia price gouging behavior as something normal.... IT IS NOT!
This card offer the worst value ever.
About ready to retire this site for good at this point; first "just buy it because there is a price to NOT be an early adapter tripe", then a 4.5/5.0 for any NVIDIA RTX product is just a slap in the face of the site readers.
Clearly this is about the ad bucks, not sure how your reviewers can look themselves in the mirror anymore.
I'll say pass to shitty rtx. I cannot believe how people could pay 1300 for a gimmick like rays. Gimmick that you will find in 0.0001% of the time played in the 2 existing games. It's like getting the gold frame. I would have preferred they invested in better more advanced phisics and market that.
Not to mention, the current gen of RTX is only optimized for 1080p!
RTX Off: too powerful for 1080p. Ideal for 1440 and 4k
RTX On: Works best at 1080p. But for 1440 and 4k? NOPE.
The core feature of these cards(flagship) is useless to folks with higher res monitors, while being too powerful without said feature for those on lower res ones. This is why I'm passing on this gen's flagship.
It's just a bad investment all around.
And the 2060 won't fix this either:
RTX Off: great for 1080p.
RTX On: Not going to be able to run max settings like the 2070(?) and up, but hey, the overall build will be better balanced at least?
If they can keep the RTX train running, I'll hop aboard when they can do RTX On: 1440p, 100+hz.
It's not that the hardware isn't powerful enough.. It's that the API's are a year behind being able to render it without massive performance issues. So yes it makes these premium cards now seem Gimmicky and over inflated in price.
In a way they are listening by releasing Touring cards without the RTX badge at a closer to normal price. The issue here is the market is flooded and Nvidia is going to rush out Touring with 2X the badge numbers they normally release. Glad I moved all my Nvidia stock to AMD when Ryzen launched.
This made me lol, what a time we are in.