Sign in with
Sign up | Sign in

Benchmark Results: Power, Analyzed

Can Lucidlogix Right Sandy Bridge’s Wrongs? Virtu, Previewed
By

Lucidlogix is touting power-saving as one of the reasons to use Virtu—and that’s one of the biggest reasons why it’s using Intel’s integrated graphics engine as a native adapter, rather than the discrete card. Purportedly, it’s possible to power down the add-in card, saving power when it’s not needed.

Our power test shows that isn’t necessarily the case in this version of the software, though. Charting a run of the same Death Race trailer transcode, Intel’s HD Graphics 3000 on its own uses the least amount of power. Drop in a GeForce GTX 580, which just sits there idling while Quick Sync does its work, and power consumption jumps up by about 40 W.

Disable Virtu and use the GeForce card on its own, employing CUDA-accelerated transcoding, and power peaks even higher. Not only that, but the workload takes more than twice as long to complete, too. And as a nail in the coffin, transcode quality is sub-standard (for more on transcoding quality, check out: Video Transcoding Examined: AMD, Intel, And Nvidia In-Depth).

To reaffirm our results, I let all three configurations rest at idle to see if Virtu would turn off the GTX 580.

Power Consumption At Idle
Watts
HD Graphics 3000 + GeForce GTX 580
112 W
HD Graphics 3000
49 W
GeForce GTX 580
108 W


The Sandy Bridge-based configuration, sans GeForce GTX 580, delicately sips 49 W. With the HD Graphics 3000 core disabled and a GeForce GTX 580 driving our display, I recorded 108 W of power use. Then, with Virtu enabled and the two display adapters idling along, power consumption jumped to 112 W.

Hopefully we’ll see a future revision of Virtu that follows through on the promises of power reduction. For now, I don’t see this as an issue on the desktop. It’d be more of a concern in the mobile space.

React To This Article