Top to bottom: GeForce 8800 GTS by Zotac, GeForce 7800 GT by Asus, GeForce 6800 GT by Nvidia (reference board).
We decided to go back in time and use three different graphics card generations to compare their impact on total system power requirements. The first card we used is an upper mainstream GeForce 8800 GTS by Zotac. The 320 MB versions start at less than $300 and provide excellent performance and graphics features thanks to DirectX 10 support. But the GeForce 8 series is a power hog as well, as the test system consumed at least 143 W when idle with this graphics card. Running 3DMark06 caused the system power requirements to reach almost 250 W, which represents a 100 W increase to display sophisticated 3D graphics.
If we bring the GeForce 7 into the game, we can't support DirectX 10 graphics, but certainly DirectX 9.0c with Shader Model 3, which remains adequate at least for the next few months. Performance of the GeForce 7800 GT clearly cannot keep the pace with that of the GeForce 8, but power requirements decrease considerably: 103 W instead of 143 W means 72% of the initial system power requirements, or a 38% decrease in idle power draw. The maximum power draw under load decreases from 248 W to 184 W, which is 74% of the requirements of our GeForce 8 test system or a 35% decrease. From an energy efficiency standpoint, GeForce 8 is a horrible choice if you have a GeForce 7 class graphics card, as long as you don't need support for DirectX 10 and don't insist on maximum performance.
Finally, let's look at the GeForce 6800 GT. We were amazed to see that the GeForce 7 is faster and more energy efficient, as the GeForce 6800 GT requires more energy than the 7800 GT: 103 vs 117 W idle and 184 vs 189 W under load. We haven't looked at ATI graphics cards due to time constraints, but from what the Radeon HD 2900 has shown, ATI's DirectX 10 hardware also requires considerably more energy than its predecessors, the Radeon X1800/1900 and Radeon X850.