AMD and Nvidia Platforms Do Battle

GeForce 8200 Graphics with PureVideo HD

GeForce 8400GS Graphics Recycled

From a feature standpoint there is little difference between AMD’s 780G and the GeForce 8200/8300. Both are DirectX 10.0 parts with Shader Model 4.0 support. Nvidia utilized the core of the GeForce 8400GS (G86) to create the new nForce… excuse me: mGPU. Like AMD’s counterpart, the GeForce 8200 and the 8300 run at a graphics processor clock speed of 500 MHz, but with a 1.2/1.5 GHz stream processor speed.

PureVideo HD: Less CPU Load, More Power

I already mentioned that AMD’s 780G required slightly less power to decode and decrypt Full HD video, adding 22 W to the system idle power. Although our GeForce 8200 test system, based on an ASRock K10N78FullHD hSLI, was slightly more efficient in idle, playing Full HD video added 25 W to the idle power of 58 W. However, Nvidia’s average CPU utilization was slightly lower, reaching 34% on average when playing FullHD video as opposed to AMD’s 35%. Clearly, the difference is small.

Display Options

While AMD’s 780G supports HDMI 1.3a and even Display Port, Nvidia hasn’t implemented Dolby True HD or DTS-HD. Yet the platform supports HDMI 1.3a and HDCP over DVI-D or HDMI. However, as with AMD, you cannot use two digital display outputs at the same time, as the two video controllers were designed for the D-Sub port and one of the digital outputs. The display options get more complex once you include an additional GeForce graphics card.