Sign in with
Sign up | Sign in

The Rebirth of Multi-GPU Graphics

AMD and Nvidia Platforms Do Battle
By

The Crossfire multi-GPU graphics mode from AMD/ATI and Nvidia’s equivalent SLI have been around for many years, but neither has had a real breakthrough in the mainstream. Although AMD and Nvidia don’t like it, multi-graphics setups remain a high-end feature. Just look to see how many users have either bought a high-end Crossfire or SLI system right away, or for actually purchased a second graphics card to upgrade graphics performance. Typically it has made more sense to invest into a next-generation stand-alone graphics card instead of creating a two-way graphics solution.

However, multi-GPU graphics is in the middle of a rebirth, going all the way from the high end down to the mainstream and budget segments. The chipset solutions from AMD and Nvidia, together with the two ASRock motherboards we looked at, are perfect examples of how graphics will be handled in the future.

One Graphics Solution to Rule Them All

It makes a lot of sense to integrate graphics features into virtually all PC platforms, since displaying Windows or other types of 2D and simple 3D information is a key requirement for PCs, with very few exceptions. Current integrated graphics are easy to integrate; they offer sufficient performance and a modern feature set to meet the requirements of 2D desktops, basic 3D animation and video playback up to high definition levels.

Looking at applications that aren’t graphics intensive, the integrated solutions we found in AMD’s 780G and Nvidia’s GeForce 8200/8300 mGPU are suitable for almost everyone. AMD’s and Intel’s approach to turn graphics into an integral part of the chipset isn’t just a temporary fad.

One Graphics Approach to Find Them

Many of you will probably complain here, as gamers do want a powerful graphics solution. But that can be accomplished easily: just plug in any discrete graphics card into your system, and the integrated graphics unit will be disabled.

While this has been the case for many years, the approach makes much more sense now: 55 nm and 65 nm fabrication technology allows for integrating a graphics unit into a chipset for a justifiable amount of silicon real estate, making it cheap. Power consumption has been decreasing as well. In the end, the most economical approach for chipset makers is to include graphics into all their products, and let users decide whether or not they want to use it.

One Shot to Bring Them All

AMD and Nvidia have realized that integrated graphics can actually be turned into a business advantage, as Hybrid Crossfire X and Hybrid SLI allow users to actually combine the graphics capabilities of an integrated solution with those of a discrete graphics card. Once a customer buys a platform based on an AMD Hybrid Crossfire enabled motherboard or one using Nvidia’s Hybrid SLI approach, many will at least consider purchasing a matching upgrade instead of getting a different card that simply shuts down the integrated graphics unit.

Combining the integrated graphics unit and a discrete graphics card allows the graphics rendering power of both units to be used, although power does not quite reach the levels of powerful discrete graphics cards. These are left out of the Hybrid option by AMD, although Nvidia allows its platforms to shut down the discrete graphics card when it is not needed. And although the current product generation is limited to hybrid modes running the integrated plus one discrete graphics unit, future products could allow for setting up multi-GPU solutions for 3D scenarios, and only using the integrated unit for Windows display and video.

… and in Proprietary Bind Them

Finally, there is a potential downside to the approach, as it can be designed in a way that excludes other suppliers. Crossfire and SLI are already proprietary solutions, and this will certainly not change in the brave new hybrid world. In the end, AMD and Nvidia will get the opportunity to sell more products thanks to brand dependencies. But let’s look first at what both firms have created.

Display all 35 comments.
This thread is closed for comments
Top Comments
  • 12 Hide
    Anonymous , July 18, 2008 11:09 PM
    quote:
    "The GeForce 8200 graphics unit provided much more performance in 3DMark06 .."
    But your graph clearly shows AMD ahead in 3DMark06!

    quote:
    "Graphics: Nvidia Wins"
    After losing all the 3D benchmarks and games, you still consider the Geforce the winner? If you consider Hybrid SLI that important, then at least put 'no winner' for 'Graphics' like you did for 'Efficiency'.
Other Comments
  • -4 Hide
    hellwig , July 18, 2008 9:45 PM
    Does nVidida Hybrid Power work with a 8400GS/8500GT? Does Hybrid SLI/GeForce Boost work with a 9800/200 series card? You mention the 8000series when you talk about SLI, but the 9800/200 when you talk about power. You also say Hybrid SLI can be enabled with low-end cards, but I believe the problem with ATI's solution is that you couldn't use high-end discrete cards. This is a little confusing, luckily I'm not planning an upgrade to this technology.
  • 12 Hide
    Anonymous , July 18, 2008 11:09 PM
    quote:
    "The GeForce 8200 graphics unit provided much more performance in 3DMark06 .."
    But your graph clearly shows AMD ahead in 3DMark06!

    quote:
    "Graphics: Nvidia Wins"
    After losing all the 3D benchmarks and games, you still consider the Geforce the winner? If you consider Hybrid SLI that important, then at least put 'no winner' for 'Graphics' like you did for 'Efficiency'.
  • 1 Hide
    JPForums , July 18, 2008 11:36 PM
    Quote:
    On average, AMD wins – but keep in mind that this is only the case because of the dominance in the productivity benchmark, which is based on office applications.


    I think your priorities are backwards. Businesses buy a lot of systems like this. Productivity is usually their main concern with a system of this grade. Most people I know make heavy use of Outlook, Word, Excel, and other office products (or equivalents) on a daily basis. People I know that make heavy use of products like after effects, and 3DSMax usually do so on more capable systems. I can photoshop being used on systems like this, but I suspect Office type product usage would be much higher even on those systems. In conclusion, winning the overall score due to dominance in applications that will be used most heavily on systems like these makes sense. I'd be more apt to complain if the system won due to a few relatively small advantages in products that are less likely to used on a system like this while losing badly in the ones that are.

    The lower power usage of the nVidia system multiplied by the number of units a business needs might be a good selling point. Though, I'd want to see the average power usage over a benchmark that runs a simple mix of Office and internet tasks before I'd be sure the nVidia system was the best choice (for standard office computers).

    I do think that nVidia's hybrid solution offers a lot of value for people that use adobe products with support for GPU acceleration as they work better with nVidia cards than ATI cards. The hybrid solution would allow the user to run with relatively low power consumption and draw more power when running the adobe product in question. Of course, this still requires you to buy a separate discreet card, but then you can also use the system to game.
  • 0 Hide
    Anonymous , July 19, 2008 2:04 AM
    Thanks for the article. I've been hoping that HybridPower and/or PowerXpress (AMD's implementation, though only in mobile parts) would get some limelight. One question about Hybrid Power: would it let you run dual display (analog+digital) like the regular IGP would, or does it disable dual-display as it does (did?) in regular SLI?
    At any rate, it seems like an amazing feature that may just sway me towards nV in my upcoming build despite all the things I don't like about the company...
  • 0 Hide
    Shuriken , July 19, 2008 8:52 AM
    The HDMI specs of the ATI 780G chipset do not fully comply with 1.3. Only the video part does. For audio only 2 Channel LPCM, Dolby Digital 5.1, or DTS 5.1 is supported. This is mentioned on several manufacturers websites and it has been confirmed by several users.

    The Nvidia 8200/8300 on the other hand supports 8 channel LPCM over HDMI.

  • 6 Hide
    Anonymous , July 19, 2008 3:59 PM
    You said that GeForce 8200 provides much more performance in 3DMark06 in the conclusion, however in page 16 of the article you wrote 780G has better performance in 3DMark06... Some kind of mistake? In the charts 780G has the lead
  • 7 Hide
    chesterman , July 19, 2008 5:35 PM
    synthetics r good tests, but r SYNTHETICS. 780g won in the real game tests, and thats whats matters. when i buy a gpu, i want real applications performance, not a high 3dmark score.
    i don't wanna be a amd/ati fanboy, but is clear that 780g is a superior chipset, except on the power consumption, when it loses for a minimal diference.
    sorry for my poor english, i'm not from us/uk.
    in my opinion; GRAPHICS: AMD/ATI WINS!
  • 0 Hide
    ZootyGray , July 19, 2008 5:58 PM
    (quote)
    The 4-phase voltage regulator does not use solid capacitors, but that is usually not the case for low-budget motherboards. However, it is powerful enough to stay cool even running an overclocked Phenom X4 processor. (quote)

    HOW DARE YOU SAY THAT???? WITHOUT QUALIFICATION.

    LOOK HERE FOLKS - the mobo manufacturers are being really quiet about this - I THINK (?) a few MIGHT have dealt with it - but nowhere do I see that the manufacturers' use of cheap MOSFET parts (3 or 4 phase) has been improved to handle hipower hiwatts cpu's - READ IT BEFORE YOU WEEP!
    NOWHERE do I see a comforting statement that this economical weakness has been resolved. Nowhere do I see specific (!) "CPU SUPPORT LISTS" that really spell out a safe solution or a specific warning or anything that makes me feel safe. This is an issue with mobo manufacturers. Buyers need to simply understand the limitations. Here's a link to some brutal testing for you - or you can buy into this little lord of duh rings fairy tale. After 4 pages of nothing, I need a break.
    read this:

    http://www.anandtech.com/mb/showdoc.aspx?i=3299&p=2

    I love the 780G boards - just know the limits!! And adding a specific fan might be prudent - the only reason I am reading this is for some info on the resolution of tthis issue - I don't see it. Most of this is old news from last april.

    I will read more later (yawn)

  • 0 Hide
    ZootyGray , July 19, 2008 6:06 PM
    Biostar has an intresting board with a little heat pipe on it - but no comment about why?? I think it's called space pipe or like that. And they say they support Phenom 9859 cpu.

    Gigobyt seems to be shuffling "versions" - really unclear.

    Azoosss seems to offer a 3year warranty (nice!) but the 780G board seems to have disappeared from the site - at least, I could not find it.

    Stay below 95watts cpu is supposed to be safe
    - an article on this issue????

    PRICELESS
    !
  • -1 Hide
    ZootyGray , July 20, 2008 1:59 AM
    (quote)
    we did not even try more demanding games such as Crysis, for performance reasons
    (quote)

    Of course - no one in their right mind would read a review seeking performance information.
    Back in April and May when various 780g mobos were being raved about, it was revealed that Crysis would run and the HybridXfire (w 3450card) actually improved it - the numbers weren't much tho. 3dguru recommended using a 3870 - last May. What about the 4850? It wasn't out last April.

    I have now read 11 pages and I wonder why. Your nvidiot bias is gross. And I feel like I am reading a promo for assrox. how about a ppcchippps review? that would be stimulating.
  • -2 Hide
    ZootyGray , July 20, 2008 2:26 AM
    Nice of you to use an ancient phenom 9600 - was that the B2 stepping errors. Find that under the bench?

    What happens when you drop a 9850/9950 in it? Does it go boom??

    Is this your first rev?
  • -3 Hide
    ZootyGray , July 20, 2008 2:46 AM
    —all Socket AM2+ boards can run a quad core AMD Phenom X4 processor.

    Does this mean only one? like the crippled 9600.

    Or can they run any phenom? like the 125 watt models. Or the 6400 x2?

    And your benchmarks clearly show in real game that nvidiot loses. again.

    This is one sad review. I apologize for reading it. This is a new low.
  • -2 Hide
    dragonsprayer , July 20, 2008 5:15 AM
    intel chipsets rule and the 4870x2 will bury nvidia for the next 6 months. nehalem make amd buy of ati really smart.
  • 1 Hide
    Anonymous , July 20, 2008 6:25 AM
    ZootyGray - cribber

    - MSI lets you buy a motherboard with Solid Capacitors - as an option you do get it if you want
    - The TLB error has only been reproduced in AMD lab, nowhere else.
    - Since you can't find motherboards, let me make it easy for you:
    http://asus.com/products.aspx?l1=3&l2=149&l3=639
    Those are the 780G Mobos from ASUS that support uption 140W CPU's; Any problem get back to ASUS.

    You obviously read a lot that goes over your head. Slow download and read less that way you can actually understand the small things and will not need to crip so much.
  • 5 Hide
    Anonymous , July 20, 2008 9:24 AM
    AMD 780 is much FASTER than nVidia in 3DMark06, correct your conclusion please.
  • 1 Hide
    PS , July 20, 2008 3:25 PM
    I don't think they give a ****.

    The sound properties and energy consumption is affected by the manufacture's choice of components and driver support, and are not entirely a chipset issue.

    Some of the conclusions seem to be made by retards. They contradict the facts of their own testing. If you're not payed to lie, fix the text - Boneheads
  • 0 Hide
    Ryun , July 20, 2008 3:27 PM
    —all Socket AM2+ boards can run a quad core AMD Phenom X4 processor.

    Does this mean only one? like the crippled 9600.

    Or can they run any phenom? like the 125 watt models. Or the 6400 x2?

    =======================================================================

    I helped a friend's friend build an AMD phenom machine with a Gigabyte 780G board with a 9750. It works fine, unlike some other 780G boards out there but Gigabyte, along with some others, have updated their CPU lists. The 9850 was not recommended though.
  • 3 Hide
    njalterio , July 20, 2008 3:54 PM
    If you are going to make blatantly wrong statements in your conclusion regarding which is the better graphics solutions, at least fudge your benchmark results so it looks consistent! I don't know if I am missing something, but why is Nvidia the winner if AMD won all of the graphics benchmarks?

    Whenever there is a really funky article written, I check up top to see who has written it and it is always Schmid and Roos.
  • 0 Hide
    miribus , July 20, 2008 3:57 PM
    The chipset has nothing to do with the processor support in terms of power consumption.
    The power supplies on the motherboards does.
    Either power supply on the board supports 125w, 140w (or higher if you're overclocking) or it doesn't.
    One of the reasons that cheaper boards don't allow overclocking features is that they know they don't have a supply that can take it.
    There's a reason that they have heatsinks on those fets.
  • 0 Hide
    njalterio , July 20, 2008 4:07 PM
    @ Canute24, are you absolutely certain that the phenom TLB error has only been found in AMD test labs?

    Maybe you should take a look at this:
    http://forums.ageofconan.com/showthread.php?t=121752

    *hands dunce cap to canute24*
Display more comments