Sign in with
Sign up | Sign in

GPGPU Benchmark

Updated: AMD 785G: The Venerable 780G, Evolved
By

It's ATI Stream versus Nvidia's CUDA in this GPGPU-on-IGP shootout. We benchmarked the Cyberlink Espresso video converter, which is conveniently optimized to take advantage of both CUDA and ATI Stream and seems to be an ideal test.

Maybe not so ideal after all, as these results look somewhat strange.

We had a few odd occurrences of note during testing. First of all, when Espresso was run on the Nvidia 9300 chipset, the program gave us the option to turn hardware acceleration on or off, presumably to enable/disable the CUDA optimizations. The strange thing is that when optimizations were enabled, the video encoding took longer, which is the opposite of what we expected. We asked Nvidia about this phenomenon, and they claim that their testing shows even a 3 GHz Core 2 Duo E6850 should demonstrate a performance increase with CUDA enhancements enabled.

Nvidia also pointed out that these results are misleading unless the resulting encoded video is bit-for-bit identical, and they claim that their encoding is of a higher quality than the video resulting from the ATI Stream optimizations. They were also concerned that the version of Cyberlink Espresso AMD had provided was not a publically available build. We were actually working with both companies (and Cyberlink) on an image quality comparison, even before 785G showed up, so you can expect more detailed coverage there as soon as that project comes together.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 60 comments.
This thread is closed for comments
  • -3 Hide
    anamaniac , August 4, 2009 6:37 AM
    Very interesting.
    A integrated GPU that can game. =D

    Makes my lil Pentium D with a 4670 seem puny...
    3.3GB/s memory bandwidth (single channel DDR2 533... though 2 sticks, it runs in single channel... damn prebuilts) also seems sad on my rig...

    macer1the real question is how would this perform if mated to an Atom processor in an nettop.


    Good question. A dual core Atom with a 4200 integrated would be nice.
    We all know Intel makes shitty mothebroards and AMD makes kickass motherboards anyways.
  • 9 Hide
    SpadeM , August 4, 2009 7:21 AM
    mcnuggetofdeath^^^ and support for DDR3. Although thats a change to the board, not the CPU.


    Not correct, the P2 has a built in memory controller so the switch to ddr3 affected that controller
  • 1 Hide
    apache_lives , August 4, 2009 8:27 AM
    anamaniacVery interesting.A integrated GPU that can game. =DMakes my lil Pentium D with a 4670 seem puny...3.3GB/s memory bandwidth (single channel DDR2 533... though 2 sticks, it runs in single channel... damn prebuilts) also seems sad on my rig...Good question. A dual core Atom with a 4200 integrated would be nice.We all know Intel makes shitty mothebroards and AMD makes kickass motherboards anyways.



    Native ram for a pentium d is PC4200 which has a max of 4.2gb/s per channel etc and the FSB has the max of 6.4gb/s

    The Intel atom would most likely underpower any video card out there, and Intel does actually make a good reliable business platform where video performance is not required etc
  • -3 Hide
    Anonymous , August 4, 2009 10:13 AM
    I'm sorry, is this an Intel benchmark site? All other reviews put SYSTEM power consumption for Athlon II 250 well below Intel E7200.
  • 1 Hide
    aproldcuk , August 4, 2009 11:27 AM
    This article raised a lot of questions for me. What about Hybrid Crossfire for example? What kind of cards can be used together with this new IGP? Is the discrete graphics card on standby if no performance is required? If no then how much extra outlet wattage is expected? And how much extra if actively in use? I'm interested in using the 785G solution in the 24/7 HTPC setup with the possibility to do occasional gaming as well. My current setup with 690G chipset and Athlon 64 X2 BE-2350 CPU draws around 50 watts most of the time and up to 90 watts under heavy load. Is it too much to expect similar levels from 785G and Phenom II X3 705e combo for example?
  • 1 Hide
    wh3resmycar , August 4, 2009 12:10 PM
    when can we see the mobile version of this? this is most certainly a welcome update compared to the 780g-hd3200 chipset. and beats any nvidia igp hands down. id love to see this on an $700-$800 laptop. good thing im still holding back on buying a new notebook.
  • 4 Hide
    Pei-chen , August 4, 2009 12:30 PM
    Good timimg, I was wondering if 785G is better than 790GX or not yesterday. Thanks.
  • 0 Hide
    neiroatopelcc , August 4, 2009 12:33 PM
    ArticleThere are two lessons to be learned here: first, if you really care about the environment, turn your PC off (or at least configure it to enter sleep mode) when you're not using it, and second, don't be afraid of purchasing a better processor for fear that it will cost you big money in power consumption.


    Perhaps the next task could be a power comparison to tell us how long a computer needs to stand in active state to consume more power than turning it off and back on again (including starting msn,av software and a bunch of other stuff running in the background).

    Anyway good article :) 
  • 3 Hide
    Anonymous , August 4, 2009 12:47 PM
    McNuggetOfDeath: There were changes to the Phenom II architecture, 45nm is not what enabled higher clocks, it was architectural changes(mostly regarding internal latencies). There were also other changes as well that enabled higher IPC and smoother overall performance.

    PS: Phenom II does support DDR3, there are only 2 models out of 12 that don't...
  • 2 Hide
    Onus , August 4, 2009 2:04 PM
    Pei-chenGood timimg, I was wondering if 785G is better than 790GX or not yesterday. Thanks.

    ========
    My take on it is except for some specific HTPC features, the 790GX is still the better of the two, especially if any gaming is involved. They compared an OC'ed 785G to a stock 790GX; what if they'd OC'ed the 790GX also?
    And, lest anyone develop any false hope, the Intel IGP has once again been shown to be a toad.
  • 8 Hide
    DarkMantle , August 4, 2009 3:19 PM
    One of the best things this chipset brings is a lower cost on AM3 motherboards, if you want to use PhenomII processors paired with DDR3 ram and a single video card, you can pay 89-99 dollars for the motherboard. I think this is important.
  • 7 Hide
    judeh101 , August 4, 2009 4:11 PM
    I would totally use this with my home theatre PC.
    Let's seee... Decent performance, able to play HD videos, low cost. That covers everything I need for a HTPC!
  • 2 Hide
    cleeve , August 4, 2009 4:12 PM
    aproldcukThis article raised a lot of questions for me. What about Hybrid Crossfire for example? What kind of cards can be used together with this new IGP?


    We concentrated on the new aspects of the 785G in this article; hybrid crossfire is exactly the same as it was with the 780G, that is to say it maxes out with a 3450 card.
  • 4 Hide
    Ryun , August 4, 2009 5:16 PM
    "At idle, the Phenom II X2 is drawing the highest load: 92 W on the 790GX motherboard. In contrast, the E7200 is drawing 68 W on the most efficient platform, Intel's G45. It looks big on the chart, but it's a difference of 14 W."

    Nope, it's using a 24 W difference. I think that's why your numbers are different too. I get:

    24 Watts * 24 hours = 576 WHrs / 1000 W/KW = .576 KWHrs * $0.15 cents/KWHr * 365 days = $31.54

    Good article otherwise, thanks.
  • -1 Hide
    Anonymous , August 4, 2009 5:17 PM
    Quote:
    refined architecture" ? To my knowledge, and please correct me if im wrong, all that was changed between the original phenom and the phenom 2 was the addition of more L3 cache allowing it to do more simultaneously and a die shrink allowing for higher clocks. That does not a refined architecture make. When AMD added an on die memory controller to their processors years ago they had made a huge advancement in architecture. Im sad to see them fall away from the performance crown. Here's hoping their new Bull Dozer architecture brings something genuinely intriguing to the table.


    That is incorrect, if that was the case, the Phenom II wouldn't benchmark so much better and it wouldn't overclock so much better. Just because it has the Phenom name to it, doesn't mean all they did was give it a bit more L3 Cache and call it a day. You could've given the original Phenom more L3 cache all day long and it wouldn't still ran like poop. Not necessarily poop, but just not as well as the Phenom II.
  • 4 Hide
    KT_WASP , August 4, 2009 5:34 PM
    cleeveWe concentrated on the new aspects of the 785G in this article; hybrid crossfire is exactly the same as it was with the 780G, that is to say it maxes out with a 3450 card.


    If this is true, then why does the Hybrid crossfire graphic on the first page show HD4350, HD4550 and HD4650 as compatible hybrid crossfire GPUs?

    It makes sense.. the 780G used an integrated 3200-series GPU, so it was compatible with lower-end dedicated 3000-series GPUs. The 785G uses an integrated 4200-series GPU, so it should be compatible with the lower-end dedicated 4000-series GPUs.

    Can you clear this up? I was also wondering what GPU's can be used as Hybrid crossfire with the 785G. I thought I knew from that graphic on page 1, but your response confused me.

    Thanks
  • 1 Hide
    aproldcuk , August 4, 2009 5:53 PM
    cleevehybrid crossfire is exactly the same as it was with the 780G, that is to say it maxes out with a 3450 card.

    Thanks for clearing it out, Cleeve! There is not much sense using Hybrid CF then. However, my original question still remains: how much extra wattage may one expect with mid-range 4600 or 4700 card added for example? Does disabling the device help here a bit more when not in use? Hope this is not too off-topic already...
Display more comments