Sign in with
Sign up | Sign in

Core i7 965 Extreme Versus Athlon X2 7850

ATI Radeon HD 4770: 40nm Goes Mainstream
By

A Platform That Makes Sense

You’ll notice that we just ran that full suite of gaming tests on an Intel Core i7 965 Extreme-based machine, complete with X58 motherboard, 6 GB of pricey DDR3 RAM, and a 1,100W power supply—all to test a $109 graphics card.

We do this for a reason. When you test a platform that’s clearly overkill, you help minimize the number of potential bottlenecks that could inadvertently affect results. In a graphics card comparison, you want to reflect only the performance differences attributable to the card in question. But without a doubt, the configuration itself is unrealistic. So, we took what we hoped would be a sweet-spot setting, 1680x1050 without AA or AF enabled, and pit the powerful Core i7 965 Extreme against a significantly less-muscular processor being launched today, wondering if we’d see any performance variation.

The Athlon X2 7850 is a 65nm, 2.8 GHz chip based on the Kuma design. It sports two cores, twin 512 KB L2 caches, and a shared 2 MB L3 cache. With an expected price tag around $70, it’s a reasonable complement to the $109 graphics card. Ironically, you’d likely spend the most money on a motherboard in this little setup.

It’s generally thought that a majority of games are not optimized to take advantage of threading, but there is more than just a core discrepancy in play here. Clock speed, cache, micro-architecture are all different. But if these titles were purely graphics limited, none of that would matter and you’d still see similar performance numbers across the board.

A couple of apps are in fact decidedly limited by the muscle of our little Radeon HD 4770 here. Stalker and Crysis—two obvious contenders for such an honor—achieve low 30-ish frames per second on the $70 and $1,000 CPUs. The rest of the field does demonstrate bias toward the Core i7 965, though, subtly suggesting that we should probably recommend a quicker CPU for gamers. The Phenom II X3 720 does cost twice as much, but would likely make a prudent upgrade. Far Cry 2, Left 4 Dead, World in Conflict, and Grand Theft Auto 4 all stand to benefit from a quicker CPU.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 103 comments.
This thread is closed for comments
Top Comments
  • 31 Hide
    bardia , April 28, 2009 5:19 AM
    I'm pretty blown away at the kind of performance that can be had for ~$100 these days thanks to ATI. It wasn't long ago when Nvidia forced us to choice between the incredibly crappy 8600GT for $150 and the ~$250-300 8800GTS 320.

    ATI is leading us into graphics nirvana.
  • 14 Hide
    Summer Leigh Castle , April 28, 2009 5:33 AM
    bardiaI'm pretty blown away at the kind of performance that can be had for ~$100 these days thanks to ATI. It wasn't long ago when Nvidia forced us to choice between the incredibly crappy 8600GT for $150 and the ~$250-300 8800GTS 320.ATI is leading us into graphics nirvana.

    I spent almost $300 on my 8800GTS 320 OC when they came out and I thought I got a great deal. Things have changed! Competition = good for the consumers!
  • 12 Hide
    RazberyBandit , April 28, 2009 8:21 AM
    Good write-up, Chris. Two points of criticism, one of high praise.

    First, I would have preferred to see a whole line of 512MB cards - Tossing a 1GB GTS into the mix makes the higher rez comparisons rather unfair. Given that the typical cost of a 1GB version of the GTS250 is is typically $150-$160 (~$140 w/ MiR), not the $120-$130 price you purport, (those around $120 or so are the 512MB cards) there is more to that story than just the amount of VRAM.

    Second, the part about DX10 vs DX10.1 where you said the following:
    Quote:
    At 1920x1200, the Radeon HD 4850 achieves 12.7 frames per second with “Use DX 10.1” checked (compared to 11.3 frames without it). Looking for a more playable frame rate, we dropped to 1280x1024 and recorded 21.35 frames—down from 21.5. The moral of the story? Don’t expect DX 10.1 to make this title any more playable than it was without the feature enabled.

    Why didn't you perform that specific switch on the 4770? I mean, that's the card the article is focused upon, right? Just seems more prudent to apply that to the focus card.

    Lastly, I particularly liked the comparison where you went from the "king" i7 to the budget-oriented X2 Kuma. It clearly showed the benefit of a much faster CPU and it's associated architecture in games that are clearly CPU-dependent.
Other Comments
  • 5 Hide
    Dekasav , April 28, 2009 4:16 AM
    "Well-played ATI, well played."

    Couldn't say it better, myself.

    Looks to be a pretty good card, but nothing spectacular. 40nm is nice, a little cheaper HD 4850 (fewer FPS, too), but all in all, nicely done.

    I wonder who'll sell more, now, the 4850 or the 4770?
  • 4 Hide
    kelfen , April 28, 2009 4:38 AM
    solid card for the average gammer ;) 
  • 31 Hide
    bardia , April 28, 2009 5:19 AM
    I'm pretty blown away at the kind of performance that can be had for ~$100 these days thanks to ATI. It wasn't long ago when Nvidia forced us to choice between the incredibly crappy 8600GT for $150 and the ~$250-300 8800GTS 320.

    ATI is leading us into graphics nirvana.
  • 10 Hide
    pharge , April 28, 2009 5:19 AM
    Wondering will 4770 a good one for crossfire? Can we have a review on it....? With its low power useage when fully loaded, cheaper price (~$40 cheaper than 4850 when CF), not much slower than 4850 (512MB), and nice overclocking range... It will be nice to see will 4770 CF setup be useful (playable) in games (1920x1200) with some visual goodies truned on.
  • 2 Hide
    Anonymous , April 28, 2009 5:32 AM
    Wondering about 4770x2, should be wishful item
  • 14 Hide
    Summer Leigh Castle , April 28, 2009 5:33 AM
    bardiaI'm pretty blown away at the kind of performance that can be had for ~$100 these days thanks to ATI. It wasn't long ago when Nvidia forced us to choice between the incredibly crappy 8600GT for $150 and the ~$250-300 8800GTS 320.ATI is leading us into graphics nirvana.

    I spent almost $300 on my 8800GTS 320 OC when they came out and I thought I got a great deal. Things have changed! Competition = good for the consumers!
  • 5 Hide
    eklipz330 , April 28, 2009 5:38 AM
    this card is amazing for 1680x1050, if they can manage to slap some aftermarket coolers on there, buying two for the price of a 1gb 4870, and overclocking them, im pretty sure we'd pass gtx 285 numbers.... simply amazing.

    great card for 16x10 resolution. good job ati, you've done more damage to nvidia[and they're sickly pricing schemes] in the past year than they've done to you in the pass 3-4
  • 0 Hide
    eklipz330 , April 28, 2009 5:44 AM
    *edit*

    just checked newegg and they all have aftermarket coolers on them... wow *_*

    http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=4770&x=0&y=0
  • 1 Hide
    Ryun , April 28, 2009 5:50 AM
    eklipz330*edit*just checked newegg and they all have aftermarket coolers on them... wow *_*http://www.newegg.com/Product/Prod [...] 70&x=0&y=0


    Nah, they're reference coolers from AMD. From what I heard, AMD gave the AIB partners a choice between the dual slot and the, for lack of a better term, uglier cooler. Apparently the "uglier" one is cheaper so that's what you're probably going see for now.
  • 2 Hide
    aznguy0028 , April 28, 2009 6:18 AM
    RyunNah, they're reference coolers from AMD. From what I heard, AMD gave the AIB partners a choice between the dual slot and the, for lack of a better term, uglier cooler. Apparently the "uglier" one is cheaper so that's what you're probably going see for now.

    i actually like the "uglier" coolers. they look like a spaceship on the card xD. haha
  • 2 Hide
    JAYDEEJOHN , April 28, 2009 6:33 AM
    Im just hoping they spend as much space, and lines on ATI's DX10.1 whenever nVidia releases something, or in an nVidia review coming
  • -8 Hide
    anamaniac , April 28, 2009 7:00 AM
    It will play Crysis!
    Now to see, will it crossfire with a 4670? That'd be orgasmic.
    I luv my 4670, but I also want the 4770... :'( 

    I love seeing low power cards also. I'm too cheap to buy a good PSU.
    All the cards on newegg look exactly the same...
  • 4 Hide
    thepinkpanther , April 28, 2009 7:14 AM
    dang i thought the 4770 would suck compared to any 256 bit interface card,boy! was i wrong!
  • -4 Hide
    Ryun , April 28, 2009 7:17 AM
    aznguy0028i actually like the "uglier" coolers. they look like a spaceship on the card xD. haha


    Maybe bulkier would've been a better term? =)

    Sorry it's late and I'm working on a web computing project so my vernacular is a little narrow.
  • 6 Hide
    crisisavatar , April 28, 2009 7:23 AM
    excellent card but i think the extra 10 bucks made it loose some of it's charm.
  • 6 Hide
    cangelini , April 28, 2009 7:46 AM
    phargeWondering will 4770 a good one for crossfire? Can we have a review on it....? With its low power useage when fully loaded, cheaper price (~$40 cheaper than 4850 when CF), not much slower than 4850 (512MB), and nice overclocking range... It will be nice to see will 4770 CF setup be useful (playable) in games (1920x1200) with some visual goodies truned on.


    This is upcoming. I know they were asking for CrossFire in other countries as well, but we didn't receive two of these boards. There is a Radeon HD 4770 roundup in the works, however!
  • 1 Hide
    NuclearShadow , April 28, 2009 8:15 AM
    The price to performance ratio just keeps getting better and better. I'm simply amazed by this.
  • 12 Hide
    RazberyBandit , April 28, 2009 8:21 AM
    Good write-up, Chris. Two points of criticism, one of high praise.

    First, I would have preferred to see a whole line of 512MB cards - Tossing a 1GB GTS into the mix makes the higher rez comparisons rather unfair. Given that the typical cost of a 1GB version of the GTS250 is is typically $150-$160 (~$140 w/ MiR), not the $120-$130 price you purport, (those around $120 or so are the 512MB cards) there is more to that story than just the amount of VRAM.

    Second, the part about DX10 vs DX10.1 where you said the following:
    Quote:
    At 1920x1200, the Radeon HD 4850 achieves 12.7 frames per second with “Use DX 10.1” checked (compared to 11.3 frames without it). Looking for a more playable frame rate, we dropped to 1280x1024 and recorded 21.35 frames—down from 21.5. The moral of the story? Don’t expect DX 10.1 to make this title any more playable than it was without the feature enabled.

    Why didn't you perform that specific switch on the 4770? I mean, that's the card the article is focused upon, right? Just seems more prudent to apply that to the focus card.

    Lastly, I particularly liked the comparison where you went from the "king" i7 to the budget-oriented X2 Kuma. It clearly showed the benefit of a much faster CPU and it's associated architecture in games that are clearly CPU-dependent.
Display more comments