AMD's Radeon HD 4870 X2: R700 First-Look

Although AMD took the hardware community by surprise with the performance of RV770 and the derivative Radeon HD 4850/4870, things didn’t go entirely according to plan. The company launched its boards immediately after Nvidia, which turned around and slashed the prices on its own models, determined to win this round of the ongoing graphics card war.

And yet, a month and a half after the launch of AMD’s newest round of products, the verdict hasn’t changed. Neither the GeForce GTX 260 nor the GeForce 9800 GTX+ (only recently made available) can take on the Radeon HD 4870 with regards to price or performance, even in light of heavy cuts from Nvidia.

But AMD’s not out just to make waves with the gamers looking for value. It also wants to reclaim a crown it lost a long time ago to Nvidia’s last two generations of large, monolithic programmable graphics architectures. As a means to that end, the company is putting a pair of its most impressive GPUs on a single PCB and calling it the Radeon HD 4870 X2. Now the question remains: does the new board have the muscle to take on Nvidia’s GeForce GTX 280, the single fastest card?

The Radeon HD 4870 X2

Contrary to what its code name might otherwise suggest, the R700 actually centers on a design sporting two RV770 GPUs. Thus, the Radeon HD 4870 X2 finds itself in a very high-end segment of the discrete graphics market. However, in the next few weeks you will also be able to find a Radeon HD 4850 X2, based on the same two chips but with lower frequencies and likely less memory as well.

In appearance, the Radeon HD 4870 X2 remains similar to the Radeon HD 3870 X2. However, looks can be deceiving, as we will soon see. Not surprisingly, the card itself is quite long (26.7 cm). It sports a large blower that exhausts through the back of the board and is neighbored by two dual-link DVI outputs (neither HDMI nor DisplayPort connectivity are native to the back panel). The board requires two auxiliary power connectors—one with six pins and another with eight (PCI Express 2.0-compliant). The two GPUs are positioned on the same PCB, though you won’t see them since a heatsink/fan combination covers the entire board.

This card, like it’s bi-GPU flagship predecessor, must deal with sharing its frame buffer. But contrary to CPUs and their more complex memory management techniques, these things evolve a lot more slowly in the graphics card world. By comparison, all of the bi-GPU cards up until now were similar to the Pentium D 900 (Presler), acting as an assembly of two cores functioning independently and integrating their own local memory (L2 cache for the CPU, frame buffer for the GPU). All of the graphics data is thus duplicated between the two cards. Communication between them passes through an external bus—think FSB for Intel’s Pentium D and PCI Express for these graphics cards.

As with the Radeon HD 3870 X2, a PCI Express bridge manages the communications between the two GPUs and the chipset. Once the final display outputs is calculated, each GPU sends them to another chip that assembles the result according to whichever multi-card rending technology is used (AFR, most often) and then sends it all to the monitor. Back when we tested the Radeon HD 3870 X2, we found that the biggest impact on performance was attributable to the board’s memory capacity, since memory on a bi-GPU card has to be divided in two. And in order to assure adequate performance, you often have to multiply the quantity of memory by four, which is what AMD does with the 4870 X2.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
118 comments
    Your comment
    Top Comments
  • Dear Sir,
    I admire your efforts to show the maximum possible both as description and tests and I would like to ask a couple of questions:
    1. Is it possible to show the same tests but with AMD configuration?
    2. Is it possible to show the same tests but with at least 8AA/16AF and soft shadows, dynamic lights, plus description of other visual options?
    In appropriate for you time I shall expect your answer.
    The reason to spend so much in time, research and money by my humble opinion is to offer the best possible and realistic images on the clients and gamers. Please, forgive my words but to have such video cards and test them only with 4AA/4AF is ... a shame. I save as much as possible and when I upgrade something I chose carefully the best I could afford.
    Thank you in advance for your co-operation.
    Yours faithfully
    Vorador_21
    15
  • Other Comments
  • How's the microstutter?
    1
  • Good job ATI/AMD! really good card you have here lets hope its cheaper then the GTX 280. I'm glad i waited it out now i can replace my old setup. Now i will get new X48 board and 2 of these puppys and im sure I need a better power supply I don't think a 750 watt will do it.
    2
  • looks like i need a new mobo and a bigger case to fit this beast
    0
  • Whoa, what happened to the heat and power consumption level charts? Where they that far off the scale you dare not show them? The card performs nicely but this is ATI/AMD we are talking about, they love heat and power. So what are they?
    -10
  • It says at the end of the article their power meter went kaput, and also not to expect much in the power consumption department anyways.
    2
  • Someone explain to me what this means in english?
    "Back when we tested the Radeon HD 3870 X2, we found that the biggest impact on performance was attributable to the board’s memory capacity, since memory on a bi-GPU card has to be divided in two. And in order to assure adequate performance, you often have to multiply the quantity of memory by four, which is what AMD does with the 4870 X2." from page 1
    0
  • I can smell incoming price drops...
    4
  • It's nice to see AMD/ATI challange nVidia again, it's been too long nVidias market. My take is that this situation can only benefit the consumers; push for faster GPU's to more affordable prices. It will be interesting to see nVidias run to retake the performance as well as the midrange price/performance crown.
    6
  • neiroatopelccSomeone explain to me what this means in english? "Back when we tested the Radeon HD 3870 X2, we found that the biggest impact on performance was attributable to the board’s memory capacity, since memory on a bi-GPU card has to be divided in two. And in order to assure adequate performance, you often have to multiply the quantity of memory by four, which is what AMD does with the 4870 X2." from page 1
    The HD3870X2 had only 2x512MB memory, and each GPU could only access 512MB, which held its performance back. Although I think it should say "doubled", not "quadrupled", since the HD4870X2 has 2x1GB.
    3
  • .............................
    Point 1:
    It's great that Ati is putting up a fight and it's obvious that with all fan-boys and trash talking/flaming aside the only reason why this rapid progression on a pure processing and on the overall technological level is made possible through competition between the titans of the graphics processing world.
    .............................

    BUT!

    According to other valued/trustworthy sources the 4870 X2 manages only a 1.09 over 1.00 increase in performance.

    see link:
    http://tweakers.net/reviews/957/4/his-radeon-hd-4870-x2-versus-asus-geforce-gtx-280-pagina-4.html (Copy paste the link if needed)
    .............................

    As for the significant increase stated in this article, it's great, but only applicable to those games with integrated support for crossfire technology. (Further and extensive testing would be desirable, and posting actual facts to support claims would also be duly noted. // I know some testing has been done, but a full review on exactly what value is added with the new card, still remains to be seen.)

    .............................
    Point 2:
    As for NVidia's reply to the 4870 X2 dual-gpu card is the introduction of the 65 nm -> 55nm gpu chipsize reduction.

    Link:
    http://tweakers.net/nieuws/55050/introductie-55nm-versie-geforce-gtx-280-eind-augustus.html

    According to the news this is to take place at the end of August. meaning that the reply is ready and launched by the end of THIS month.
    .............................

    I would like to at this point state I'm not a fan boy for either side, and I myself have switched from Nvidia to Ati back to Nvidia and will keep doing this as either side dominates the market with high quality merchandise and depending on which one is the market leader at the points where I update my machines.
    .............................

    My apologies to those of you who cannot read or understand Dutch.

    Here is a substitute link for the interested English parties:
    http://www.stokedgamer.com/2008/07/nvidia-geforce-gtx-290-launching-at.html

    The reason I responded with such diligence and ferocity is not because I hate Tomshardware, think that Ati is doing a bad job or that NVidia rocks beyond compare... The reason for my reply is that I wish for the consumer to be fully informed and not end up buying a more expensive product for less bang/buck than originally expected.

    Stay informed folks ! (For those of you who have a no limit budget the 4870 X2 might be the way to go, other than that, if you are looking for a more financially affordable card, wait for the 55nm NVidia GTX280 revision (name is yet to be determined, release expected at the end of the month))

    That concludes my rant for the day and I hope you guys are now fully informed or that I have at least have sparked your interest on the field of gathering information before drawing a conclusion based on a single review ;)

    Look around! and stay informed ! Ati and Nvidia both rock and the best of luck to both companies... may the struggle be eternal and the progress fiery and extreme!

    .............................

    -= Michael Out =-
    -10
  • Thurin The reason I responded with such diligence and ferocity is NOT because I hate Tomshardware


    I'd like to stress that fact, as I read Tomshardware every day and have always enjoyed reading the news articles, however I would like the news to be as unbiased as possible and lately it seems Toms has been slipping a little.

    .............................
    -= Michael Out =-
    2
  • $500+ for 2 gpus with 1gb memory each? i guess its worth. I usually never spend more than $300 for a card so i guess ill still wait till the 1gb 4870 comes out. I think the 260gtx would have been worth it, if it had dx10.1...even thought i dont know what the .1 does.
    2
  • The 10.1 is just an extended graphics capability suite including all features and extras that were planned for dx10 in the first place but couldn't be included in the primary release.

    ( hence the relatively small difference between dx9 and dx10 and dx10 games being cracked to work on dx9 machines with all the same effects save for the few bits of eyecandy that make up the difference between dx9 and dx10 )
    4
  • so... nvidiot will continue to offer 2nd rate overpriced underperforming stuff that's obviously been price inflated in the first place and so... so it's easy to reduce the ripoff price in the face of genuine article? Is that about it Thurin Michael out? And you are protecting me how? By baffling bs? How stupid are you? Or do you think we are?

    Blow your doors off in really high resolution. And the drivers evolve.

    Best card. You really are out michael - they pay you for that?

    btw how's the massive recall? thx for looking after our interests - now go to the corner and soil your nvidiot undeez.

    ATI wins. There will be lots of reviews.
    -7
  • Reading is an art too Zooty *winks*

    It's not a flame post towards Ati, and Ati does win, for now... at high costs.

    before the month is over, Nvidia will win again at high costs... This is the cost of progress.

    I;m not on either side, whichever is best I'll go for... I'm promoting and supporting the battle between the two industrial giants.

    Take one out of the race and progress will become lethargic, dead slow and the prices will be insane... even more insane than they are already...

    But if you choose to see me or my posts in a narrow minded way resembling the likes of what you just exhibited, then by all means go right ahead. You are entitled to your opinion.

    However you may want to try and keep it a bit more decent.

    (side note: using some mutated version of hackz0r 1337-speak or whatever this bit of slang is will not improve your standing with me or anyone else I wager... Try keeping it clean and I have no problems discussing any of the matters at hand with you.)
    0
  • Yes yes yes YES!!! I'm an Nvidia user by heart but this is........(tear) YES!!!
    0
  • yeah its all nice , these graphic cards and stuff, but am i to play crysis and call of duty 4 for the rest of my life there are no games that need this !
    -11
  • there aren`t any new pc games at all
    -10
  • MEEEEEEEGAAAAAAAAAAAAAAA CRAAAAAAAAAAPPPPPPP """"""""""REVIEWWWWWWWW"""""""
    this crap site should renamed to NOOBIDIA`S HARDWARE suck all fk noobs
    -23
  • Dear Sir,
    I admire your efforts to show the maximum possible both as description and tests and I would like to ask a couple of questions:
    1. Is it possible to show the same tests but with AMD configuration?
    2. Is it possible to show the same tests but with at least 8AA/16AF and soft shadows, dynamic lights, plus description of other visual options?
    In appropriate for you time I shall expect your answer.
    The reason to spend so much in time, research and money by my humble opinion is to offer the best possible and realistic images on the clients and gamers. Please, forgive my words but to have such video cards and test them only with 4AA/4AF is ... a shame. I save as much as possible and when I upgrade something I chose carefully the best I could afford.
    Thank you in advance for your co-operation.
    Yours faithfully
    Vorador_21
    15