Sign in with
Sign up | Sign in

AMD's Radeon HD 4870 X2: R700 First-Look

AMD's Radeon HD 4870 X2: R700 First-Look
By

Although AMD took the hardware community by surprise with the performance of RV770 and the derivative Radeon HD 4850/4870, things didn’t go entirely according to plan. The company launched its boards immediately after Nvidia, which turned around and slashed the prices on its own models, determined to win this round of the ongoing graphics card war.

And yet, a month and a half after the launch of AMD’s newest round of products, the verdict hasn’t changed. Neither the GeForce GTX 260 nor the GeForce 9800 GTX+ (only recently made available) can take on the Radeon HD 4870 with regards to price or performance, even in light of heavy cuts from Nvidia.

But AMD’s not out just to make waves with the gamers looking for value. It also wants to reclaim a crown it lost a long time ago to Nvidia’s last two generations of large, monolithic programmable graphics architectures. As a means to that end, the company is putting a pair of its most impressive GPUs on a single PCB and calling it the Radeon HD 4870 X2. Now the question remains: does the new board have the muscle to take on Nvidia’s GeForce GTX 280, the single fastest card?

The Radeon HD 4870 X2

Contrary to what its code name might otherwise suggest, the R700 actually centers on a design sporting two RV770 GPUs. Thus, the Radeon HD 4870 X2 finds itself in a very high-end segment of the discrete graphics market. However, in the next few weeks you will also be able to find a Radeon HD 4850 X2, based on the same two chips but with lower frequencies and likely less memory as well.

In appearance, the Radeon HD 4870 X2 remains similar to the Radeon HD 3870 X2. However, looks can be deceiving, as we will soon see. Not surprisingly, the card itself is quite long (26.7 cm). It sports a large blower that exhausts through the back of the board and is neighbored by two dual-link DVI outputs (neither HDMI nor DisplayPort connectivity are native to the back panel). The board requires two auxiliary power connectors—one with six pins and another with eight (PCI Express 2.0-compliant). The two GPUs are positioned on the same PCB, though you won’t see them since a heatsink/fan combination covers the entire board.

This card, like it’s bi-GPU flagship predecessor, must deal with sharing its frame buffer. But contrary to CPUs and their more complex memory management techniques, these things evolve a lot more slowly in the graphics card world. By comparison, all of the bi-GPU cards up until now were similar to the Pentium D 900 (Presler), acting as an assembly of two cores functioning independently and integrating their own local memory (L2 cache for the CPU, frame buffer for the GPU). All of the graphics data is thus duplicated between the two cards. Communication between them passes through an external bus—think FSB for Intel’s Pentium D and PCI Express for these graphics cards.

As with the Radeon HD 3870 X2, a PCI Express bridge manages the communications between the two GPUs and the chipset. Once the final display outputs is calculated, each GPU sends them to another chip that assembles the result according to whichever multi-card rending technology is used (AFR, most often) and then sends it all to the monitor. Back when we tested the Radeon HD 3870 X2, we found that the biggest impact on performance was attributable to the board’s memory capacity, since memory on a bi-GPU card has to be divided in two. And in order to assure adequate performance, you often have to multiply the quantity of memory by four, which is what AMD does with the 4870 X2.

Display 118 Comments.
This thread is closed for comments
Top Comments
  • 15 Hide
    Vorador_21 , August 12, 2008 9:49 AM
    Dear Sir,
    I admire your efforts to show the maximum possible both as description and tests and I would like to ask a couple of questions:
    1. Is it possible to show the same tests but with AMD configuration?
    2. Is it possible to show the same tests but with at least 8AA/16AF and soft shadows, dynamic lights, plus description of other visual options?
    In appropriate for you time I shall expect your answer.
    The reason to spend so much in time, research and money by my humble opinion is to offer the best possible and realistic images on the clients and gamers. Please, forgive my words but to have such video cards and test them only with 4AA/4AF is ... a shame. I save as much as possible and when I upgrade something I chose carefully the best I could afford.
    Thank you in advance for your co-operation.
    Yours faithfully
    Vorador_21
Other Comments
  • 1 Hide
    randomizer , August 12, 2008 6:37 AM
    How's the microstutter?
  • 2 Hide
    tipmen , August 12, 2008 6:40 AM
    Good job ATI/AMD! really good card you have here lets hope its cheaper then the GTX 280. I'm glad i waited it out now i can replace my old setup. Now i will get new X48 board and 2 of these puppys and im sure I need a better power supply I don't think a 750 watt will do it.
  • 0 Hide
    kite212 , August 12, 2008 6:49 AM
    looks like i need a new mobo and a bigger case to fit this beast
  • 2 Hide
    Cmhone , August 12, 2008 7:14 AM
    It says at the end of the article their power meter went kaput, and also not to expect much in the power consumption department anyways.
  • 0 Hide
    neiroatopelcc , August 12, 2008 7:29 AM
    Someone explain to me what this means in english?
    "Back when we tested the Radeon HD 3870 X2, we found that the biggest impact on performance was attributable to the board’s memory capacity, since memory on a bi-GPU card has to be divided in two. And in order to assure adequate performance, you often have to multiply the quantity of memory by four, which is what AMD does with the 4870 X2." from page 1
  • 4 Hide
    romulus47plus1 , August 12, 2008 7:32 AM
    I can smell incoming price drops...
  • 6 Hide
    Anonymous , August 12, 2008 7:33 AM
    It's nice to see AMD/ATI challange nVidia again, it's been too long nVidias market. My take is that this situation can only benefit the consumers; push for faster GPU's to more affordable prices. It will be interesting to see nVidias run to retake the performance as well as the midrange price/performance crown.
  • 3 Hide
    randomizer , August 12, 2008 7:52 AM
    neiroatopelccSomeone explain to me what this means in english? "Back when we tested the Radeon HD 3870 X2, we found that the biggest impact on performance was attributable to the board’s memory capacity, since memory on a bi-GPU card has to be divided in two. And in order to assure adequate performance, you often have to multiply the quantity of memory by four, which is what AMD does with the 4870 X2." from page 1
    The HD3870X2 had only 2x512MB memory, and each GPU could only access 512MB, which held its performance back. Although I think it should say "doubled", not "quadrupled", since the HD4870X2 has 2x1GB.
  • 2 Hide
    Thurin , August 12, 2008 8:05 AM
    Thurin The reason I responded with such diligence and ferocity is NOT because I hate Tomshardware


    I'd like to stress that fact, as I read Tomshardware every day and have always enjoyed reading the news articles, however I would like the news to be as unbiased as possible and lately it seems Toms has been slipping a little.

    .............................
    -= Michael Out =-
  • 2 Hide
    thepinkpanther , August 12, 2008 8:17 AM
    $500+ for 2 gpus with 1gb memory each? i guess its worth. I usually never spend more than $300 for a card so i guess ill still wait till the 1gb 4870 comes out. I think the 260gtx would have been worth it, if it had dx10.1...even thought i dont know what the .1 does.
  • 4 Hide
    Thurin , August 12, 2008 8:20 AM
    The 10.1 is just an extended graphics capability suite including all features and extras that were planned for dx10 in the first place but couldn't be included in the primary release.

    ( hence the relatively small difference between dx9 and dx10 and dx10 games being cracked to work on dx9 machines with all the same effects save for the few bits of eyecandy that make up the difference between dx9 and dx10 )
  • -7 Hide
    ZootyGray , August 12, 2008 8:58 AM
    so... nvidiot will continue to offer 2nd rate overpriced underperforming stuff that's obviously been price inflated in the first place and so... so it's easy to reduce the ripoff price in the face of genuine article? Is that about it Thurin Michael out? And you are protecting me how? By baffling bs? How stupid are you? Or do you think we are?

    Blow your doors off in really high resolution. And the drivers evolve.

    Best card. You really are out michael - they pay you for that?

    btw how's the massive recall? thx for looking after our interests - now go to the corner and soil your nvidiot undeez.

    ATI wins. There will be lots of reviews.
  • 0 Hide
    Thurin , August 12, 2008 9:16 AM
    Reading is an art too Zooty *winks*

    It's not a flame post towards Ati, and Ati does win, for now... at high costs.

    before the month is over, Nvidia will win again at high costs... This is the cost of progress.

    I;m not on either side, whichever is best I'll go for... I'm promoting and supporting the battle between the two industrial giants.

    Take one out of the race and progress will become lethargic, dead slow and the prices will be insane... even more insane than they are already...

    But if you choose to see me or my posts in a narrow minded way resembling the likes of what you just exhibited, then by all means go right ahead. You are entitled to your opinion.

    However you may want to try and keep it a bit more decent.

    (side note: using some mutated version of hackz0r 1337-speak or whatever this bit of slang is will not improve your standing with me or anyone else I wager... Try keeping it clean and I have no problems discussing any of the matters at hand with you.)
  • 0 Hide
    Brashen , August 12, 2008 9:27 AM
    Yes yes yes YES!!! I'm an Nvidia user by heart but this is........(tear) YES!!!
  • 15 Hide
    Vorador_21 , August 12, 2008 9:49 AM
    Dear Sir,
    I admire your efforts to show the maximum possible both as description and tests and I would like to ask a couple of questions:
    1. Is it possible to show the same tests but with AMD configuration?
    2. Is it possible to show the same tests but with at least 8AA/16AF and soft shadows, dynamic lights, plus description of other visual options?
    In appropriate for you time I shall expect your answer.
    The reason to spend so much in time, research and money by my humble opinion is to offer the best possible and realistic images on the clients and gamers. Please, forgive my words but to have such video cards and test them only with 4AA/4AF is ... a shame. I save as much as possible and when I upgrade something I chose carefully the best I could afford.
    Thank you in advance for your co-operation.
    Yours faithfully
    Vorador_21
Display more comments