PowerColor Radeon HD 5870 LCS: The GHz Limit, Broken

Installation

Installation requires the user to attach the card to a separate liquid-cooling system. Typically, I would expect that buyers of this card will own a pre-existing liquid-cooling system for their CPU and will simply add the Radeon HD 5870 LCS to the circuit.

We're testing this video card's limits, so we will dedicate a Koolance Exos-2 system with a 750W cooling capacity exclusively to the task.

As we mentioned previously, we'd have preferred some documentation about the proper way to install the coolant hoses. Since we'd taken the block apart and examined it, we chose the ports that would force the coolant flow first to the GPU and then to the memory before being sent to the radiators for cooling.

Once the hoses are attached and the fittings are tightened, the procedure is the same as it is with any liquid-cooling system. We first ran the coolant through the card without plugging the board into our test motherboard, ensuring there were no leaks and to force any air pockets out of the block. Once we were satisfied that the air bubbles were gone, we simply plugged the card into the motherboard and secured it.

We'll be pitting PowerColor's Radeon HD 5870 against two Radeon HD 4890s in CrossFire. This should be an interesting comparison because the prices are similar (if you're comparing reference card to reference card, that is; PowerColor's water block naturally adds cost just as it would if you were to purchase water-cooled 4890s) and we expect performance to be in the same ballpark, too. However, we're counting on the Radeon HD 5870 to use far less real estate and power than two Radeon HD 4890 cards.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
91 comments
    Your comment
  • tacoslave
    AMD = EPIC WIN

    oh and you too powercolor. mmmmmmmmm waterblock...
    7
  • CoryInJapan
    Wow...is ATI leading again...O wait they are.
    Nice to see ATI back on top. Would be nice to have that card.
    3
  • tacoslave
    so much potential lost for shame...

    sorry for double posting but i got cut off.
    -18
  • jimmysmitty
    Enjoy it while it lasts ATI.

    Unless nV screws G300 up (rebranding G200) then it may be a nice time but wont last forever.

    If only they could get a damn seperate shader clock. With 1600 SPs running at 1.7GHz they could blow nV out of the water....
    -25
  • anamaniac
    It's dissapointing to hear the memory had to be put so low...
    Though I'd love to liquid cool my i7, then add a 5870 and liquid cool that also. I got a radiator at work that can likely handle a 10 kilowatt system, add to that it's constantly cooled by sub zero temperatures (during winter atleast), with a 5 barrel resevoir, and dual 24" fans (used to cool 3000psi hydraulics).

    Even more dissapointing to see it can't keep up with it's dual 4xxx series cousin.
    Also, GPU waterblocks just look so inneficient...
    4
  • micky_lund
    hmmm...
    can i has one to sell? then buy dual 5850s
    -12
  • 7amood
    WOOOOOOOW extremely low temperatures... but not worth the 500W draw.
    I think now am interested in water cooled video cards.

    but honestly, i thought the gain will be 20% at least... disappointing.
    4
  • liquidsnake718
    You made a mistake on the test and benchmark page on the Crysis Bench config. It states you tested it at low quality. This would mean at least 100-150FPS....

    Nice work but I also wish you guys tested Crysis at Very High settings.... its always great to see Crysis tested at its maximum threshhold.
    1
  • shubham1401
    Being a single card 5870 performs very impressively...
    And the power usage is impressive too(Only on stock).

    This is the best card for now and with a lil price dip it'll be fav. of all high end gamers.
    0
  • IzzyCraft
    I love the draw on their oc 1 gpu is greater then 2 older gen gpus draw that's just hilariously bad results. Still with the waterblock set up maybe you can afford the 1000 dollar electric bill with you're fastest possible machine. Sorry but as far as ATI venders powercolor is low on the list of ones i trust. Poor 5870 gpu wasted on excess and ya can't waste those things, hard to come by with tsmc doing jack in the yields department.
    -9
  • Netherwind
    Great article, but I'm still debating between a second 4890 and selling my 4890 for a 5850/70.
    0
  • shubham1401
    ^See the power consumption for 2 4890's. Adding second one may not be a good option.
    For now single 4890 is plenty for most of us.Buy when prices fall.
    1
  • annihilator-x-
    I have convinced myself watercooling is a must for GPUs.

    I am running a watercooling setup on my CPU alone at the moment. Despite majority of heat gone from internal case, my HD4850 card would barely escape overheat in room temperature of 23deg C in UK winter. I once tried moving my soundcard closer leaving 2 inch clearance from HD4850 and it crashes every single time when I run Mass Effect. Now it crashes once in a while and I am 100% sure it's due to overheating of the GPU. Load temperature reaches 106 deg C. The card crashes at 110 deg C.
    1
  • idoln95
    If there is a limit how did they manage to hit a 1380MHZ here:
    http://www.tweaktown.com/articles/3038/kingpin_cooling_single_gpu_competition_with_ln2_by_deanzo/index.html

    That deanzo guy is crazy....
    -2
  • uh_no
    ATI FTW

    but can it run pacman?
    2
  • BartG
    so i`m 1 of the few people who was not surprised by the outcome!? 2 4890`s packs a decent punch, the fact 1 5780 can keep up with that is not bad, I did not expect anything different...

    Keen to find out if DX 11 will make a difference once its mature...
    0
  • abhilash
    Quote:
    The Radeon HD 5870 also sports a memory bandwidth advantage

    what rubbish?
    -2
  • JohnnyLucky
    Interesting article with some surprising results and conclusions.
    0
  • hmolleta
    Remove the WC from this PowerColor and put it on a Asus 5870 and overclock it until it burn!!!!!!!!!!
    0
  • idisarmu
    I don't understand how the 5870 can be slower than a 4870x2/2x4890. Aren't the specs EXACTLY the same, except for the 5870's HIGHER clocks and that the 5870 has all those specs in ONE gpu die rather than two?
    1