Radeon HD 5850: Knocking Down GTX 295 In CrossFire
The Makings Of Radeon HD 5850
Radeon HD 5850 centers on a different PCB than its big brother. That’s good news, since we were already a little concerned about the 5870’s 11” length. The 5850 is a much more accommodating 9.5,” making it smaller than the Radeon HD 4870 X2.
To that end, the Radeon HD 5850 doesn’t sport the two top-mounted six-pin auxiliary power connectors we’ve come to appreciate on Nvidia’s cards. Rather, it reverts back to a pair of rear-mounted six-pin plugs, like the Radeon HD 4870. This is less-favorable, we think, because it extends the clearance needed behind your graphics card by at least another inch or two, depending on the rigidity of those cables.
Aside from the shorter board and power connectors, the only other aesthetic difference between the Radeon HD 5870 and 5850 is that the Radeon HD 5850’s backside isn’t covered. Fortunately, video cards have no shame.
This means the same implementation of Eyefinity enabled through the Radeon HD 5870 carries over here as well. Two dual-link DVI outputs, an HDMI connector, and a DisplayPort can be mixed and matched for any combination of three digital outputs, as long as one of the three outputs is DisplayPort (thanks to reader Bounty for pointing this out). Otherwise, you're limited to two DVI outputs or one DVI/HDMI output. And for those who read the original review and thought something wonky was up with our bitstreaming testing—it was, and here’s photographic proof.
Fortunately, CyberLink is going to pick up the Onkyo receiver we use to test and work on compatibility before PowerDVD 9 is formally patched to support this functionality. No word yet on TotalMedia Theatre or WinDVD incorporating support for AMD’s 5800-series.
Under The Hood
Header Cell - Column 0 | Radeon HD 5870 | Radeon HD 5850 | Radeon HD 4870 |
---|---|---|---|
Die Size | 334 square millimeters | 334 square millimeters | 263 square millimeters |
Transistors | 2.15 billion | 2.15 billion | .956 billion |
Memory Bandwidth | 153 GB/s | 128 GB/s | 115 GB/s |
AA Resolve | 128 | 128 | 64 |
Z/Stencil | 128 | 128 | 64 |
Texture Units | 80 | 72 | 40 |
Shader (ALUs) | 1,600 | 1,440 | 800 |
Idle Board Power | 27W | 27W | 90W |
Active Board Power | 188W | 151W | 160W |
Once you get past the card’s looks, there are few other tweaks to the board’s vital specs that affect its performance. To begin, while the Cypress GPU driving AMD’s Radeon HD 5850 is the same as the one powering 5870, two of its SIMD arrays are disabled, turning off 160 of its shader processors and eight of its texture units. The chip's total consequently drops to 1,440 ALUs and 72 texture units. Its clock rate is also slowed from 850 MHz to 725 MHz.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Cypress’ back-end remains unchanged, with 32 ROPs and Z/stencil rates that are affected only by the GPU’s clock. As with the Radeon HD 5870, the 5850 boasts a 256-bit memory bus populated by 1GB of GDDR5. Of course, it’s running slower than the 5870, at 1 GHz rather than 1.2 GHz, yielding 128 GB/s of bandwidth instead of 153 GB/s. Even still, that’s still more than the Radeon HD 4870’s 115.2 GB/s.
ATI uses the same power-saving tweaks that helped Radeon HD 5870 dip in at 27W idle to do the same here. Same GPU, same idle clocks, same 27W floor. But because two of the processor’s 20 SIMD arrays are disabled, and because 3D clocks are lower, maximum board power drops to 151W—9W lower than a Radeon HD 4870, 37W lower than the 5870, and 19W lower than the spec ATI originally provided when the Radeon HD 5870 launched last week.
Current page: The Makings Of Radeon HD 5850
Prev Page Introduction Next Page Hardware And Benchmark Setup-
duckmanx88 another great article. can you guy add these to your 2009 charts please. and the new i5 and i7 cpu's too please! =)Reply -
jj463rd Quote "ATI has two cards that are faster than its competitor’s quickest single-GPU board. My, how times have changed." LoLReply
Yep I was looking at the Radeon 5850 especially CF'd for a build.
The Radeon 5870's seem a bit pricey to me so I'd prefer 2 5850's.
I can wait till they become available.
Thanks for the great review very impressive on those scores of the 5850.
-
coonday Ball's in your court now Nvidia. Time to stop whining and bring some competition to the table.Reply -
Annisman Hi, very very good article, It's nice to see my two 5870's at the top of every chart destroying every game out there!Reply
I hope you guys will go into more details about how you run your benchmarks for games. When I compare my own results, sometimes I wonder if you are using ingame FRAPS results, or a benchmark tool such as Crysis to get your results, this is very important for me to know. Please dedicated a small portion of reviews to let us know exactly what part of the game you benched, and in what fashion, it will be very helpful. Also, it would be great to see exactly what settings were used in games. For example you state that you set GTA4 to the 'highest' settings, however without 2GB of Vram, the texture settings can only be set on Med. unless you are compromising in the view distance category or somewhere else. So maybe a screenshot of the settings you used should be included, I would like to see this become regular in Tom's video card reviews. Great article, and please conisder by requests. -
Kl2amer Solid review. Now we just have to wait for aftermarket coolers/designs to get them a quiter and even cooler.Reply -
megamanx00 Glad that 5850 is shorter, but I'll probably wait till Sapphire or Asus put out cards with a cooler better than the reference. Damn I want one now though :D.Reply -
JohnnyLucky Another interesting article. I'm almost tempted to get a 5850. I'm just wondering how power consumption during Furmark which is a rigorous stress test compares to power consumption during gaming. Am I correct in assuming power consumption during a typical gaming session would be less? If I'm not mistaken ATI is recommending a 600 watt power supply with 40 amps on the 12 volt rail(s) for a system with two 5850's in Crossfire mode.Reply -
cangelini annismanHi, very very good article, It's nice to see my two 5870's at the top of every chart destroying every game out there!I hope you guys will go into more details about how you run your benchmarks for games. When I compare my own results, sometimes I wonder if you are using ingame FRAPS results, or a benchmark tool such as Crysis to get your results, this is very important for me to know. Please dedicated a small portion of reviews to let us know exactly what part of the game you benched, and in what fashion, it will be very helpful. Also, it would be great to see exactly what settings were used in games. For example you state that you set GTA4 to the 'highest' settings, however without 2GB of Vram, the texture settings can only be set on Med. unless you are compromising in the view distance category or somewhere else. So maybe a screenshot of the settings you used should be included, I would like to see this become regular in Tom's video card reviews. Great article, and please conisder by requests.Reply
Usually try to include them on a page in the review. Anything more detailed you'd like, feel free to let me know and I'm happy to oblige! -
SchizoFrog It does seem that the 5850 is a great £200 card and definately the option to go for if you are buying today. I pride myself on getting good performance from great value and the test of this is to try and get my GPU to last 2 years and still be playing high end games. My current O/C 9600GT 512MB which cost me a huge £95 18 months ago, is doing just that right now. So, for a £200 DX11 GPU the 5850 is on its own and a great buy by default. However, and this is a big however! While Windows 7 will support DX11 and a few upcoming games will use a few visual effects based on DX11, nothing else does and certainly there are no true DX11 games and won't be for some time as nearly all games released these days are developed with the console market in mind. So I for one will wait. I will wait for nVidia to decide it is time to launch their DX11 GPU's. Either their GPU's will push them firmly back to the top or at least drive ATi's prices down.Reply