Radeon HD4870 Benchmark

245 answers Last reply
More about radeon hd4870 benchmark
  1. bluesguitar007 said:

    Looks like ATI finally widened the bottleneck that held back 3870. If that review is real, of course. Lol@ "Legit Review forum" :na:
  2. Yeah, that site provides two sources for the benchmarks.

    I'm not sure what language they're in or how credible they are.
  3. Wowie!! False hope or not, it sure made me happy! :)
  4. If this review is real, then I know what my next upgrade will be. I might even take two cards for good measure.
  5. I'm liking this, let's hope it's true. Might be the first time in my life to go red.
  6. Even if true, a little dissapointing they don't have any AA in the tests, and conspicuous by it's absence.
  7. TheGreatGrapeApe said:
    Even if true, a little dissapointing they don't have any AA in the tests, and conspicuous by it's absence.

    If those benchmarks are real, it's such a huge gap that AA won't matter. Run it at 3 times the resolution, and you won't see the jagged edges anyway. :p
  8. Actually, the question is does it nose dive with AA. That's the big problem with AA on the HD series, you can see it go from competitive with the high end without AA (like the HD2900XT vs GF8800Ultra) to stuggling to keep up with the low end of the enthusiast parts (like the HD2900XT vs GF8800GTS-320).

    This is good, but you know that for many people running at 1920x1200 with AA will matter more than 2500x1600 without AA.

    Anywhoo, just wish more than high and very high they had VH with no AA and either VH or H with AA, that would've at least given us an idea about what this segment tends to look for, c'mon 1280x1024 with no AA?.
  9. I agree that is an important test. It seems ATI just keeps wishing the old AA method would just go away. This won't happen, however, until NVidia does DX10.1 too (or gets destroyed by Intel :lol: )
  10. EXT64 said:
    I agree that is an important test. It seems ATI just keeps wishing the old AA method would just go away. This won't happen, however, until NVidia does DX10.1 too (or gets destroyed by Intel :lol: )

    With that kind of performance, I don't care for aa anymore. The gap is just way too large. It's one gpu core against 2, and look at that bar. What happens when the 4870x2 comes out? :na:
  11. OK, I confess. I didn't look at the benchmarks before posting. WOW! I wish for once the hype could be true! I wasn't planning on getting a new card yet, but if this is true, I might grab one of the lower ones (4850 or equivalent).
  12. Assuming these are legit, the question now becomes:

    How will nVidia respond?

    I wonder if the GeForce 99xx series will post even bigger numbers.
  13. bluesguitar007 said:
    Assuming these are legit, the question now becomes:

    How will nVidia respond?

    I wonder if the GeForce 99xx series will post even bigger numbers.

    It'll be difficult, considering how high the bar is set.
  14. I really really hope this lives up to these results :P And I agree with ape, where's the AA? If they had a sample, why wouldn't they test the AA capabilities since that was a major problem with the last generation..

    *ponders*
  15. monst0r said:
    I really really hope this lives up to these results :p And I agree with ape, where's the AA? If they had a sample, why wouldn't they test the AA capabilities since that was a major problem with the last generation..

    *ponders*

    It may not be intentional. That's hardly a complete benchmark. If they tested 20 games and there's no aa, then you have reason to susupect. But they've only tested 3dmark and Crysis. The order to benchmarking is testing without aa first, then with aa, not the other way around. This fits the convention.
  16. dagger said:
    It'll be difficult, considering how high the bar is set.


    I don't think it will. The 8800GTX has been out since when? November 2006 was it? Nvidia has had well over a year to work on their next gen card.

    I think the 9900GTX could be faster. But from what I hear it is not power efficent at all and does create a lot of heat.

    Even if the 9900GTX out performs the HD4870 I think the HD4870 has a shot of being the market leader due to value, power consumption, and price.

    I read the other day on Tom's Hardware that the DDR5 verison of the HD4870 uses something like 34% less power than the DDR3 verison and has a TDP rating of an amazing 102 Watts!!

    That is just incredible performance per watt!
  17. Well the details we know on the GT200 suggest it will be about double the number of stream processors (rumor is 240), or about what a 9800GX2 has. Once you factor in efficiencies from having a single GPU and the other enhancements they will add in, you would expect the GT200 GPU would outperform the 4870.

    That said, I am buying one of these when they come out. :D Way to restore the faith ATI!

    PS Please let these benchmarks be real!
  18. NarwhaleAu said:
    Well the details we know on the GT200 suggest it will be about double the number of stream processors (rumor is 240), or about what a 9800GX2 has. Once you factor in efficiencies from having a single GPU and the other enhancements they will add in, you would expect the GT200 GPU would outperform the 4870.

    That said, I am buying one of these when they come out. :D Way to restore the faith ATI!

    PS Please let these benchmarks be real!



    On some of those, the 4870's framerate is nearly 50% more than the 9800gx2's. It would be hard to make up this much of a gap just through "efficiencies from having a single gpu." :p
  19. Why would you even pay a price premium for these cards if you're not going to use AA?

    EDIT: Man that forum sucks ass. People complain about contrast here? That looks more like blog comments than a forum.
  20. rwayne said:

    I read the other day on Tom's Hardware that the DDR5 verison of the HD4870 uses something like 34% less power than the DDR3 verison and has a TDP rating of an amazing 102 Watts!!

    That is just incredible performance per watt!


    Of course, if all the power reduction happens as well as these charts being close to true, some of us will be wondering why we bought such large PSU's, even if we run two cards. I know, I bought the big PSU last year for power hungry graphics cards, but now it may sit there just idling.

    @Dagger- I can guess that there were a lot of things that contribute to high efficiency besides a single GPU. But until the cards come out and real benches are posted, I'll reserve final judgment. But I do hope these preliminary charts are at least close to real.
  21. rwayne, you can't even get GDDR3 at 3400 MHz. Hell, not even GDDR4 will reach that (in fact, GDDR4 doesn't reach that much higher. That's why NVidia stuck with GDDR3, because they would have lower latencies.)

    Regardless, I really doubt this is completely legit. I'm not trying to say that it won't be a powerful card, but this seriously looks like a hoax to me. Regardless, though, I personally think that the 9900 series will be able to challenge it, at least in performance. They are supposedly adding more stream processors, as well as adding that ROP upgrade that was shown in the 9600 GT, and much higher bandwidth. I think I'm looking more towards that than the 4800 HD series.
  22. does bench marks DO NOT jive with the changes and the pixal fill rates - but they could be right since there are many changes
  23. sailer said:
    know, I bought the big PSU last year for power hungry graphics cards, but now it may sit there just idling.

    Which is bad because a PSU is most efficient at higher loads.
  24. randomizer said:
    Which is bad because a PSU is most efficient at higher loads.


    My present PSU is supposed to have a very wide efficiency range. Its just that I could have used a 750wt instead of a 850wt.

    I agree with the new forum formats. For some reason I'm not getting e-mail notifications of replies on the threads for the past few days. I checked all my settings and they say that I should be getting them, but I don't. Wish Tom's had left well enough alone.
  25. Addressing people's concerns about AA...

    Tom's review of the GeForce 9800 GX2 gave us the following information about Crysis and AA:

    At 1920 X 1200,
    GeForce 9800 GX2: No AA = 32.1 fps, 4X AA = 12.2 fps, Percent reduction = 62.0%
    Radeon HD 3870 X2: No AA = 21.1 fps, 4X AA = 11.8 fps, Percent reduction = 44.1%

    The 9800 GX2 actually suffers more than the Radeon when AA is enabled at this high resolution.

    The bottom line is that as long as the HD 4870 is as effective as the 3870 X2 when it comes to AA, it will continue to show a strong lead (in Crysis, at least).

    Of course, all of this assumes the benchmarks are real.
  26. Darkness Flame said:
    rwayne, you can't even get GDDR3 at 3400 MHz. Hell, not even GDDR4 will reach that (in fact, GDDR4 doesn't reach that much higher. That's why NVidia stuck with GDDR3, because they would have lower latencies.)

    Regardless, I really doubt this is completely legit. I'm not trying to say that it won't be a powerful card, but this seriously looks like a hoax to me. Regardless, though, I personally think that the 9900 series will be able to challenge it, at least in performance. They are supposedly adding more stream processors, as well as adding that ROP upgrade that was shown in the 9600 GT, and much higher bandwidth. I think I'm looking more towards that than the 4800 HD series.


    The jump from DDR3 to DDR4 is very minor and DDR4 runs hot.

    DDR5 for some reason is a dramatic improvement, runs much faster and cooler too.

    I have read that on more than one site.
  27. Well, I think that NVidia will be able to compete in or even win this round, however, I don't care. I like the current cycle:

    NVidia has best => ATI releases better and prices competitively => NVidia retakes crown soon after but prices their card obnoxiously.

    In the end, all the consumers win. Those who want Price vs. Performance buy ATI; those who want best performance go NVidia.
  28. EXT64 said:

    In the end, all the consumers win. Those who want Price vs. Performance buy ATI; those who want best performance go NVidia.

    Seems like it's going to be the opposite. :p
  29. Yes, that sounds good. ATI hasn't been on top since the 1900/1950 XTX. It would nice to use an ATI card that was solidly on top again. Then I might even replace the old Nvidia card in my other computer. One thing I do wonder, though. If this new card is as good as it seems, there might be less jokes being told about how bad AMD/ATI is. How would the forums look with a headline like, "AMD does something right! for a change".
  30. this is very good.initials are promising!
  31. bluesguitar007 said:

    At 1920 X 1200,
    GeForce 9800 GX2: No AA = 32.1 fps, 4X AA = 12.2 fps, Percent reduction = 62.0%
    Radeon HD 3870 X2: No AA = 21.1 fps, 4X AA = 11.8 fps, Percent reduction = 44.1%

    The 9800 GX2 actually suffers more than the Radeon when AA is enabled at this high resolution.

    And both times the 9800GX2 has higher framerates. Percentages aren't all that important, it's absolute framerates that I'm interested in. Hell according to toms my 9600GT beats a HD3870X2 by 900% in 3DMark Vantage FT3, yet I still only got 9FPS. The 9800GX2 had more to lose in the first place.
  32. randomizer said:
    And both times the 9800GX2 has higher framerates. Percentages aren't all that important, it's absolute framerates that I'm interested in. Hell according to toms my 9600GT beats a HD3870X2 by 900% in 3DMark Vantage FT3, yet I still only got 9FPS. The 9800GX2 had more to lose in the first place.


    My point was that the HD 4870's performance with AA on will still beat out the 9800 GX2, even if it takes as much of a hit (44%) as the 3870 X2 does with the filters applied.

    If we take the percent reduction values I calculated for the Tom's test, and apply them to the proposed HD 4870 benchmarks, we'd get something like this:

    Crysis (Very High, 1600 X 1200, 4X AA)
    GeForce 9800 GX2 - 35 fps with a 62.0% reduction for AA = 13.3 fps
    Radeon HD 4870 - 44 fps with a 44.1% reduction for AA = 24.6 fps

    I'm simply speculating that the HD 4870 will have its frame rates reduced by the same amount as the 3870 X2 when 4X AA is enabled.
  33. But then you're comparing a released card to a pre-released card. We have yet to see how GT200 handles AA ;)

    I reckon Hector did these benchmarks, which explains the significant increases.
  34. randomizer said:
    But then you're comparing a released card to a pre-released card. We have yet to see how GT200 handles AA


    Therein lies the fun of speculation.

    ;)
  35. I speculate that I won't get a review sample. Anyone care to wager against my claims?
  36. The G94 handles AA very well for some reason. Which reminds me, what ever happened to the "free 4xAA" that I kept hearing about before. And when is the damn physics card being implemented into the GPU, or by software. anyway that is an awesome 3D mark score. I doubt that that is the 4870, it is probably 4870X2 scores trust me.It just makes more sense. The 4800s were only supposed to be 50% better than the 3800s anyways. Finnally a single card that comes out and totally rapes 3dMark 06. 10000 SM3 score, impressive.
  37. And it will be an epic fail in 3dmark vantage. I expect GT200 to put up decent numbers in Vantage though.
  38. of course, vantage just came out.
  39. And Vantage is heavily nvidia biased.
  40. hmmm....is that so...it could just be driver support but who knows.
  41. When does the card actually come out in stores? Like early May? I can't wait to find out if these are true!
  42. I think Apes right about this. If those marks are true, it doesnt show the AA hit. You add up all the numbers that has been posted, and youll see 200% gains in some areas, not in physical increase, but with clock*physical increase. Going back to FUDs original estimate of 50% better , then reduced later to 40+% better than the 3870, it was done without the knowledge of the core clock being lower, and the shaders higher. It could be like what Ive heard also, that these new cards (4870 and 9900GTX) will do very well in the newer games, with not as great an increase in the older ones, which is still ok, being that almost all older games have been conquered already by current solutions. In other words, too good to be true, unless its the X2 being shown here. But then again, GDDR5 has some amazing potential. Im wondering if itll help with the 4+1 super?
  43. LOL if those benchmarks are real... Plays Crysis real good! Can you imagine what the 4870X2 will be like?
  44. Could be that I will finally be able to retire my 8800GTX's depends on AA performance though Im just the sort of person who likes to run high rez and AA....

    My mother board is also crossfire compatible... Although my favourite game isnt :(

    First card in a long while to be interesting, but AA performance will be the deal maker \ breaker for me. If this manages to flop back to 8800 gtx \ultra levels when using AA then it still wont be worth the upgrade...
  45. randomizer said:
    And Vantage is heavily nvidia biased.


    randomizer randomly read my mind! :lol:
  46. sailer said:
    If this review is real, then I know what my next upgrade will be. I might even take two cards for good measure.

    Yeah, me too! :D
    randomizer said:
    And both times the 9800GX2 has higher framerates. Percentages aren't all that important, it's absolute framerates that I'm interested in. Hell according to toms my 9600GT beats a HD3870X2 by 900% in 3DMark Vantage FT3, yet I still only got 9FPS. The 9800GX2 had more to lose in the first place.

    Very true, but you can't ignore percentages. You have to look at the relative figures and percentages.
    I didn't realise the 9800GX2 took such a hit on FPS with AA enabled, that was a very interesting stat.
    I'm hoping to get a 24" soon, so hopefully I'll be able to run far less (if any) AA. :D
  47. LukeBird said:
    Yeah, me too! :D

    Very true, but you can't ignore percentages. You have to look at the relative figures and percentages.
    I didn't realise the 9800GX2 took such a hit on FPS with AA enabled, that was a very interesting stat.
    I'm hoping to get a 24" soon, so hopefully I'll be able to run far less (if any) AA. :D


    If you are talking LCD monitors, if you are currently running a 22" then it will need less AA if you are currently running 20" then you will need more at 24".

    LCD resolution is very different to CRT resolution, with a CRT as you bump up the resolution the physical size of the pixels reduces lessening the need for AA with a LCD a higher "resolution" just means a larger screen and not necesarily smaller pixels.
  48. dtq said:
    If you are talking LCD monitors, if you are currently running a 22" then it will need less AA if you are currently running 20" then you will need more at 24".

    LCD resolution is very different to CRT resolution, with a CRT as you bump up the resolution the physical size of the pixels reduces lessening the need for AA with a LCD a higher "resolution" just means a larger screen and not necesarily smaller pixels.

    No, I'm currently running a 17"!!!! :(/ :lol:
    I'm expecting to use less AA, because I'm expecting the size of the screen to be that much bigger, that lower AA will be far less noticeable :)
    But yeah, I understand (a bit! :lol:) the differences between LCD & CRT & Native res.
  49. dtq said:
    If you are talking LCD monitors, if you are currently running a 22" then it will need less AA if you are currently running 20" then you will need more at 24".

    LCD resolution is very different to CRT resolution, with a CRT as you bump up the resolution the physical size of the pixels reduces lessening the need for AA with a LCD a higher "resolution" just means a larger screen and not necesarily smaller pixels.

    True to a point. I believe pixel density is highest at 20" and lower as you go up, but then again, as you go up, you tend to have your monitor a bit further back as well, so it somewhat negates the need for all the AA. Maybe a little less on a 24"
Ask a new question

Read More

Graphics Cards Benchmark Radeon Graphics