2900XT set 3DMark05 World Record!!!!!!!!

2900XT's in Crossfire achieve a 3DMark05 score of a world record whopping 34,126 points!!!! Check it out!

This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts. The article doesn't mention whether the system was cooled by air or water, I guess water tho. But it does say the 2900XT's were air cooled. This does offer some support to the "poor" performance of the 2900XT being driver issues.

I am curious to see what the score would be if the same effort were put into SLI'ed 8800GTX's or Ultras.
60 answers Last reply
More about 2900xt 3dmark05 world record
  1. And at the end of the day, synthetics are still worthless.
  2. isnt setting the resolution to 1024x768 without AA kinda like cheating?
  3. Quote:
    isnt setting the resolution to 1024x768 without AA kinda like cheating?


    That was one of the first things I noticed about the screen shot...while the BIG numbers seem impressive, 1024x768 with no AA is somewhat of an anti-climax.
  4. At the end of the day so is your opinion and your nvidia cards.
  5. Doesnt make me wonder.
    The 05 version is the one that favors ATi.
  6. Quote:
    Doesnt make me wonder.
    The 05 version is the one that favors ATi.


    Ahh yes, the benchmark bias argument...

    IMO, 34, 216 is impressive but it's hard to relate what that number means without a comparative measure. I'd like to see a similiar test with an nVidia based SLI set-up.
  7. Quote:
    Doesnt make me wonder.
    The 05 version is the one that favors ATi.


    Ahh yes, the benchmark bias argument...

    IMO, 34, 216 is impressive but it's hard to relate what that number means without a comparative measure. I'd like to see a similiar test with an nVidia based SLI set-up.

    It's hard to relate because it's a worthless number. Who gives a crap if it posts 648 million? If it can't beat a different card in an actual game, then the numbers are meaningless.

    Even if you test it against an nVidia SLI set-up, it still tells you nothing, regardless whether it performs better or worse than the nVidia set-up.

    As for the bias argument, if you have a synthetic program that regularly tests one brand of cards higher than the other, but in actual performance that brand does worse, it's pretty obvious that there is a bias. I don't think there's much of an argument there, it's just kind of common sense.
  8. Yeah, in 3DMark06 the 2900XT does not do as well as the 3DMark05 scores would indicate.


    3DMark05 @ 2560x1600

    8800Ultra 13233
    2900XT 11789
    8800GTX 11138
    8800GTS 9496

    3DMark06 @ 2560x1600

    8800Ultra 7835
    8800GTX 6802
    2900XT 6448
    8800GTS 5817
    Link


    But all it takes is to see how a 8600GT/S performs in 3DMark06 and them compare it to a X1950Pro or XT in real world to realize how worthless synthetics are.
  9. Quote:
    isnt setting the resolution to 1024x768 without AA kinda like cheating?


    That was one of the first things I noticed about the screen shot...while the BIG numbers seem impressive, 1024x768 with no AA is somewhat of an anti-climax.

    default for 3dmark and always has been
  10. Quote:
    2900XT's in Crossfire achieve a 3DMark05 score of a world record whopping 34,126 points!!!! Check it out!

    This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts. The article doesn't mention whether the system was cooled by air or water, I guess water tho. But it does say the 2900XT's were air cooled.


    Pretty sure that X6800 was on LN2
  11. Quote:
    2900XT's in Crossfire achieve a 3DMark05 score of a world record whopping 34,126 points!!!! Check it out!

    This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts. The article doesn't mention whether the system was cooled by air or water, I guess water tho. But it does say the 2900XT's were air cooled. This does offer some support to the "poor" performance of the 2900XT being driver issues.

    I am curious to see what the score would be if the same effort were put into SLI'ed 8800GTX's or Ultras.
    That has to have been overclocked on LN2 as it was at 5.3GHz. I guess it's an impressive 3DMark05 score, but it doesn't mean the cards are the best in the world.
  12. Quote:

    This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts.

    ....

    I am curious to see what the score would be if the same effort were put into SLI'ed 8800GTX's or Ultras.


    I don't know what you mean, cause the CPU is at about the same speed, but actually with a lower FSB (thus it's a multipier benefit not the whole board) that KingPin used on his OC'ed GF8800Ultras to hold the previous record before the HD2900s came along;

    http://service.futuremark.com/compare?3dm05=2898493

    I would say that's at least an equal effort to Kinc's and with mature hardware and software.
  13. right....now only if it could do that in games :roll:

    nice find any way
  14. Look. if it can do it in a benchmark it can do it in a game.

    You just refuse to think don't you.

    They are both 3d applications. Thus the can both be processed as fast.

    You said the card sucked, but since the only area in which you can cheat with a benchmark is drivers it means that the card is capable of more than you insisted it was. You just don't want to eat your words. It is a 3d app, the card is made for 3d apps. It is winning in a 3d app. So now 3d apps don't matter? You will likely just say that they rigged the drivers, well really genius, the whole problem with the card WAS the drivers and now once they fixed them you day it does not matter? I cant believe i am hearing anything so stupid. Your words are being forcefully pressed down your throat and you don't like it one bit so you will pretend it is not happening.
  15. Quote:
    Look. if it can do it in a benchmark it can do it in a real world app.

    You just refuse to think don't you.
    So the 7900GTX was faster than the X1900XTX when it performed better in 3DMark06? If you're relying on synthetic benchmarks to justify your purchase of a card, please keep it to yourself. I bet if you run 3DMark05 at 2560x1600 with 4xAA 16xAF enabled the 8800GTXs in SLI would outperform it.
  16. Look, the whole driver base for this card is fucked right now, so if they were able to show this much improvement in 1 application they can elsewhere too.

    The 8800s would obviously outperform it, but i was talking about the driver improvement SO FAR.

    First prove to me that they are cheating in the benchmark like nvidia always does. Next thing we know is that 3dMark will start up saying 'the way it's meant to be played'

    When the hardware is fundamentally better all you need do is wait for it to be implemented correctly. I see a correct implementation.
  17. Quote:
    Look, the whole driver base for this card is ****** right now, so if they were able to show this much improvement in 1 application they can elsewhere too.

    The 8800s would obviously outperform it, but i was talking about the driver improvement SO FAR.

    First prove to me that they are cheating in the benchmark like nvidia always does. Next thing we know is that 3dMark will start up saying 'the way it's meant to be played'
    I never said ATI was cheating, but you quickly get on the defensive and say Nvidia always cheats on synthetic benchmarks? Running 3dMark05 at 1024x768 is not a good indicator of real world performance; I don't give a damn who does better in synthetic benchmarks. Look, it's great you love your HD 2900XT, but playing games at 1024x768 without antialiasing and anisotropic filtering just doesn't interest me.
  18. I agree this is maningless for gamers or for G-card comperision.
    With such low settings it's more a CPU benching. Shift the load to the CPU.

    Also for nV and ATI, are good 3Dmark scores important as this thing is mostly present in reviews.
    this means that for driver support it has priority. Just like other very popular games.
    As for the not so popualir games you need to wait a lot longer.
    In this aTI runs behind and take several months.

    It's well know that HD2900 rocks in real games at low resoos and no AA and no AF.

    So there is nothing bais on that. The R600 Flyies with this low load.

    It's known that R600 has lower fill rate.
  19. And once again it all comes down to drivers. And they will fix it eventually. So you can nay say all you want but this card is going to beat your 8800 eventually.
  20. Quote:
    Look. if it can do it in a benchmark it can do it in a game.


    Not to jack the thread, but I've often wondered about this.

    What is the difference between 3dMark 05-06 considering you bench several cards on as similar systems as possible and testing the same setups on real world games...

    Is 3dMark 05-06 still considered useless as it is synthetic? If so why? Assuming they are NOT BIAS, what makes 1 set of information more valuable than the other?
  21. Quote:
    It's hard to relate because it's a worthless number. Who gives a crap if it posts 648 million? If it can't beat a different card in an actual game, then the numbers are meaningless.

    Even if you test it against an nVidia SLI set-up, it still tells you nothing, regardless whether it performs better or worse than the nVidia set-up.
    Benchmarks, synthetic or not, are a legitimate and widely accepted measure of relative performance. Are you implying that benchmarks are altogether worthless? Or, just worthless in this instance? Even in game benchmarks are hard to take seriously given a game's opening sequence begins with "nVidia, the way it's meant to be played!" So, if not benchmarks, real or synthetic, how are we as consumers and enthusiasts to evaluate and compare competing products?

    Quote:
    As for the bias argument, if you have a synthetic program that regularly tests one brand of cards higher than the other, but in actual performance that brand does worse, it's pretty obvious that there is a bias. I don't think there's much of an argument there, it's just kind of common sense.
    I only made the "bias arguement" remark because it has been a re-occurring theme any time ATI/nVidia or INtel/AMD best each other with benchmarks.
  22. Quote:
    And at the end of the day, synthetics are still worthless.


    So why does everyone use them :roll:
  23. Phrozt just hates ati for some reason. I think one of the CEO's molested him.

    Synthetics are not useless, they are just not the whole story.
  24. Quote:
    Phrozt just hates ati for some reason. I think one of the CEO's molested him.

    Synthetics are not useless, they are just not the whole story.
    A case of the pot calling the kettle black; this whole time you're constantly trying to prove the HD 2900XT is better than the 8800GTX, when all evidence says otherwise. What do you have against Nvidia? Is it the fact that they released a better product 6 months earlier than ATI? For the price, the HD 2900XT isn't a bad card, but it isn't the 8800GTX killer you're still dreaming of.
  25. Quote:

    This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts.

    ....

    I am curious to see what the score would be if the same effort were put into SLI'ed 8800GTX's or Ultras.


    I don't know what you mean, cause the CPU is at about the same speed, but actually with a lower FSB (thus it's a multipier benefit not the whole board) that KingPin used on his OC'ed GF8800Ultras to hold the previous record before the HD2900s came along;

    http://service.futuremark.com/compare?3dm05=2898493

    I would say that's at least an equal effort to Kinc's and with mature hardware and software.

    Correct me if I'm wrong, but if I read that link right, Kingpin had the same proc and memory but ran the test with 8800GTX's and not Ultras?

    For nVidia's sake, I hope they were GTX's given the 2900XT is supposed to compete with the GTS. It would stink for nVidia to think that "mid-range" 2900XT's in Crossfire squeeked past "high-end" 8800 Ultras in SLI.
  26. Quote:
    And at the end of the day, synthetics are still worthless.


    So why does everyone use them :roll:

    I don't know.. why do people say that the 2900xt can outperform the 8800gtx? I guess the real answer is that people are morons and choose certain things to believe if it pleases them.

    Rabid... I've been running a 9800proSE for the last 3 years. I guess that's because I hate ATI so much... right?

    Chunkymonster; benchies are not worthless, but IMO, synthetic ones are. Specifically because their numbers are almost ALWAYS different from real world applications (I.E. actually *game* benchmarks). It's the same reason that people frown upon timedemos of games. Everything is calculated differently when it's a playback demo vs actual gameplay where things are rendered on the fly. Why would you care if one card can play a demo better than the other?

    You don't buy a card to play demos, you buy a card to play games!!!
  27. Quote:

    Rabid... I've been running a 9800proSE for the last 3 years. I guess that's because I hate ATI so much... right?


    That's a pretty weak arguement... If I'd gone 3 years using the 9800pro Suckey Edition I might hate ATI too.
  28. Really? I loved it...

    well.. obviously it's been outgrown, but it did fine for me...
  29. I was going to compare what the difference is going from a 9800SE to an 8800GTX in terms of cars, but the idea seemed really stupid. 8O :roll:
  30. Quote:
    Yeah, in 3DMark06 the 2900XT does not do as well as the 3DMark05 scores would indicate.


    3DMark05 @ 2560x1600

    8800Ultra 13233
    2900XT 11789
    8800GTX 11138
    8800GTS 9496

    3DMark06 @ 2560x1600

    8800Ultra 7835
    8800GTX 6802
    2900XT 6448
    8800GTS 5817
    Link


    But all it takes is to see how a 8600GT/S performs in 3DMark06 and them compare it to a X1950Pro or XT in real world to realize how worthless synthetics are.


    Rabidpeanut, your argument that if it can do it in a synthetic benchmark that means it can do it in real world is not entirely accurate. While synthetic benchmarks give some idea about the performance, as can be seen from the above benchmarks, they can differ. 05 has XT beating the GTX, but 06 has GTX beating the XT, and in both cases not by a tiny margin of a couple of points, but a few hundred points. So which is it - is 06 is biased in favor of NVIDIA, or 05 biased in favor of ATI? There is only one way to tell. How do the above 4 cards compare in game benchmarks?

    And as far as I've seen, the XT is on the heels of the GTS, and it doesn't even touch the GTX. You say it's because of driver issues. But the driver issues wouldn't explain the difference in 3dmark05 and 3dmark06 right now. So clearly your argument that one 3d application is the same as another is already false. And if the driver issues are fixed, it will at best catch up to the 8800GTS and *maybe* even go slightly ahead. Which would make it only attain the 3dmark06 levels. But that again does not say anything about how it ended up in front of the GTX in 3dmark05 - unless you're saying that driver issues will make it out-pace the GTX as well? That seems highly unlikely, but you are welcome to make that assumption - if you have no data to back it up, of what value would such an assumption be?
  31. Quote:
    I was going to compare what the difference is going from a 9800SE to an 8800GTX in terms of cars, but the idea seemed really stupid. 8O :roll:



    Hey... I don't care what people say... I made a good comparison.

    On a side note though, my upgrade from 9800pro to 8800GTX was probably a lot like my upgrade in cars. The first car I bought was my grandma's '86 olds cutlass ciera S that had been sitting in a garage for about 7 years. Then, I bought an 01 Dodge Intrepid ES w/leather seats/moonroof/autostick/bunches of extras.

    big difference.. heh
  32. The only real problem with synthetics is that it can create a condition where a gpu company well waste driver development effort on the synthetic rather than actual games. I agree with Rabidwankelnut that the card has to have the muscle to achieve the score but that doesn't let them off the hook for showing that strength in actual games.
    The other area where ATI shines in 3dmark is of course default settings don't involve AA. Default settings are the standard for comparing but in actual games high end cards need to run with AA.
  33. Even that's probably an understatement.
  34. Well comparing 2900xt to the GTX is pointless because it wasnt meant to fight it.

    Id compare 2900xt to the 8800GTS 640meg. In which is DOES actually beat the nvidia in some benchmarks, not all but some. And they are around the same price.

    Personally id go for the 2900xt of those 2.. because yes they were delayed 6months because of problems. These may be the problems that the 8000 series is having with DX10. When we actually get a full on DX10 game we can compare em properly but until then no need to fight over it.
  35. Quote:
    Well comparing 2900xt to the GTX is pointless because it wasnt meant to fight it.

    Id compare 2900xt to the 8800GTS 640meg. In which is DOES actually beat the nvidia in some benchmarks, not all but some. And they are around the same price.

    Personally id go for the 2900xt of those 2.. because yes they were delayed 6months because of problems. These may be the problems that the 8000 series is having with DX10. When we actually get a full on DX10 game we can compare em properly but until then no need to fight over it.
    In the Direct X10 versions of Call of Juarez, Company of Heroes, and Lost Planet the 8800s performed better than the HD 2900XT, but it is too early to really know.
  36. Then explain to me why i just told 3 of my friends to get the 8800 gts 312.
    And i told a few others to get the 7600 when it was the best deal. Never said it was an 8800 gtx killer but for the price it is doing very very well.
  37. Quote:
    Well comparing 2900xt to the GTX is pointless because it wasnt meant to fight it.

    Id compare 2900xt to the 8800GTS 640meg. In which is DOES actually beat the nvidia in some benchmarks, not all but some. And they are around the same price.

    Personally id go for the 2900xt of those 2.. because yes they were delayed 6months because of problems. These may be the problems that the 8000 series is having with DX10. When we actually get a full on DX10 game we can compare em properly but until then no need to fight over it.
    In the Direct X10 versions of Call of Juarez, Company of Heroes, and Lost Planet the 8800s performed better than the HD 2900XT, but it is too early to really know.

    I think the 2900xt cards are a lot mroe overclockable fromt he reveiws Ive read, and when they are overclcoked they are damned close to the GTX in almost everything.

    If you have the cooling of coarse... which I will do soon :D
  38. Quote:

    I think the 2900xt cards are a lot mroe overclockable fromt he reveiws Ive read, and when they are overclcoked they are damned close to the GTX in almost everything.

    If you have the cooling of coarse... which I will do soon :D


    I don't recall seeing anything whatsoever that would support this claim.
  39. Quote:
    Id compare 2900xt to the 8800GTS 640meg. In which is DOES actually beat the nvidia in some benchmarks, not all but some. And they are around the same price.

    Prices vary depending on where you are in the world. In the U.S., they are not around the same price. The 8800GTS is about $65 cheaper.

    Cheapest hd2900xt @ newegg $409.99
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814241056

    Cheapest 8800gts 640 @ newegg $343.99
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814130071
    Quote:

    These may be the problems that the 8000 series is having with DX10. When we actually get a full on DX10 game we can compare em properly but until then no need to fight over it.

    What problems are the 8800's having in DX10? I hate to sound like a nVidiot, but you have got to link some kind of proof to show what you are talking about here, all of the DX10 benchies I have seen (I know they are not "fully" dx10, but still) show the 8800's doing fine. :?
  40. Quote:

    Correct me if I'm wrong, but if I read that link right, Kingpin had the same proc and memory but ran the test with 8800GTX's and not Ultras?


    True, sorry I was thinking of Kinc's previous effort with 3DMK06 which wre Ultras, Kingpin was using GTX, although they were clocked beyond Ultra spec, but yet they were GTXs.

    Quote:
    For nVidia's sake, I hope they were GTX's given the 2900XT is supposed to compete with the GTS. It would stink for nVidia to think that "mid-range" 2900XT's in Crossfire squeeked past "high-end" 8800 Ultras in SLI.


    Now considering his previous efforts;
    http://www.nordichardware.com/news,6530.html

    I'd say GTX or Ultra if the above is the HD2900 on AIR like your original post says (wish they had a pic of the video card frequencies in those other pics) versus the GTX and Ultras on LN2 then there's alot more headroom to go, Kinc already did the LN2 on a single card for that record.

    Doesn't mean much other than to shaow the maximum of the cards and their architecture. There is diminishing returnd though above 1ghz so even LN2 will likely not add too too much I suspect maybe 1GHZ cores (which would be a 100+mhz below the ingle card record).
  41. When you are one of the top overclockers in the world then you can say whether or not it is good at ocing. For now, shut your pimply face and go and tell your mom how nasty the people on the forums are. I am sure she will make you some cookies and ask you when you are moving out of the house...

    If kinc is using it without water cooling somehow i think it might just be a good overclocker.

    I don't see any information supporting that you are intelligent.

    Kinc's overclocking knowledge is likely a LOT better than yours, i don't see you with any records(apart from leading phukh34d on this forum), he holds just about all of them.
  42. Quote:
    When you are one of the top overclockers in the world then you can say whether or not it is good at ocing. For now, shut your pimply face and go and tell your mom how nasty the people on the forums are. I am sure she will make you some cookies and ask you when you are moving out of the house...

    If kinc is using it without water cooling somehow i think it might just be a good overclocker.

    I don't see any information supporting that you are intelligent.

    Kinc's overclocking knowledge is likely a LOT better than yours, i don't see you with any records(apart from leading phukh34d on this forum), he holds just about all of them.


    HATMAN's claim is that it overcocks close to the GTX... I haven't seen anything that would support this claim. If you didn't have such a hardon for me and applied a bit of reading comprehension, you'd understand that.

    You truly excel at being wrong on just about everything.
  43. Quote:
    the Direct X10 versions of Call of Juarez, Company of Heroes, and Lost Planet the 8800s performed better than the HD 2900XT, but it is too early to really know.


    Actually COJ doesn't seem to be doing as well on the GF8800 as it once did in the pre-release demos we had last month;

    http://www.extremetech.com/article2/0,1697,2147119,00.asp

    Right now it's still a crap-shoot of crap examples, but COH goes both ways (GTS vx XT [not GTX/Ultra]), COJ prefers ATi, and LP prefers nV. Seems like pretty even sofar from what I've seen.

    But like 3Dmark, those benchies are kinda crap predictors of the najor titles we're all waiting for like UT3 and Crysis. Heck even they aren't consistent amongst each other.
  44. I didnt say it overclocks close tot eh GTX the performance benchmarks come close to the GTX when i IS greatly overclcoked, and the GTX is left at standard.

    I read a reveiw Ill see if I can find it in history... cant argue with benchmarks im afraid lol. Well ya can try but since they cant argue back its kinda pointless.

    Edit: i cant find it, but a reveiw did show that normally it cant match the GTX but overclocked to the max it performed better than the 8800gts and closish to the gtx. When i say closish i mean closer to GTX performance than GTS.

    And they were benchemarks/FPS in 3dmark and games.

    One thing that it also noted was that the 2900xt performed crap with AA enabled..
  45. Your use of insults makes me think you're unable to provide a compelling argument. Kinc may know more about overclocking, but you do not; just because the HD 2900XT does well in one synthetic benchmark at 1024x768 without antialiasing and anisotropic filtering does not mean it comes anywhere close to competing with the 8800GTX.
  46. Quote:
    Your use of insults makes me think you're unable to provide a compelling argument. Kinc may know more about overclocking, but you do not; just because the HD 2900XT does well in one synthetic benchmark at 1024x768 without antialiasing and anisotropic filtering does not mean it comes anywhere close to competing with the 8800GTX.


    Your right, it shouldnt, which is why beating the GTX is a bloody good achievement lol.
  47. Quote:
    the Direct X10 versions of Call of Juarez, Company of Heroes, and Lost Planet the 8800s performed better than the HD 2900XT, but it is too early to really know.


    Actually COJ doesn't seem to be doing as well on the GF8800 as it once did in the pre-release demos we had last month;

    http://www.extremetech.com/article2/0,1697,2147119,00.asp

    Right now it's still a crap-shoot of crap examples, but COH goes both ways (GTS vx XT [not GTX/Ultra]), COJ prefers ATi, and LP prefers nV. Seems like pretty even sofar from what I've seen.

    But like 3Dmark, those benchies are kinda crap predictors of the najor titles we're all waiting for like UT3 and Crysis. Heck even they aren't consistent amongst each other.

    Good find Ape, I hadn't seen that one. Looks like the 2900 gave the GTX a run for the money there.

    Hehe, at the end of the article they mention how nVidia is crying about that benchmark now, saying that they "changed the code" ROFL.

    Here read for yourself
    Quote:
    Since this benchmark was publicly released, Nvidia has sent mail to journalists in the PC enthusiast space calling into question some of its code. Their claim is that changes were made to some shaders and default "under the hood" settings that impact performance on Nvidia cards without making a substantial difference to the performance of ATI cards. We are unsure at this point if these code revisions will appear in the DX10 patch for the full game or not. Before everyone cries foul, we should note that this sort of thing happens a lot, and often in Nvidia's favor, thanks to their strong developer relations program.


    Sounds like AMD :lol:
  48. Quote:
    Now considering his previous efforts;
    http://www.nordichardware.com/news,6530.html

    I'd say GTX or Ultra if the above is the HD2900 on AIR like your original post says (wish they had a pic of the video card frequencies in those other pics) versus the GTX and Ultras on LN2 then there's alot more headroom to go, Kinc already did the LN2 on a single card for that record.

    Doesn't mean much other than to shaow the maximum of the cards and their architecture. There is diminishing returnd though above 1ghz so even LN2 will likely not add too too much I suspect maybe 1GHZ cores (which would be a 100+mhz below the ingle card record).


    Nice link. I really like that pict and seeing the condensation cloud! Effing-A that's cool! I hear what your saying about having headroom.

    I too would have liked a pict other than the desktop screenshot showing scores. A shot or two of his hardware or testing bench of the 2900's would have been gravy on the biscuits, but...

    Thanks for the input Ape!
Ask a new question

Read More

Graphics Cards Crossfire Graphics Product