2900XT set 3DMark05 World Record!!!!!!!!

2900XT's in Crossfire achieve a 3DMark05 score of a world record whopping 34,126 points!!!! Check it out!

This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts. The article doesn't mention whether the system was cooled by air or water, I guess water tho. But it does say the 2900XT's were air cooled. This does offer some support to the "poor" performance of the 2900XT being driver issues.

I am curious to see what the score would be if the same effort were put into SLI'ed 8800GTX's or Ultras.
 
Doesnt make me wonder.
The 05 version is the one that favors ATi.

Ahh yes, the benchmark bias argument...

IMO, 34, 216 is impressive but it's hard to relate what that number means without a comparative measure. I'd like to see a similiar test with an nVidia based SLI set-up.
 

Phrozt

Distinguished
Jun 19, 2002
565
0
18,980
Doesnt make me wonder.
The 05 version is the one that favors ATi.

Ahh yes, the benchmark bias argument...

IMO, 34, 216 is impressive but it's hard to relate what that number means without a comparative measure. I'd like to see a similiar test with an nVidia based SLI set-up.

It's hard to relate because it's a worthless number. Who gives a crap if it posts 648 million? If it can't beat a different card in an actual game, then the numbers are meaningless.

Even if you test it against an nVidia SLI set-up, it still tells you nothing, regardless whether it performs better or worse than the nVidia set-up.

As for the bias argument, if you have a synthetic program that regularly tests one brand of cards higher than the other, but in actual performance that brand does worse, it's pretty obvious that there is a bias. I don't think there's much of an argument there, it's just kind of common sense.
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
Yeah, in 3DMark06 the 2900XT does not do as well as the 3DMark05 scores would indicate.


3DMark05 @ 2560x1600

8800Ultra 13233
2900XT 11789
8800GTX 11138
8800GTS 9496

3DMark06 @ 2560x1600

8800Ultra 7835
8800GTX 6802
2900XT 6448
8800GTS 5817
Link


But all it takes is to see how a 8600GT/S performs in 3DMark06 and them compare it to a X1950Pro or XT in real world to realize how worthless synthetics are.
 

bkiserx7

Distinguished
Dec 12, 2006
219
0
18,680
isnt setting the resolution to 1024x768 without AA kinda like cheating?

That was one of the first things I noticed about the screen shot...while the BIG numbers seem impressive, 1024x768 with no AA is somewhat of an anti-climax.

default for 3dmark and always has been
 

bkiserx7

Distinguished
Dec 12, 2006
219
0
18,680
2900XT's in Crossfire achieve a 3DMark05 score of a world record whopping 34,126 points!!!! Check it out!

This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts. The article doesn't mention whether the system was cooled by air or water, I guess water tho. But it does say the 2900XT's were air cooled.

Pretty sure that X6800 was on LN2
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
2900XT's in Crossfire achieve a 3DMark05 score of a world record whopping 34,126 points!!!! Check it out!

This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts. The article doesn't mention whether the system was cooled by air or water, I guess water tho. But it does say the 2900XT's were air cooled. This does offer some support to the "poor" performance of the 2900XT being driver issues.

I am curious to see what the score would be if the same effort were put into SLI'ed 8800GTX's or Ultras.
That has to have been overclocked on LN2 as it was at 5.3GHz. I guess it's an impressive 3DMark05 score, but it doesn't mean the cards are the best in the world.
 
This amazing score can also be attributed to a very overclocked X6800 and some other top quality parts.

....

I am curious to see what the score would be if the same effort were put into SLI'ed 8800GTX's or Ultras.

I don't know what you mean, cause the CPU is at about the same speed, but actually with a lower FSB (thus it's a multipier benefit not the whole board) that KingPin used on his OC'ed GF8800Ultras to hold the previous record before the HD2900s came along;

http://service.futuremark.com/compare?3dm05=2898493

I would say that's at least an equal effort to Kinc's and with mature hardware and software.
 

Rabidpeanut

Distinguished
Dec 14, 2005
922
0
18,980
Look. if it can do it in a benchmark it can do it in a game.

You just refuse to think don't you.

They are both 3d applications. Thus the can both be processed as fast.

You said the card sucked, but since the only area in which you can cheat with a benchmark is drivers it means that the card is capable of more than you insisted it was. You just don't want to eat your words. It is a 3d app, the card is made for 3d apps. It is winning in a 3d app. So now 3d apps don't matter? You will likely just say that they rigged the drivers, well really genius, the whole problem with the card WAS the drivers and now once they fixed them you day it does not matter? I cant believe i am hearing anything so stupid. Your words are being forcefully pressed down your throat and you don't like it one bit so you will pretend it is not happening.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Look. if it can do it in a benchmark it can do it in a real world app.

You just refuse to think don't you.
So the 7900GTX was faster than the X1900XTX when it performed better in 3DMark06? If you're relying on synthetic benchmarks to justify your purchase of a card, please keep it to yourself. I bet if you run 3DMark05 at 2560x1600 with 4xAA 16xAF enabled the 8800GTXs in SLI would outperform it.
 

Rabidpeanut

Distinguished
Dec 14, 2005
922
0
18,980
Look, the whole driver base for this card is fucked right now, so if they were able to show this much improvement in 1 application they can elsewhere too.

The 8800s would obviously outperform it, but i was talking about the driver improvement SO FAR.

First prove to me that they are cheating in the benchmark like nvidia always does. Next thing we know is that 3dMark will start up saying 'the way it's meant to be played'

When the hardware is fundamentally better all you need do is wait for it to be implemented correctly. I see a correct implementation.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Look, the whole driver base for this card is ****** right now, so if they were able to show this much improvement in 1 application they can elsewhere too.

The 8800s would obviously outperform it, but i was talking about the driver improvement SO FAR.

First prove to me that they are cheating in the benchmark like nvidia always does. Next thing we know is that 3dMark will start up saying 'the way it's meant to be played'
I never said ATI was cheating, but you quickly get on the defensive and say Nvidia always cheats on synthetic benchmarks? Running 3dMark05 at 1024x768 is not a good indicator of real world performance; I don't give a damn who does better in synthetic benchmarks. Look, it's great you love your HD 2900XT, but playing games at 1024x768 without antialiasing and anisotropic filtering just doesn't interest me.
 

SuperG

Distinguished
Jul 21, 2006
28
0
18,530
I agree this is maningless for gamers or for G-card comperision.
With such low settings it's more a CPU benching. Shift the load to the CPU.

Also for nV and ATI, are good 3Dmark scores important as this thing is mostly present in reviews.
this means that for driver support it has priority. Just like other very popular games.
As for the not so popualir games you need to wait a lot longer.
In this aTI runs behind and take several months.

It's well know that HD2900 rocks in real games at low resoos and no AA and no AF.

So there is nothing bais on that. The R600 Flyies with this low load.

It's known that R600 has lower fill rate.
 

Rabidpeanut

Distinguished
Dec 14, 2005
922
0
18,980
And once again it all comes down to drivers. And they will fix it eventually. So you can nay say all you want but this card is going to beat your 8800 eventually.
 

sweetpants

Distinguished
Jul 5, 2006
579
0
18,980
Look. if it can do it in a benchmark it can do it in a game.

Not to jack the thread, but I've often wondered about this.

What is the difference between 3dMark 05-06 considering you bench several cards on as similar systems as possible and testing the same setups on real world games...

Is 3dMark 05-06 still considered useless as it is synthetic? If so why? Assuming they are NOT BIAS, what makes 1 set of information more valuable than the other?
 
It's hard to relate because it's a worthless number. Who gives a crap if it posts 648 million? If it can't beat a different card in an actual game, then the numbers are meaningless.

Even if you test it against an nVidia SLI set-up, it still tells you nothing, regardless whether it performs better or worse than the nVidia set-up.
Benchmarks, synthetic or not, are a legitimate and widely accepted measure of relative performance. Are you implying that benchmarks are altogether worthless? Or, just worthless in this instance? Even in game benchmarks are hard to take seriously given a game's opening sequence begins with "nVidia, the way it's meant to be played!" So, if not benchmarks, real or synthetic, how are we as consumers and enthusiasts to evaluate and compare competing products?

As for the bias argument, if you have a synthetic program that regularly tests one brand of cards higher than the other, but in actual performance that brand does worse, it's pretty obvious that there is a bias. I don't think there's much of an argument there, it's just kind of common sense.
I only made the "bias arguement" remark because it has been a re-occurring theme any time ATI/nVidia or INtel/AMD best each other with benchmarks.