Confusing 3dmark06 score on x800 pro

labortius

Distinguished
May 17, 2004
182
0
18,680
Hi!

I'm getting a 3dmark06 default result of 1808 with default processor (Athlon64 3500 Venice, 1GB RAM and no overclocking on my x800 pro.

According to the VGA Charts, run on an FX-60, an x800xt should get 1295, and x1800XL gets 1715.

Both these cards are supposed to be faster that the x800pro, unless I've got confused. Are there any reasons that might explain these high results?

Overclocking my x800pro gets me 2002 3dmark06 points!

Thanks for any help.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Where are you getting these figures from? Even if you were able to run 3DMark06 on your X800 pro, there are key Shader Model 3.0 tests you would be unable to run.
 

labortius

Distinguished
May 17, 2004
182
0
18,680
I forgot about SM 3.0 (believe it or not!)

Perhaps 3dmark misses these tests and gives an average "credit" for the tests I didn't do, putting up my score.

That said, surely it would be doing the same with the x800xt, which is supposed to be faster, I think, yet my x800 pro seems to beat it with a slower processor as well!

I'm going to reinstall 3dmark06 again and rerun the tests. Maybe it was just a flukey one off result. If not, I guess I can stick with my x800 pro for a couple more years!!!
 

kaotao

Distinguished
Apr 26, 2006
1,740
0
19,780
Maybe you are using the free version, and Tom's doesn't. The free version doesn't allow you to enable AA, AF, or use higher resolution. I'm still not sure where you're getting that. Link?
 

labortius

Distinguished
May 17, 2004
182
0
18,680
I got it from the madonion site.

http://www.futuremark.com/

I'm not sure about the AA / AF tests, but the resolution is the same.

I thought the free and registered version gave the same results?!

I have just rerun it, 1024 x 768 with trilinear, 1896 3dmarks.
Default clock speeds, 475, 450.

992 SM2, no sm3, 852 cpu tests.

Is there anyone else with this same card with a similar CPU who can tell me their results?
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Your using a non SM 3.0 GPU coupled with a silly benchmark....

I'm sure real world results would show you the difference.

I've said it before, benchmarks are good for minor comparison, but its mainly only good for system tuning, tweaking and performance guide.
 

labortius

Distinguished
May 17, 2004
182
0
18,680
So you are saying 3dmark is silly because the results change depending on no good reasons? Also, it overcompensates the fact I can't run SM 3 tests?

Two good points, and they explain the strange results, but why didn't the THG tests have the same problem? A x1800XL is SM3, but it scores less that the x800pro? An x800xt gets much less (-600), but is a faster card?

That must mean the THG applied some other smoothing gradient to the scores to compentsate for the fact that 3d mark ain't compensating for SM3, but forgot for the x1800xl?
 

raven_87

Distinguished
Dec 29, 2005
1,756
0
19,780
Possible, however an X1800XL should be about double the score they gave.
My X1800XT got just over 5k ( havent done the 1900 yet). Something is amiss, yet I'd take it with a grain of salt. Just go on the orb and compare results for some similar systems, then I think you'll see where you stand.
 

icbluscrn

Distinguished
Jan 28, 2006
444
0
18,780
what was you breakdown from sm2 and cpu scores?

I like to rely on 3dmark score because if you dont use sli they seem to be a good reference on gameplay fps, I mean i have been doing alot of benching /comparing Fps so i feel comfortable with the results.

3d06 just omits sm3 score if card does not support it

Minimum System Recommendations
"DirectX® 9 compatible graphics adapter with Pixel Shader 2.0 support or later, and graphics memory of 256 MB minimum"

It says what happens without proper mem but not what happens without sm3 support

"* To run the HDR/SM3.0 graphics tests, a DirectX 9 compatible graphics adapter with support for Pixel Shader 3.0, 16 bit floating point textures and 16 bit floating point blending is required.

It is possible that 3DMark06 will run on PCs that do not meet the recommendations above, but the benchmark performance may be seriously affected. For example, insufficient video memory will result in texture swapping - this will cause fluctuations during the tests, reducing the reliability of the generated scores."


Well with my d805 @3.8ghz and a x800gto2 oc'd i got 2350 default setup
 

labortius

Distinguished
May 17, 2004
182
0
18,680
I have just rerun it, 1024 x 768 with trilinear, 1896 3dmarks.
Default clock speeds, 475, 450.

992 SM2, no sm3, 852 cpu tests.

ORB won't work for me (confession time) because 3dmark06 is stupidly set to run at a higher resolution than my LCD TV so I had to (hangs head in shame) use a crack just to get it to run. It knows I did something funny, so it won't let me use ORB...

All the other 3Dmarks could change resolution in demo mode, but not this. It's pretty annoying that I have to join the ranks of the great unwashed pirates just to get the "demo" to run with my setup.