GeforceFX 5600 and 5200 benchmarks here...

cleeve

Illustrious
**EXTRAPOLATED**

These are EXTRAPOLATED based on the charts Nvidia supplied at the GDC. You can see them here:
http://www.tomshardware.com/business/20030307/gdc_2003-04.html

Their graph suggests that a GeforceFX 5200 runs UT2003 1024*768 at 4xFSAA about 2.4x as fast as a Geforce4 MX-8x, and that a GeforceFX 5200 runs UT2003 1024*768 at 4xFSAA about 1.7x as fast as a Geforce 4200.

I concentrated on the UT2003 engine because it's the easiest to find 4xFSAA benchmarks for, and that's what the Nvidia charts called for (read the small print on 'em, all the UT2003 benchmarks done with 4xFSAA)
So don't take it too seriously! It's just for $hits and giggles, baby!


I based the Radeon 9500 PRO, Geforce4 4200 and 4600 numbers on the figures supplied by Mr. Pabst himself, here:
http://www.tomshardware.com/graphic/20021202/radeon_9500-07.html

Unfortunately, no-one has benchmarked the geforce4MX-8x running UT2003 at 4xFSAA. I therefore guesstimated the numbers based on two things:
1- The folowing Digit-Life article, which shows a 33 FPS framerate for UT2003 using a Geforce4MX-8x and Quincunx antialiassing: http://www.digit-life.com/articles2/gf4/nv28-nv18.html
2 - this VR-Zone article, which shows that true 4xFSAA on a Geforce4MX-8x runs about 53% slower than Quincunx: http://www.vr-zone.com/reviews/Prolink/GF4MX4408x/page4.htm

Please don't flame me! I know these numbers are highly suspect. It's all just fun speculation!


And now, the (possibly meaningless) numbers:
--------------------------------------------

UT 2003 (1024*768, 32 bit, 4x AA)

18 FPS Geforce4MX-8x (Guesstimated)
43 FPS Geforce FX 5200 ULTRA (2.4x GF4 MX)
44 FPS Geforce4 4200 Ti
54 FPS Geforce4 4600 Ti
75 FPS Geforce FX 5600 ULTRA (1.7x GF4 4200)
75 FPS Radeon 9500 PRO

--------------------------------------------

Even though these are wild guesses, it puts the GeforceFX 5600 ULTRA neck-and-neck with the Radeon 9500 PRO, about where it should be.

And the GeforceFX 5200 ULTRA looks to be on-par with the Geforce4 4200, not too shabby at all for a Geforce4 MX replacement...

Be well,

- Cleeve
 

eden

Champion
Well, I'd really like to believe these benches, but they just don't say anything really, in real life. For all we know, those percentages were scores below 15 FPS. Assume 10FPS. Is 2.4 times better THIS much better, if it is still frame skipping?

Anyways, good extrapolation, it is indeed good to speculate. However system configs will always come in the way, and worst is that the FX5600 Ultra performance, if on par with the R9500PRO, may not be as succesful as the R9600PRO if the latter will be better than the R9500PRO and comes out at equally, 199$ US.
Though all this is EXTREMLY theoretical and EXTREMLY hypothetical, I do find myself confused with which of the upcoming x6xx cards from either ATi or nVidia will be successor and be worth it. Both come out at same prices really, but who knows what the performance will be, and if the AA and Aniso issue with nVidia has been fixed or not, as well as the image quality rendering problems with Aniso on the FX.

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
nvidia cards don't change currently. they have all the same features and bugs (okay, they strip out some features for $$$). they will not have bether aa and af. no way.

for the rest, dunno.. but i think testing a gf4mx against any gfFX _should_ show a huge difference.. :D

i prefer currently the ati cards. why?
- stable drivers
- fast in dx9 featuresets (remember 3dmark03, ps2.0 tests, etc.. nvidia has there huge problems, no mather wich chip)
- dx9 compliant (nvidias cheap cards don't have all dx9 features, according to nvidia)
- known hw (hey, its the r300.. or even bether.. we know this chip is good. we don't know this on the fx:D)
- no stupid marketing (i really hate nvidias marketing part)
- best image quality in games
- and the image quality without much speed drop

"take a look around" - limp bizkit

www.google.com
 

eden

Champion
One thing I just don't get, is why was nVidia trashing test 4's PS2.0 capabilities, if there is a test in 3dMark03 SPECIFICALLY called PS2? It seems like no one ever commented on that test...

Additionally, aside from PS2, what about VS 2.0 in DX9? How is it?

--
This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
 
I don't think that's true. I think they all have relatively the same ratio, the problem is that currently we have imature PS2.0 and DX9 cards. Likely the DX8.1~ capabilities of cards will remain considerably constant whereas the breadth of the processing of the GPUs should mean that future cards will increase their scores in test 4 while only marginally increaing Tests 1-3, that's just my read on things, but I don't think that the scores will reflect similar weighting in the future once the cards seriously work through the tests. But we shall see....

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK