I bought an XFX 5200 a while ago for my comp till i get a new one, and from what i've heard a normal 5200's speeds are 250MHz/400MHz or 200MHzx2 but anything that shows the cards speed like RivaTuner or coolbits says 250MHz/333Mhz so is it running at 333MHz or 666MHz?
It's well known that video card manufactures often use slower/cheaper RAM on cards with more memory. So I gather you have a 128MB FX5200 and got stuck with some slow/crappy RAM. Video card manufactures often stray from Nvidia specs to either save money or sometimes create a niche product with specs above Nvidia defaults.
FYI, you also could have a 64 bit width (memory) card.
Is it a low profile card? From what I've gathered most (if not all) low profile MX440/FX5200 cards are only 64bit width which is a <b>huge</b> performance hit on an already mediocre card
To check download RivaTuner and the memory bus shows up under detailed info.
You have a 64-bit memory bus there, too, unfortunately. If you plan on doing any gaming, you should probably exchange that card for something better... a 64-bit 5200 is basically as good as a Geforce2 MX.
Well, as long as you're happy with it. But you should be aware that a cheap Geforce4 T4200 for $75 or so would probably triple your framerates. Even a cheapo Radeon 9600SE would double them.
Then again, on a CPU that old it might not make much of a difference. but you say you plan to upgrade your CPU, so this is definitely not the card for you...
Your definition of "smoothly" is certainly not the same as mine, methinks.
I sold my ATI All-in-Wonder 9800 Pro for $200 and was given a GeForce FX 5200 as well.
I hated the ATI card because it performed so poorly on my NForce motherboard. I have been using a GeForce 3 ti-500 (best video card I've ever owned) for about 1.5 years without any trouble. I figured that an upgrade to the GeForce FX 5200 would be cool. So, I put the GF3 in my girlfriend's computer and the new GeForce FX 5200 in mine. Boy, was I wrong.
I play CounterStrike at 1600x1200 on my Dell 2000FP monitor (DVI out). No AA and graphics always on performance vs. quality. The ATI card would average 100fps but would drop to 30fps during combat. The GeForce 3 ti-500 averages 130fps and rarely dropped below 100fps. The GeForce FX 5200 averages 50fps and often drops to 30 or LESS. If I see a smoke grenade on the screen (even in low quality smoke) my FPS will drop to 10 with the horrible GeForce FX 5200.
Seriously... I think my TNT2 card performed better than this FX card.
There is *ONE* game that played well on it, and that was the Call of Duty demo. I haven't tried that game on the other two cards.
I'm ... disapointed with the FX series so far. I'd like to get the 5950, but that's a little pricey. On the flipside, I'll never buy another ATI card. I've had nothing but bad luck with them ever since my first Rage Fury.
I'd argue the AIW Radeon 9800 Pro is the best card you've owned. I would have given you 3 GF3's or 2 GF4Ti4200's for that thing. I bet you were a victim of driver conflicts since you upgraded from a NVidia.
Have you tried updating your drivers yet?
Play with the OpenGL settings in the advanced display properties area and turn everything to performance over quality and see if that helps.
Although with a 64bit bus on that card I wouldn't expect much from any game made past 2001-02.
<A HREF="http://ssseth.t35.com/q" target="_new">My world famous quote banner</A>
Is the FX5200 fully DX 9? I've heard that with Halo, if you turn off all the fancy DX 9 effects, the game runs much more smoothly. Could it be your friends have all the effects on while you have all of yours off?
<font color=red> If you design software that is fool-proof, only a fool will want to use it. </font color=red>
A 64 bit bus won't just kill openGL. It'll kill everything.
What you have there, sir, is a Geforce2 MX200 that can run DirectX 9 shaders.
And very slowly at that.
I'm afraid that the idea of your computer getting better framerates than theirs is quite impossible... unless they're running 6x AA on Celeron 1.2 Ghz machines. Or maybe if you're running 640*480 and they're running 1600*1800...
I get 2492 3dmarks in 01 and no i'm running with it all on running 800x600 and forcing PS2.0 cause the others ones gave it worse believe it or not, and the one has a Asus A7N8X-E DX nForce2 and athlon 1700+ with the 9800p getting around 107 fps avg at 1024x768 with no FSAA/AF and i never did say i was getting "better" just enough.
Having a FX5200 with a 64-bit memory bus isn't going to make it worse than a TNT2 with a 64-bit memory bus, all i asked was if anyone knew what could be causeing OpenGL to perform that badly even in old games.
I thought you still had the opportunity to return it so that is why I brought it up. Pretty much ANY 128bit memory budget card will be a huge imporvement. Even an old GF4 MX440 (128bit) 64MB 8x AGP should get around 6000-7000 on 3D Mark 2001.
Even though the stickied guide says not to go with the Nvidia cards any 128bit version of the above cards will outperform the one you have now significantly.
Myself I would try and track down the a GF4 Ti4200. If not the best out of the 9xxx ones is the 9100 (64MB), then 9200, then 9000.
The good thing about picking any of the Radeon 9xxx cards is you KNOW they will be 128bit versions (except the SE it is 64 bit, but at least ATI has the decency to change the name so it's easy to tell).
Sounds like a good choice. You can then be sure you will get what you want (128bit memory). Just make sure it's not a 9200<b>SE</b> or you won't be much better off then what you have now.
It's a shame that Nvidia and/or card manufacturers can't do everyone a favor and do something like ATI has to let us know if it's 64bit or 128bit.
I recently went through the same confusion myself. Very frustrating.