Hey, So mainly I play World of Warcraft and Team Fortress 2. I was getting between 8fps and 17fps, so i decided to swap out some parts.
First i swapped out my VisionTek X1550 (512mb) for an Nvidia GTS 250 (1gb). Nothing changed. I tried uninstalling the drivers, then reinstalling them, but nothing changed, so i decided to try something else.
Next, i swapped out my processor. I had an AMD 5000+ (2.6mhz dual core) for an AMD 5600+ (2.8mhz dual core) which is the best my motherboard supports. But still, nothing changed.
I have 3 gigs of RAM, which my motherboard only supports 4, and my power supply has a max output of 250W. My next move is to go to fry's and get a new power cord because my GTS 250 has 2 power jacks, though it only came with one 6pin adapter, so i'm going to go and geta new one.
This is really annoying me because alot of my friends have worse setups than i do, but are running these games at much better settings than i am.
All that matters is the resolution (of your monitor). Your friends could be maxing out their settings at 1024x768, and you might be using HD resolutions, something around 1280x1024. The higher the resolution, the more pixels; the more pixels, the more 'power' you need.
And that new video card should WIPE the floor with your old one. A new power supply was definitely in order. But that CPU upgrade was pretty much a waste.
But, when you got your new card, did you
1. Uninstall all video card drivers
2. Shutdown computer, swap out cards.
3. Reboot and install new drivers, probably reboot after install again...?
Yeah, I'm running both my monitor and the game at 1440x900, which is what my monitor's rated for...and i went through all the steps after getting my new card...do you think it would make a difference if I plugged a power cable into both jacks?...I'm just thinking that if it needed more power than it was getting, it just wouldn't run...