GTX 570 Worse Performance Than GTX 275?

sporkfire

Distinguished
May 8, 2011
4
0
18,510
Hey guys,

So I recently upgraded from a GTX 275 to a GTX 570 and was expecting performance increases in games. However, when I tested games on my new card I found that I was getting lower FPS than with my GTX 275, which obviously isn't right. In fact, I was just playing Battlefield Bad Company 2 on DirectX 9 on high settings (same settings as I always used on the GTX 275) and noticed that my FPS was dropping in the 20's and even in the teens! It's frustrating because my GTX 275 played on these settings very smoothly and when I upgraded to this card I was expecting an increase in performance not a decrease. It's weird because it seems the performance of the card gets even worse over time (although it starts off poorly in the first place). For instance, when I first start playing I was getting around 25-40 FPS, which is still worse than my GTX 275, but as I kept playing I would get even lower FPS around the 15-28 FPS range.

I have an older computer that I built in 2007 (which still uses Windows XP Professional) but have made a few upgrades to it. Here are my system specs:

Motherboard: EVGA 680i SE
CPU: intel Core2Duo @ 3.00 GHz
Memory: Corsair Dominator DDR2 1066
Graphics Card: EVGA GTX 570 SC
Power Supply: PC Power & Cooling 750W

Now I understand that games like BFBC2 prefer a quad core processor but I should be getting higher FPS than that and like I said before I was getting a lot better FPS with the same setup on my GTX 275. I've also tried a couple other games and noticed I can barely keep 30 fps and often times drop into the 20's even in an old game like the Witcher.

Here is a list of things I've tried so far (list is not in chronological order):

-Cleaned and reinstalled the latest drivers
-Updated BIOS to the latest version
-Installed latest motherboard drivers
-Tested in other user profiles.
-Checked the processes, nothing's eating up resources
-All hardware is set to default (no overclocking, unless it came factory overclocked)
-Checked the temperatures, everything is running cool, including the graphic card
-GPU-Z shows that the GPU usage is low when playing games, even though I get terrible FPS.

So I'm stomped, I don't know what else to do, that's why I came here to ask others who may know what it is.
 
seems all to common that people upgrading to what should be a faster newer card, sometimes get poor results. The conclusion is often a driver issue. Your cpu will also be bottlenecking that GPU. Maybe look on ebay for a q6600 or similar and oc that to 3ghz+ with a decent cooler. Also how much ram do you have? should be running 4GB+
 

sporkfire

Distinguished
May 8, 2011
4
0
18,510
Thanks for all the replies. I ran GPU-Z and the Bus Interface says PCI-E x16 @ x16. I should also mention that even if I lower the graphics in a game such as BFBC2 my FPS either stays the same or goes even lower sometimes, weird.

I ran Afterburner hardware monitor while playing Battlefield Bad Company 2 and The Witcher. Here are the results:


Battlefield Bad Company 2 (dirextX 9, high settings, 4x AF, no AA, Resolution 1680x1050)

BFBC2-1.jpg


The Witcher (High settings, everything turned on, everything maxed, Resolution 1680x1050)

TheWitcher.jpg


As you can see, the GPU usage is extremely low for both games, staying around 25-30% GPU usage. The GPU usage seems to be slightly higher for The Witcher game.


I ran some benchmark software (Heaven & Tropics) to test the GPU. I ran Heaven in OpenGL because I was getting some weird wavy yellow lines on textures. Tropics I ran in DirectX 9.

Here are the hardware monitor results:

Heaven OpenGL (High settings, no AA, 8x AF, medium tessalation @ max distance, everything else on, resolution 1680x1050)

Heaven1.jpg


Tropics DirectX 9 (Everything turned on, everything on max, resolution 1680x1050)

Heaven2.jpg


I also took a screenshot of task manager while running the Tropics benchmark:

Heaven2TaskManager.jpg


So it seems the GPU usage is around 99% most of the time with these benchmarks. I did try 3dmark06 but because it was the free basic version I couldn't change the image settings and was getting kind of low FPS for the default settings, which were pretty low.

I'm just curious, if it is my CPU's fault then why does it perform so poorly in an old game like The Witcher which my CPU should be more than enough for? Also, it still doesn't make sense why my performance would go down in these games. It seems they would at the very least be the same as my older GTX 275.