Hello some assistance please.
So here's my situation. My gaming computer is an I-5 2500k Sandy Bridge with 8 gigs of RAM. Initially I had a Geforce GTX 560 TI on it, but I recently upgraded it to a GTX 670, as well as installed Windows 8. I noticed that some of my games, old ones even, suffered from sever frame rate drop. Now it wouldn't bother me that much since sometimes that just happens in some situations of the game, but I noticed this did not happen in the same situations on my secondary computer, which is an Intel Quad Core Q9300, with the GTX 560 TI I removed from my gaming computer and 8 gigs of RAM.
I did some research and I found out Windows 8 really isn't optimized in a lot of ways quite yet, so I figured that might be the problem. So I reinstalled Windows 7, problem still persisted. Then I read that the 600 series of Geforce cards don't play nice with a lot of games either. So I uninstalled all the drivers and re-installed the GTX 560 TI... in other words I put the system back the way that it was prior to my "upgrades"... the problem still persisted. It seems I did something to my system at its core that's causing this drop in FPS.
Possibilities I have read about in my research:
1. Drivers - they are all updated
2. DirectX - I have 11 installed, up to date.
3. CPU temperature - I am not overclocking, and temperature seems stable at 58-63 degrees.
4. Remove spyware/additional programs - I am running things exactly the same way as I did prior to my attempted "upgrades", back when this frame rate issue was not happening.
The only difference that I can see between now and prior to the upgrades is that some programs still have that blue-and-gold administrator shield in the icon, despite the fact that I have UAC turned all the way down to the lowest level.
There's just no reason it should be outperformed by my far weaker computer, and I have no idea what might be causing it. Any thoughts?
Thank you
So here's my situation. My gaming computer is an I-5 2500k Sandy Bridge with 8 gigs of RAM. Initially I had a Geforce GTX 560 TI on it, but I recently upgraded it to a GTX 670, as well as installed Windows 8. I noticed that some of my games, old ones even, suffered from sever frame rate drop. Now it wouldn't bother me that much since sometimes that just happens in some situations of the game, but I noticed this did not happen in the same situations on my secondary computer, which is an Intel Quad Core Q9300, with the GTX 560 TI I removed from my gaming computer and 8 gigs of RAM.
I did some research and I found out Windows 8 really isn't optimized in a lot of ways quite yet, so I figured that might be the problem. So I reinstalled Windows 7, problem still persisted. Then I read that the 600 series of Geforce cards don't play nice with a lot of games either. So I uninstalled all the drivers and re-installed the GTX 560 TI... in other words I put the system back the way that it was prior to my "upgrades"... the problem still persisted. It seems I did something to my system at its core that's causing this drop in FPS.
Possibilities I have read about in my research:
1. Drivers - they are all updated
2. DirectX - I have 11 installed, up to date.
3. CPU temperature - I am not overclocking, and temperature seems stable at 58-63 degrees.
4. Remove spyware/additional programs - I am running things exactly the same way as I did prior to my attempted "upgrades", back when this frame rate issue was not happening.
The only difference that I can see between now and prior to the upgrades is that some programs still have that blue-and-gold administrator shield in the icon, despite the fact that I have UAC turned all the way down to the lowest level.
There's just no reason it should be outperformed by my far weaker computer, and I have no idea what might be causing it. Any thoughts?
Thank you