While playing Assassin's Creed I've been getting some odd performance results from my GPU. Here's a list of my specs:
CPU: i7-2600 at stock
GPU: GTX 560 ti (I've been playing with the clock speeds to find something stable)
RAM: 8GB 1333Mhz
Motherboard: I don't know the exact name off the top of my head it's an MSI board with a P67 chipset
I'm also playing at 1280x720
While playing, my CPU and GPU will be running at about %50 but I'll only be getting somewhere from 60-70 FPS and down to somewhere near 30 FPS if I'm running around in areas with many buildings, people, and scenery in them. The GPU usage will stay constant most of the time, occasionally dip under heavy, sudden load, but I'm not worried about that. My CPU, however, will usually stay near %50, but sometimes go up to %80-%90, but I'll still have the same framerate, same GPU usage. I'm confused as to why both of these components aren't working to the best of their capabilities. This happens in other games such as Batman: Arkham City, but it's not as much of a problem as I get higher framerates in that game. I've also tried setting everything to the lowest they could go in the game, but there was no performance increase. I might have gotten a few extra frames, but the GPU would then be working at somewhere near %25-%30, which leads me to believe this has something to do with the CPU. Any help or advice is greatly appreciated.
CPU: i7-2600 at stock
GPU: GTX 560 ti (I've been playing with the clock speeds to find something stable)
RAM: 8GB 1333Mhz
Motherboard: I don't know the exact name off the top of my head it's an MSI board with a P67 chipset
I'm also playing at 1280x720
While playing, my CPU and GPU will be running at about %50 but I'll only be getting somewhere from 60-70 FPS and down to somewhere near 30 FPS if I'm running around in areas with many buildings, people, and scenery in them. The GPU usage will stay constant most of the time, occasionally dip under heavy, sudden load, but I'm not worried about that. My CPU, however, will usually stay near %50, but sometimes go up to %80-%90, but I'll still have the same framerate, same GPU usage. I'm confused as to why both of these components aren't working to the best of their capabilities. This happens in other games such as Batman: Arkham City, but it's not as much of a problem as I get higher framerates in that game. I've also tried setting everything to the lowest they could go in the game, but there was no performance increase. I might have gotten a few extra frames, but the GPU would then be working at somewhere near %25-%30, which leads me to believe this has something to do with the CPU. Any help or advice is greatly appreciated.