Hello everyone
I have a query regarding my gaming PC and the average FPS I get in most games. First off, my setup is as follows:
Intel Core i5-2500K (running around 3.6 GHz)
8GB Corsair DDR3-1600 CL9
Sapphire HD6950 2GB
ASUS P8Z68-V GEN3 Motherboard
Samsung 26" LCD
1TB SATA 6G HDD
Coolermaster 650W Modular PSU
Windows 7 with SP1
Games I'm playing or have played recently include Batman: Arkham Asylum, Arkham City and currently Arkham Origins, as well as BioShock Infinite, Mass Effect 3, and Splinter Cell Blacklist.
My problem is that the FPS I get in my games is not near what I feel I should get, based on various benchmarks and reviews I've found on this site and others.
Allow me to give an example:
In Batman: Arkham Origins, I loaded the game up, changed the resolution to 1920x1200, set AA to FXAA High, and had to reduce all the DX11 settings to Normal to reach a playable FPS that averages 50 - 70. It's the same with most other games.
Yet in reviews and benchmarks, I see similar graphics cards running at resolutions of 2560 with all settings on high, and still maintaining a good 40 - 60 FPS.
Is there perhaps something wrong with my computer setup?
Oh, I'm running the latest ATI drivers, with the card overclocked to from 800/1250 to 840/1325 with +10% power and 50% fan speed.
Any help or tips would be greatly appreciated.
Thanks
I have a query regarding my gaming PC and the average FPS I get in most games. First off, my setup is as follows:
Intel Core i5-2500K (running around 3.6 GHz)
8GB Corsair DDR3-1600 CL9
Sapphire HD6950 2GB
ASUS P8Z68-V GEN3 Motherboard
Samsung 26" LCD
1TB SATA 6G HDD
Coolermaster 650W Modular PSU
Windows 7 with SP1
Games I'm playing or have played recently include Batman: Arkham Asylum, Arkham City and currently Arkham Origins, as well as BioShock Infinite, Mass Effect 3, and Splinter Cell Blacklist.
My problem is that the FPS I get in my games is not near what I feel I should get, based on various benchmarks and reviews I've found on this site and others.
Allow me to give an example:
In Batman: Arkham Origins, I loaded the game up, changed the resolution to 1920x1200, set AA to FXAA High, and had to reduce all the DX11 settings to Normal to reach a playable FPS that averages 50 - 70. It's the same with most other games.
Yet in reviews and benchmarks, I see similar graphics cards running at resolutions of 2560 with all settings on high, and still maintaining a good 40 - 60 FPS.
Is there perhaps something wrong with my computer setup?
Oh, I'm running the latest ATI drivers, with the card overclocked to from 800/1250 to 840/1325 with +10% power and 50% fan speed.
Any help or tips would be greatly appreciated.
Thanks