Hello All,
I built an entry-level gaming PC a few months back and while I'm not surprised that I can't max out current games (next-gen), I was a bit disappointed that my PC can't easily Max out games that are now a few years old. As I've primarily been a console gamer for the past few years, I wanted to ask the community a couple questions so I can get my head on straight -
In a nutshell, anything with an "Auto-detect" setting makes me frown. A great example of this is "The Witcher 2" which I have yet to play through but recently picked up during the Steam Summer Sale. Being that this game came out in 2011, I expected to be able to turn it on Medium settings at the very least but was shocked that the auto-detect suggests I run it on the lowest settings possible.
Even for titles like "Gone Home" which aren't too graphically intense, the "Auto-detect" feature suggests I play it on Medium settings but I was easily able to crank the settings on MAX without any performance issues.
My questions are this:
1. How accurate are these "Auto-detect" settings as most of them don't seem to run any sort of stress test on the CPU/GPU?
2. Which component should I upgrade first for the largest performance gain? (I imagine the GPU which has gotten $25 cheaper since my initial purchase)
CPU: AMD FX-6300 Vishera 6-Core 3.5GHz
Graphics Card: SAPPHIRE Radeon R7 260X 2GB 128-Bit GDDR5
Motherboard: ASRock 960GM/U3S3 FX AM3
Memory: HyperX Black Series 8GB DDR3 (1600)
Hard Disk: Western Digital WD Blue 1TB 7200 RPM
Any step in the right direction is greatly appreciated! (Woot! First post!)
- Joey
I built an entry-level gaming PC a few months back and while I'm not surprised that I can't max out current games (next-gen), I was a bit disappointed that my PC can't easily Max out games that are now a few years old. As I've primarily been a console gamer for the past few years, I wanted to ask the community a couple questions so I can get my head on straight -
In a nutshell, anything with an "Auto-detect" setting makes me frown. A great example of this is "The Witcher 2" which I have yet to play through but recently picked up during the Steam Summer Sale. Being that this game came out in 2011, I expected to be able to turn it on Medium settings at the very least but was shocked that the auto-detect suggests I run it on the lowest settings possible.
Even for titles like "Gone Home" which aren't too graphically intense, the "Auto-detect" feature suggests I play it on Medium settings but I was easily able to crank the settings on MAX without any performance issues.
My questions are this:
1. How accurate are these "Auto-detect" settings as most of them don't seem to run any sort of stress test on the CPU/GPU?
2. Which component should I upgrade first for the largest performance gain? (I imagine the GPU which has gotten $25 cheaper since my initial purchase)
CPU: AMD FX-6300 Vishera 6-Core 3.5GHz
Graphics Card: SAPPHIRE Radeon R7 260X 2GB 128-Bit GDDR5
Motherboard: ASRock 960GM/U3S3 FX AM3
Memory: HyperX Black Series 8GB DDR3 (1600)
Hard Disk: Western Digital WD Blue 1TB 7200 RPM
Any step in the right direction is greatly appreciated! (Woot! First post!)
- Joey