Sign in with
Sign up | Sign in
Your question

Fx-8320 vs i5/i7 for futureproof ?

Tags:
  • Intel i5
  • CPUs
  • Games
  • Intel
  • Intel i7
Last response: in CPUs
Share
March 1, 2014 10:42:22 AM

Hello everyone.
can somebody explain me the whole thing about fx vs i5/i7 ? From what I head the fx is overall better than intel. Why ? Let's take a look on these scenarios:
1) Older games runs on 1/2 cores, therefore intel performs undoubtly better, but these games are much likely to be +60 fps anyway so what's the point on having 240 fps instead of 170 ???
2) Games that are very hw demanding are optimized better = thrive for amd due to more cores.

Another thing, the Thief game have recommended settings: 8core AMD or i7. So i5 wouldn't be able to max out.. I know it has something to do with mantle and amd optimalization.

But can somebody tell me games, that amd will suck on ? I guess skyrim and starcraft would be the ones.

Please no intel fanboys, i know nahalem i3>fx-9590..

More about : 8320 futureproof

a b à CPUs
March 1, 2014 10:47:45 AM

I run Skyrim just fine in an AMD 955BE at stock speeds at 1080p. focus on the GPU, not the CPU.
m
0
l
March 1, 2014 10:51:20 AM

egilbe said:
I run Skyrim just fine in an AMD 955BE at stock speeds at 1080p. focus on the GPU, not the CPU.


Cool story bro.
Now let's get back to topic.
m
0
l
Related resources
a b à CPUs
March 1, 2014 11:41:09 AM

Don't know where the "older games run better on Intel" stuff keeps coming from. ALL games run better on my 4770k than on my friend's 8350. Literally, all games. I don't see many (honestly any) gaming benchmarks that aren't for brand new games, and they all favor Intel. The FX-8xxx CPU's are perfectly fine and have a good cost/performance ratio, but there is a reason they cost less than the 4670k and the 4770k. It's because they aren't as fast...otherwise AMD would charge more.

AMD CPU's do not just objectively "suck" at anything. It's all about what you play, what you expect from your hardware, and what you're willing to pay. Do you want an average 100fps in BF4, or are you cool averaging 60fps? Some people just don't care. Or maybe you care, but you just don't want to spend the extra money on a better CPU. Then it becomes a moot point.

There really aren't any secrets to CPU purchasing. The more you spend, the better performance you get.
m
0
l
a b à CPUs
March 1, 2014 12:00:05 PM

Romenfousek said:

1) Older games runs on 1/2 cores, therefore intel performs undoubtly better, but these games are much likely to be +60 fps anyway so what's the point on having 240 fps instead of 170 ???


I think you're a bit confused here OP. Games do not lock your fps. Hardware determines how many frames per second you get. That's why I get hundreds of fps in CS 1.6 with my current hardware. Also 170/240fps might not really matter. But if the difference is between averaging 70fps or 100fps, it makes a huge difference. And that's a much more likely scenario for current games. Remember, the higher average fps you have, the lower your drops will be. And drops below 60fps get really annoying in competitive online games.
m
0
l
a b à CPUs
March 1, 2014 3:19:49 PM

Romenfousek said:
egilbe said:
I run Skyrim just fine in an AMD 955BE at stock speeds at 1080p. focus on the GPU, not the CPU.


Cool story bro.
Now let's get back to topic.


Its a stupid topic. CPU choice isn't the primary focus for anyone wanting to build a gaming computer. Focus on the GPU. Either choice in CPU will work for many years to come. Any game will suck if you are trying to run it on budget GPU, but almost all games will run decent on a high end GPU with a 5 year old CPU.
m
0
l
a b à CPUs
March 1, 2014 4:32:29 PM

egilbe said:
Romenfousek said:
egilbe said:
I run Skyrim just fine in an AMD 955BE at stock speeds at 1080p. focus on the GPU, not the CPU.


Cool story bro.
Now let's get back to topic.


Its a stupid topic. CPU choice isn't the primary focus for anyone wanting to build a gaming computer. Focus on the GPU. Either choice in CPU will work for many years to come. Any game will suck if you are trying to run it on budget GPU, but almost all games will run decent on a high end GPU with a 5 year old CPU.


It completely depends on what you're playing. Any multiplayer game with a large player count will be very CPU dependent. BF4, Planetside 2, and any MMO are good examples. Again, it completely depends on what kind of performance you want and what you're playing. A, i7 920 (5 years old top of the line) will not play any of the games I mentioned as well as a 4770k. So it does matter. The GPU matters as well, but it's a myth that the CPU matters not.

m
0
l
a b à CPUs
March 1, 2014 5:16:35 PM

Honestly, either an FX8320 or an i5/i7 will be basically future-proof no matter which you choose (provided you are going to overclock the 8320). These are good processors, and will be strong for a number of years as long as you keep up with GPU upgrades, which is more important anyway. Game devs are bound to programming games for consoles first, and the new gen consoles have comparatively weak cpus in them vs an 8320/8350 or an i5/i7. So you are already ahead of the game. Even Phenom II X4's are still decent for gaming when paired with strong GPUs.
m
0
l
a b à CPUs
March 1, 2014 6:14:50 PM

Romenfousek said:
Hello everyone.
can somebody explain me the whole thing about fx vs i5/i7 ? From what I head the fx is overall better than intel. Why ? Let's take a look on these scenarios:

2) Games that are very hw demanding are optimized better = thrive for amd due to more cores.


So far this is basically a myth. By the time a computer programmer figures out how to utilize eight cores, the game engine is already smart enough to start as many threads or worker processes as would work best on the given CPU. If there are four cores available, they could start four worker processes but if there are more cores available, they could start more. Moreover, even if the game has to use eight threads, that's not a big hindrance on a four core CPU. The OS will simply switch the threads/processes on a limited number of CPUs without a big performance hit. This is how a web server could handle dozens of client requests at once while running on single CPU (single core) machine 15 years ago. The cost of switching threads is not very high, though context switching for processes can be more expensive.
m
0
l
!