Wnt to switch from AMD to Intel

drakd

Reputable
Jul 20, 2015
4
0
4,510
What kind of gaming and multitasking performance increases will I see switching from a FX-8320 to an i7-4790k? Like I said, I'm gaming heavily on this machine and I am streaming gameplay as well.
 

spdragoo

Splendid
Ambassador
Depends on the game. For example, in BF4 you'll gain maybe 3 FPS with a GTX 770 @ 1080p (http://anandtech.com/bench/CPU/1107)... as well as noting that, on that title, the i7-4790K provides no performance improvement over an i5-4690K.


Now, sure, you'll see some larger improvements. For example, F1 2013 shows a 30 FPS improvement, using the GTX 770 @ 1080p (http://anandtech.com/bench/CPU/1103)...except that both provide you with a 100+ FPS experience. Unless you have a 100+Hz monitor, you won't even know there's a difference.

If you're having some issues, then there might be something else going on -- heat issues, lack of RAM, insufficient GPU, underperforming HDD, Internet connection, etc. -- that could be behind the problems. Knowing what those issues are might help us pinpoint what's causing them, & be able to suggest something other than "nuke your current system & start from scratch".
 

drakd

Reputable
Jul 20, 2015
4
0
4,510


It's not so much that I'm having any really big issues. I do have an r9 290, 16gigs of Patriot Viper Xtreme RAM. I am playing on an ASUS VG248QE 144Hz monitor. I just happen to notice maybe not as high FPS as I think I should be getting with an r9 290. I know I should probably be playing most games on High Settings at 1080p at 60FPS. Right?
 
What games do you play? Usually, it isn't worth it to switch. Overclocking an FX 8320 on a good board with good cooling and you'll match or get very close to i7 performance, for less, especially in modern games that are well threaded. The difference in FPS is small, 3-5%. Sometimes this boils down to FPS differences in the single digits, even down to 1-2 FPS in well optimized titles.

The only time you see a big difference is in OLD or poorly optimized games. But they still stay over 60 FPS because of their age and hardware advances.

For example. I've been playing a bit of Fallout 3 recently. On Ultra, 1080p, everything maxed, only uses 2 CPU cores, uses HAVOK physics which are intel optimized, yet on AMD it STILL stays pegged at 60+ FPS. This is the pattern you'll see in most cases.

And spending less money on an AMD platform allows you to get a better GPU, which is usually more important to gaming performance than CPU. Using Metro 2033 as an example, an FX 6350 + GTX 970 is faster than a 3770k + 780. Granted, the example below is Overclocked (4.8 is a common OC with this CPU), but overclocking the i7 isn't going to get you 10+ FPS more.

Here's what I'm talking about:

https://www.youtube.com/watch?v=jV2Voo5h3eU
https://www.youtube.com/watch?v=4hmNltwUUsc

And the Metro stuff...

https://www.youtube.com/watch?v=4ZHlo0DAI38
https://www.youtube.com/watch?v=m3pCWzro4ZE

And you can also look at the Techno Kitchen channel for more. They have a ton of side-by-side comparisons showing how close most games will run between several platform and GPU combos.

I'm only saying this because I don't want you to listen to the hype, spend $400+ on a CPU + MoBo upgrade + the assembly time, only to be disappointed by a 2-8 FPS gain.

But...If you still want to spend your money, then I'd suggest considering an i5 instead of an i7. For a single high end GPU a k series i5 is the best option. You don't get much benefit from an i7 unless you go crossfire or SLI.
 

drakd

Reputable
Jul 20, 2015
4
0
4,510


I've been playing Far Cry 4, Witcher 3, Batman, and Tera.
 

teknobug

Distinguished
Feb 10, 2011
407
1
18,815
Let's not forget that you'll need to buy a decent cooler for overclocking an FX chip which adss another $40-60 depending, which in total price being identical to an Intel setup so you really aren't going to save that much money going with the aging AMD route. With Intel you can stay at stock clocks and do just as well or better as a 4.6GHz OC'd FX.

PS- first video is hilarious, especially the comments section, I'm lucky to have had my i7 3770K before I sold it, it was and still is hands down the best processor you can get that isn't a 2011 socket monster, not even an FX at 5GHz is going to take on it in most cases (I'm sorry to crush your fanboism buddy).
 


In well threaded games like FarCry the difference is around 2 FPS. The Witcher is a little more but still only a few. Batman depends on the version. The latest Batman Arkham Knight had major issues on every platform. I played a little Tera but don't recall any slowness. Still, if it is like every other MMO out there is isn't well optimized for multicore CPUs and you'll see a gain with intel here...but AMD will still be good FPS and very playable.
 


Playing the fanboy card eh? That's sad. You really have no idea how idiotic and weak it is to try and label me a fanboy. Merely follow my other posts here. You will see that I build both AMD and Intel systems, and recommend them equally based on posters' needs. So save the projection and trolling.

And I can tell you did not watch all those videos in their entirety. No rational mind could and then still continue to think Intel is far superior to AMD in gaming.

Here, have some more stuff to google:

Tom's Hardware FarCry 4 review
Tom's Hardware Shadow of Mordor review
AMD vs Intel ultimate gaming showdown FX 9590 5GHz vs i7 4960x (* To give you an idea of how well a 5GHz FX 8 core REALLY runs)

Tom's is also looking at updating their CPU charts to include more multithreaded games in testing. Why? Because they are fair, objective, and want to give up-to-date advice and a clear picture of the current state of technology.

The days of poorly optimized games that favor Intel are over. AMD was ahead of their time with the release of 6 and 8 core CPUs. And now that devs are catching up, and these 4 year OLD AMD CPUs are keeping up with the latest Intel CPUs in gaming, you have to wonder if Intel can continue to justify a higher cost for such minimal gains.

That level of competition is what we should all be cheering on. It's good for us as consumers. Painting an Intel-biased picture, in spite of their overpricing, is not helping anyone, especially Intel fans.
 

teknobug

Distinguished
Feb 10, 2011
407
1
18,815
Tom's is also looking at updating their CPU charts to include more multithreaded games in testing. Why? Because they are fair, objective, and want to give up-to-date advice and a clear picture of the current state of technology.
Good, let Tom's add non cherry picked stuff.

I came from being an AMD fan, I was using AMD as my primary systems since the mid 90's but came FX I looked for other options. Even then I built several AMD systems over the past 3 years and still can't find a system I was comfortable with, and there's no way I'm going to sit and use an FX 9*** that sucks +200W, that's the same as turning a couple of my lights on in the house using helogen/standard light bulbs (all of which has CFL/LED bulbs now). Although I still have and use my FX 4350 and still have my Phenom II X6 1090T laying around.

Those Far Cry 4 and SoM reviews are GPU benchmarks. And the first two videos, the FX 6300 vs i5 4440 with 10-12fps difference and the 6300 suffering more minimum fps dips.

When you actually use both Intel and AMD at all levels, you won't be singing the tune that you are right now.
 

Creme

Reputable
Aug 4, 2014
360
0
4,860
Depends on the games you play. On CPU bound games you'll notice a big difference. In games where the 8320 does like 70fps and the i7 does 90fps, unless you have a monitor with higher refresh rate, you won't notice a difference.

Also, DX12 supposedly sports big gains for AMD FX cpus, putting the 8350 on par with an i5 4690K. That doesn't account for all the DX11 games out there though...

If you do switch to Intel only for gaming, the i7 is probably not worth it, the i5 offers 90% of the performance and is cheaper (though still expensive).
 
This review of far cry 4 shows a much different result. This is at 1200, not 1080 and using a gtx 980 - with the higher resolution, more stress should be on the gpu than say 1080p. i5's are putting out over 15fps more than an fx 8350. A bit more than a 'couple fps' difference.

http://www.techspot.com/review/917-far-cry-4-benchmarks/page5.html

at 1080p ultra settings on bf4 multiplayer with a 770, even the i3 was outpacing the fx 8350. The i5's were getting close to 10fps more across the board, with amd's average fps closer to intel's min fps and intel's avg fps on par with amd's max fps.
http://www.hardwarepal.com/battlefield-4-benchmark-mp-cpu-gpu-w7-vs-w8-1/

I wouldn't exactly say amd's fx are 'keeping up', in just about every game the i5 2500k is beating out the aging 4yr old fx cpu's - and it's over 4.5yrs old.

The only dx12 comparisons I've seen so far are regarding draw calls which is just one aspect of gaming performance not the whole picture. At that, the fx 8350 is still falling behind the i5's there too. All cpu's get a gain from dx12, the theory that amd somehow becomes competitive with dx12 isn't overly accurate. We're also seeing the dx12 in its beta stages and not paired with a fully dx12 compatible game. How fully the devs implement dx12 in their games and which games will be another factor.

Looking at the gtx 970 or the r9 290x, the fx 8350 is no closer to 'catching' the i5 in dx12 draw calls than it was in dx11 multithreaded. For instance the 970 on both cpus, the i5 was 14% faster than the 8350 in dx11 mulithreaded (not factoring the much larger difference in single threaded). In dx12, the i5 is 18.5% more powerful in draw calls than the fx 8350. So going by draw calls (which have been what's been benched in dx12 so far) the amd to intel performance gap is getting worse, not better. That's not to say the draw calls don't improve from 2.1m to 13.5m for the fx chip, but comparing intel to amd, the intels are pulling further ahead. How this translates to 'fx becoming just as good as an i5 thanks to dx12' I'm not sure. Now factor in that draw calls are only a small portion of the gaming experience and at the end of the day, intel still has stronger cores. I just don't see amd narrowing the gap, not with the fx.
http://www.eurogamer.net/articles/digitalfoundry-2015-why-directx-12-is-a-gamechanger
 
I'd say if the op wants better performance than they're getting, they don't really have a choice. Any upgrade is going to cost money, amd has nothing to upgrade to. If their pc is doing good enough, keep it. They're at the point where amd's performance is maxed, the 8320 vs 8350, 8370, 9590 - makes no real difference. Aside from overclocking what they have to essentially get the slightly higher performance of the 8350/8370 there are no better amd cpus available so I guess it really comes down to whether or not their current pc is enough. It's not as if well amd has a cheaper upgrade option, amd has no upgrade option period.
 
Far Cry 4 -> FX-4320 beating FX-8350. Says enough.

In BF4 you say an i3 was outpacing the FX-8. This might be, but it's also clear that when the i7 is behind the i5, something is wrong with the optimization. Switch from Windows 7 to Windows 8 and the i3 advantage practically vanishes. Quoted;

"Here we see again something that has now begun to be a standard, the i7 and FX 8350 again manage to perform well only at Windows 8.1, while in Windows 7 they fall behind."

Read more at HardwarePal: Battlefield 4 Benchmark – Multiplayer CPU and GPU W7 vs W8.1 http://www.hardwarepal.com/?p=5248

As for your draw calls stuff... If you actually understood what draw calls do, you'd know that despite the FX being lower than the i5/i7, it turns the CPU from irrelevant back into a more than capable gaming CPU. This would make an upgrade a moot point right now.

As for the draw calls, I'll repeat what I said in another thread;
You're completely missing the point regarding the draw calls. The difference right now between the i5 and the FX in DX11, using the GTX 970 as a reference, is 0.3 million draw calls for multi-threading, based on 2.4 million for the i5 vs 2.1 million for the FX. If you take single threading, the difference is a bit bigger. 0.8 million based on 2.0 million for the i5 vs 1.2 million for the FX.

Now, let me explain this as simple as I can to you. According to all you people, the i5 4690K is super ultra awesome and causes no bottlenecks, while the FX-8350 does cause bottlenecks regularly. So for the FX to be a bottleneck, and the i5 not to be a bottleneck, the draw calls need to logically be above 2.1 million, which is the DX11 ceiling for the FX, but, below the 2.4 million, the ceiling of the i5 under DX11. If the draw calls are higher than 2.4 million, the i5 would bottleneck also.

Now, with DX12, the FX-8350 is capable of doing 13.5 million draw calls. If the i5 is not bottlenecking at 2.4 million, the FX-8350 cannot possibly be a bottleneck with over 560% the amount of draw calls that it can currently achieve in DX11 under the exact same circumstances. Whether the i5 can do 16 million is irrelevant. Neither of them will be bottlenecking. Upgrading from an FX to whatever Intel CPU will be a waste of money in that case, because the GPU will be limited before the CPU reaches its limit of over 2.4 million draw calls, and waaaaaaaaaaaaaay before 13.5 million draw calls. Maybe we'll have to update the FX in 3 years, and the i5 in 4 years instead. But arguing that because the i5 can do more draw calls, he should upgrade, is not only wrong, it's blatantly deceptive and reeks of fanboyism.

It isn't for nothing that I tell people to wait. What's worse, having lost one year of better performance because of not upgrading, or upgrading and having wasted over $200 that could've been spent on a superior CPU down the line?