I7 4790k vs FX 8350

grimmjow660

Reputable
May 30, 2014
236
0
4,690
So after doing some research, I found the Fx 8350 gives better performance than even a 4930k if Mantle is activated. But other benchmarks say the 4790k has better performance than an 8350. So would getting an 8 core cpu with Mantle on an R9 290 be better than a 4790k with an R9 290?
Kinda feels like the 8 core would be better with it's more cores
 
Solution
An i5-4590 stomps all over the FX-8350... The i7 4790k is not even comparable. It's several times as powerful as an FX-8350, and can achieve much higher framerates in all games.

This crap about the "FX-8350 being an 8-core processor" needs to end. The FX-8350 has 8 integer cores and 4 floating point cores, paired into 4 modules. And one AMD module is still weaker than an Intel core. AMD's labelling it as an 8 core CPU is one small step away from false advertising, as they changed the definition of a core.

AMD is using clustered integer cores from 1996, not actual full cores. Doubling the number of integer cores from 4 up to 8 only provides hyper threading levels of performance returns, they are not 8 full cores.

All the people who...
@ OP thats not quite rite. if you have a weak cpu regardless of the manufacturer you get more performance from mantle. it will raise the minimum fps on games that support it if the cpu is relatively weak compared to the gpu. if you have a strong cpu and gpu you will get marginal gains.

if you run a 8350 with an r9 290 with a mantle supported game you will get roughly 5-10 more fps as opposed to running the same on direct x.

the 4930k/4970k and an r9 290 running a game that supports mantle will give 3-7 fps as the cpu is a lot stronger and more efficient so it doesn't take advantage of the optimizations because it doesn't need to. infact you would likely loose some performance if the system tried to.
but the gains are only to minimum fps. if you were getting 100 fps max on direct x you would get 100 maybe 103 fps with mantle regardless of the system its running on.

mantle is a gimmic the same as nvidia physx and will remain so until a lot more devs start using it.

lastly the 8350 isnt even in the same price bracket as the 4930k nor is it in the same performance delta where heavily threaded apps are concerned.
the 8350 is an 8 core 4 module (effectively the same as a quad core with hyperethreading due to the limits of the shared fpu units)the 4930k is a hex a core with 6 true cores that can run 12 threads. there worlds apart as 1 is considered a gaming part the other an enthusiast/workstation part.
 
An i5-4590 stomps all over the FX-8350... The i7 4790k is not even comparable. It's several times as powerful as an FX-8350, and can achieve much higher framerates in all games.

This crap about the "FX-8350 being an 8-core processor" needs to end. The FX-8350 has 8 integer cores and 4 floating point cores, paired into 4 modules. And one AMD module is still weaker than an Intel core. AMD's labelling it as an 8 core CPU is one small step away from false advertising, as they changed the definition of a core.

AMD is using clustered integer cores from 1996, not actual full cores. Doubling the number of integer cores from 4 up to 8 only provides hyper threading levels of performance returns, they are not 8 full cores.

All the people who have an FX-8350 functionally have a quad-core with worse core-per-core performance and worse overall performance than an i5.
All the people who have an FX-6300 functionally have a tri-core with worse core-per-core performance and worse overall performance than an i3.


You want proof? Look at Watch Dogs. That game was supposed to be AMD's saving grace. All the fanboys clustered around it and said an FX-8350 might finally beat an i5 or i7. It recommended an "Intel i7" or an "AMD 8-core CPU", which at face value should have given AMD the heavy price/performance advantage. Instead, it turns out that Intel CPUs are just so much stronger they're still a better price/performance value for an 8-core optimized game like Watchdogs.

http://static.techspot.com/articles-info/827/bench/CPU_01.png
 
Solution

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360


When will people understand that synthetic benchmarks are useless or those low res ones.. nobody, i mean nobody will ever ever run a game in low settings while having one of the best CPUs on the market.. and as for cinebench .. that program is so misleading... there are tons of guys with best intel CPUs out there and top graphics cards being beaten by lower end ones ... i mean getting a 90 fps with a Titan while a GTX750 will get 120 fps in Cinebench R15 tells a lot on how reliable that one is .. just do real world application benchmarks.

Yes, Intel chips are faster(top ones, but you also pay a price for that faster) but these synthetic benchmarks are so misleading.

@ Rationale "An i5-4590 stomps all over the FX-8350... The i7 4790k is not even comparable." From my BF4 multiplayer experience that i5 doesn`t stomp the FX-8350.
 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160

Well, in my perspective the FX 8350 is a 4 CMT core CPU (still shouldn't be confused with a regular 4 core CPU).
Multicore CPU is done with CMP, CMT is NOT an implementation of CMP. CMT is basically duplicating certain parts of a regular core (In case of piledriver, we are talking about the ALU cluster).

CMT is a technology in use to increase throughput and be spaceefficient. We are talking about an 60-70% throughput increase, where SMT (Hyper-threading) is only 30%. However SMT is in use for a total different reason.

A core is essentially what would have been a single-core CPU. Back in the days the definition of a core didn't exit (because there were only single-core processors), so they were referred to as a CPU. A dual core is essentially two CPUs on the same die.

So each core, should in theory need to function standalone (Each piledriver cannot as the share the front-end (bp, fetch and decoders). Remember that the SIMD cluster wasn't in the original CPU design, and a CPU could (and did) function without it.

CMT is better for higher throughput when compared to SMT.

I would however still call it either a 4 CMT core processor or an 8 core processor.

 


Those are graphics cards comparisons. Why are you using them to guess at CPU performance?
 


Posted twice by accident.
 


First, AMD released those numbers. It says right in your link that they haven't been verified by anyone.

Second, that's an average framerate, it does not show minimums or maximums, and it's usually the minimum framerate that matters most, and it's the minimum that suffers most in multicore CPUs.
 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160
Never trust any source from AMD or Intel.

It is obvious picked to make their product look better(which we shouldn't blame them for. This is the standard in EVERY business).

Always wait for a third-party to release some numbers, which then can be considered.

A company will always try to boost their product, doesn't matter if it says Intel, AMD or IKEA on their frontdoor. They need to sell, they need to convince you to buy. Take it with a grain of salt.
 

grimmjow660

Reputable
May 30, 2014
236
0
4,690


Ah, ok.
So basically if I want to save money get an AMD processor but it won't be as good no matter what even with mantle, and if I want better performance go Intel but I'd have to spend more money.
 

vmN

Honorable
Oct 27, 2013
1,666
0
12,160
Remember that mantle is a GPU feature, not a CPU feature. You will need a GPU with mantle support. Mantle doesn't rely on a specific CPU.

However mantle do benefits "lower"-end CPU more, as it reduce the overhead (meaning the CPU would need to work less doing the same).
 


it depends on the game mate some games give a decent boost when run on an 8350 even when mantle isnt used. tek syndicate did a couple of benches that showed the 8350 pulling ahead of a 3570k by some margin in some games and also with things like streaming to twitch. amd optimized code really does work well on amd cpu's. but like i said it was a few games not every game.
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
Bogus or no bogus ... are you really going to play Plants vs Zombies ? Because if that`s what you`ll gonna play ... go AMD all the way ... will you be doing something else ? Do more research. Personally i wouldn`t even get into consideration the 4930k because of the price .. but that CPU is no slouch at all.

Try to look on youtube on videos with actual gameplay of something for example how i did in BF4 with my rig :

https://www.youtube.com/watch?v=rWSluVC6-v4 BF4 screen recording - FX8350+ R9-290 OC Vapor-X
https://www.youtube.com/watch?v=0KQSGgoqKG8 BF4 recording - FX8350+ R9-290 OC Vapor-X
https://www.youtube.com/watch?v=0KQSGgoqKG8 BF4 recording - FX8350+ GTX760 SC ACX

To be noted the FPS impact on the radeon is greater because of Action recorder vs Nvidia`s Shadow play, but on Shadowplay i can`t use ingame sound since it records at variable FPS and i can`t edit that in sync.

This kind of tests you want to look at , not charts with benchmarks in valley / cinebench / 3dmark / etc mark.. those are only synthetics .. in real world results might amaze you ... in singleplayer about any CPU from a good 2 core one to top of the line i7 will pull good fps from your GPU, put them in multiplayer and everything changes, but you won`t see charts about multiplayer since they are not easy to reproduce ..
 

canopus72

Distinguished
Nov 22, 2011
31
0
18,530
A lot of these guys (ie - critics) are talking total BS. They are just arm chair theorists. My personal experience is something different. Only a few days ago, I 'upgraded' from an FX9590, Crosshair V Formula Z, 16GB@2400MHz and a pair of Sapphire TRIX 290's in Xfire to the new 4790K and Asus z97ws mobo. I went for this mobo as it is dedicated towards graphics power (4xPCIE3 lanes and you can run a pair of PCIE3 gfx cards, both at x16 speeds).

Since Friday 21/09/14 it took two days to build this damn intel rig (everything the same except mobo and cpu). I had so many problems, wouldn't get into BIOS, fatal error messages, chassis intrude messages, etc. This damn thing kept crashing like a cancer patient. Finally got it up and running (in stark contrast, all of the AMD rigs I have built for myself and family/friends have never given any trouble from 1st boot onwards).

Anyway, I did some benchmarking of the 4790K (performance passmark score = 5304 versus FX9590 = 3355).

Firemark extreme (4790K & 2x290's = 7693 versus FX9590 & 2x290's = 7562). So very little improvement.

I just ran crysis3 on ultra settings and I am getting 40-59fps with the 4790K on the opening mission on the ship (I have a 144Hz monitor). BUT when I had my FX9590, I was getting a solid 70-80fps on the opening mission (in fact it never dipped below 70fps). So far I am unspeakably disgusted with this inhell-4790K.

It scores highly on some synthetic benchmarks but when I test it with real life benching (gaming for me is the main thing), it scores noticeably less than my FX9590. I shelled out £250 on the 4790K and £250 on the z97ws. I fell for all of the intel hype and was expecting a massive performance increase. I didn't get it. In fact, it is scoring a lot less than my FX9590. That is £500 of my money down the toilet. I am going to run some more real life bench tests and see what I get.

The only benefits I can see so far, is that the 4790K is substantially cooler than my FX9590 (22c idle versus 40c idle). My pc is much quieter now (fans no longer on full whack). Also, it is faster in single thread work than my FX9590.

The z97 platform will support the new Broadwell cpu (tbr September 2015). I recommend you hold fire, bearing in mind my costly experience. Will run some more bench tests, see what I get and then decide if I am going to keep this new intel platform or go back to my trusty, no BS-hype FX9590 platform.