Core i7-3960X Versus FX-8150
It’s a foregone conclusion that Core i7-3960X is faster than AMD’s FX-8150, but the comparison is an interesting one nevertheless.
With all benchmarks weighted evenly, we end up seeing Sandy Bridge-E best Zambezi by more than it beat the Sandy Bridge-based Core i5-2500K. This is a result of close finishes in the threaded apps, and bigger blow-outs in titles like iTunes, WinZip, and Lame.
Unfortunately, pricing that remains way above AMD’s initial estimation means you pay 253% more for the Core i7-3960X, which averages 32% better results across our benchmark suite. Compare that to Core i5-2500K, a processor that fares better in both performance and pricing metrics.
Current page: Core i7-3960X Versus FX-8150Prev Page Core i7-3960X Versus Core i7-2600K/Core i5-2500K Next Page A Symbolic King In A Crowd Full Of Value
Stay on the Cutting Edge
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
Wow,lots of details and benchies.Great review as always Chris !Reply
So no SAS/Full Sata 3 ports but u do get PCIe 3 ... no Quicksync but u do get 2 more cores and the added cache ... no USB 3.0 but u get quad channel memory which in real life every day computing is a minimal gain at best. Feels an awful lot like a weak trade if you ask me. I'm basically asked to buy the P67 chipset with sprinkles on top. And for 1000$ it feels like it falls short. For heavy workloads it's cheaper and faster to make yourself 2 systems based on 1155 or bulldozer and render, fold, chew numbers that way. X79 should have launched with an ivy bridge based cpu inside and a better chipset to live to it's name.Reply
What we have today is simply a platform for bragging rights not a serious contender to the X38, X48, X58 family.
Enjoyed the review Chris ! WoW.Reply
Not to take the review to much off topic but its worth bringing up because this review was so complete , as in covering a vast array of situations and programs. Its truly embarrassing for AMD that the FX-8XXX series is beaten not only bye chips with half the cores but half the cores that are a generation behind. In fact as of this moment the FX set is almost inspiring it its lack of any value at first glance at some of these marks one could say that AMD's most expensive chip at over 200$ is one of its slowest being beaten bye both the x4 and x6 phenoms.Reply
Illfindu, you are beating a dead horse... Old news, lets move on (sorry, just tired of the same thing being said over and over, which will end in an amd fanboy fight). Great review though!Reply
This article tells me 2 things , either our current software is a total piece of crap since it has absolutely no clue of multi core cpus, or the future without AMD is so grim that intel makes you pay 1000 bucks for a cpu that doesn`t perform really that fast ... but for sure the software industry needs to take a better look at those multicore optimisations.Reply
I think Intel would be raking in the dough if they left all 8 cores enabled for the 3960X. I doubt that a later revision will enable them. 8c/16t will probably hit the desktop with IB-E (can't wait) :)Reply
:| Well AMD is fighting a losing battle.. (In High-End CPU's, which I actually use for rendering etc..)Reply
I would LOVE to see them pick up their game and provide me with a worthy upgrade over my 4GHz i7 2600 (Non-K). I would swoop it up.
Look, BD had 4 modules with two "cores" each, each module is equivalent to a Sandy Bridge core.
They should just combine both of those cores or make them a single core, so we get 4 threads.
Then create 4-6-8 core versions of those CPU's..
Think about it.. the FX8150 is more of a 4-core CPU where the resources are halved pretty much so you get two threads per core, it would have been MUCH MUCH better if they just kept 4 strong cores.
Not sure why either but I always seem to start an AMD related comment :\
great but too expensive....Reply
The labels are wrong on the graphs on this page the last ones should read DDR2-2133 on the last two shouldn't it?