AMD or Intel for new gaming build?

Christopher Addy

Honorable
Jul 6, 2013
6
0
10,510
So the FX-8350 vs i5-3750K had been beaten to death on these forums and I feel I've read a great deal of them and I still can't come to a good conclusion.
I can see that the i5 usually wins out on FPS with most games. However, I've seen the 8350 do decent enough (and sometimes better) in most benchmark results. It seems they are relatively equal in prowess (Intel slightly ahead), with the AMD being slightly cheaper to build.
Those chips being relatively equal in performance, which would you go with in the current game market? Consider that most games are ported to PC and that all new consoles are using AMD chips with 8 cores. Within the next couple years will games finally begin to effectively utilize more than 4 cores? Will this give AMD chips the advantage in longevity? I'm thinking as games develop to use more cores the 8 core chips will being to outpace the Intel equivalent. Would you agree?
I'm looking to build a PC that will last 3 or 5 years. I haven't seen much on whats ahead for AMD after Vishera that isnt an APU, which I'm not interested in because I'd rather have discrete GPU. Has the socket AM3+ topped out and the most powerful processors already made? What about the LGA 1155, ho wmuch more room does it have to develop? I've seen AMD released the 9--- series, but is that the ceiling before the next socket architecure? If I go with an i5 i can always upgrade to an i7 later which seems to be a pretty good boost. It would seems that Intel offers a little more headroom for future upgrading at the moment.
So, between game code changing with the AMD chips and the (possible?) potential for AMD chips to open up as the code utilize more cores and the intel chip which currently slightly edges out the AMD chip which would you choose?
Thanks.
 
Solution
CPU tech is gonna start stagnating, so developers are gonna have to use current CPU resources more effectively. That leads me to believe that in the future(5+ years), more cores will be better.
BUT for anything under 5 years, its a toss-up really... You will be equally happy with either.

Let's stop beating this dead horse...
CPU tech is gonna start stagnating, so developers are gonna have to use current CPU resources more effectively. That leads me to believe that in the future(5+ years), more cores will be better.
BUT for anything under 5 years, its a toss-up really... You will be equally happy with either.

Let's stop beating this dead horse...
 
Solution

fatboytyler

Distinguished
Jan 29, 2012
590
0
19,160
Go with the 8350 for a stable upgrade path since the FM3+ socket is likely going nowhere anytime soon. The 1155 Ivy Bridge SOcket is dead and the 1150 will likely be dead in a year or two.

However, the i5 is better for gaming at this moment.

As for people that keep mentioning the whole console thing. There is no proof that PC ports will start supporting more cores. You have to think that most games don't utilize more than 2 cores and with Desktop CPUs being much more powerful than their console counterparts, a quad core will still more than shred any game thrown at it.
 


Really???

Anyway... Upgrade paths mean NOTHING these days... Not that 1150 is worth ANYTHING above 1155, and nor will the next one be such a huge jump. You say no proof and I agree with you? But nor is there proof of the continued use of AM3+ socket.
To be honest I wish AMD would move on and stop constraining themselves to an ancient socket that is on the verge of being overloading with the new FX9xxx.
 

Tradesman1

Legenda in Aeternum

_________________

+1

AMD needs to look at the real world, they are still effectively using antiquated MCs (memory Controllers) that max out at 32GB of 1333 and few can runmore than 16GB of 1866...with both the BullDozer and the Piledriver they artificially raised the multiplier to release them at a higher clock so they could claim 'native' as 1866 when the CPU is, by their own specs, native 1333 (from their BIOS and Kernel programming guide), which soon went by the wayside and they reverted to claims the can run 1 stick per channel at 1866 (and that was with 4GB sticks)
 

fatboytyler

Distinguished
Jan 29, 2012
590
0
19,160


Yea I don't know what I was thinking putting FM... My bad.

I honestly feel like the AM3+ socket will stick around until they start downsizing to 22NM CPUs.
 


Even if they do, there are sooo many reasons for AMD to move on to a new socket.
 

Tradesman1

Legenda in Aeternum

_________________

a plus 2 on that
 
A little different view point: Money. The i5 IS the cpu for gaming, if the game is cpu intensive. Most aren't-they are GPU intensive.
On newegg the i5 ranges in price from 179.99 up to 239.99. The 3570k is 219.99.
Still on Newegg the AMD 8 cores are 149.99 - 199.99.
Nope, not a lot of difference, but some savings depending on your choice. I'd save the money and use it for graphics.
I don't do any video editing/compiling, recording, detailed graphical work; I do, when I am not writing in this forum, play games. I went from dual core AMD 5600, to a Phenom X4, to my current 8120 and every time I saw an improvement in gaming. Did I wish I'd gone with and Intel? Sort of, but I wouldn't have been able to afford two video cards as I would have been out another ~$100.
AMD will give you plenty of "bang-for-your-buck", and I don't think you'll see a diff. between the two (i5 or 8350).
ASSUMING you are using a good graphics set up
And as fatboytyler and Tradesman1 said, the AM3+ socket will be around for awhile. Intel has changed sockets 3 or 4 times in as many years.
Click on my avatar to see my specs. I did the 8120 as it was cheaper and if I want to run with an 8150 or 8350, I'll just OC it a little.
 
You can prove that the AMD FX's (of today) are SMT parts.
- They only have 1 x L1 instruction cache per 'pair' of 'AMD sub-cores'

When used in software the workload of every 2nd 'core' impacts the performance of every odd numbered core.

I'm a Software Developer and I know it is not viable to develop software that scales on AMD hardware without significant investment (in time, man power, and cash to pay for it all).

It took 10 years to go from 16-bit to 32-bit, and it'll take 20 years to go from 32-bit to 64-bit (properly).

Coding for SMT for specific products is a giant waste of time and resources for any software company, even one with $100 million to throw at a project.

The next 'stepping' of these products may scale completely differently (from a low level software perspective) so few few applications (not all games) will be optimized for it. (And I mean optimized, as in hand coded inline assembly, not just targeting a specific stepping of a CPU using a compiler and then using different code paths once the CPU ID has been done).

For gaming, if minimum frame rate is important: do not buy SMT parts!

If you don't believe me that AMD FX parts are SMT I can prove it to you, if you have Visual Studio 2010, 2012, or even 2013 installed.

AMD needs to pull their head of out their arse if they want to recover to their former AMD Athlon 64 days.
- The money from the K6, K6-2/+ and K6-III/+ went into developing the AMD Athlon, and even Athlon 64 which really took Intel by surprise, sadly AMD seem unable to keep getting returns from R&D, nor do they seem willing to create a processor for gamers and another processor for servers. (Beyond simple re-branding of Opteron to Athlon or 'FX').

The 'FX' series was always just rebranded Opteron's for Enthusiast PC's
- Today it still is, just the Opterons of today are designed purely for server and cloud workloads.
- These workloads are completely different from gaming, and more inline with what Sun Microsystems had in mind with their UltraSPARC line-up.

The AMD FX processors of today are basically HyperThreaded parts with dedicated L1 data caches per thread; that alone is a massive mistake as Software Engineers need to have dedicated L1 instruction caches per thread... why they designed their processor backwards is beyond me. (Heck if they had a larger 'smart' L1 cache shared between the two 'sub-cores' and dedicated L1 code caches that were half the size they would actually scale!).

The resource contention in AMD's current FX processors is such a joke, and 99.9% of software developers won't even compile for a given CPU stepping target, then have dedicated code paths after a CPU ID..... let alone perform inline assembly for a given product that is going to be obsolete by the time the project is finished.


I used to be a hardware guru here a decade ago, or more, these days I lean massively towards Software Engineering.
 
By comparison:

The LGA1150 processors (Haswell) have an extra ALU, which means that 'per clock' software compiled targeting even older processors can magically become 4/3rds faster (+33.33333%) and the slightly more advanced processor does all the work to extract the extra IPC.

All the code requires is 4 independent integer operations in a given thread.

AMD did exactly this with the original Athlon's FPU (if memory recalls correctly), and as a result they become market leader in most the software that TomsHardware Forum users talk about very quickly.

They did this even with their L2 caches running a step behind the Pentium III L2 cache in clock speed.


The Itanium 2 (which everyone here hated for some reason) can actually perform 6 instructions per clock cycle per core and has an incredible FPU; just no-one wanted to migrate to IA-64 from x86-32 and the processors were not designed to be cost effective to produce. (Like the Pentium Pro production lines).
 

Vulpes vulpes

Honorable
Feb 3, 2013
216
0
10,710


There's alot of PC supporting 8 core since PC will be making the games for PS4 and Xbox1.
 




Some interesting and very informative information in the various posts here; just asking that you eventually choose a Best Answer for your question.
 


NO, NO Rush!!! Sorry. But too often OP's forget to check a BAnswer. And none of my responses deserve that check mark, Tabris... has given some really important info and insight.
 
So the FX-8350 vs i5-3750K had been beaten to death on these forums and I feel I've read a great deal of them and I still can't come to a good conclusion.
- You've got to be fugging kidding me!

These consoles won't have 8 core performance if the chip mirrors the FX-8000's, even with FreeBSD.
- The only thing these 'eight' cores are good for is cooling, as the OS can park the idle 'sub-cores'; of which there will be many as no modern 'bleeding edge' game software is going to see benefit vs four full cores.
- The lower power consumption might be good, if the CPU and GPU share a heatsink, as it will let APU designs scale the GPU further, which -in turn- increases gaming performance.

This isn't because the eight 'sub-core' processor is doing more work, it's because its heat output is lower and the maximum heat it can output is shared with the GPU in a slim console 'toy' device.

Consoles main design considerations are power consumption and heat output, performance takes a back seat. (Nintendo know this all too well).

The other reason they use eight 'sub-cores' is because marketing gets a front row seat over hardware engineers when it comes to actually designing the hardware.
- Heck, why not just have 4 way SMT on each of the 'sub-cores' and call it a 64 'core' processor?

This is good news for PC Gamers though:
- Let the chumps buy their eight 'sub-core' toys.
- The consoles are basically low end PC's designed by the marketing dept.
- This means that any half decent PC tech could assemble higher performance machines easily
- It also means that the software will be easier to port to PC without loss of performance, etc.

The 'eight' 'sub-core' optimized code running on these things will ultimately perform better on Core i5 4000 series systems, as the CPU and GPU wont be limited to a shared maximum power consumption and heat output.

Seriously, if the idea was so awesome we would have all migrated to UltraSPARC systems on Solaris and be laughing at Windows users today. Since the same concept has come up every 2 years since 1998 in computing and gaming and it NEVER takes off in the long run. (It just lets Marketing Dept's design shitbox units with a good average selling price, while claiming to be making a loss per unit, to sell in a closed garden marketplace with 100% vendor lock-in).
- Sound like the almost perfect business model to rip people off?
 

fatboytyler

Distinguished
Jan 29, 2012
590
0
19,160


This is exactly what I've been trying to tell people for the past month... Yea they are 8 core CPUs, but they are essentially a modified mobile chipset.