Sign in with
Sign up | Sign in
Your question

Is AMD pulling a big Rope-A-Dope on Intel?

Last response: in Components
Share
February 4, 2013 9:00:25 PM

Hi all,

By starting this thread I'm not trying to start up a platform war between Intel or AMD fanboys, I'm just trying to see what others think. Over the last couple of weeks I have been doing minor upgrades to my gaming computer to try to squeeze a little more life out of it, and am happy that I will be able to game on it for the next year and a half to two years, or at least until they release the next gen consoles. In doing my upgrading and overclocking I had to research all the benefits and drawbacks of AMD and Intel processors, and ultimately decided to wait for a couple years to update my system for new motherboard and processor as I don't think either line is "proofed" against the future consoles. I'm not saying the tech will be outstanding on the next gen consoles, but their specs will set the tone for how games are produced for the next 10 years.

Intel currently rules the computer gaming world. That is a fact. But where Intel is not so strong is their i5 and i7 processors are quad core cpus. They rule the gaming world because right now all games run on a maximum of 3 cores, and Intel have the most powerful cores in the computing world.

AMD currently rules the multicore application world of computing. In benchmark tests the only place where an AMD 8 core processor can beat the mighty i7 quad core is in programs that fully utilize hyper threading. If a program can utilize 8 cores the AMD processors will beat the more powerful i7 quad core.

That brings me to my question, is AMD pulling a big rope-a-dope? The only way that AMD could hope to bring its processors back to life in the gaming world is if games changed their programming to fully utilize hyper threading multiple core design. How would AMD do that? They already are. Both the next gen Xbox and PS4 are expected to use AMD chip sets. Now if they both use underclocked 8 core processors that would mean game producers would have to shift from 3 core programming maxing out the cores to 8 core programming demanding less of each independent core. IF the next gen consoles come out with 8 core design that will almost certainly demand that game producers maximize multiple thread multiple core gaming, and that would leave Intel (at least the i5 and i7) behind AMD 8 core design.

Now I know that Intel is more than likely working on 8 core processors utilizing sandy, ivy, ect tech, however there aren't any available right now. That means that anyone who went through all the expense of buying the motherboards and processors and possibly ram would have to buy all those again to play multi thread designed games. AMD has also been focusing on multi core (6 cores +) design for much longer than Intel and with focusing on the 8 core design may have the best 8 core design.

So I ask the question: could this be AMDs answer to the Intel core processors gaming dominance? Is 8 core gaming the near future of gaming? Is this how AMD is going to get back on top?
February 4, 2013 9:32:16 PM

Interesting idea. The 83xx Vishera is a nice CPU already, it will be great to see how developers work to fully utilize the architecture.
m
0
l
February 4, 2013 10:11:05 PM

Benchmarks that give the (very slight) edge to AMD are irrelevant for gaming because they usually have very streamlined and predictable workload that can constantly utilize each thread to its fullest from start to finish. Such workload is parallelized by nature and therefore it's easy to code.
On the other hand, games are a different can of worms. There's constant spiking in CPU utilization, no matter what game you play because there will never be an identical situation in any non turn-based game (such as chess) twice, let alone, being constantly identical.
When it comes to chess (let's say white is human and black is computer for argument's sake), it's always going to be the following:
1) White makes a move
2) Black assesses the situation and acts accordingly, or more specifically, tries as many moves as possible in the given time specified by the programmer, deducting the best move out of all the variations it managed to play which is again, dependant on the processing power of the computer
Step 2 on the drawing board will look like this:

...and so on. As you can clearly see from this poor schematic, such game can benefit from parallelized coding. Sadly, most games we get to play nowadays, don't.

The best thing developers can do with the design of current games is to split different tasks to different cores, i.e. main program would be executed on Core 0, AI functions would be run on Core 1, control input, sound output on Core 2; what's left there for CPU to do? Well, particle physics comes to mind, but unfortunately, that task is easier to code for GPGPU (OpenCL, PhysX which is based on CUDA) than it is to code in x86 (CPU) syntax.

Currently, the tasks other than main program and particle physics done on CPU in few select games such as SC2, are too light on power requirements. The only reason gaming industry would prefer more cores over more powerful cores would be improved motion detection input with: a) more sensors and b) higher resolution sensors than today's kinect for example. Another thing would be focusing on MMO genre with many players which would introduce much higher player input, or should I say output count that would need to be processed by both server and client. Unfortunately, MMO is a dying breed, WoW's time has passed, and realistically speaking, there hasn't been any other MMO anywhere as successful, ever.

The bottom line is that as long as games don't perform repetitive tasks, developers won't be able to efficiently parallelize game engines which would justify having octocores in purely gaming machines, be it PCs or consoles.

I hope I'm wrong, it's been a long time since I started moaning about game developers going easy on the CPUs.

One other idea popped into my mind just as I was going to post, it's very unlikely to happen, but as the things stand at the moment, the only thing that warrants extra cores in gaming today is streaming, so if MS/Sony/Nintendo would like to push PC gaming even further away from the mainstream, they could launch streaming service like Twitch.tv that would be viewable on the consoles too.

Edit: had to resort to paint
m
0
l
Related resources
a b À AMD
a b å Intel
February 4, 2013 10:17:49 PM

you have to ask where the pc/gaming console/tablet is going. if the arm/phone cpu keep the rate that there coming out and the speed and power that coming out on each new gen of arm chips. i dont think there going to be much life left in the old x86 cpu.
myself i see a home pc/server unit. that not only streams movies. but plays games on andorid/osi os. i see amd selling there cpu ip and eng to cell phone arm people to help them make a more powerful low wattage chips. I see amd making gpu core logic for arms or other chips. i see intel x86 leave the home/pc market and stay in the server workspace. intel turning more into a fab plants and r and d.
m
0
l
February 4, 2013 11:06:33 PM

Now this is the kind of discussion I was hoping for :sol:  I hate it when you try to talk about the future of gaming, and cpus and all you get is fanboy nonsense. I totally agree that I hope gaming in the near future doesn't make quad cores obsolete. I only have a Phenom 965 BE, but I gotta admit I love it. It has handled everything I have ever asked of it and then some, and with my 7970 GPU, I can play anything on ultra, hopefully for the next 2 years at least.

I'm thinking that the speculated eight core processors in the next gen consoles have a lot to do with motion tech controls, but it does leave the door open to software producers too. I know it would take most studios several years working with the new hardware to produce games that fully take advantage of the available power. It would be interesting to see what a game made from the ground up to run on 8 cores at once would be like....

I'm really hoping that gaming isn't going in the direction of tablet gaming. I have played games on PC, consoles, tablets, and smart phones, and while Angry Birds is fun on tablet and smart phone I am very glad to get my PC powered up to play some "real" games. In my humble opinion Kinect type tech also has its drawbacks, such as I really don't want my kids throwing kicks at my expensive LCD!! But the tech is interesting, and even enjoyable.

Thank you both for your input!! And if you have any other thoughts please post them. I love an open discussion with no "mine is bigger than yours" crap getting in the way :sol: 
m
0
l
February 5, 2013 2:59:54 AM

Intel have spare cores up their sleeve though so any time they want they can deliver extra cores for no extra expense to themselves. Sandy bridge E has eight cores with 2 disabled.

Mactronix :) 
m
0
l
a b å Intel
February 5, 2013 3:24:45 AM

It's worth remembering that the Xbox360 and PS3 are over 7 years old, and both have what are ultimately 6(+) core CPUs. You can debate what a core is ofc, and whether or not they are genuine "cores" or if the current AMD CPUs are either, but consoles have been heavily threaded for a long time now.
Just because the next generation of consoles might utilise 8+ threads doesn't necessarily mean that PC games will follow suit.

Interesting topic though.
m
0
l
February 5, 2013 7:08:54 AM

Also don't forget that multi core chips have been out for years now and we still don't have games that take advantage of all the cores properly.

I think a game that would use an octocore CPU to its benefit is way off yet.

Mactronix :) 
m
0
l
February 5, 2013 7:49:17 AM

In fact, how i see is that intel's edge comes from software side especially. Now, they make good hardware, no question in that but there is also the coding side of the stuff, be it games or other software. This begins at the driver and OS level, and goes on with the optimisations. Intel is for sure investing more in the software side than AMD does. But lately it seems AMD saw the important of the software too. Ever since their vision platform, i was wondering if they could utilize their stronger side (GPU and multithreading) for computing and at last they now talk about stuff like HSA. The future of computing is about software and "tricks to do more with less" and not solely about hardware and more advanced process nodes. Of course not every work load can be parallelized, and this is another topic. Intel also released their "kind" of chips (Xeon Phi) for massively parallelized work loads, also they provided their software tools to make programing easier for this chip. And we will see which one will be the more "efficient" implementation and which one will be used more widely by software developers in the future.
m
0
l
February 28, 2013 6:21:13 AM

I see this thread is old but I'd like to bring up a few points if anyone is interested.

1. Android/IOS games suck right now, yes. However, while the OUYA will not come even close to the next gen of consoles on a TECHNICAL level, it does open up Android to game developers in a way that has not been explored previously. I think with this avenue open, Android can blossom into a gaming ecosystem worthy of the true gamer's dollar. (in other words, non-casual gamers) This might be even more true once the OUYA is updated to Tegra4 and beyond.

2. Yes, AMD would love to have heavily threaded games be the norm. Keep in mind that although the 360/PS3 technically can take advantage of heavy threading, lazy porting causes most of our game thread issues. I'm looking at you, Ubisoft, you lazy sack of ----.
With both new high end consoles (not the Wii U) using AMD CPUs, ports will be a lot easier to do well and keep all advanced threading intact when going to the PC. It's x86 all around.

3. If HSA is a resounding success, it may be possible for Intel to be relegated to a server role like smorizio suggested. However, Intel's budget is huge, and they hate being pushed around. They will try to out-HSA AMD if it comes to that. Can Intel do it? I wouldn't put it past them.... However, even with their huge R&D budget, according to rumors, the Haswell GT3 iGPU will still need expensive GDDR on board to compete with Kaveri. I'm not an Intel fanboy, but I know it's a bad thing to piss them off.
They can always put the hurt on Nvidia and buy them out if they need engineers to make it happen.

4. There are already games out that utilize 8 cores. See Guild Wars 2, which sees improvements in performance all the way up to 16 cores. True, the most gains are had going up to 6, and less past that, but still. It's possible.

5. Xeon Phi is a joke. Hopefully OpenCL relegates these things to another failed Intel project like Itanium.

6. Heavily threaded games will help Intel, too. All they have to do is enable HT on their i5s and presto, 8 virtual cores. While HT is only about 20% effective compared to a "real core", Intel doesn't need that big of a performance advantage to keep AMD in second place for gaming.

7. I think many people forget the Oculus Rift when they think about the future of gaming. While I doubt that the data output from the O.R. to the PC will need more cores, I do think it makes an impact for the resolution we demand from games. The current prototype isn't even 1080p, and apparently, that's not even an issue. According to the Tom's article, your eye fills in data to complete the illusion of reality.
I am almost certain the O.R. will be successful. The Tom's article was literally glowing, and I can see it becoming a mainstay for any gamer, regardless of platform, whether they prefer Android, PC or console. This could cause a move away from 1440p/4k displays in the future and change the game developer's focus to mechanics and interactivity rather than pure graphical fidelity.
Oh, I'm sure graphics will still improve, but it will be things like better character models, advanced physics, lighting and particles. Hmmm all things done well by GPU compute..

8. Another point about ARM that so many people seem to have forgotten:
Nvidia was scorned by Intel and has a chip on their shoulder. Project Denver is well underway using ARM tech and I have a feeling we'll be hearing news on it soon. I believe that the PC world will be shaken to the foundations when Denver hits the reviewers. I am not an Nvidia fanboy, but I am incredibly excited to see what kind of secret sauce Nvidia has up their sleeve. Don't count them out of the CPU race just yet.
Also, I think it'd be funny to see Team Green fans react to this. They should love the chance to build a 100% green machine.
m
0
l
February 28, 2013 7:03:56 AM

The idea of a fully team green machine is certainly interesting, yet AMD have had this ability for a while now and they don't seem to be making as much out of this as it would have seemed possible.

Technically AMD should be in a better place for this in a couple of years time, what with the console wins and all. However it would be a very brave man to say Nvidia would be at any great disadvantage. They have a track record of leveraging advantage from games Dev's.
Also I know this is a contentious point. Personally I have had about 2 driver related issues after 10+ years of ATI/AMD GPU's, however I do believe Nvidia have the better driver team to help implement a closer synergy between CPU and GPU.

Mactronix :) 
m
0
l
a b À AMD
February 28, 2013 9:36:01 AM

I think AMD is looking better and better vs Intel for enthusiast desktop CPUs in newer games like Far cry 3 there Phenom II & 965 are looking better value than they ever did before and in Crysis 3 just beating Intel altogether. Intel could easily release 4-6 core CPUs at a lower price but I don't expect them to any time soon. Haswell is not expected to to have much of a performance increase over Ivy in desktops so I think AMD will have a good year in desktop CPUs anyway. Also if they could release an APU with Radeon 7850 performance (like the PS4 chip) Intel is not going to beat that.
m
0
l
February 28, 2013 12:50:29 PM

Soda-88 said:
Benchmarks that give the (very slight) edge to AMD are irrelevant for gaming because they usually have very streamlined and predictable workload that can constantly utilize each thread to its fullest from start to finish. Such workload is parallelized by nature and therefore it's easy to code.
On the other hand, games are a different can of worms. There's constant spiking in CPU utilization, no matter what game you play because there will never be an identical situation in any non turn-based game (such as chess) twice, let alone, being constantly identical.
When it comes to chess (let's say white is human and black is computer for argument's sake), it's always going to be the following:
1) White makes a move
2) Black assesses the situation and acts accordingly, or more specifically, tries as many moves as possible in the given time specified by the programmer, deducting the best move out of all the variations it managed to play which is again, dependant on the processing power of the computer
Step 2 on the drawing board will look like this:
http://i.imgur.com/svK0oy5.png
...and so on. As you can clearly see from this poor schematic, such game can benefit from parallelized coding. Sadly, most games we get to play nowadays, don't.

The best thing developers can do with the design of current games is to split different tasks to different cores, i.e. main program would be executed on Core 0, AI functions would be run on Core 1, control input, sound output on Core 2; what's left there for CPU to do? Well, particle physics comes to mind, but unfortunately, that task is easier to code for GPGPU (OpenCL, PhysX which is based on CUDA) than it is to code in x86 (CPU) syntax.

Currently, the tasks other than main program and particle physics done on CPU in few select games such as SC2, are too light on power requirements. The only reason gaming industry would prefer more cores over more powerful cores would be improved motion detection input with: a) more sensors and b) higher resolution sensors than today's kinect for example. Another thing would be focusing on MMO genre with many players which would introduce much higher player input, or should I say output count that would need to be processed by both server and client. Unfortunately, MMO is a dying breed, WoW's time has passed, and realistically speaking, there hasn't been any other MMO anywhere as successful, ever.

The bottom line is that as long as games don't perform repetitive tasks, developers won't be able to efficiently parallelize game engines which would justify having octocores in purely gaming machines, be it PCs or consoles.

I hope I'm wrong, it's been a long time since I started moaning about game developers going easy on the CPUs.

One other idea popped into my mind just as I was going to post, it's very unlikely to happen, but as the things stand at the moment, the only thing that warrants extra cores in gaming today is streaming, so if MS/Sony/Nintendo would like to push PC gaming even further away from the mainstream, they could launch streaming service like Twitch.tv that would be viewable on the consoles too.

Edit: had to resort to paint


Agreed. The stuff games can offload to parallel processors are already offloaded to the GPU for the most part, which limits scalability. There are a few exceptions to the rule; some really GPU heavy games make more use of the CPU for some effects (Crysis 3 comes to mind), even though the GPU is better suited for such tasks. But for the most part, the majority of work is not parallel in nature, which limits scalability.

As noted, some tasks, such as chess, scale REALLY well. But these days, that type of work is offloaded to the GPU because, again, its better suited for such workloads.

You see this thinking in the PS4 hardware: Top tier current GPU, very underwhelming CPU. Why? Because the GPU drives performance.
m
0
l
a c 145 À AMD
a b å Intel
February 28, 2013 1:12:07 PM

smorizio said:
you have to ask where the pc/gaming console/tablet is going. if the arm/phone cpu keep the rate that there coming out and the speed and power that coming out on each new gen of arm chips. i dont think there going to be much life left in the old x86 cpu.
myself i see a home pc/server unit. that not only streams movies. but plays games on andorid/osi os. i see amd selling there cpu ip and eng to cell phone arm people to help them make a more powerful low wattage chips. I see amd making gpu core logic for arms or other chips. i see intel x86 leave the home/pc market and stay in the server workspace. intel turning more into a fab plants and r and d.


Rammy said:
It's worth remembering that the Xbox360 and PS3 are over 7 years old, and both have what are ultimately 6(+) core CPUs. You can debate what a core is ofc, and whether or not they are genuine "cores" or if the current AMD CPUs are either, but consoles have been heavily threaded for a long time now.
Just because the next generation of consoles might utilise 8+ threads doesn't necessarily mean that PC games will follow suit.

Interesting topic though.


The thing is, the AMD APU's in these next gen consoles will be x86 and will be optimized for eight x86 cores and not some other standard that say the PS3 currently uses.
m
0
l
February 28, 2013 1:45:16 PM

While its AMD hardware a lot of what makes it tick (if reports are correct) is Sony owned and as such we cant expect a straight drop in port to work ant better than they do now.
Toms own report states that its all AMD intellectual property. I cant quite see Sony allowing AMD the rights to optimizations that Sony have developed though.

Mactronix :) 
m
0
l
February 28, 2013 2:25:41 PM

^^ But keep in mind: Those "optimizations" are going to be built around that fact that on the PS4, there will almost certainly be some guranteed amount of resources free for developer use at all times.

For example, the PS4 had 256 MB of RAM and 7 functional PPE elements. For developer use, about 200MB or so (varied by OS version) and 6 PPE's were free for developer use AT ALL TIMES. This allows VERY low level optimizations to take place (fine threading, etc). On PC's, this can not happen, due to the fact that at any given point in time, you have no guarantee over any resources. Heck, you can't even guarantee your application in question is even running at any given point in time!

So that nice hard coded threading logic that will be created for the PS4? It gets junked in the PC port, with the logic re-designed to thread creation, and not much else, leaving the OS scheduler to handle loading threads to cores.

Hence why I doubt games will suddenly start to use more cores any more then they already do. You *might* see some threads that used to do very little work doing more processing due to a higher amount of graphical work, but I really doubt you are going to see any real change in threading logic going forward.
m
0
l
!