CPU Performance Boosted 20% When CPU, GPU Collaborate
Researchers have managed to get CPUs and GPUs on a single chip to work together more efficiently and boost performance.
Engineers at the North Carolina State University endeavored to improve the way both the CPU and the GPU perform by engineering a solution that sees the GPU execute computational functions, while the CPU cores pre-fetch the data the GPUs need from off-chip main memory. In the research team's model, the GPU and the CPU are integrated on the same die and share the on-chip L3 cache and off-chip memory, similar to the Intel's Sandy Bridge and AMD's APU platforms.
"Chip manufacturers are now creating processors that have a 'fused architecture,' meaning that they include CPUs and GPUs on a single chip," said Dr. Huiyang Zhou, an associate professor of electrical and computer engineering who co-authored a paper on the research.
"This approach decreases manufacturing costs and makes computers more energy efficient. However, the CPU cores and GPU cores still work almost exclusively on separate functions. They rarely collaborate to execute any given program, so they aren’t as efficient as they could be. That’s the issue we’re trying to resolve."
Zhou's solution was to have the CPU do the leg work by determining what data the GPU needs and then going and retrieving it from off-chip main memory. This in turn leaves the GPU free to focus on executing the functions in question. The result of this collaboration is that the process takes less time and simulations have found that the new approach yields an average improved fused processor performance of 21.4 percent.
The paper will be presented at the 18th International Symposium on High Performance Computer Architecture, in New Orleans, later this month. In the meantime, you can check out more details on the project here.

This is actually not true. Just FYI, credentials wise, I am a software engineer that doesn't work in gaming but plays plenty of games. I have used openGL / openCL / etc.
PC game developers now have a technology that allows them to compute almost all game logic GPU side - openCL / CUDA - where before that had to be done CPU side. It is why a game like World of Addictioncraft used a lot of CPU resources when it came out, because it did collision detection CPU side because they wrote the game for an openGL standard that didn't support general computation outside vector processing on GPUs.
Today, with openCL (you can't make a game that uses CUDA if its an Nvidia chip and something else if it is AMD when you can just write openCL and be cross GPU) you can do a lot of parallelizable things GPU side that were previously outside the vectorization paradigm openGL fixes processing on the GPU to.
And the general pipeline of a game engine, at its basic roots, is process input (user, network, in engine message passing) -> update state (each agent reacts on a tick stamp to game world events) -> collision detection (to prevent overlapping models) -> GPU rendering of the game world. Today, everything but processing input can be offloaded to the GPU and done massively parallel through openCL / openGL.
The next generation of games "should", if properly implemented, use so few processor resources besides file and texture streaming and processing key events and handling network packets that you might get 10% of one CPU utilized in an extremely high fidelity game that pushes the GPU to the limit but barely uses any CPU resources.
It also makes no sense to do any of those parallel tasks CPU side either - GPUs are orders of magnitude faster at that stuff. It is why an i5 2500k for $225 will last you a decade but you can spend $1500 on 3 7970s in triple Crossfire and have them be outdated by 2015. Games are moving into a completely GPU driven architecture for everything, and it is a good thing. It hugely increases the performance you can get from a game.
nothing really, and don't game developers already know this and have been doing this for some time.
It could, if programmers get behind it.
I hope not.
I know they are talking about cpu and gpu being on the same chip in this article, but it seems to me better utilization across the board could be taking place.
You are completely missing it. This research proves separate GPUs are STUPID. I've been saying for at least a year now that the integrated GPU is the future. Little by little, the Separate GPU is going to disappear until only the most extreme GPUs will remain as separate units. Eventually SOC will be the norm. RAM and BIOS will be integrated on a single die someday also.
When they get to the point where top video solutions are on the chip with the CPU, we'll all need one mother of a cooling solution.
First we had the CPU and graphical calculation were made by it.
Then they made the GPU and graphical calculations were shifted onto it.
Then they made the Sandy Bridge & The Llano which calculated the graphics with the help of a secondary Chip on board.
Now they're getting the CPU and the GPU onto one single die....
Lots of work... done... all for the same result....
I would think going the software route would not meet the efficiency angle these researchers seem to be looking at, but it might have better payoffs in terms of performance. Part of why CPUs with imbedded graphics controllers aren't more powerful than they are is that the heat generated is too much, something not really a problem if the GPU and CPU are on different dies as in a gaming rig. Again, APUs are more of an efficiency for low-power systems, not really performance. So the first uses of the ideas in this article will likely be either in smartphones, netbook/ultrabooks, tablets, or gaming consoles; any device that goes the SoC route. In short, means much more for the mobile industry than it does for enthusiasts and gamers.
http://www.guru3d.com/article/amd-a8-3850-apu-review/11
As long as you dont mind not being able to upgrade your graphics without upgrading your cpu and can put up with lowered fps and overall system performance... then yes apu's are "the future". Oh wait were you speaking of the idea that some people have been throwing about that tablet gaming is the future? yeah no.