Sign in with
Sign up | Sign in

CPU Performance Boosted 20% When CPU, GPU Collaborate

By - Source: NCSU | B 49 comments

Researchers have managed to get CPUs and GPUs on a single chip to work together more efficiently and boost performance.

Engineers at the North Carolina State University endeavored to improve the way both the CPU and the GPU perform by engineering a solution that sees the GPU execute computational functions, while the CPU cores pre-fetch the data the GPUs need from off-chip main memory. In the research team's model, the GPU and the CPU are integrated on the same die and share the on-chip L3 cache and off-chip memory, similar to the Intel's Sandy Bridge and AMD's APU platforms.

"Chip manufacturers are now creating processors that have a 'fused architecture,' meaning that they include CPUs and GPUs on a single chip," said Dr. Huiyang Zhou, an associate professor of electrical and computer engineering who co-authored a paper on the research.

"This approach decreases manufacturing costs and makes computers more energy efficient. However, the CPU cores and GPU cores still work almost exclusively on separate functions. They rarely collaborate to execute any given program, so they aren’t as efficient as they could be. That’s the issue we’re trying to resolve."

Zhou's solution was to have the CPU do the leg work by determining what data the GPU needs and then going and retrieving it from off-chip main memory. This in turn leaves the GPU free to focus on executing the functions in question. The result of this collaboration is that the process takes less time and simulations have found that the new approach yields an average improved fused processor performance of 21.4 percent.

The paper will be presented at the 18th International Symposium on High Performance Computer Architecture, in New Orleans, later this month. In the meantime, you can check out more details on the project here.

Follow @JaneMcEntegart on Twitter for the latest news.      

Display 49 Comments.
This thread is closed for comments
Top Comments
  • 22 Hide
    alvine , February 10, 2012 1:26 PM
    in other news SSDs make your system faster
  • 16 Hide
    pg3141 , February 10, 2012 12:46 PM
    Will this mean anything for the current generation of hardware?
  • 12 Hide
    zanny , February 10, 2012 3:14 PM
    warezmenothing really, and don't game developers already know this and have been doing this for some time.


    This is actually not true. Just FYI, credentials wise, I am a software engineer that doesn't work in gaming but plays plenty of games. I have used openGL / openCL / etc.

    PC game developers now have a technology that allows them to compute almost all game logic GPU side - openCL / CUDA - where before that had to be done CPU side. It is why a game like World of Addictioncraft used a lot of CPU resources when it came out, because it did collision detection CPU side because they wrote the game for an openGL standard that didn't support general computation outside vector processing on GPUs.

    Today, with openCL (you can't make a game that uses CUDA if its an Nvidia chip and something else if it is AMD when you can just write openCL and be cross GPU) you can do a lot of parallelizable things GPU side that were previously outside the vectorization paradigm openGL fixes processing on the GPU to.

    And the general pipeline of a game engine, at its basic roots, is process input (user, network, in engine message passing) -> update state (each agent reacts on a tick stamp to game world events) -> collision detection (to prevent overlapping models) -> GPU rendering of the game world. Today, everything but processing input can be offloaded to the GPU and done massively parallel through openCL / openGL.

    The next generation of games "should", if properly implemented, use so few processor resources besides file and texture streaming and processing key events and handling network packets that you might get 10% of one CPU utilized in an extremely high fidelity game that pushes the GPU to the limit but barely uses any CPU resources.

    It also makes no sense to do any of those parallel tasks CPU side either - GPUs are orders of magnitude faster at that stuff. It is why an i5 2500k for $225 will last you a decade but you can spend $1500 on 3 7970s in triple Crossfire and have them be outdated by 2015. Games are moving into a completely GPU driven architecture for everything, and it is a good thing. It hugely increases the performance you can get from a game.
Other Comments
  • 16 Hide
    pg3141 , February 10, 2012 12:46 PM
    Will this mean anything for the current generation of hardware?
  • -9 Hide
    warezme , February 10, 2012 12:55 PM
    pg3141Will this mean anything for the current generation of hardware?

    nothing really, and don't game developers already know this and have been doing this for some time.
  • 8 Hide
    outlw6669 , February 10, 2012 12:57 PM
    Quote:
    Will this mean anything for the current generation of hardware?

    It could, if programmers get behind it.
  • 4 Hide
    hoofhearted , February 10, 2012 1:22 PM
    I am sure the more experienced engineers at Intel and Nvidia are past what the college paper-writing academia oriented folks are doing.
  • 2 Hide
    hoofhearted , February 10, 2012 1:24 PM
    It is probably already done, just to be released at a later date determined by their marketing release schedule.
  • 22 Hide
    alvine , February 10, 2012 1:26 PM
    in other news SSDs make your system faster
  • -7 Hide
    drwho1 , February 10, 2012 1:40 PM
    Could this mean no more dedicated video cards in the future?
    I hope not.
  • 6 Hide
    vittau , February 10, 2012 1:41 PM
    Well, transistors are reaching a physical limit, so we need any kind of optimization we can get. Let's hope this technology gets implemented soon...
  • 5 Hide
    stingray71 , February 10, 2012 1:57 PM
    Often when gaming, my gpus and vram are at or near 100% while my system CPU and Memory are no where near being maxed out. Seems to me if the system memory and cpu were given some of the workload.

    I know they are talking about cpu and gpu being on the same chip in this article, but it seems to me better utilization across the board could be taking place.
  • 8 Hide
    digitalzom-b , February 10, 2012 2:05 PM
    Maybe lower budget gaming PCs will be able to get away with integrated video solutions in the future.
  • -4 Hide
    jaber2 , February 10, 2012 2:07 PM
    Isn't Intel and AMD already doing this?
  • -8 Hide
    loomis86 , February 10, 2012 2:16 PM
    digitalzom-bMaybe lower budget gaming PCs will be able to get away with integrated video solutions in the future.


    You are completely missing it. This research proves separate GPUs are STUPID. I've been saying for at least a year now that the integrated GPU is the future. Little by little, the Separate GPU is going to disappear until only the most extreme GPUs will remain as separate units. Eventually SOC will be the norm. RAM and BIOS will be integrated on a single die someday also.
  • 5 Hide
    ubercake , February 10, 2012 2:41 PM
    drwho1Could this mean no more dedicated video cards in the future?I hope not.

    When they get to the point where top video solutions are on the chip with the CPU, we'll all need one mother of a cooling solution.
  • 6 Hide
    alyoshka , February 10, 2012 2:42 PM
    Now this is an old idea but is becoming possible now....
    First we had the CPU and graphical calculation were made by it.
    Then they made the GPU and graphical calculations were shifted onto it.
    Then they made the Sandy Bridge & The Llano which calculated the graphics with the help of a secondary Chip on board.
    Now they're getting the CPU and the GPU onto one single die....
    Lots of work... done... all for the same result....
  • 4 Hide
    DRosencraft , February 10, 2012 2:44 PM
    Proper integration of command structure is not only a hardware issue, but also a software. For example, 3D rendering programs, which one would think would pull much of its data from the GPU during rendering, instead pulls all its power from the CPU (note I'm talking specifically about rendering). I would imagine that there should be some means of re-writing the code of a program like Maya or Modo so that the GPU helps out. It is a software instruction, after all, that allows for multiple cores or even multiple CPUs to work together.

    I would think going the software route would not meet the efficiency angle these researchers seem to be looking at, but it might have better payoffs in terms of performance. Part of why CPUs with imbedded graphics controllers aren't more powerful than they are is that the heat generated is too much, something not really a problem if the GPU and CPU are on different dies as in a gaming rig. Again, APUs are more of an efficiency for low-power systems, not really performance. So the first uses of the ideas in this article will likely be either in smartphones, netbook/ultrabooks, tablets, or gaming consoles; any device that goes the SoC route. In short, means much more for the mobile industry than it does for enthusiasts and gamers.
  • -8 Hide
    hajila , February 10, 2012 2:44 PM
    But will it play Crysis?
  • 2 Hide
    shin0bi272 , February 10, 2012 2:52 PM
    loomis86You are completely missing it. This research proves separate GPUs are STUPID. I've been saying for at least a year now that the integrated GPU is the future. Little by little, the Separate GPU is going to disappear until only the most extreme GPUs will remain as separate units. Eventually SOC will be the norm. RAM and BIOS will be integrated on a single die someday also.


    http://www.guru3d.com/article/amd-a8-3850-apu-review/11

    As long as you dont mind not being able to upgrade your graphics without upgrading your cpu and can put up with lowered fps and overall system performance... then yes apu's are "the future". Oh wait were you speaking of the idea that some people have been throwing about that tablet gaming is the future? yeah no.
  • 3 Hide
    aznjoka , February 10, 2012 3:08 PM
    This is why AMD had developed the APU.
Display more comments