Sign in with
Sign up | Sign in

Half Of All Notebooks To Use gCPUs This Year

By - Source: Tom's Hardware US | B 24 comments

The introduction of Intel's Sandy Bridge and AMD's Fusion processor will dramatically increase the penetration of graphics-enabled CPUs (gCPUs), market research firm IHS iSuppli said today.

According to a new forecast, 50% of notebooks and 45% of desktops will use gCPUs in 2011, up from 39% and 36%, respectively. By 2014, 83% of notebooks will use gCPUs with integrated graphics processors, the share of desktop PCs will hit 76%, the firm said. "With GEMs [graphics enabled microprocessors] capable of generating the total graphic output of a PC, no additional graphics processor or add-in graphics card is needed," said said Peter Lin, principal analyst for compute platforms at IHS. "Computers today are serving up ever-richer multimedia experiences, so the graphics capabilities of PCs have become more important, driving the rising penetration of GEMs."

The obvious question would be what the effect on discrete graphics cards may be, even if AMD is unlikely to torpedo the demand for its own products. IHS noted that "discrete graphics cards will remain the solution of choice for leading-edge graphics, providing high-end performance for applications such as games." GEMs, as far as their graphics capability is concerned, are likely to be targeted especially at mainstream and value PCs, IHS said.

Both AMD and Intel are positioning their gCPUs as a way to reduce the manufacturing cost of their chip solutions as well as a way to reduce the influence of third-party manufacturers within their platform environments as many users will perceive embedded graphics solutions as good enough for their purposes. While Intel is relying on a single general gCPU approach, AMD is expected to release five application platforms with five GEM microprocessor categories.

Via is also part of the game, but caters with its gCPU solutions to embedded and industrial applications, IHS iSuppli said.
 
  

Discuss
Display all 24 comments.
This thread is closed for comments
  • 1 Hide
    daygall , March 18, 2011 10:17 PM
    i realize im a gamer, but i cant see consumer numbers that high, true the market share may be that high, but i cant see consumer numbers matching it exactly that way.

    until gCPUs are able to do the same work load that is
  • 1 Hide
    rmmil978 , March 18, 2011 10:24 PM
    Really shouldn't surprise anyone, since a gCPU is all you'd ever need to play Farmville or Plants Vs. Zombies.
    I know that's a pretty typical spiteful PC gamer response, but heck, when you really think about it in another way, most gCPU's coming out this year are probably more powerful (graphics wise probably, processor wise definitely) than an Xbox 360 or PS3 (considering they are using 5 year old hardware), so games of console caliber should be able to be played reasonably well on a PC running a gCPU, which is just fine for most people. Sad, but true.
  • 2 Hide
    Thunderfox , March 18, 2011 10:29 PM
    Not surprising at all considering all the mainstream parts are gCPU's. It's not about whether you want it or not, it's about whether you have a choice. And most people don't need more than that for what they do with a portable computer anyway.

    People who actually want to game on them will invest in something with a dedicated GPU, even though the CPU may have an unused graphics core built into it anyway.
  • 1 Hide
    pelov , March 18, 2011 10:36 PM
    The most intensive task my CPU does most of the time is gaming anyway. It'd be a good thing if the CPU can pitch in with it's dedicated cpu/gpu to the discrete GPU then you'd be seeing nice performance increases. There's already talk of this with the upcoming AMD processors.

    You could potentially have a 6850 x-fire solution and a llano/zacate CPU providing an even bigger performance boost. Just a matter of how well it works... took them quite a while to get x-fire right. can't imagine it'll be smooth-sailing right out of the box.
  • 0 Hide
    agnickolov , March 19, 2011 2:56 AM
    Well, the integrated GPU used to be in the chipset, now it's in the CPU. Nothing has changed for those that use discrete graphics. All this article is saying really is that the percentage of chipset-based graphics is going to practically disappear by 2014.
  • -1 Hide
    aftcomet , March 19, 2011 3:04 AM
    rmmil978Really shouldn't surprise anyone, since a gCPU is all you'd ever need to play Farmville or Plants Vs. Zombies. I know that's a pretty typical spiteful PC gamer response, but heck, when you really think about it in another way, most gCPU's coming out this year are probably more powerful (graphics wise probably, processor wise definitely) than an Xbox 360 or PS3 (considering they are using 5 year old hardware), so games of console caliber should be able to be played reasonably well on a PC running a gCPU, which is just fine for most people. Sad, but true.


    Since when can integrated graphics play Crysis 2 on setting equal to consoles?
  • 0 Hide
    iam2thecrowe , March 19, 2011 3:55 AM
    can someone tell me how this is any different to integrated gfx on the motherboard? is it cheaper this way? faster? more power efficient? Because it seems like the exact same thing, just moved closer to the cpu.
  • 0 Hide
    DavidC1 , March 19, 2011 4:56 AM
    Quote:
    can someone tell me how this is any different to integrated gfx on the motherboard? is it cheaper this way? faster? more power efficient? Because it seems like the exact same thing, just moved closer to the cpu.
    ..


    All of the above.
  • 0 Hide
    jsc , March 19, 2011 12:58 PM
    This has been the trend since the early IBM PC days. The goal has been more and more integration.
  • 0 Hide
    Anonymous , March 19, 2011 1:17 PM
    What will be interesting to watch is how Intel handles the gpu refresh for its Sandy Bridge and Ivy Bridge CPU. AMD has a continually evolving library of OpenCL based gpu's to add to their future refreshes of Llano and Brazos. This makes Intel a gpu design house as well as a cpu design house. But without a portfolio.

    AMD simply makes this years star gpu into the APU.

    Designing gpu's does not come cheap. AMD library is already paid for. That shold be a huge price advantage or margin advantage for AMD.
  • 0 Hide
    shloader , March 19, 2011 1:49 PM
    Plus don't forget that north bridge IGP access to memory is @ 32bit (at least according to CPU-Z). I'm hoping that moving the IGP to the CPU will increase that to 128bit. This will still be a severe limiting factor for modern gaming as all modern cards pack GDDR5. However the problems some encounter with DXVA2 with an IGP when their CPU throttles down should be totally eliminated.
  • 1 Hide
    danwat1234 , March 19, 2011 6:30 PM
    gCPUS have a great energy efficiency advantage.. SLI laptops for instance can hopefully shut down both high-end cards when not gaming!
  • 0 Hide
    Anonymous , March 19, 2011 6:37 PM
    good news and bad. intel igp sucks unless you're using it for video encoding. Also AMD need to release llano. Brazos is cool, but thats for netbook.
  • 0 Hide
    bystander , March 19, 2011 6:48 PM
    I do believe the laptop will likely see this trend continue, but I have a hard time believing the desktop gCPU segment will grown this rapidly.
  • 0 Hide
    dalta centauri , March 20, 2011 3:03 AM
    aftcometSince when can integrated graphics play Crysis 2 on setting equal to consoles?

    The same integrated graphics from AMD that shows that it can handle games like Aliens Vs Predators under DX11. If the integrated graphics chip is capable of doing dx11, it already beats consoles graphics as a tech standpoint.
  • 0 Hide
    Anonymous , March 20, 2011 8:39 AM
    Contemporary integrated gpu's may beat console-grade graphics from a technical and specs standpoint, but, the game will still not run better on a PC or a laptop using that.
    Why?
    Due to porting and poor optimization for the PC.

    Ideally, any game made for consoles should be playable on maxed out settings (console-level wise) on a notebook from 2008 with mid-range discrete gpu's, and yet, numerous console game ports require high-end PC hardware to be run on those setting alone (never mind the higher ones).

    So the hardware is not the problem, it's the idiots who port games to PC.

    However, where is the profit in being able to run any console ported game that looks good in the process?
    There isn't any... at least not for manufacturers of new desktops and laptops.

    It's all interconnected to be honest.
    And seriously, unless a game can take full advantage of DX11, then there will be no improvement in image quality between DX10 and DX11 game.
    You might notice some minor differences (to which you'd be oblivious 90% of the time) but not enough to justify the lag that occurs when the gpu has numerous computations to do on those settings (and for what?... an obscure detail you probably won't be able to notice).
  • 0 Hide
    mateau , March 20, 2011 2:00 PM
    Sandy Bridge and AMD Fusion means the end of Invidia as we know it.
    Llano uses a one year old discrete GPU core design!! If you look at the Radeon cores that are being used for AMD Llano several things really stand out. The HD 6370 is a 7 watt core as discrete GPU so they probably draw much less as APU. Clearly this is headed for laptops. These cores also have one other HUGE common denominator.
    Here's a link for Radeon releases: http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units
    They were all released Nov 2010!!! If anybody does not believe that AMD Llano will eliminate the mid price point mass market for discrete gpu's then they really need to have a hard look at the facts.
    AMD is releasing Radeon 6990 without quantity restrictions. Nvidia is releasing less than 1000 GeForce 590's. The 590 is a cherry picked dual gpu board. It may perform equal to or actually outperform Radeon 6990. But what good is it if you can't buy it? Or is the market for bleeding edge bragging rights also just not there?
    The mass market supports new gpu core development. Without the sales of millions of discrete gpu's for legacy upgrades, the next generation doesn't get designed or if it does without the prospect of any mass sales volume then it becomes a very expensive piece of silicon.
    A good example is the ATI FirePro and Nvidia Quadro brands. They simply do not have the mass volume sales to allow for a lower purchase price point of $2000-$3000.00, the demand is simply not there. Product refreshes are also not as frequent as the mass market again due to demand.
    If AMD is using this year’s top discrete gpu design for next years Fusion APU then the discrete gpu market is most certainly dead. Will there be a reason to upgrade a one year old Llano box with the latest discrete GPU? For what gain other than bragging rights? And what would be the discrete GPU demand looking forward?
    The real question becomes is that AMD’s plan? And if so how does Nvidia plan to keep the discrete market open? Does Nvidia license core designs to Intel?
    The other question is just what does AMD plan to do with Bulldozer? It seems that Bulldozer will be the server, workstation or high performance desktop and gamers cpu. This is certainly not a mass market cpu. As a server obviously graphics are not needed beyond a motherboard integrated gpu. So there will be some demand for discrete gpu boards with Bulldozer.
    The next question becomes. When does AMD release Bulldozer with an on die graphics core? Probably with Llano’s replacement the Trinity Fusion APU with 2nd gen Bulldozer. Because Bulldozer will be the only market left open for discrete gpu’s. That would imply a Bulldozer development APU first.
    Of course just how Intel intends to answer AMD will determine the future of Nvidia graphics. Arguably Intel cannot compete with the AMD/ATI library. Every few months AMD releases new graphics silicon, they are continually evolving that product to meet present market demand. Intel is not a graphic’s design house. But now they have to be to keep their CPU business competitive. That means they are designing graphics gpu’s to penetrate a market that is owned 100% by AMD and Nvidia.
    AMD is now designing discrete GPU’s with the intention of integrating that design on-die for an APU release ONE YEAR LATER! That has to be an optimized model and as such just how can Nvidia compete with AMD if they don’t have that insight into Intel future release Architectures? Nvidia’s only market will be on an Intel Inside box.
    Right now AMD is directing the future of CPU design. They have the edge over Intel with ownership of arguably the world’s best graphics design portfolio and gpu design team. And they have the cost edge over Nvidia as they simply sell a one year old core design on die to millions of consumers as an APU. For Intel to remain competitive they are forced into the same model and this model shuts out Invidia.
  • 1 Hide
    schmich , March 20, 2011 3:46 PM
    So technically what's the difference between gCPU and APU?
  • 0 Hide
    mateau , March 20, 2011 5:30 PM
    APU was coined by AMD to describe FUSION. A cpu with on-die gpu. The author either doesn not know that, or he does not want to use AMD nomenclature.

    What is NOT being said here is that Fusion and Intel Sandy Bridge will achieve 100% notebook penetration by next year. That is the end of Nvidia notebook graphics.

    When a notebook cpu has Radeon HD 6550 on die that is huge!!!! Apple is using Radeon HD 6750M boards in Mac Book Pro now. That gpu is only a few months newer refresh. Apple has to go to Fusion.
  • 0 Hide
    Anonymous , March 21, 2011 1:17 AM
    So basically I will have to upgrade my CPU to get the latest graphics (combining the price of both the CPU and GPU). Very poor for desktop gamers. I upgrade my components at different times. Seems like choice and customization is slowly being driven out of this market. Of course the niche will always be there, but cost may drive most of us into the integrated market.
Display more comments