Sign in with
Sign up | Sign in

CPU/GPU Hybrids And The Death Of The Graphics Business?

Talking Heads: VGA Manager Edition, September 2010
By

Question: Will there ever come a time when integrated graphics make discrete graphics cards unnecessary?

  • There are still a lot of room on the performance improvement front for discrete graphics and 3D applications. As long as the applications do not satisfy our need for realism, discrete will still be necessary.
  • Discrete graphics cards always have their place, especially for the high-end segment, which the power envelope will not fit into CPU/GPU hybrid solution.
  • At the current stage, we don't think the CPU/GPU hybrids can replace high-end discrete graphics. The design architecture of CPU-based CPU/GPU hybrids cannot really compete with what discrete graphics can do.
  • The levels of integration necessary to deliver even today's high-end graphics performance, together with full CPU logic, are unrealistic for the foreseeable future. In any case, this is a moving target, as graphics performance continues to multiply year on year. There will always be specialist applications in design or animation, for example, where the highest levels of graphics performance are still not enough to deliver results in a short time, and multiple graphics arrays or even higher-performing GPUs are demanded.
  • Discrete graphics will always be necessary. The pace of development and power restrictions on integrated GPUs will always keep them a minimum of one generation behind. The average user will also see increased benefits of a discrete graphics card as more and more programs/operating systems take advantage of the parallel processing power of the GPU. I think these new applications, in addition to games of course, will continue to showcase the benefit of a graphics card. Very similar to the reason "wireless" will never replace "wired" networking entirely. There is always more demand for bandwidth and "wireless," or integrated graphics in this case, may be good enough...but never fully support the latest generation.
  • Discrete graphic cards should still have better performance, and the end-user should consider the total cost of whole PC. CPU/GPU hybrid solution should have higher cost premium than normal one.


The outlier here really has us scratching our heads. While we were curious to see if anyone had an apocalyptic vision on the horizon, we honestly never expected an outright “yes.” The need for discrete graphic solutions harkens to the very existence of these companies (or a very large portion of their business). That respondent didn’t provide details behind his answer, so this only raises more questions. While not a tier-one  company, they are still a large manufacturer/supplier in terms of worldwide unit sales. Does this mean they are planning to retreat from the graphic card market, either in the short-term or long-term? Or do they see this as a very long-term technological advancement at an unforeseeable date?

In our opinion, we don’t see the video card industry disappearing, one, two, or even five years forward. One has to wonder a bit if CPU/GPU hybrids are as much as a “game changer” as AMD and Intel hope them to be (or if they’re only game-changers for each company’s bottom line, as they cut cost via integration and maintain pricing by offering better performance). Even if they are, these folks have a valid point. The applications and demands made of graphic solutions will continue to multiply, most likely to a degree that outpaces the development of integrated graphics for some time to come.

Combine this with a desire to enable general-purpose computing using GPUs. It is easy to see how these companies will exist in the long term. If you recall the emergence of 64-bit computing, Intel and AMD were both heavily vested in pushing adoption. Fast forward to the present day. We are still lacking a concerted effort by the software development community to adopt 64-bit programming. We still lack a 64-bit version of Firefox, and there is no ETA on a 64-bit Flash plug-in. While the benefits of 64-bit in these two scenarios may in fact be negligible, it shows how slow the software community has been in contrast to what today’s hardware provides. Only recently did Adobe update its suite of apps to support a 64-bit architecture, and we’ve already shown the effect to be massive.

If all of AMD’s and Intel’s wishes come to fruition, the video card industry will no doubt be shaken up. In a worst-case scenario, it seems more likely that within a couple of years, we will see the number of third-party board vendors condense (indeed, industry veteran BFG has already disappeared, and customers have been told directly that the company wasn’t getting enough support from Nvidia).  What doesn’t seem to be changing is the focus on the high-end graphic space. This market will never really be satisfied by integrated graphics.

This makes sense considering what one representative pointed out. “The pace of development and power restrictions on integrated GPUs will always keep them a minimum of one generation behind. The average user will also see increased benefits of a discrete graphics card as more and more programs/operating systems take advantage of the parallel processing power of the GPU.” With general-purpose GPU programming, there should still be a huge performance delta between a high-end discrete graphics card and a “high-end” IGP solution.

If you talk to the people in the motherboard industry, everyone is more confident in their own strengths after watching competitors exit the market amid what we might think of as the second tech bust. It was a wake up call that many needed to focus on what really made brands stand out in the first place: overclockablity, stability, feature set, pricing, quality of service, and so on. In the short-term, the only business really being threatened is the low-end discrete space. While there is volume here, it really isn’t in the channel, if you are looking at the marginal profit. It is a decent revenue stream for tier-one and tier-two board vendors, simply because of the large volumes on lower contracted pricing. However, the real money is in the “sweet spot” from $125 to $175. If this means card makers are going to invest more time and money into making better cards in that range, everyone might be better off having the IGP solutions take the low-end of the spectrum. The war for the low-end discrete space is going to drag out for at least a year and a half, because success depends on multiple factors (price, driver support, discretionary spending, economic climate, etc...), so it feels like there is ample time for all card makers to stay ahead of the curve.

Ask a Category Expert

Create a new thread in the Reviews comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 43 comments.
This thread is closed for comments
Top Comments
  • 23 Hide
    Anonymous , September 3, 2010 7:50 AM
    The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.

    The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.
  • 21 Hide
    Anonymous , September 3, 2010 6:40 AM
    Who knows, By 2020, AMD would have purchased Nvidea and and renamed Geforcce to GeRadeon... And talk about considering integrating RAM, Processor, Graphics and Hard drive in Single Chip and name it "MegaFusion"... But there will still be Apple selling Apple TV without 1080p support, and yeah, free bumpers for your Ipods( which wont play songs if touched by hands !!!)
  • 20 Hide
    Kelavarus , September 3, 2010 6:50 AM
    That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?
Other Comments
  • 21 Hide
    Anonymous , September 3, 2010 6:40 AM
    Who knows, By 2020, AMD would have purchased Nvidea and and renamed Geforcce to GeRadeon... And talk about considering integrating RAM, Processor, Graphics and Hard drive in Single Chip and name it "MegaFusion"... But there will still be Apple selling Apple TV without 1080p support, and yeah, free bumpers for your Ipods( which wont play songs if touched by hands !!!)
  • 20 Hide
    Kelavarus , September 3, 2010 6:50 AM
    That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?
  • 2 Hide
    TheStealthyOne , September 3, 2010 6:58 AM
    I've been waiting for this article! Yes :D 
  • 23 Hide
    Anonymous , September 3, 2010 7:50 AM
    The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.

    The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.
  • 3 Hide
    corser , September 3, 2010 9:21 AM
    I do not think that including a IGP on the processor die and conecting them doesn't means that discrete graphics vendors are dead. Some people will have graphics requirements that will overhelm the IGP and connect an 'EGP' (External Graphics Processor). Uhmmmm... maybe I created a whole new acronym.

    Since the start of that idea, believed that IGP on the processor die could serve to offload math operations and complex transformations from CPU to IGP, freeing CPU cycles for doing what is intended to do.

    Many years ago Apple made somewhat similar to this with their Quadra models that sported a dedicated DSP to offload some tasks from the processor to the DSP.

    My personal view on all this hype is that we're going to a different computing model, from a point that all the work was directed to the CPU and making some small steps making that specialized processors around the CPU do part of the work of the CPU (think on the first fixed instruction graphics accelerators, sound cards that off-load CPU, Physx and others).

    From a standalone CPU -> SMP ->A-SMP (Asymetric SMP).
  • 1 Hide
    silky salamandr , September 3, 2010 9:30 AM
    I agree with Scort. We have all this fire sitting on our desks and it means nothing if theres no software to utilize it. While I love the advancement in technology, I really would like devs to catch up with the hardware side of things. I think everybody is going crazy adding more cores and having an arms race as a marketing tick mark but theres no devs stepping up to write for it. We all have invested so much money into what we love but very few of us(not me at all)can actually code. With that being said, most of our machines are "held hostage" in what they can and cannot do.

    But great read.
  • 2 Hide
    corser , September 3, 2010 9:55 AM
    Hardware should be way time before software starts to take advantage of it. Has been like this since the start of the computing times.
  • 7 Hide
    Darkerson , September 3, 2010 10:17 AM
    Very informative article. I'm hoping to see more stuff like this down the line. Keep up the good work!
  • 1 Hide
    jestersage , September 3, 2010 10:30 AM
    Awesome start for a very ambitions series. I hope we get more insights and soon.

    I agree with Snort and silky salamandr, we are held back by developments on the software side. Maybe because developers need to take backwards compatibility into consideration. Just take games for example: developers would like to keep the minimum and even recommended specs down so that they can get more customers to buy. So we see games made tough for the top-end hardware but, thru tweaks and reduced detail, can be played on a 6-year old Pentium 4 with a 128mb AGP card.

    From a business consumer standpoint, and the fact that I work for a company that still uses tens of thousands of Pentium 4s for productivity related purposes, I figure that adoption of the GPU/CPU in the business space will not happen for another 5-7 years AFTER they launch. There is simply no need for an i3 if a Core2 derivative Pentium Dual Core or Athlon X2 could still do spreadsheet, word processing, email, research, etc. Pricing will definitely play into the timelines as the technology ages (or matures) but both companies will have to get money to pay for all that R&D from somewhere, right?
  • -4 Hide
    smile9999 , September 3, 2010 12:01 PM
    great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.
  • -1 Hide
    smile9999 , September 3, 2010 12:03 PM
    smile9999great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.


    I meant nvidia and ati dedicated low end GPUs
  • 2 Hide
    Onus , September 3, 2010 12:54 PM
    I also agree with scort. Part of the cause, however, I believe is nVidia's unwillingness to coexist. Whether it's their x86/chipset issues, or disallowing PhysX (minus 3rd party hacks) if there's an AMD card in the system, I suspect developers are averse to putting a lot of work into something that may not ever be standardized. I think GPGPU could give us the AI needed for vastly more immersive experiences, but it won't happen until the hardware vendors agree on (and/or accept) some standards.
  • 2 Hide
    NotYetRated , September 3, 2010 1:22 PM
    Death of the graphics card business? No. Way. Not for a long long time. The sound card only began to die because modern processors are in no way phased by the hit sounds takes on the system. Sound cards have become almost obsolete for most except for the audio enthusiast. (Though, I am not enthusiast and still have an X-fi...) We are nowhere near ready for CPU's to be able to handle graphics without taking very much a hit. In fact, graphics cards themselves cannot keep up with the pace that software is pushing it to. I think it will be a long, long time before we begin to see the phase out of dedicated graphics cards. At least for the gamers/engineers/creative professionals out there.... If we ever even do see the phase out.
  • -2 Hide
    hundredislandsboy , September 3, 2010 1:43 PM
    If I can still get the same gaming experience, same visual quality with same smooth fast framerates from a CPU/GPU, then go ahead and pour huge investments to R&D to make this happen. Why? Because it's a win-win fro everyone, users play less, cases gets less heat, electric bills get lowered, etc, etc...
  • 2 Hide
    Aerobernardo , September 3, 2010 2:08 PM
    I have two ideas for this kind of topics:

    1 - Tom's start a pool on wich questions do we want to ask?
    2 - I kinda hoped to hear something about what's next. We know about NDA's, but it would be nice to hear if AMD have intention to adopt new features like a proprietary 3D solution or what both AMD and Nvidia think about how a next generation VGA should do (or wich architecture should it have) to be successfull on todays perspective.
  • 0 Hide
    LORD_ORION , September 3, 2010 2:43 PM
    If Nvidia can bring out the best $150 card by Black Friday, I think they will be fine.

    Basically, they need something that blows the 5770 out of the water and costs the same or less... because AMD will lower the 5770 prices when that happens, so it can't perform on par.
  • 0 Hide
    rocky1234 , September 3, 2010 2:44 PM
    I do not think they will be getting rid of the stand alone graphics cards any time soon. Reason is that if they did it would send us back in performance at least 10 to 12 years in performance if we lost stand alone graphics cards all together. Maybe by 2017 they will have tech that can do it but right now it is not going to happen. Well it could I guess if all of the save the planet people have their way look at what the auto makers are starting to do rolling out 4 bangers in mid sized cars GM just said that is what they plan to do so I guess if enough pressure is applied anything can happen. Which is to bad because all it does is effect the consumers we get less & pay more for that less.
  • 4 Hide
    Moores Law , September 3, 2010 2:56 PM
    Nvidia better come up with something to counter the new AMD cards soon or this is going to get even uglier.
  • 1 Hide
    porksmuggler , September 3, 2010 2:57 PM
    Very good premise for an article, and an excellent write-up for the companies involved. I haven't read any articles from Andrew Ku before, is Tom's keeping the one guy who can spell hidden away? Beware, he may get criticism from the TL;DR crowd though. Those were great questions, but lets be honest *cough NDAs* the answers were generic PR we already know. Without Ku's analysis, this series would sink, thanks for putting the leg work into this.

    oh, and the 100% pie chart for question 1 is too funny.
  • 3 Hide
    kilthas_th , September 3, 2010 3:13 PM
    High phone bill? If only there were some way to have our conversations or even conferences routed effectively and cheaply over the internet ;) 
Display more comments