Skip to main content

Talking Heads: VGA Manager Edition, September 2010

Future Opportunities For VGA Vendors

Question: As Intel and AMD integrate more functionality into their host processors, what opportunities remain for VGA vendors to add value or differentiate their products?

  • Host processors won’t enable a compelling 3D visual experience. A better gaming experience also requires a powerful discrete GPU.
  • 3D visual reality applications
  • Host processing power is never enough. Applications are always asking for more CPU and GPU power. We will add more fancy functions whenever the processing power is ready.
  • Integrated [graphics] does not offer enough power to play games.
  • Better performance, more unique features, our own custom-designed PCBs and coolers.
  • I think making sure the industry takes advantage of the parallel processing capability found in a discrete graphics card with new software will be the best way for VGA vendors to show the value of a more powerful discrete graphics card.
  • Performance and reliability will always be the key. Improved cooling technologies, PCB layout, and individual component selection all improve performance and reliability. Added value comes with pushing this quality into the mainstream and also by adding additional functionality on the graphics card--we already see this with different output connection options for media devices or multi-monitor support, for example, and we will continue to see other functions added to the basic graphics card.
  • Great parallel computing capability is the major advantage discrete graphics have. Providing a more interactive gaming experience such as Nvidia's PhysX or AI (in the future) is what VGA vendors should do in the future.
  • OC stability of a discrete card is better than CPU/GPU hybrid.
  • Using stable thermal solutions or passive solutions.
  • Share CPU loading to get better performance.
  • Introduce high quality and better performance for the game.
  • Basically, VGA still performs the unique function that a hybrid can't replace. As to the end user, VGA does have the "simple to replace" capability when the user feels the need to upgrade. They don't have to swap in a new processor; simply change a VGA card for access to the desired performance level.
  • Yes, we admit, the hybrid solution might affect low- and mid-range graphic cards, since this range is normally for the home/office user. But there are still users that mainly play games, and these are the users that VGA vendors will need to focus on.
  • For VGA vendors, we can add value to our graphic card by giving the user a more refined performance product on mid- to high-end market. Besides, hybrid motherboards still support external graphics  cards, and this helps too. With customized product and specialized features, discrete graphics will still have a place in the market

As far as value goes, we are a bit surprised that stereoscopic 3D wasn’t discussed more often, from its anticipated role in PC-to-TV output or 3D-on-PC. Only two people explicitly mentioned 3D. This is maturing technology, and may simply be in its infancy in a similar way HD output was back in the early 2000s. Nvidia (more so than AMD) seems to be really be pushing this technology. This may prove to be a boon or dead weight, depending on how the market moves forward. Stereoscopic 3D adoption for the home television still seems slow. At the moment, this has more to do with a cost premium tempering the degree of consumer interest. And if our own coverage of 3D Blu-ray has told us anything, it’s that the most vocal of our readers currently see the technology as gimmicky, rather than something they’d regularly use.

AMD has its own specific challenges with their upcoming APU designs. Moving forward, the company needs to find a way to position its integrated graphics performance, while finding the right balance for graphic card vendors. Past Radeon users cannot be traded for APU adopters, which means the company needs to take its own customers plus the competition’s. At the same time, AMD will need to carefully lay out its long-term plans. This includes weighing the cost of alienating the company’s discrete board partners with a potentially-superior APU-based graphics solution.

Since Nvidia still doesn’t have an x86 license for chipset production and has no plans for a CPU/GPU hybrid, it probably feels pressure on two fronts: the potential loss of its chipset business and the need to remain competitive in graphics. The company will have to come up with a few more ups on the blackboard if its video card vendors are going to be up to the challenge in 2011 (losing BFG doesn’t help there, we imagine). Nvidia has a very rare opportunity to try delivering a superior feature set and set the stage for compelling competition against Intel and AMD. Our behind-the-scenes talks with a few Nvidia engineers make us excited, but the proof is in the pudding. We'll find out more in 2011 as the company's plans unfold.

What does this mean for third-party video card vendors? Their offerings in 2011 will not really look any different than they did in 2010. We should expect more evolutionary design: improved performance and, as a byproduct, innovative cooling solutions. In the end, discrete graphic cards can still deliver a more complete feature set, better performance, along with lowering the total cost of ownership as users take advantage of component modularity.

In our opinion, video card vendors need a more serious commitment to GPGPU advancements, ultimately becoming the leading evangelists toward the software community. Unlike previous IGP solutions that may have been aimed at replacing the need for discrete solutions, GPGPU is AMD and Nvidia’s way to coax graphic card partners. It is their way out of the argument that their partners will eventually be obsolete or even marginalized. Video cards add serious computing power even to an IGP platform. So rather than competition or a conflict of interest, discrete solutions serve as a powerful performance solution, even for IGP platforms as the industry moves forward.

In the next year, the underlying battle between graphics vendors won’t really change, except for maybe the weapons they use to fight. Hopefully, they will be very powerful weapons indeed. The game is on.

  • Who knows, By 2020, AMD would have purchased Nvidea and and renamed Geforcce to GeRadeon... And talk about considering integrating RAM, Processor, Graphics and Hard drive in Single Chip and name it "MegaFusion"... But there will still be Apple selling Apple TV without 1080p support, and yeah, free bumpers for your Ipods( which wont play songs if touched by hands !!!)
  • Kelavarus
    That's kind of interesting. The guy talked about Nvidia taking chunks out of AMD's entrenched position this holiday with new Fermi offerings, but seemed to miss on the fact that most likely, by the holiday, AMD is going to already be starting to roll out their new line. Won't that have any effect on Nvidia?
  • TheStealthyOne
    I've been waiting for this article! Yes :D
  • The problem I see is while AMD, Intel, and Nvidia are all releasing impressive hardware, no company is making impressive software to take advantage of it. In the simplest example, we all have gigs of video RAM sitting around now, so why has no one written a program which allows it to be used when not doing 3d work, such as a RAM drive or pooling it in with system RAM? Similarly with GPUs, we were promised years ago that Physx would lead to amazing advances in AI and game realism, yet it simply hasn't appeared.

    The anger that people showed towards Vista and it's horrible bloat should be directed to all major software companies. None of them have achieved anything worthwhile in a very long time.
  • corser
    I do not think that including a IGP on the processor die and conecting them doesn't means that discrete graphics vendors are dead. Some people will have graphics requirements that will overhelm the IGP and connect an 'EGP' (External Graphics Processor). Uhmmmm... maybe I created a whole new acronym.

    Since the start of that idea, believed that IGP on the processor die could serve to offload math operations and complex transformations from CPU to IGP, freeing CPU cycles for doing what is intended to do.

    Many years ago Apple made somewhat similar to this with their Quadra models that sported a dedicated DSP to offload some tasks from the processor to the DSP.

    My personal view on all this hype is that we're going to a different computing model, from a point that all the work was directed to the CPU and making some small steps making that specialized processors around the CPU do part of the work of the CPU (think on the first fixed instruction graphics accelerators, sound cards that off-load CPU, Physx and others).

    From a standalone CPU -> SMP ->A-SMP (Asymetric SMP).
  • silky salamandr
    I agree with Scort. We have all this fire sitting on our desks and it means nothing if theres no software to utilize it. While I love the advancement in technology, I really would like devs to catch up with the hardware side of things. I think everybody is going crazy adding more cores and having an arms race as a marketing tick mark but theres no devs stepping up to write for it. We all have invested so much money into what we love but very few of us(not me at all)can actually code. With that being said, most of our machines are "held hostage" in what they can and cannot do.

    But great read.
  • corser
    Hardware should be way time before software starts to take advantage of it. Has been like this since the start of the computing times.
  • Darkerson
    Very informative article. I'm hoping to see more stuff like this down the line. Keep up the good work!
  • jestersage
    Awesome start for a very ambitions series. I hope we get more insights and soon.

    I agree with Snort and silky salamandr, we are held back by developments on the software side. Maybe because developers need to take backwards compatibility into consideration. Just take games for example: developers would like to keep the minimum and even recommended specs down so that they can get more customers to buy. So we see games made tough for the top-end hardware but, thru tweaks and reduced detail, can be played on a 6-year old Pentium 4 with a 128mb AGP card.

    From a business consumer standpoint, and the fact that I work for a company that still uses tens of thousands of Pentium 4s for productivity related purposes, I figure that adoption of the GPU/CPU in the business space will not happen for another 5-7 years AFTER they launch. There is simply no need for an i3 if a Core2 derivative Pentium Dual Core or Athlon X2 could still do spreadsheet, word processing, email, research, etc. Pricing will definitely play into the timelines as the technology ages (or matures) but both companies will have to get money to pay for all that R&D from somewhere, right?
  • smile9999
    great article btw, out of all this what I got seems that the hyprid model of cpu/gpu seems more of a gimmick that an actual game changer, the low end market has alot of players, IGPs are a major player in that field and they are great at it and if that wasnt enough there still is nvidia and ati offerings, so I dont think it will really shake the water much as they predict.