I've been a journalist/reviewer in the 3D graphics industry for over a decade. I can still remember walking through Fry's Electronics and seeing Western Digital's Paradise Tasmania 3D and actually getting excited about the Yamaha-powered graphics chip. Chris Angelini, the managing editor of Tom's Hardware US, and I go way back, with our first jobs in online journalism traced back to 3DGaming.com more than a decade ago.
Having been there from the beginning, I've seen the rise and fall of countless graphics manufacturers: S3, 3DLabs, Rendition, 3dfx, as well as board manufacturers like Orchid, STB, Hercules, the original Diamond, and Canopus. But as wild and crazy as the last decade was for visual computing, the next decade is going to be even more exciting, not only in what technology will offer to consumers, but in the upcoming arms race in visual computing.
A lot has been said about the impending death of the dedicated GPU. If you look at the history of dedicated upgrade products for consumer PC technology, they all eventually reach the point of diminishing returns and then integration. However, while it is inevitable that the dedicated GPU will eventually disappear, it’s not going to happen in the next decade.
Integration of computer technology only happens after the evolutionary process of reaching the point of diminishing returns on quality and performance is reached. We can see evidence of this with sound cards, video processing, and even monitors.
What follows is a discussion on the future of 3D graphics. Is the GPU on its death bed? Will AMD, Intel, and Nvidia continue to be relevant? This is purely an opinion piece, but it is based on more than a decade of experience.
Disclosure: per FTC guidelines, I am required to disclose any potential conflicts of interest. I own no shares of any company discussed in this editorial. Additionally, although I have received free engineering samples from AMD, Intel, and Nvidia in the past for editorial purposes, I have not received any products from these companies in the last year.


I'm surprised that you've completely missed the console factor.
The reason why devs are not coding newer and more powerful games is nothing to do with budgetary constraints or lack thereof. It is because they are coding for an XBox360 / PS3 baseline hardware spec that is stuck somewhere in the GeForce 7800 era. Remember only 13% of COD:MW2 units were PC (and probably less as a % sales given PC ASPs are lower).
So your logic is flawed, or rather you have the wrong end of the stick. Because software titles with more complex graphics are not being created (because of the console baseline), newer and more powerful GPUs will not continue to produced.
Or to put it in more practical terms, because the most graphically demanding title you can possibly get is now three years old (Crysis), then NVidia has been happy to churn out G92 respins based on a 2006 spec.
Until we next generation of consoles comes through there is zero commercial incentive for a developer to build a AAA title which exploits the 13% of the market that has PCs (or the even smaller bit of that has a modern graphics card). Which means you don't get phat new GPUs, QED.
And the problem is the console cycle seems to be elongating...
J
After several pages of technology mumbo jumbo jargon, that was a perfect closing statement. =)
Wicked article Alan. Sounds like you've had an interesting last decade indeed.
I'm hoping we all get to see another decade of constant change and improvement to technology as we know it.
Also interesting is that you almost seemed to be attacking every company, you still managed to remain neutral.
Everyone has benefits and flaws, nice to see you mentioned them both for everybody.
Here's to another 10 years of success everyone!
After several pages of technology mumbo jumbo jargon, that was a perfect closing statement. =)
Wicked article Alan. Sounds like you've had an interesting last decade indeed.
I'm hoping we all get to see another decade of constant change and improvement to technology as we know it.
Also interesting is that you almost seemed to be attacking every company, you still managed to remain neutral.
Everyone has benefits and flaws, nice to see you mentioned them both for everybody.
Here's to another 10 years of success everyone!
Hardware is moving so fast and game developers just cant keep pace with it.
I am allready suspecting a long time that the videocards are gonna surpass the CPU's.
You allready see it atm, videocards get cheaper, CPU's on the other hand keep going pricer for the relative performance.
In the past I had the problem with upgrading my videocard, but with that pushing my CPU to the limit and thus not using the full potential of the videocard.
In my view we're on that point again: you buy a system and if you upgrade your videocard after a year/year-and-a-half your mostlikely pushing your CPU to the limits, at least in the high-end part of the market.
Ofcourse in the lower regions these problems are smaller but still, it "might" happen sooner then we think especially if the NVidia design is as astonishing as they say and on the same time the major development of cpu's slowly break up.
I'm surprised that you've completely missed the console factor.
The reason why devs are not coding newer and more powerful games is nothing to do with budgetary constraints or lack thereof. It is because they are coding for an XBox360 / PS3 baseline hardware spec that is stuck somewhere in the GeForce 7800 era. Remember only 13% of COD:MW2 units were PC (and probably less as a % sales given PC ASPs are lower).
So your logic is flawed, or rather you have the wrong end of the stick. Because software titles with more complex graphics are not being created (because of the console baseline), newer and more powerful GPUs will not continue to produced.
Or to put it in more practical terms, because the most graphically demanding title you can possibly get is now three years old (Crysis), then NVidia has been happy to churn out G92 respins based on a 2006 spec.
Until we next generation of consoles comes through there is zero commercial incentive for a developer to build a AAA title which exploits the 13% of the market that has PCs (or the even smaller bit of that has a modern graphics card). Which means you don't get phat new GPUs, QED.
And the problem is the console cycle seems to be elongating...
J
and +1 jontseng
a typical animation feature : 90 minutes
the maths : 90 minutes = 5400 seconds ; @ 25fps that is a total of 135000 frames to render ; rendition time is 5 times that, totalling 675000 hours = 28125 days - that's 77 years - even in parallel that means it'll take a year with 77 supercomputers to do just one animation of each frame ; and I know a disney guy (dvd extra content) said it took about 8 months to make an animated feature after the story was done (ie. animation) - doubt pixar is so much slower.
ps. assuming 29.7fps it's over 90 years
That is where render farms come in. Hundreds of computer clusters churning out frames day and night.
What follows is a citation from the actual article page four.
"With Pixar-level budgets come the potential for Pixar-level graphics (and Pixar-level characters and stories). Given that Pixar films still require 5 to 6 hours to render a single frame on large supercomputer clusters, the answer is no, graphics have not reached the point of diminishing returns yet."
In my book a cluster of supercomputers is the same as a render farm.
I don't own that movie, and I'm not even sure I've seen it. Is it the one with amodern yellow muscle car and a wimpy teen? if so I may have.
Anyway - a day per frame simply can't be an average - not even 5 hours can! It would simply take too many years to make a movie, and I'm sure pixar doesn't rent all blue gene servers in the world or half of crays hardware just to make one movie?