Opinion: AMD, Intel, And Nvidia In The Next Ten Years

AMD In The Next Decade

After an impressive Radeon HD 4000-series lineup, AMD has maintained its momentum with an equally-impressive Radeon HD 5000-series.

One of the innovations that AMD adopted with its GPU design strategy was a "small die" concept. Whereas Nvidia typically architects a flagship GPU and then tries to scale budget versions by disabling processing cores, reducing clock speeds, or using fewer 64-bit memory interfaces, AMD elected to build around a optimized die philosophy in which the best mid-range GPU that can be built on a given manufacturing process is developed. Those GPUs can then be paired up on a flagship card. So, while Nvidia has historically offered the highest-performing GPU for a given generation, the last two product cycles have seen AMD offering the highest price/performance ratio.

ATI's (and now AMD's) approach to GPU design has always been gaming-focused and yet conservative. While Nvidia was trying to introduce 16/32-bit floating point into its GeForce FX line-up, a feature too slow for games, ATI was sticking to a fixed 24-bit precision shader in its Radeon 9700.

We've seen the same "problems" with Nvidia's GT200 core (GeForce GTX 260/275/280/285/295) and support for IEEE-754-compliant double-precision math. This capability is only just being introduced in the Radeon HD 5800-series (and is absent from the HD 5700-series). Nvidia’s ambition will carry over to GF100, seen by many to be architected more for CUDA than gaming. 

What's important to recognize is that AMD has kept pace with the times. When hardware and software reached the point where FP32 shaders were needed, they had the feature available. Though Nvidia was first to market with double-precision IEEE-754-compliant math, now that this feature is coming into greater importance, the Radeon HD 5800-series is making that capability available as well. For the most part, Nvidia's first-to-market advantage for non-gaming features has not been a major driving force for sales (more on this later).

Looking Further Ahead

AMD's designs are leading toward the goal of Fusion. In Fusion, AMD plans to integrate traditional GPU technology with the CPU. Not only can this reduce latency while increasing bandwidth (faster), but the CPU/GPU can share a substantial set of resources (cheaper, less silicon). In much the same way that our CPUs have floating point units (FPUs), which are integrated components optimized for single- and double-precision math, rounding, and so on, AMD will offer a future that integrates GPUs and CPUs.

In the initial phase, AMD will most likely integrate existing CPU cores with GPU cores. In the same way the company developed today's Radeon GPUs with multi-chip scalability, Fusion processors will be scalable by controlling the number of processing elements. Down the line, the integration will likely be more thorough, with no clear difference between the CPU and GPU components of the chip. Instead, it'll be a CPU with an integrated FPU and "stream" cores, which, through a software driver will act as graphics chip. With proper power management, this will offer a potent combination of performance per watt. It will be possible to have a fully-capable desktop environment powered by the integrated GPU, saving battery life, and then having on-demand access to a faster GPU for applications able to benefit from it.

Nvidia tried this with its Hybrid SLI technology, but it did not work for the enthusiast market. While Hybrid SLI (GeForce Boost) allowed you to split the 3D workload between the integrated GPU and the GPU on an add-on card, it was useless with a flagship GPU. The latency and processing overhead required to split the workload was greater than the performance benefit. AMD has a higher chance of success with Fusion, thanks to the lower-latency of the integrated component.

Whereas Fusion is about integrating graphics technology into the CPU, Torrenza is about providing direct-to-CPU connectivity via HyperTransport or a similarly-advanced, high-bandwidth low-latency interconnect protocol.

The use of direct-to-CPU add-ons is actually already shipping in niche markets. For example, you can use an XtremeData XDI device in a standard AMD Opteron socket. This features two Altera Stratix II FPGAs that have direct access to the primary Opteron CPU and RAM. Even though these devices may not have the same raw GFLOPS that an eight-core CPU or GPGPU setup may leverage, the real-world performance for compute-intensive algorithmic applications, such as financial market data analysis, data encryption, or military radar systems, is considerably higher in both raw performance and performance per watt. You can also get HTX cards with similar FPGAs or multi 10 GigE network adapters with direct HyperTransport links to the CPU.

AMD's Outlook

AMD's graphics division has demonstrated its ability to compete at the top of the market. Its Radeon HD 4000-series cards offered better price/performance than Nvidia's GeForce GTX (GT200) line-up, and the Radeon HD 5000-series remains the uncontested DirectX 11 champion.

AMD's CPU division has also demonstrated an ability to design high-performance multi-core CPUs, starting from the Athlon 64 X2 to today's current six-core server-oriented Opterons. Throughout the last decade, AMD and ATI have both been either a performance leader or in a close second place. As they enter the next ten years, AMD is the only technology company with the technical resources and consistent track record in both CPU and GPU technology. When AMD talks about Fusion, one cannot help but to believe that the company can succeed. We have high expectations for AMD in the upcoming decade.

  • anamaniac
    Alan DangAnd games will look pretty sweet, too. At least, that’s the way I see it.After several pages of technology mumbo jumbo jargon, that was a perfect closing statement. =)

    Wicked article Alan. Sounds like you've had an interesting last decade indeed.
    I'm hoping we all get to see another decade of constant change and improvement to technology as we know it.

    Also interesting is that you almost seemed to be attacking every company, you still managed to remain neutral.
    Everyone has benefits and flaws, nice to see you mentioned them both for everybody.

    Here's to another 10 years of success everyone!
    Reply
  • False_Dmitry_II
    I want to read this again in 10 years just to see the results...
    Reply
  • " Simply put, software development has not been moving as fast as hardware growth. While hardware manufacturers have to make faster and faster products to stay in business, software developers have to sell more and more games"

    Hardware is moving so fast and game developers just cant keep pace with it.
    Reply
  • Ikke_Niels
    What I miss in the article is the following (well it's partly told):

    I am allready suspecting a long time that the videocards are gonna surpass the CPU's.
    You allready see it atm, videocards get cheaper, CPU's on the other hand keep going pricer for the relative performance.

    In the past I had the problem with upgrading my videocard, but with that pushing my CPU to the limit and thus not using the full potential of the videocard.

    In my view we're on that point again: you buy a system and if you upgrade your videocard after a year/year-and-a-half your mostlikely pushing your CPU to the limits, at least in the high-end part of the market.

    Ofcourse in the lower regions these problems are smaller but still, it "might" happen sooner then we think especially if the NVidia design is as astonishing as they say and on the same time the major development of cpu's slowly break up.


    Reply
  • sarsoft
    Nice article. Good read....
    Reply
  • lashton
    one of the most interesting and informativfe articles from toms hardware, what about another story about the smaller players, like Intel Atom and VILW chips and so on
    Reply
  • JeanLuc
    Out of all 3 companies Nvidia is the one that's facing the more threats. It may have a lead in the GPGPU arena but that's rather a niche market compared to consumer entertainment wouldn't you say? Nvidia are also facing problems at the low end of market with Intel now supplying integrated video on their CPU's which makes the need for low end video cards practically redundant and no doubt AMD will be supplying a smiler product with Fusion at some point in the near future.
    Reply
  • jontseng
    This means that we haven’t reached the plateau in "subjective experience" either. Newer and more powerful GPUs will continue to be produced as software titles with more complex graphics are created. Only when this plateau is reached will sales of dedicated graphics chips begin to decline.
    I'm surprised that you've completely missed the console factor.

    The reason why devs are not coding newer and more powerful games is nothing to do with budgetary constraints or lack thereof. It is because they are coding for an XBox360 / PS3 baseline hardware spec that is stuck somewhere in the GeForce 7800 era. Remember only 13% of COD:MW2 units were PC (and probably less as a % sales given PC ASPs are lower).

    So your logic is flawed, or rather you have the wrong end of the stick. Because software titles with more complex graphics are not being created (because of the console baseline), newer and more powerful GPUs will not continue to produced.

    Or to put it in more practical terms, because the most graphically demanding title you can possibly get is now three years old (Crysis), then NVidia has been happy to churn out G92 respins based on a 2006 spec.

    Until we next generation of consoles comes through there is zero commercial incentive for a developer to build a AAA title which exploits the 13% of the market that has PCs (or the even smaller bit of that has a modern graphics card). Which means you don't get phat new GPUs, QED.

    And the problem is the console cycle seems to be elongating...

    J
    Reply
  • Swindez95
    I agree with jontseng above ^. I've already made a point of this a couple of times. We will not see an increase in graphics intensity until the next generation of consoles come out simply because consoles is where the majority of games sales are. And as stated above developers are simply coding games and graphics for use on much older and less powerful hardware than the PC has available to it currently due to these last generation consoles still being the most popular venue for consumers.
    Reply
  • Swindez95
    Oh, and very good article btw, definitely enjoyed reading it!
    Reply