Opinion: AMD, Intel, And Nvidia In The Next Ten Years

Can Software Developers Keep Up?

What follows is a very long-winded way of saying: the video game market is large enough and profitable enough to support the development of titles with Pixar-level budgets of $150-175M. Seriously. It’ll be very long-winded.

In order to keep up with hardware, software developers need huge development budgets. Today’s "Quadruple A" games, such as Call of Duty: Modern Warfare 2, Halo 3, and Gran Turismo 5 have stratospheric development budgets of $50 to $60 million. Grand Theft Auto 4 has even been reported as having a development budget of a $100 million. With marketing costs factored in, CoD: MW2's budget is a reported $200 million!

Source: selectstartgames.files.wordpress.comSource: selectstartgames.files.wordpress.com

Ambitious "triple A" games built to establish a new franchise, such as Bioshock or Mirror's Edge, range in the $20 to $30 million budget. These games have the potential to grow into major new franchises, and their budget reflects that.

For 2K Games, it was a great investment. But for EA, its investment resulted in a critically-acclaimed game with a wonderful visual design and concept, but Mirror's Edge was a financial disappointment. If you haven't played the game, you really owe it to yourself to try it at today’s bargain-bin price.

Source: www1.on-mirrors-edge.comSource: www1.on-mirrors-edge.com

Historically, software development budgets have increased by an order of magnitude with each console generation. In the Super Nintendo/Genesis era, a big budget game was in the range of $300,000. By the GameCube/PS2 era, games were in the $10 million range with top-tier games like Final Fantasy warranting a $40 million budget. If this trend continues, we will be seeing a $600 million budget for Gran Turismo 7 on the PlayStation 4, a $6 billion budget for Gran Turismo 9 on the PlayStation 5, and a $60 billion budget for Gran Turismo X.  Obviously, this trend cannot be extrapolated forever.

Source: z.about.comSource: z.about.com

The exponential growth of software development budgets over the last two decades has indeed been a reflection of increased hardware performance. But it was also reflective of the expanding market size and the resulting increase in potential profits. The financier of a game isn’t budgeting the game "so that it looks cool." Rather, they’re considering the risk/reward of the title, and they are doing that across an entire portfolio of software titles. For every Guitar Hero or Madden NFL, there will be a Mirror’s Edge or Duke Nukem Forever. The sequels that reviewers like to criticize for being "more of the same" subsidize the budget for more ambitious and creative games.

The process of developing and selling games is similar to that of developing a Hollywood blockbuster. Although games cost $50-60 a piece and movie tickets are around $10, a household can buy one copy of the game and enjoy it, while each person in the theater needs to pay for a movie ticket.

As a general rule, successful effects-driven Hollywood movies with budgets in the $150-175 million range bring in approximately 4x to 5x their budget in worldwide ticket sales. These are movies like Transformers, Casino Royale, and Pirates of the Caribbean. Successful lower-budget story-driven films can have a widely varying rate of return, starting from 3x for The Departed, 6x for 500 Days of Summer, 10x for Twilight, and 25x for Slumdog Millionaire. Movies like Children of Men can end up making less in ticket sales than their budget (10% deficit).

Title
Estimated Budget (source: IMDbPro)
Worldwide Ticket Sales (source: IMDbPro)
Transformers

$150 million

$701 million

Dark Knight

$185 million

$1 billion

Ratatouille

$150 million

$616 million

Casino Royale

$150 million

$588 million

Pirates of the Caribbean: Curse of the Black Pearl

$140 million

$653 million

Pirates of the Caribbean: Dead Man's Chest

$225 million

$1.06 billion

Pirates of the Caribbean: At World's End

$300 million

$958 million

Iron Man

$140 million

$572 million

Star Trek (2009)

$150 million

$383 million

The Departed

$90 million

$289 million

(500) Days of Summer

$7.5 million

$55.3 million

Twilight

$37 million

$351 million

Slumdog Millionaire

$15 million

$362 million

Children of Men

$76 million

$68.3 million


You cannot use these numbers to talk about the actual profits that a company brings in, due to things like DVD/Blu-ray sales, broadcast rights, and costs of marketing. But these numbers allow us to see that the risks a financier is willing to take are not shouldered arbitrarily. When committing nine-figures before the first ticket is sold, the financier has to be sure of what he is doing in the long run. Just because several movies have reached the $1B mark in ticket sales doesn’t mean that you’ll see a $500 million pre-marketing development budget anytime soon. Likewise, one may be more comfortable committing eight-figures toward a movie, knowing that one’s losses will be smaller when you lose, and gains will be bigger when you win.

Financiers of games follow similar patterns. If you look at successful "big budget" games like Bioshock, GTA IV, Halo 3, and Final Fantasy XII, games seem to be have a sales that reflect 4x to 6x their budget (ignoring marketing costs). Again, that 4x to 6x estimate doesn’t mean anything about what they really bring back, but instead reflects the fact that successful "big budget" games ranging from $20 million to $100 million, end up selling a similar number of copies that is proportional to the investment. The financiers of games try to follow a similar unwritten risk profile.

Industry pundits will tell you that the video game industry is comparable in total financial size to the motion picture industry, if not larger. That’s not entirely true. In 2008, Media Control GfK International reported that global sales of video games reached $32 billion--more than the $29.8 billion amassed in DVD/Blu-ray sales.

The catch is that box office ticket sales accounted for another $28.1 billion. We haven’t even begun to touch on broadcast rights.

Moreover, the success of a Hollywood film is more predictable than that of video games due to all of those multiple revenue streams and a better understanding of the market. So, while game development budgets will continue to grow, it’s hard to imagine games reaching the same budget levels as Hollywood films. The risks are higher in game development. And therefore, for any given budget, they’ll want a higher return. Over the next ten years, I anticipate quadruple-A games to have budgets in the region of $175 million, and ambitious games introducing new intellectual property having budgets of $60 to $80 million.

With Pixar-level budgets come the potential for Pixar-level graphics (and Pixar-level characters and stories). Given that Pixar films still require 5 to 6 hours to render a single frame on large supercomputer clusters, the answer is no, graphics have not reached the point of diminishing returns yet.

This means that we haven’t reached the plateau in "subjective experience" either. Newer and more powerful GPUs will continue to be produced as software titles with more complex graphics are created. Only when this plateau is reached will sales of dedicated graphics chips begin to decline. In addition, unlike the sound card world, in which the plateau was reached with a relatively small number of transistors, allowing rapid integration onto motherboards, GPUs are still pushing the limits of process technology and require massive cooling and power. That is, you could manufacture a CPU and GPU on a single physical package, but you’re going to have a heck of a time cooling that device and delivering the appropriate amount of power (Ed.: Intel's Clarkdale-based Core i5 and Core i3 are a perfect example of the limits of CPU/GPU integration today). We’ve also only looked at graphics as the driving force for development. GPUs can be used to accelerate large, complex parallel mathematics. This is the other frontier for manufacturers like AMD, Intel, and Nvidia. 

The software world continues to provide ripe opportunities for this merry trio, and allows them to sustain substantial R&D efforts. The state of the graphics industry is strong.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
102 comments
    Your comment
    Top Comments
  • jontseng
    This means that we haven’t reached the plateau in "subjective experience" either. Newer and more powerful GPUs will continue to be produced as software titles with more complex graphics are created. Only when this plateau is reached will sales of dedicated graphics chips begin to decline.

    I'm surprised that you've completely missed the console factor.

    The reason why devs are not coding newer and more powerful games is nothing to do with budgetary constraints or lack thereof. It is because they are coding for an XBox360 / PS3 baseline hardware spec that is stuck somewhere in the GeForce 7800 era. Remember only 13% of COD:MW2 units were PC (and probably less as a % sales given PC ASPs are lower).

    So your logic is flawed, or rather you have the wrong end of the stick. Because software titles with more complex graphics are not being created (because of the console baseline), newer and more powerful GPUs will not continue to produced.

    Or to put it in more practical terms, because the most graphically demanding title you can possibly get is now three years old (Crysis), then NVidia has been happy to churn out G92 respins based on a 2006 spec.

    Until we next generation of consoles comes through there is zero commercial incentive for a developer to build a AAA title which exploits the 13% of the market that has PCs (or the even smaller bit of that has a modern graphics card). Which means you don't get phat new GPUs, QED.

    And the problem is the console cycle seems to be elongating...

    J
    33
  • False_Dmitry_II
    I want to read this again in 10 years just to see the results...
    31
  • anamaniac
    Alan DangAnd games will look pretty sweet, too. At least, that’s the way I see it.

    After several pages of technology mumbo jumbo jargon, that was a perfect closing statement. =)

    Wicked article Alan. Sounds like you've had an interesting last decade indeed.
    I'm hoping we all get to see another decade of constant change and improvement to technology as we know it.

    Also interesting is that you almost seemed to be attacking every company, you still managed to remain neutral.
    Everyone has benefits and flaws, nice to see you mentioned them both for everybody.

    Here's to another 10 years of success everyone!
    22
  • Other Comments
  • anamaniac
    Alan DangAnd games will look pretty sweet, too. At least, that’s the way I see it.

    After several pages of technology mumbo jumbo jargon, that was a perfect closing statement. =)

    Wicked article Alan. Sounds like you've had an interesting last decade indeed.
    I'm hoping we all get to see another decade of constant change and improvement to technology as we know it.

    Also interesting is that you almost seemed to be attacking every company, you still managed to remain neutral.
    Everyone has benefits and flaws, nice to see you mentioned them both for everybody.

    Here's to another 10 years of success everyone!
    22
  • False_Dmitry_II
    I want to read this again in 10 years just to see the results...
    31
  • Anonymous
    " Simply put, software development has not been moving as fast as hardware growth. While hardware manufacturers have to make faster and faster products to stay in business, software developers have to sell more and more games"

    Hardware is moving so fast and game developers just cant keep pace with it.
    1
  • Ikke_Niels
    What I miss in the article is the following (well it's partly told):

    I am allready suspecting a long time that the videocards are gonna surpass the CPU's.
    You allready see it atm, videocards get cheaper, CPU's on the other hand keep going pricer for the relative performance.

    In the past I had the problem with upgrading my videocard, but with that pushing my CPU to the limit and thus not using the full potential of the videocard.

    In my view we're on that point again: you buy a system and if you upgrade your videocard after a year/year-and-a-half your mostlikely pushing your CPU to the limits, at least in the high-end part of the market.

    Ofcourse in the lower regions these problems are smaller but still, it "might" happen sooner then we think especially if the NVidia design is as astonishing as they say and on the same time the major development of cpu's slowly break up.
    -2
  • sarsoft
    Nice article. Good read....
    17
  • lashton
    one of the most interesting and informativfe articles from toms hardware, what about another story about the smaller players, like Intel Atom and VILW chips and so on
    7
  • JeanLuc
    Out of all 3 companies Nvidia is the one that's facing the more threats. It may have a lead in the GPGPU arena but that's rather a niche market compared to consumer entertainment wouldn't you say? Nvidia are also facing problems at the low end of market with Intel now supplying integrated video on their CPU's which makes the need for low end video cards practically redundant and no doubt AMD will be supplying a smiler product with Fusion at some point in the near future.
    6
  • jontseng
    This means that we haven’t reached the plateau in "subjective experience" either. Newer and more powerful GPUs will continue to be produced as software titles with more complex graphics are created. Only when this plateau is reached will sales of dedicated graphics chips begin to decline.

    I'm surprised that you've completely missed the console factor.

    The reason why devs are not coding newer and more powerful games is nothing to do with budgetary constraints or lack thereof. It is because they are coding for an XBox360 / PS3 baseline hardware spec that is stuck somewhere in the GeForce 7800 era. Remember only 13% of COD:MW2 units were PC (and probably less as a % sales given PC ASPs are lower).

    So your logic is flawed, or rather you have the wrong end of the stick. Because software titles with more complex graphics are not being created (because of the console baseline), newer and more powerful GPUs will not continue to produced.

    Or to put it in more practical terms, because the most graphically demanding title you can possibly get is now three years old (Crysis), then NVidia has been happy to churn out G92 respins based on a 2006 spec.

    Until we next generation of consoles comes through there is zero commercial incentive for a developer to build a AAA title which exploits the 13% of the market that has PCs (or the even smaller bit of that has a modern graphics card). Which means you don't get phat new GPUs, QED.

    And the problem is the console cycle seems to be elongating...

    J
    33
  • Swindez95
    I agree with jontseng above ^. I've already made a point of this a couple of times. We will not see an increase in graphics intensity until the next generation of consoles come out simply because consoles is where the majority of games sales are. And as stated above developers are simply coding games and graphics for use on much older and less powerful hardware than the PC has available to it currently due to these last generation consoles still being the most popular venue for consumers.
    11
  • Swindez95
    Oh, and very good article btw, definitely enjoyed reading it!
    5
  • 1898
    Without much doubt, Nvidia is working on a x86 CPU simply because their life depends on it.
    and +1 jontseng
    2
  • mfarrukh
    I hope all of this is for the betterment of mankind
    -5
  • neiroatopelcc
    Page 4 sais 5-6 hours to render a frame? can't be true really ....
    a typical animation feature : 90 minutes
    the maths : 90 minutes = 5400 seconds ; @ 25fps that is a total of 135000 frames to render ; rendition time is 5 times that, totalling 675000 hours = 28125 days - that's 77 years - even in parallel that means it'll take a year with 77 supercomputers to do just one animation of each frame ; and I know a disney guy (dvd extra content) said it took about 8 months to make an animated feature after the story was done (ie. animation) - doubt pixar is so much slower.
    ps. assuming 29.7fps it's over 90 years
    0
  • Tohos
    @ neiroatopelcc
    That is where render farms come in. Hundreds of computer clusters churning out frames day and night.
    12
  • mfarrukh
    Excellent read. Thorough and great Experience speaking
    3
  • climber
    If you want to get a feel for how long it takes to render a frame in a modern movie. Check out the extended features content on the first Transformers movie. It's mentioned that it takes ~24 hrs to render a frame of movie footage with five transformers in it at full screen size. I might have some of the numbers slightly off, but it's serious computational time folks.
    4
  • neiroatopelcc
    Tohos@ neiroatopelcc That is where render farms come in. Hundreds of computer clusters churning out frames day and night.

    What follows is a citation from the actual article page four.
    "With Pixar-level budgets come the potential for Pixar-level graphics (and Pixar-level characters and stories). Given that Pixar films still require 5 to 6 hours to render a single frame on large supercomputer clusters, the answer is no, graphics have not reached the point of diminishing returns yet."
    In my book a cluster of supercomputers is the same as a render farm.
    climberIf you want to get a feel for how long it takes to render a frame in a modern movie. Check out the extended features content on the first Transformers movie. It's mentioned that it takes ~24 hrs to render a frame of movie footage with five transformers in it at full screen size. I might have some of the numbers slightly off, but it's serious computational time folks.

    I don't own that movie, and I'm not even sure I've seen it. Is it the one with amodern yellow muscle car and a wimpy teen? if so I may have.
    Anyway - a day per frame simply can't be an average - not even 5 hours can! It would simply take too many years to make a movie, and I'm sure pixar doesn't rent all blue gene servers in the world or half of crays hardware just to make one movie?
    -5
  • Anonymous
    That's what the writer was getting at. These 'supercomputers' are nothing but render farms. And while they don't take hours to render a frame, they do take a significant amount of time. Then again one must consider that these frames are 4096×2160 each and you are not only doing raster operations, including high levels of AA, but also carrying out physics calculations on almost all of contents of the scene. This is waaaay more than any gpu can hope to do right now. That is why you are seeing these scenes rendered on farms consisting of general purpose cpus. They can compute anything required for the rendering of the scene. They are easy to assign tasks to over the network with existing software. I doubt scheduling rendering chores to cpus AND gpus on each of the nodes on the farm work as well with current hardware and software.
    7
  • killerclick
    I'm happy I get to keep my 8800GT for another year. :)
    0
  • memeroot
    one of the great things with render farms is that they can render more than one image at a time, one of the sad things about render farms is if you allocated the whole farm over to 1 image it wouldn't be any faster.
    4