Sign in with
Sign up | Sign in

Nvidia: The Future of Graphics Processing

By - Source: Tom's Hardware US | B 47 comments

Nvidia's Tony Tamasi took ECGC 2011 attendees on a trip to the past, to the present, and to the future of both GPU rendering and mobile graphics.

During ECGC 2011, Nvidia senior vice president of content and technology Tony Tamasi made a startling prediction during his keynote presentation called "The Future of Graphics Processing." He claimed that GPU performance will increase 1000-percent by 2015, allowing graphics cards to generate real-time ray tracing and procedurally generated smoke at 30 to 60 frames per second.

To put this into perspective, Nvidia's latest GPU can churn out the same photo-realistic graphics at 2 frames per second. Obviously that's not practical for gamers at this point. But for digital artists, product and automobile designers, this is a virtual holy grail. Gone are the days of making simple changes and then having to wait an hour or two for the image to be redrawn. Instead, it could take mere second depending on the artwok's complexity. But in a FPS environment, one or two frames per second isn't even worth a glance.

To back up his claim, Tamasi presented a timeline on how the GPU has progressed since the days of GLQuake using screenshots of several games (Quake 2, Call of Duty, Battlefield 3) to represent stages in the evolution. At the same time, he also detailed hardware features that have been added along the way including transform and lighting, programmable shading and so on.

But he also threw up a chart on the big screen that listed GPU specs dating 2007, 2011 and 2015. In 2007, GPUs featured a texture performance of 12.3 giga-transfers per second (GT/s), an antialiasing performance of 10.3 giga-samples per second (GS/s), a memory bandwidth of 63.4 gigabytes per second (GB/s), geometry running at 0.3 triangles per second (Gtri/s) and a floating point of 228 giga-flops (Gflop/s). In 2011, Nvidia's latest GPU features a texture performance of 84.5 GT/s, antialiasing performance of 37.0 GS/s, a memory bandwidth of 192.4 GB/s, geometry running at 3.1 Gtri/s and a floating point of 2703 Gflop/s.

Now here's the kicker. Based on the compound annual growth rate between 2007 and 2011 (1.94, 1.56, 1.47, 2.34 and 2.35 respectively), Nvidia predicts that a 2015 GPU will feature a texture performance of 579.7 GT/s, antialiasing performance of 133.8 GS/s, and a memory bandwidth of 584.1 GB/s. Geometry will be at a staggering 37.2 Gtri/s and the floating point will be up to 32039.8 Gflop/s.

Given that a large portion of the audience probably owned Microsoft's Xbox 360 gaming console (guilty as charged), he didn't leave them out of the picture. The console, which launched in 2005, features a GPU with a texture performance of 8 GT/s, an antialiasing performance of 16 GS/s, a memory bandwidth of 22.4 GB/s, geometry running at 0.3 Gtri/s and a floating point of 240 Gflop/s. When compared with a 2007 PC GPU, the console outperforms in antialiasing and floating point. But as Tamasi noted, the geometry numbers remained flatlined on both PC and Xbox 360 for years.

Looking over the charts and hearing Tamasi's prediction of real-time ray tracing at acceptable, nearly fluid levels in 2015, you have to wonder: is this the end of the road? Is it even possible to push graphics beyond photo-realism? When will the GPU run out of gas? When will performance taper off? Tamasi says he's asked that quite a lot.

"I don't know when it's going to be done," he admitted to the audience. "Which from my perspective, that's a good thing. And probably for all of us too because as soon as people see it as 'done,' then innovation starts to change. And it starts to go from being an innovation-driven industry to basically a lowest-common-denominator kind of cost-driven economy. There's innovation there but a different kind of innovation."

Later on after the keynote, I wanted to take this topic a little further. Seeing the visual difference between Quake 2 and Battlefield 3 made me think of Jack Thompson and the term he seemingly likes to throw around, killer simulator. Quake 2 and Call of Duty look like games-- they attempt to imitate reality (well mostly CoD) but there's a clear difference. Battlefield 3 images borderline on realistic. With developers pushing for realism and Nvidia pushing technology to provide realism, when does it go too far? When do games cross the line from being a simple game for entertainment, to a real-life simulator?

He agreed that you can definitely overdo it. You can have a movie that's focused on special effects and no story, and it's a crappy movie. If you have a game with all graphics and no game, you have no gameplay. You can definitely have a great game that doesn't sport stunning eye candy. But certain genres-- FPS primarily-- have come to require top-notch, bleeding-edge visuals-- fans have simply come to expect high-end visuals in that particular genre.

Later on Mark Cerny spoke of a vicious cycle in the following keynote, where consumers demand more, developers and hardware manufacturers produce more, requiring larger budgets, and then consumer demand require even more on top of that. On the FPS front at least, the genre has mostly matured from a gaming aspect to a simulator aspect, and doesn't appear to have any kind of "end" in sight.

And just as we entered a golden era by moving from pixels to polygons (and thus adding native OpenGL support), the mobile front is now entering a similar, exciting era. "[Mobile] gameplay innovation has been re-invigorated," he told me after the keynote, saying we'll essentially ride that new wave out until there's an overall standard, and then the graphics front should escalate dramatically. We're already experiencing a steep escalation now as it is thanks to a dramatic increase in mobile hardware performance.

Given the rapid advancement in mobile (smartphone, tablet) technology, will these devices actually replace netbooks in the near future? Netbooks will be wedged out, he said, but not notebooks because it's a form factor most consumers are familiar with. It has a larger display and an integrated keyboard. "There's a place in the universe for that form factor," he said.

While that indeed may be true, I saw a large number of tablets throughout the convention, seen both within the sessions and the keynotes. Although there were notebook users present at the show (this one included), tablets by far outweighed the older form factor. But as he said, there's a place in the universe for notebooks just as there's a place in the universe for a 1000-watt PC playing host to three Nvidia GeForce GTX 580 cards in SLI.

Getting back to the topic at hand, a large chunk of Tamasi's presentation focused on the road Nvidia has taken since the days of GLQuake, to where the GPUs stand now in terms of what they can crank out on a visual level. As mentioned in another article, Nvidia and Epic presented the DirectX 11-drenched "Samaritan" video in real-time behind closed doors, showcasing the current state of GPU technology in a 3-way SLI configuration. A video version was also shown during Tamasi's keynote even though the monster rig used in the private demo sat at his feet on stage like a dark, ferocious beast poised and ready for attack. I didn't see any Scooby snacks, either.

On a side note, he openly admitted that he was thrilled many people in the industry believed the demo to be pre-rendered like most cinematics. But it's not. It runs in real-time and he believes Nvidia has reached a milestone to where "many people's perception of what's possible in real time has been completely changed," that what can be accomplished today was achieved as an offline, pre-rendered video five to six years ago.

The next half of his keynote focused on mobile, claiming that the latest generation resides in the DirectX 9 class. But he then offered an interesting view of mobile's future: take all the "amazing" technological advances primarily manifested on the PC (as it tends to advance a level every year), the content developed for the consoles (because, let's face it, we're in the Era of Consoles whether you like it or not), and cram it all into a mobile form factor you can take with you wherever you go (as in stick it in your pocket). That is apparently Nvidia's vision of Tegra.

The next segment regarding Tegra's roadmap was more of a rehash of what we already know, and he whipped out the familiar Tegra slide listing upcoming SOC's named after heroes from the DC and Marvel comics: Kal-El in 2011 (5x faster than the current Tegra 2), Wayne in 2012 (10x), Logan in 2013 (50x) and Stark (75x). As seen on the slide, the CPU aspect of Kal-El outperforms the Intel Core 2 Duo T7200 processor and is 4x away from the current generation of gaming consoles... in a mobile form factor. That said, new mobile devices should pass current-gen consoles in computing performance within the next few years.

Tamasi said that things didn't really get interesting on the mobile front until programmable pixel shading was introduced. He also said that the first iPhone was "truly revolutionary" and completely turned the mobile smartphone industry around-- absolutely a fantastic product that brought real computing to a mobile platform in a truly useful way. It also caused everyone else outside Apple to completely re-think their mobile strategy. "No doubt about it, they'll all tell you the same thing," he said.

At the end of his keynote, Tamasi played Blizzard's awesome cinematic for World of Warcraft: Cataclysm, saying that some aspects will be possible to render in real-time within the next four to five years: the fire, level of geometric complexity, a lot of the smoke simulations, and more. He told the audience to look back at the original Call of Duty and then play the "Samaritan" video-- you'll then see it's not all that impossible to imagine real-time ray tracing and whatnot within the next five years.

After the show, Tamasi said something interesting that made me realize there will probably never be an "end" as far as pushing the graphics boundary or pumping out the next level of hardware: you can't develop the next generation of gaming content on a 1-watt phone. As long as gamers demand more, software and hardware will supply the goods.

Discuss
Display all 47 comments.
This thread is closed for comments
  • -8 Hide
    Anonymous , May 7, 2011 12:17 AM
    Well DUH, With the performance gains in GPUs the last 3 years I can't see WHY that wouldn't happen, although I think it will be AMD who gets it done and nVidia producing an inferior heat machine to compete.
  • 2 Hide
    kcorp2003 , May 7, 2011 12:19 AM
    aren't the new consoles suppose to be out by 2015 too?
  • 3 Hide
    sceen311 , May 7, 2011 12:22 AM
    Is it even possible to push graphics beyond photo-realism?
    We're a long ways from that yet. Even if you can reach photo realism it wont be that impressive until it's off a 2-d screen and completely surrounding me.
  • 1 Hide
    rohitbaran , May 7, 2011 12:32 AM
    Quote:
    Given the rapid advancement in mobile (smartphone, tablet) technology, will these devices actually replace netbooks in the near future? Netbooks will be wedged out, he said, but not notebooks because it's a form factor most consumers are familiar with. It has a larger display and an integrated keyboard. "There's a place in the universe for that form factor," he said.

    Well, nVidia ia definitely out of the notebook market since with Intel coming up with somewhat decent IGPs and AMD coming up with fusion, nVidia's solution don't seem to be the odd man out.
  • -1 Hide
    rohitbaran , May 7, 2011 12:34 AM
    kcorp2003aren't the new consoles suppose to be out by 2015 too?

    Yeah, but then the will feature tech from now, just as Wii 2 is rumored to feature the R700 GPU, a chip 3 years old.
  • 0 Hide
    rohitbaran , May 7, 2011 12:34 AM
    rohitbaranWell, nVidia ia definitely out of the notebook market since with Intel coming up with somewhat decent IGPs and AMD coming up with fusion, nVidia's solution don't seem to be the odd man out.

    Oops, I meant netbook there.
  • 0 Hide
    JOSHSKORN , May 7, 2011 12:35 AM
    kcorp2003aren't the new consoles suppose to be out by 2015 too?

    I don't think anything, other than the successor to the Wii (which will be out in 2012), has been announced.

    Quote:
    You can have a movie that's focused on special effects and no story, and it's a crappy movie.

    OPINION: Star Wars Episode I, II and III
  • 1 Hide
    NightLight , May 7, 2011 1:18 AM
    oh come on, give us v-world allready!
  • 3 Hide
    11796pcs , May 7, 2011 1:41 AM
    In 10 years I will be laughing at myself for ever thinking that Crysis was hard to run. Though over time I think developers will start to get lazy with their code as hardware gets so advanced.
  • 0 Hide
    Anonymous , May 7, 2011 2:14 AM
    @11796pcs:

    Welcome to ten years ago.
  • 5 Hide
    rad666 , May 7, 2011 2:26 AM
    Will the hardware exist to do these amazing things in 2015? Yes.

    Will consoles still exists and hold PCs back? Also yes.
  • 4 Hide
    schmich , May 7, 2011 3:11 AM
    Quote:
    While that indeed may be true, I saw a large number of tablets throughout the convention, seen both within the sessions and the keynotes

    Those are wannabe hipsters. I bet they had a keyboard addon to the tablet, right? Well a proper laptop (whether PC or Mac) is WAY more productive than a tablet. I just facepalm when I see someone use tablet for productivity in order to try to look cool.
    Quote:
    many people's perception of what's possible in real time has been completely changed

    This is due to most people only knowing "console graphics" and that we're soon at the end of the line for this generation of consoles. As simple as that. Every time a console gamer friend sees how games look on my PC they're always impressed.
    We won't be at the end of line in a long time. Why? Even the faster graphics cards won't run Battlefield 3 at full settings. There's just no way. Even an AMD 6950 barely keeps proper framerates (40+) during the most intensive graphics of BC2. Ray-tracing gets more complex the further away the ray has to travel. Ray-tracing in large open sandbox games is a lot more for the GPU to do VS one in a small enclosed room.
    To all this you can add larger resolutions (1080p is quite lame if you think about it really) and then 3D. Sure a lot of people who haven't had 3D gaming will say it's gimmicky but it's actually fun to turn on once in a while. 3D in gaming is true 3D.
    Quote:
    Kal-El outperforms the Intel Core 2 Duo T7200 processor

    No, no and no! How can you continue to spread this nonsense?! I'm an AMD person and I will defend Intel here, that's how wrong this is. You're supposed to be know your things and not spread lies! It's your job as a news poster on one of the larger Tech sites. Nvidia did the lame trick to give Kal-El an optimized version of Coremark for the benchmark whereas the T7200 didn't get it!

    A few Google searches can get you a long way. Here I did some homework for you about Kal-El vs Intel T7200: http://news.softpedia.com/news/Nvidia-s-Kal-El-Quad-Core-ARM-Chip-Is-Actually-Slower-Than-Intel-s-Core-2-Duo-T7200-185406.shtml

  • 2 Hide
    doorspawn , May 7, 2011 3:26 AM
    What exactly does photo-realism mean? The term has been trotted out for decades and the meaning keeps increasing. It seems the definition is the unreachable "significantly better than current quality".

    If you want to base it on whether people can tell the difference between a photo and the render, then scene complexity and resolution make all the difference. A GF2 could render a photo-realistic concrete wall at 640x480. A 100 exahertz GPU couldn't render a photo-realistic rainy tokyo arial view at 24000x16000 with atmospheric effects, thousands of people lit by hundreds of lights.

    You want to check out the angular resolution of your fovea. Judging by the writing around the visa logo on visa cards, mine can do around 1/10000 rad. So for a 1m by 1m screen 30cm away that's approx a billion pixels necessary.

    As for consoles, they're just a very successful DRM marketing drive from which everyone suffers.
  • 0 Hide
    SchizoFrog , May 7, 2011 3:31 AM
    Even if they could, they wouldn't because they can make far more money taking they time with minor upgrades to the hardware.
  • -1 Hide
    pocketdrummer , May 7, 2011 3:39 AM
    rad666Will the hardware exist to do these amazing things in 2015? Yes.Will consoles still exists and hold PCs back? Also yes.


    I, for one, think PC game developers should just leave consoles in the dust and develop more PC exclusives that utilize current technology. It's a bit pathetic that we have PC that absolutely stomp consoles, and yes we have to dumb it down just to make it cross-platform. If they don't want to keep up, leave them behind...
  • 1 Hide
    ern88 , May 7, 2011 3:41 AM
    Hey Schmich,

    That was a nice find. I always doubt these charts that are thrown around by a manufacturer and not tested by a site that isn't bias. I am sure that a lot of people buy into it blindly. Nvidia is very mis-leading to say the least. Not saying that other companies aren't. But consumers really need to do their homework on a product before splurging. Thanks again man.
  • 1 Hide
    Anonymous , May 7, 2011 4:18 AM
    ....since with Intel coming up with somewhat decent IGPs....

    i believe i have just entered a parallel universe
  • 1 Hide
    sykozis , May 7, 2011 4:46 AM
    The increase in GPU performance from 2007 to 2011 saw power consumption skyrocket...unless nVidia's partners plan on selling dedicated GPU power supplies with every card, nVidia will need to learn how to develop more energy efficient GPUs fairly soon, otherwise that "1000%" increase will result in a rather hefty power bill increase....
  • 1 Hide
    fir_ser , May 7, 2011 5:06 AM
    At the end of the day Moore’s law is just a guideline and not a holly rule carved into stone.
  • 3 Hide
    builder4 , May 7, 2011 5:54 AM
    Does anyone remember this story?

    http://www.tomshardware.com/news/Nvidia-GPU-Huang-570x,8544.html

    nVidia has lowered its targets a little bit.
Display more comments