Sign in with
Sign up | Sign in
Your question
Closed

Nvidia: The Future of Graphics Processing

Last response: in News comments
Share
Anonymous
May 7, 2011 12:17:58 AM

Well DUH, With the performance gains in GPUs the last 3 years I can't see WHY that wouldn't happen, although I think it will be AMD who gets it done and nVidia producing an inferior heat machine to compete.
May 7, 2011 12:19:22 AM

aren't the new consoles suppose to be out by 2015 too?
Related resources
May 7, 2011 12:22:07 AM

Is it even possible to push graphics beyond photo-realism?
We're a long ways from that yet. Even if you can reach photo realism it wont be that impressive until it's off a 2-d screen and completely surrounding me.
May 7, 2011 12:32:00 AM

Quote:
Given the rapid advancement in mobile (smartphone, tablet) technology, will these devices actually replace netbooks in the near future? Netbooks will be wedged out, he said, but not notebooks because it's a form factor most consumers are familiar with. It has a larger display and an integrated keyboard. "There's a place in the universe for that form factor," he said.

Well, nVidia ia definitely out of the notebook market since with Intel coming up with somewhat decent IGPs and AMD coming up with fusion, nVidia's solution don't seem to be the odd man out.
May 7, 2011 12:34:22 AM

kcorp2003aren't the new consoles suppose to be out by 2015 too?

Yeah, but then the will feature tech from now, just as Wii 2 is rumored to feature the R700 GPU, a chip 3 years old.
May 7, 2011 12:34:56 AM

rohitbaranWell, nVidia ia definitely out of the notebook market since with Intel coming up with somewhat decent IGPs and AMD coming up with fusion, nVidia's solution don't seem to be the odd man out.

Oops, I meant netbook there.
May 7, 2011 12:35:25 AM

kcorp2003aren't the new consoles suppose to be out by 2015 too?

I don't think anything, other than the successor to the Wii (which will be out in 2012), has been announced.

Quote:
You can have a movie that's focused on special effects and no story, and it's a crappy movie.

OPINION: Star Wars Episode I, II and III
May 7, 2011 1:18:55 AM

oh come on, give us v-world allready!
May 7, 2011 1:41:04 AM

In 10 years I will be laughing at myself for ever thinking that Crysis was hard to run. Though over time I think developers will start to get lazy with their code as hardware gets so advanced.
Anonymous
May 7, 2011 2:14:50 AM

@11796pcs:

Welcome to ten years ago.
May 7, 2011 2:26:36 AM

Will the hardware exist to do these amazing things in 2015? Yes.

Will consoles still exists and hold PCs back? Also yes.
May 7, 2011 3:11:46 AM

Quote:
While that indeed may be true, I saw a large number of tablets throughout the convention, seen both within the sessions and the keynotes

Those are wannabe hipsters. I bet they had a keyboard addon to the tablet, right? Well a proper laptop (whether PC or Mac) is WAY more productive than a tablet. I just facepalm when I see someone use tablet for productivity in order to try to look cool.
Quote:
many people's perception of what's possible in real time has been completely changed

This is due to most people only knowing "console graphics" and that we're soon at the end of the line for this generation of consoles. As simple as that. Every time a console gamer friend sees how games look on my PC they're always impressed.
We won't be at the end of line in a long time. Why? Even the faster graphics cards won't run Battlefield 3 at full settings. There's just no way. Even an AMD 6950 barely keeps proper framerates (40+) during the most intensive graphics of BC2. Ray-tracing gets more complex the further away the ray has to travel. Ray-tracing in large open sandbox games is a lot more for the GPU to do VS one in a small enclosed room.
To all this you can add larger resolutions (1080p is quite lame if you think about it really) and then 3D. Sure a lot of people who haven't had 3D gaming will say it's gimmicky but it's actually fun to turn on once in a while. 3D in gaming is true 3D.
Quote:
Kal-El outperforms the Intel Core 2 Duo T7200 processor

No, no and no! How can you continue to spread this nonsense?! I'm an AMD person and I will defend Intel here, that's how wrong this is. You're supposed to be know your things and not spread lies! It's your job as a news poster on one of the larger Tech sites. Nvidia did the lame trick to give Kal-El an optimized version of Coremark for the benchmark whereas the T7200 didn't get it!

A few Google searches can get you a long way. Here I did some homework for you about Kal-El vs Intel T7200: http://news.softpedia.com/news/Nvidia-s-Kal-El-Quad-Cor...

May 7, 2011 3:26:32 AM

What exactly does photo-realism mean? The term has been trotted out for decades and the meaning keeps increasing. It seems the definition is the unreachable "significantly better than current quality".

If you want to base it on whether people can tell the difference between a photo and the render, then scene complexity and resolution make all the difference. A GF2 could render a photo-realistic concrete wall at 640x480. A 100 exahertz GPU couldn't render a photo-realistic rainy tokyo arial view at 24000x16000 with atmospheric effects, thousands of people lit by hundreds of lights.

You want to check out the angular resolution of your fovea. Judging by the writing around the visa logo on visa cards, mine can do around 1/10000 rad. So for a 1m by 1m screen 30cm away that's approx a billion pixels necessary.

As for consoles, they're just a very successful DRM marketing drive from which everyone suffers.
May 7, 2011 3:31:14 AM

Even if they could, they wouldn't because they can make far more money taking they time with minor upgrades to the hardware.
May 7, 2011 3:39:59 AM

rad666Will the hardware exist to do these amazing things in 2015? Yes.Will consoles still exists and hold PCs back? Also yes.


I, for one, think PC game developers should just leave consoles in the dust and develop more PC exclusives that utilize current technology. It's a bit pathetic that we have PC that absolutely stomp consoles, and yes we have to dumb it down just to make it cross-platform. If they don't want to keep up, leave them behind...
May 7, 2011 3:41:31 AM

Hey Schmich,

That was a nice find. I always doubt these charts that are thrown around by a manufacturer and not tested by a site that isn't bias. I am sure that a lot of people buy into it blindly. Nvidia is very mis-leading to say the least. Not saying that other companies aren't. But consumers really need to do their homework on a product before splurging. Thanks again man.
Anonymous
May 7, 2011 4:18:31 AM

....since with Intel coming up with somewhat decent IGPs....

i believe i have just entered a parallel universe
May 7, 2011 4:46:48 AM

The increase in GPU performance from 2007 to 2011 saw power consumption skyrocket...unless nVidia's partners plan on selling dedicated GPU power supplies with every card, nVidia will need to learn how to develop more energy efficient GPUs fairly soon, otherwise that "1000%" increase will result in a rather hefty power bill increase....
May 7, 2011 5:06:58 AM

At the end of the day Moore’s law is just a guideline and not a holly rule carved into stone.
May 7, 2011 5:56:49 AM

those are some serious claims...hope it's true
May 7, 2011 8:52:25 AM

Given just how powerful GPUs have become over generations this line is obviously believable. Whether the industry will taper off when we achieve photo-realism, is open to interpretation however.

I think we'll find a new target, or a benchmark, to focus on once we achieve photo-realism. Just looking realistic is not the end of all.
May 7, 2011 12:00:11 PM

I wont be getting a "five hundred times a gtx470" if i have no good game to actually use it on...

I guess what we are loosing here is the real meaning of games: Fun..! Not simluation of reality, just taking you to another reality and making you dream, and have fun.

I dont want to be inside a war simulator. That awfully resembles me to "training"
May 7, 2011 3:40:20 PM

Um, 2012 people..... None of this will matter :) 
May 7, 2011 3:50:32 PM

"you can't develop the next generation of gaming content on a 1-watt phone."

Yea I see the point in that. in 2015 you can maybe do real time photo-realistic games in 1080p. 2017 in hd2k... but that's in a 200w TDP. Then its something like 50 years of continued work to get that down to milliwatts and into mobile units :D 
May 7, 2011 4:32:24 PM

robisinho"you can't develop the next generation of gaming content on a 1-watt phone."Yea I see the point in that. in 2015 you can maybe do real time photo-realistic games in 1080p. 2017 in hd2k... but that's in a 200w TDP. Then its something like 50 years of continued work to get that down to milliwatts and into mobile units

For that to happen....nVidia will have to stop producing increasingly power hungry GPUs....they're nearing a 300watt TDP now...
Anonymous
May 7, 2011 4:36:37 PM

Really starting to hate how much tablets, smartphones, etc are completely invading gaming and tech news. WE DON'T CARE! Go ahead and play your stupid Angry Birds as you sit there on the bus or train or waiting room. When "gamers" play games, they do it on a 30" monitor or 50" plasma, with surround sound. We have no interest in playing Crysis on a 4" display, and NVIDIA/ATI shouldn't be wasting their time pandering to such lucrative and short-sighted demands.

May 7, 2011 6:15:56 PM

Graphics while being photo realistic, which basically means if you take a screen shot, and compare it to a photo, it will look real, but, regarding gameplay, you can never have a peak or limit, why? because to have gameplay you must simulate many more things than just graphics, motion, physics, lighting, Ai, ect. ect. ect.
It's like grand theft auto, you might have awsome graphics, but you still have to calculate other things, like if you crash, metal gets deformed, your character gets hurt, Ai react, you might bleed, the car might catch on fire, your tire might go flat, all while the sun is moving in the sky or rain is pounding the town. Plenty to do. In the future gaming might just be realistic, maybe they will find a way to create feeling, taste, smell, ect. All in a holedeck type room or via matrix plug into the back of the head. Who knows.

Personally, how things have been changing,just an opinion, but I think his estimate for graphics power is an under estimate. Can't wait to see what the next 10 years are gonna be like. Gaming is going to be like stepping into another dimension.
May 7, 2011 11:27:51 PM

JOSHSKORNI don't think anything, other than the successor to the Wii (which will be out in 2012), has been announced.OPINION: Star Wars Episode I, II and III


why si it peopel ahmer on thsoe movies ? yet peoepl so loving speak of teh oriignal's which if you ask me lacked in plot story and cahracter development as much as the newer prequels , difference is is every oen ahs fond child hood memories of teh originals , but in hind sight after re veiwing eh opriginals , i rally can't say i see nay thign that stands out over the prequels , they fit like a glove , both were over the top effects , gearted towards kids and had their share of stupid characters (in the orignals it was the robots adn teh damned ewoks). so hoenslty dude if youa re going to count the prequels you need to count the original's to cos they ARE every bit as stupid as the prequels.
May 8, 2011 1:34:08 AM

What's the point? If almost all games are developed for the crappy consoles first and then ported to PC with dumb down graphics. Unless.... Nvidia opens PC exclusive studio game development :) 
Anonymous
May 8, 2011 2:13:35 AM

Does Intel's new 3D transistor figure into these calculations?
May 8, 2011 6:32:57 AM

11796pcsIn 10 years I will be laughing at myself for ever thinking that Crysis was hard to run. Though over time I think developers will start to get lazy with their code as hardware gets so advanced.


Developers are already lazy, can't blame them though. Heck I wouldn't do the extra work just so someone can have a rather minimal performance gain.
May 8, 2011 7:14:52 AM

Photorealistic with Ray tracing in 4 years means caped raytracing algorithms and a mixture of direct calculations to get 30-60 fps gameplay. But there will be room for improvement for graphic hardware as you add more sophistication. Maybe real 2d and 3d motion blur, real depth of field, subsurface scattering, etc and things really get escalated with sophisticated irradiance calculations. It will be very nice to be in front of all this power to create fantastic adventures. And I agree with those who say the story is what mostly keeps you in front of your game or movie.

I believe there will be a lot of room for improvement after ray tracing gets comfortable at HD 1080-60 fps for a good visual quality. There will be bigger resolutions in 7-8 years, presumably at 6000k and 8000k, and it could be even higher in the not so distant future as we have today multi screen setups with crazy 6000k resolutions. So, I think it is correct to assume the next 10 years are going to offer improvement we all are going to appreciate.

But the next 10 years after that are a challenge to hardware vendors in my oppinion as most consumers, even gamers will see no justification to get more powerful hardware after a certain point. With more screen density you get to a point where there is no much to be seeing on a screen and if the screen is too big you will either have to watch at a bigger distance. This is coming to an end already with smart phones getting 300dpi screens. Tablets and laptops will follow and at some point even a cinematic big screen will see no real benefit form pixel density. Do we have 70 millimeter theaters everywhere? And this technology has been available at least since the 70s.

The screen, the speakers the depth of the color will finally reach its limits with what we could perceive from our own senses. So the real future is for content creation and with time the hardware will decrease in importance. But obviously this is something we wont hear or read from graphic cards vendor gurus from Nvidia or ATI. Most users from today are caring less form hardware specs and are starting to pay more attention to tablets for functionality.
Anonymous
May 8, 2011 9:45:27 AM

So, in 5 years, Epic has to rename it's engine from Unreal to Real?

And, how long until we basically live in a pod somewhere, totally emmerged in a completely simulated virtual world? And what do we do, once we are in there? Running around with guns trying to kill each other? Seriously, why not just start ww3 now and we can have that kind of entertainment right now, with totally realistic physics and graphics and opponents better than any AI would ever be...
May 8, 2011 2:51:00 PM

This brings a whole new meaning to virtual reality, I remember hearing about the idea of putting something on over your head and hooking stuff up to your hands and feet and fingers and wondering what that would be like, this is much more then just that.
May 8, 2011 5:37:21 PM

Will the resolution of the IBM T221 3840×2400 become standard? I still find full HD in a 15" screen too low!
May 8, 2011 7:00:31 PM

Wow, I hadn't ever seen that IBM monitor before, that looks amazing, why are monitors still so far behind compared to that? I saw an article made in 2003 with that same monitor being tested, I know monitors size and resolution has got a lot bigger and higher but how high will it end up getting?
May 8, 2011 7:29:13 PM

rohitbaranWell, nVidia ia definitely out of the notebook market since with Intel coming up with somewhat decent IGPs and AMD coming up with fusion, nVidia's solution don't seem to be the odd man out.


Exactly and thets why 3d is just a passing fad imo. The real future is going to be in stereoscopic headsets that have a high res LCD panel for each eye to generate the 3d effect and a kinect like position sensor so the system will know where you are looking, moving, and aiming and provide you with the appropriate visuals. Now thats the real future of gaming imo. After glasses will come mini projectors that project right onto your retina.
May 8, 2011 7:36:35 PM

It makes me laugh when I see people saying the next consoles from Microsoft and Sony will come out in 2015 because they said the current systems have a 10 year life cycle. Life cycle refers to how long the company and developers plan on supporting a console. For instance, the PS2 still was selling and games were being made for it long after the PS3 was introduced. The PS2 had a 10 year life cycle, which didn't mean that the PS3 wasn't released during that time. Let's at least wait until E3 in June to see if announcements are, or aren't made about possible 2012 console releases as they usually are announced a little over a year in advance of sales.

Kind of what jecastej was saying--the law of diminishing returns will start coming into play later in the decade going into next, where graphic improvements become less obvious and having expensive computers no longer makes it practical to gain any real edge over later consoles. Especially, those released around the 2018-2020 timeframe. By such time, consumers will be so used to extreme eye candy, I think developers will have to get back to making good plots in all media in order to keep people's attention. Sooner or later, I think the gee wiz flash is going to wear thin. Audiences have historically been easy to tire of things that amused or wowed them at first, but they got used to very quickly.
May 9, 2011 10:47:58 AM

1000% increase in 4 years? That's only slightly better than the standard 10x in 5 years (which is what you get if you compound a doubling period of 18 months).

And the whole question of when the graphics problem will be solved is a bit silly. Even at the level of feature films, where minutes are spent rendering each frame, they still resort to many modeling and algorithmic tricks, as well as limiting shot choices. In other words, there will always be room for improvement and hard cases for the technology to handle.

IMO, the only useful or insightful thing in this keynote was their roadmap.
May 9, 2011 10:56:31 AM

how do they plan on handling the bus bandwitdth to actualy get the model and texturing data to the GPU? currently the biggest bottleneck is CPU/main memory to GPU communication. you can throw any number of compute units and whatnot on the card but once it has to sit idle waiting for data and instructions you are busted.
May 9, 2011 1:43:00 PM

schmich... Nvidia did the lame trick to give Kal-El an optimized version of Coremark for the benchmark whereas the T7200 didn't get it!A few Google searches can get you a long way. Here I did some homework for you about Kal-El vs Intel T7200: http://news.softpedia.com/news/Nvi [...] 5406.shtml


This little nugget of truth goes to show the limits (or lack thereof) of propaganda and puffery meant to influence analysts and stock holders/buyers.

That, and the little ugly fact that nVidia has been poo-pooing ray-tracing for years. Does this mean nVidia has finally seen the light? (and shadows and reflections?)
May 9, 2011 3:22:04 PM

Also look at the power usage and heat levels between 2007 and 2011, and project that on to 2015. This is the same reason the P4 got curtailed and we went in an entirely different direction. Intel people were pushing thay'd "reach 10GHz with this design" which never happened. The same thing will happen here. We're at a maximum of heat and power on video cards, as it was with the P4. The same thing will happen to video cards, and the companies that don't recognize that will be left to rot on the side of the road. AMD is already looking to other roads to increase the performance per watt curve. They know what's happening, and they're going to come out of this far better than Nvidia, if Nvidia doesn't get smart about it.
May 9, 2011 3:33:20 PM

popatimExactly and thets why 3d is just a passing fad imo. The real future is going to be in stereoscopic headsets that have a high res LCD panel for each eye to generate the 3d effect and a kinect like position sensor so the system will know where you are looking, moving, and aiming and provide you with the appropriate visuals. Now thats the real future of gaming imo. After glasses will come mini projectors that project right onto your retina.


I seriously doubt that. I have a feeling 3D won't catch on until we get something that is stand alone. I already wear glasses, and I hate wearing extra glasses to see 3d. I avoid 3d movies, and I won't buy any 3d gear if I have to wear glasses. a headset would be even worse.

I see more of either large screens that make it look 3d without glasses (it can be done) and/or Star Wars type 3D displays that use a suspended mist and can be seen from all angles. Perhaps even crystal cubes that display 3d internally as an intermediary step.
May 9, 2011 6:03:12 PM

I don't even bother updating my dual 9800 GTX's, "Why?" you may ask, because unless I'm benchmarking there is no significant increase in real life performance for the games that are out. I run everything at 1920x1200 and it all runs just fine. Unless an engine is poorly optimized (Cryengine 2, and GTA IV's engine come to mind) most of the games built nowadays are built to run on 6 year old architecture, the consoles. Unless dev's start focusing more on PC's and giving their games something that would make the upgrade worth it, there is no point. Until the next generation of consoles release we are shafted, the PC gaming community lost its edge when everyone went multiplatform with their titles.
May 9, 2011 7:01:10 PM

Quote:
why si it peopel ahmer on thsoe movies ? yet peoepl so loving speak of teh oriignal's which if you ask me lacked in plot story and cahracter development as much as the newer prequels , difference is is every oen ahs fond child hood memories of teh originals , but in hind sight after re veiwing eh opriginals , i rally can't say i see nay thign that stands out over the prequels , they fit like a glove , both were over the top effects , gearted towards kids and had their share of stupid characters (in the orignals it was the robots adn teh damned ewoks). so hoenslty dude if youa re going to count the prequels you need to count the original's to cos they ARE every bit as stupid as the prequels.



Your complete lack of ability to communicate in a reasonable and effective manner renders your argument. Put your big-girl panties on and try again.
May 9, 2011 8:44:47 PM

Real Time Photo Realism is simply when you can artificially render and reproduce visual objects that exist in nature that can not be distinguished from the real object at the resolution that equals or surpasses the detectability offered by the human retina.
!