Questioning the gaming Realism (today and tomorrow)

AliasedGamer

Distinguished
Jul 17, 2010
4
0
18,510
Hello friends ,

So I was wondering ... AMD showed us on 16 June 2008 the Cinema 2.0 demos with stunning visual realism: http://vivian.amd.com/us-en/Corporate/AboutAMD/0,,51_52_15438_15106,00.html

I was led to belive that , from now on , having these super-capable GPU's like HD4800 and beyond (including Nvidia equivalents) we will be having such realism or close to such realism in more and more games.
Still , 2 years later we are nowhere near that , with games continuing to look like DX9 level they did in 2006 and 2007(with too few exceptions).

I'm asking anyone that knows detailed information about that demo (what DX in based on , what shader model uses , how was the motion capture achieved) to share with us. It came as a (horrible) surprise to me to see DX11 capabilities still dont come even close to the graphics presented in Cinema 2.0 real-time demo.

So , if we allready have GPUs that can do that in real time , how long will it take untill they implement it in games? another 2 years? 4 years? Will the next generation of consoles be capable of such graphics? Is the limitation we see in today's games caused by under-developed software?

I currently own an 9800gt nvidia and i can't upgrade since 460-470 or HD5850 offers increased performance but the graphics of the games are basically the same in these last few years. (maybe except metro2033).

/discuss mode on :)
 

GunBladeType-T

Distinguished
Jul 8, 2010
553
0
19,010

 

hooray4boobies

Distinguished
Mar 6, 2010
111
0
18,710
I haven't seen that demo, and still can't as it's blocked at work. But metro 2033 still looks pretty *** if you ask me. It provides a great atmosphere...and sometimes your just in shock at how beautiful some of the enclosed town/stations are. But outside and most open spaces still look second rate.

The problem is developers i believe, and in extension of that, piracy and 'console conveniance'. The last game that really pushed the envelope was crysis...which even now needs current generation cards to run decently. Argue cryengine being badly coded or whatever...but when that game came out nothing was even close to it. Developers are concentrating more on consoles and just porting to pc's that can smash out the shitty xbox/ps3 graphics without breaking a sweat...because that's where the money is. Consoles take up like 80% (Guesstimate) of the games market.

Without these games being built to smash PC's, GPU designers like Nvidia and ATI can get lazy. Which is sad, cause even CryTek and Id (once PC exclusively) have gone console now.

Alot of people complain about ubisofts, for example, use of piracy protection...but if we want developers to be more interested in PCs and really just push to get their games generations ahead...it's what we need. You may argue the quality of games on PC's these days are shitty...but that's because profit margins are very low for developers.
 

djbenny1

Distinguished
Jul 20, 2010
42
0
18,530
Sounds like you've hit the nail on the head there boobies, that all makes perfect sense.

I read that XBOX 360 is so similar to PC in some ways that its extremely easy for them to port from X360 over to PC.


I suppose the answer then if what boobies said is right that the quality of PC gaming will always be dictated by the quality of the best console gaming at the time.
 

AliasedGamer

Distinguished
Jul 17, 2010
4
0
18,510
I agree , the developers are not developing more complex graphics because of limitations that consoles have , but i wonder , if the next generation will come out in about 2 years (because current generation was launched in ~2006, and in about 2012 it would have been in use for 6 years , and full lifetime will be about 10 years, or so it's rumored...maybe more) that means they will have DX11 level graphics right? and we will see more usage of depth of field ,tesselation , higher quality lights...blabla.

Considering the polygon count and recent breakthroughs in geometry complexity (with the HD5xxx and Geforce 4xx) i want to hope they'll make DX11-capable games to look almost like CGi cinematic footage .I'm thinking this because they have tesselation now and within 1-2 generations of video cards(meaning about 2 years), the raw power will not be an issue in creating extremely detailed models.
With every new generation of Radeon graphic cards i think the stream processor number doubled ,same with Geforce with cuda cores.
When will it be enough? It is allready enough and all we need is to wait for next generation consoles to include current generation GPU's we find in PC's?
 
Frankly, Rasterization has reached its limits. Sure, you can spruce it up, but its gone about as far as it can graphically. And we don't have the processing power for real-time ray tracing yet...

Personally, I'd rather see a decent physics engine (such as the one used in Backbreaker; youtube it to get an idea what I'm talking about) used in games for everything (not just limited interactions).
 

AliasedGamer

Distinguished
Jul 17, 2010
4
0
18,510
Thanks gamerk316 for mentioning rasterization and ray tracing.
I've done some research and i understood better than i did before what these methods of rendering imply .
A really interesting article i found useful in understanding is this:

http://software.intel.com/sites/billboard/archive/ray-tracing.php

QUOTE:
“Even 25 years or so ago,” Hurley continued, “when people implemented the first raster graphics algorithms, they were actually trying to implement the quality and the capabilities of ray tracing. We’ve reached a point where platform capabilities have evolved so that the two techniques can be on par, in terms of performance. From a content creator’s point of view, if they can use the technique that is physically accurate at the same performance level that they can use for approximation (and they don’t have to live with the artifacts that the approximation introduces), then it should be a no-brainer for them to want to move to ray tracing. It is not, however, because there is a lot of infrastructure invested in traditional raster graphics, so we have to win over their hearts and minds at a certain level.” - Jim Hurley , Intel Corporate Technology Group.

It is clear that realtime raytracing requires enormous amounts of computing power , but we're getting there.
First we had the HD4870 i belive , that broke the 1TeraFLOP limit , in the next generation we see the 5TeraFLOP HD5970 (some say 4.64) , i think nVIDIA is pretty close , at about 1.35 teraFLOPS with the GTX480.
I hope in about10 years we'll reach the capabilities of this MONSTER (used in the auto industry i've read):

http://www.geek.com/articles/chips/japanese-firm-hits-800-teraflops-with-new-ray-tracing-gpu-2009077/
and then we'll finally have realtime high quality raytracing in our homes ,like this:

http://11k2.files.wordpress.com/2009/03/090309raytracing.jpg



First realtime raytraced attempts seem (of course) low quality and too slow to be of use in games , also they are more based on reflections , rather than indirect and global illumination. They are also not pure-raytraced but rendered in combination with rasterising technique.

PS3 ray tracing:
http://www.youtube.com/watch?v=oLte5f34ya8

Nvidia ray tracing:
http://www.youtube.com/watch?v=pt1jKQpcAXM
http://www.youtube.com/watch?v=5tiBG7yI-Yo

Somewhat real-time raytracing showing just how much more computing power we need:
http://www.youtube.com/watch?v=vx_wO3ZTSjU

Intel ray tracing:
http://www.youtube.com/watch?v=blfxI1cVOzU

More on hybrid raytracing:
http://www.geeks3d.com/20081023/interactive-ray-tracing-with-cuda-and-opengl/

Of course a more advanced physics algorithm is necessary , and it must be applyed to everything , from blades of grass to every item of clothing a character is wearing.