Sign in with
Sign up | Sign in
Your question

Realistic rendering on an Ultra 8800

Last response: in Graphics & Displays
Share
May 8, 2007 9:04:06 AM

pretty good thought i doubt it is realtime... :p 
May 8, 2007 9:07:42 AM

I doubt so too....but we're getting there. I'd like to see what the ATI card can do.... hell, i'd settle for just seeing the ATI card on the market.
Related resources
May 8, 2007 9:40:09 AM

by the looks of that realtime lighting thingy that was posted it could easliy do the lighting in realtime...
May 8, 2007 12:06:42 PM

The problem is that there isn't a clear definition of what you can call realtime. Some aspects of the scene might have been pre rendered, or specific optimisations pre calculated... If it isn't the case (if the algo got the scene directly and started rendering the primitives with global illumination calculations and skin shaders...) then it is indeed great. Again, i have no idea!
May 8, 2007 12:10:47 PM

Yeah, you can only go by what they claim, but on the audio track accompanying, the rep says that there was no pre-calculation.
May 8, 2007 1:03:55 PM

I'm as unimpressed as I have always been with Nvidia's tech demos; sure that one head is being rendered in real time, but I personally would rather see things like Crysis as a display of an 8800's power.
May 8, 2007 1:25:10 PM

Quote:
pretty good thought i doubt it is realtime... :p 


You must not have an 8800GTX...,

Doing that in realtime is not that big a deal. If you have ever seen the Adriane demo, it a realtime realistic runway model, just this head model is not anymore polygon intensive than that is.
May 8, 2007 1:25:15 PM

that is so true. so what if it can render a face and have it running smooth, will i be able to run Allen wake and crysis on it... i mean well, full settings and have a minimum fps of 60?
May 8, 2007 1:27:41 PM

there's nothing bad about that...just remember that everything you now see in Crysis (will see) and other games was once demoed in a video or picture somewhere as part of a research effort. You're probably unimpressed because you don't get the details of what the GFX card is doing, and the algorithms involved in rendering true skin shaders.
May 8, 2007 1:39:08 PM

:roll: nobody knows.. but the demo's of alan wake looks fine what i have seen..
May 8, 2007 1:42:33 PM

Looking forward to the day when in game graphics make that look bad. Till then I will keep on dreaming.
May 8, 2007 2:07:55 PM

Quote:
there's nothing bad about that...just remember that everything you now see in Crysis (will see) and other games was once demoed in a video or picture somewhere as part of a research effort. You're probably unimpressed because you don't get the details of what the GFX card is doing, and the algorithms involved in rendering true skin shaders.
You have the audacity to tell me I don't understand what they're doing? Nvidia's is notorious for releasing tech demos and over exaggerating the power of their hardware. The Geforce 8800Ultra is nowhere near capable of rendering a game at a quality like that.
May 8, 2007 2:25:12 PM

Audacity?? What am i supposed to be scared of you or something. If you think what i'm saying is wrong, correct it! I am not defending neither Nvidia nor ATI. What i said is that if you've ever tried to apply a realistic shader before in realtime, you'd see that such a demo in realtime is a step forward.
As to whether an 8800 Ultra is capable or not of rendering such details in a game, sure it can't. Just if you had read the post, i was saying that even this simple scene may not have been rendered in realtime.
To wrap up this subject, Nvidia is also notorious for doing research-oriented (as well as game oriented) GFX cards (which is why they dominate the workstation area) as well as better support for Linux and OpenGL. I suspect that ATI has taken that path (considering the number of shaders in their card, and the "weak" FPS compared to such enormous count).
May 8, 2007 2:37:04 PM

Quote:
pretty good thought i doubt it is realtime... :p 


You must not have an 8800GTX...,

Doing that in realtime is not that big a deal. If you have ever seen the Adriane demo, it a realtime realistic runway model, just this head model is not anymore polygon intensive than that is.
It's not a matter of polygon intensive or not...it's about rendering skin shaders... What this demo means is that animation movies will gain a notch in realism. In 4 or 5 years, we may even start seeing such characters inside games.
Oh and i don't have an 8800GTX and probably will never have one. I mostly work on workstation versions of the graphics engines.
May 8, 2007 3:02:17 PM

aren't skin shades in shader model 4.0? wouldn't that just be showing off these shades... meaning it is doing exactly what they said it would do...

also now days programmers aren't worried about polygons, that is why they started using bump maps, gives the impression of more polygons then there actually are... however i would like to start to see a good (and well supported) vector based type GPU.... imagine a sphere that is actually rendered round not just the illusion of it....
May 8, 2007 3:36:59 PM

Quote:
aren't skin shades in shader model 4.0? wouldn't that just be showing off these shades... meaning it is doing exactly what they said it would do...

also now days programmers aren't worried about polygons, that is why they started using bump maps, gives the impression of more polygons then there actually are... however i would like to start to see a good (and well supported) vector based type GPU.... imagine a sphere that is actually rendered round not just the illusion of it....


I would actually like to see what a Vector GPU at the 8800 speeds could do. Ironically that is what limited the Playstaion 2's GPU, It was designed for Vector graphics rendering.
May 8, 2007 5:03:39 PM

Quote:
aren't skin shades in shader model 4.0? wouldn't that just be showing off these shades... meaning it is doing exactly what they said it would do...

also now days programmers aren't worried about polygons, that is why they started using bump maps, gives the impression of more polygons then there actually are... however i would like to start to see a good (and well supported) vector based type GPU.... imagine a sphere that is actually rendered round not just the illusion of it....


If you look at the wireframe mode on the Adrianne demo there is a very high polygon count. bump maps add more realism to low polygon models but in order to have a large number of models even if they are low polygon individually (like in a game with many many objects) you still need to be able to push large numbers of polygons as a whole.
a b U Graphics card
May 8, 2007 5:40:36 PM

Quote:
You have the audacity to tell me I don't understand what they're doing?


LOL! :wink:
a b U Graphics card
May 8, 2007 6:38:46 PM

Quote:
What i said is that if you've ever tried to apply a realistic shader before in realtime, you'd see that such a demo in realtime is a step forward.


How is this a heavy load for the shaders, it seems like some pretty high res texture maps, perhaps with some displacement maping thrown in, with some lighting overtop, the skin itself is pretty smooth and featureless, hair would be more impressive.
It's not as impressive as it seems if all it is is just a higher resolution map due to added 8x8 support.

Sure it's ok impressive from a lighting perspective, but even then it's a pretty basic object to light, hair would be hader, and there's no diffusion and the collision path is pretty linear.

The skin shading can be done through the same colour gradient mapping from the PS1.4 era, just with a larger range now.

While it's visually nice, I'm not sure it's technically impressive unless there's other things going on that they didn't mention.

also it'd be easier to be critically objective if it was a high resolution video that allowed you to see the subtelties that are the impressive parts of things like displacement mapping.

BTW, bump mapping is for kids. :twisted:
May 9, 2007 1:10:59 AM

yeah but isn't the how thing about vector graphics is that is can be scaled to an unlimited degree and the only pixilation would be from teh device it was rendering to? so it would do away with the performance hits at higher resoulutions. just seems like a good idea.
May 9, 2007 1:22:34 AM

The problem gets to be the number of vectors rendered. The bigger the screen space the more pixels they have to be rendered for too.
May 9, 2007 2:27:36 AM

that looks like the most boring game ever
!