Why does the in-game cut-scenes\trailors look awesome in graphics

Status
Not open for further replies.

xtcx

Distinguished
Jan 7, 2012
104
1
18,695
hi,
I have always wondered when seeing a game trailor!!!. It really looks splendid and awesome. Almost graphics mimic real life. But when playing games at max settings, I could not see vivid graphics which is being shown in title\cut-scenes. What's happening?. I play using Asus GTX560 OC, which is not such a bad card to hide certain parameters which are shown well in title. Lets take GTA4, the in-game cutscenes really look superb. where as in DiRT3, I see the in-game cutscenes which are so so real than what I could actually play at ultra settings. Why is this?. Do they add any glooming effects or so to make it look vivid?

Or is it, my GPU powerless :pfff: or is it all custom made video playback just for gamer attraction :love: ? .

Just take any game such as racing\adventure, you can always see the difference
 
Solution
Some cutscenes use in game graphics/engine, others are just movie clips, pre rendered as others have said. Game graphics are rendered in real time, calculating polys, textures, lighting, effects, etc., movies are already rendered as single images. There's a lot of extra effects, more complicated realistic lighting, and high poly models which greatly increase render times. Pixar's latest released movie (Cars 2) took an average of 11.5 hours per frame on a renderfarm of over 12000 cores. Movies are primarily cpu based renderers as gpus can't handle the more complex calculations. So you can imagine how much is cut out of games to get you at least 30 frames per second.

xtcx

Distinguished
Jan 7, 2012
104
1
18,695
What do you mean "pre-render" here?.
You mean like adding more shadows or gloom or blur, etc?
But do you think if this is possible with current gen shader models used by all our GPU cards?.
 

joedjnpc

Distinguished
Nov 4, 2011
296
0
18,810
It's pre rendered, and if you've played a game like deus ex HR in full settings 1920 x 1080 then you'll realise that pre rendered cut scenes are sometimes a pain in the ass. Going from a lovely sharp game to an upscaled "movie" just looks bad.
 

sonnyd09

Distinguished
May 8, 2010
61
0
18,630
Basically, the developer would render the video on a server with lots of cpu's, they would then make the textures really high res; the shadows really high res; everything like that.

Making a video look extremely life like (such as a game cutseen) takes a lot of processing power, which is not avaliable even on todays top end pc's.
 

sonnyd09

Distinguished
May 8, 2010
61
0
18,630


Yeah it's a pain!
 

janiashvili

Distinguished
Dec 8, 2011
75
0
18,640

haha, it's available as nVidia Teslas

 
Some cutscenes use in game graphics/engine, others are just movie clips, pre rendered as others have said. Game graphics are rendered in real time, calculating polys, textures, lighting, effects, etc., movies are already rendered as single images. There's a lot of extra effects, more complicated realistic lighting, and high poly models which greatly increase render times. Pixar's latest released movie (Cars 2) took an average of 11.5 hours per frame on a renderfarm of over 12000 cores. Movies are primarily cpu based renderers as gpus can't handle the more complex calculations. So you can imagine how much is cut out of games to get you at least 30 frames per second.
 
Solution

thespieler

Distinguished
Nov 20, 2011
127
0
18,690



Rendering 3D graphics as seen in cars would take horribly long to do using CPU's and waste alot of power. Most likely Pixar uses a cluster of servers running with Intel Xenons and Nvida Quaddro or Tesla GPUs. Nvidia has a specific line of cards for 3D animation, even the best Tri SLI GTX 580 setup cant beat one Nvidia Quaddro at rendering a 3D scene. Less time rendering= more money saved on production
 

sonnyd09

Distinguished
May 8, 2010
61
0
18,630


Am I right in thinking that using one of those 3D Animation cards for gaming would cause horrible performance due to the different calculation types?
 

thespieler

Distinguished
Nov 20, 2011
127
0
18,690

yes, an Nvidia quaddro is horrible for gaming, its made for the specific job of rendering lighting effects and other related technologies, I believe they also have a low core clock (which also attributes to its bad gaming performance)
 
Now we are going completely off topic but to clarify:

Pixar uses renderman which they've created which can have gpu acceleration. A renderfarm is a cluster of computers or servers, the same as a supercomputer. Of course the pros are going to use server components for reliability/stability. The exact specs of pixar's renderfarm is undisclosed but the render is still mainly cpu as stated of earlier shortcomings of gpu calculations. I never said it was just cpus, I said mainly cpus. This is my field and am well aware of the limitations of gpu vs cpu rendering and how the supercomputers/renderfarms of the world have made huge strides in efficiency with gpu acceleration ever since the change away from pixel pipelines to cudas/stream.

Quadros are not that bad for gaming, I've done it and it's a bit less vs an equivalent number of cuda/clock geforce card due to drivers. It's the same calculation methods, math is math no matter how you put it; light has the same exact formula to calculate it's trajectory. But quadros and geforce are made for different fields and handle different software more efficiently than the other (like how gtx is horrible with opengl). A geforce can beat quadro in rendering, it depends on the renderer. If it's cuda based, cuda is exactly the same on gtx or quadro. Gtx usually offer much more gflops at the fraction of the price. You also need to take into account memory usage, if you run out of vram, a render will suffer greatly. I can go on about geforce vs quadro vs tesla and how they differ but I will stop here.
 

xtcx

Distinguished
Jan 7, 2012
104
1
18,695


Good explanation, Do you mean to say that quadros is only meant for 3D-animation and rendering?. Once I remember using MAYA tool for a 3-D design, I found the 8800GT from nvidia started to struggle a lot after adding more and more polygon to the objects. I used FRAPS for checking the FPS, it was dropping drastically with increase in polygon as well as adding more objects into space.

I thought it was the downside of the GPU card\tool, but could it have improved if I had used Quadro GPU?. I did not remember much about rendering, but it consumed 100% of my CPU all through-out rendering anmd hefty of time for a 15 sec animation with 20FPS :pt1cable: .

 
It's like cars vs trucks, you could hook up a trailer to a car but the truck is going to be better for hauling. They're capable of the same work but are made for different purposes. Especially in viewports, quadros fair much better than geforce. But you will find softmods gtx to quadro will almost put them on par to the card you're softmodding to. The default maya renderer and mental ray is cpu only.
 
Status
Not open for further replies.