Sign in with
Sign up | Sign in
Your question
Closed

Why does the in-game cut-scenes\trailors look awesome in graphics

Last response: in Graphics & Displays
Share
February 2, 2012 10:27:19 AM

hi,
I have always wondered when seeing a game trailor!!!. It really looks splendid and awesome. Almost graphics mimic real life. But when playing games at max settings, I could not see vivid graphics which is being shown in title\cut-scenes. What's happening?. I play using Asus GTX560 OC, which is not such a bad card to hide certain parameters which are shown well in title. Lets take GTA4, the in-game cutscenes really look superb. where as in DiRT3, I see the in-game cutscenes which are so so real than what I could actually play at ultra settings. Why is this?. Do they add any glooming effects or so to make it look vivid?

Or is it, my GPU powerless :pfff:  or is it all custom made video playback just for gamer attraction :love:  ? .

Just take any game such as racing\adventure, you can always see the difference
February 2, 2012 10:30:24 AM

Its because some game developers "pre-render" their cutseens. Sometimes they use the in game engine to have cutseens, and vise versa. :) 
Score
0
February 2, 2012 10:36:33 AM

What do you mean "pre-render" here?.
You mean like adding more shadows or gloom or blur, etc?
But do you think if this is possible with current gen shader models used by all our GPU cards?.
Score
0
Related resources
a b U Graphics card
February 2, 2012 10:38:36 AM

It's pre rendered, and if you've played a game like deus ex HR in full settings 1920 x 1080 then you'll realise that pre rendered cut scenes are sometimes a pain in the ass. Going from a lovely sharp game to an upscaled "movie" just looks bad.
Score
0
February 2, 2012 10:39:33 AM

Basically, the developer would render the video on a server with lots of cpu's, they would then make the textures really high res; the shadows really high res; everything like that.

Making a video look extremely life like (such as a game cutseen) takes a lot of processing power, which is not avaliable even on todays top end pc's.
Score
0
February 2, 2012 10:51:26 AM

joedjnpc said:
It's pre rendered, and if you've played a game like deus ex HR in full settings 1920 x 1080 then you'll realise that pre rendered cut scenes are sometimes a pain in the ass. Going from a lovely sharp game to an upscaled "movie" just looks bad.


Yeah it's a pain!
Score
0
February 2, 2012 5:00:33 PM

sonnyd09 said:
Basically, the developer would render the video on a server with lots of cpu's, they would then make the textures really high res; the shadows really high res; everything like that.

Making a video look extremely life like (such as a game cutseen) takes a lot of processing power, which is not avaliable even on todays top end pc's.

haha, it's available as nVidia Teslas

Score
0
February 2, 2012 5:01:58 PM

Yeah, but no-one needs to know that ;D
Score
0

Best solution

a c 181 U Graphics card
February 2, 2012 6:12:55 PM

Some cutscenes use in game graphics/engine, others are just movie clips, pre rendered as others have said. Game graphics are rendered in real time, calculating polys, textures, lighting, effects, etc., movies are already rendered as single images. There's a lot of extra effects, more complicated realistic lighting, and high poly models which greatly increase render times. Pixar's latest released movie (Cars 2) took an average of 11.5 hours per frame on a renderfarm of over 12000 cores. Movies are primarily cpu based renderers as gpus can't handle the more complex calculations. So you can imagine how much is cut out of games to get you at least 30 frames per second.
Share
February 2, 2012 6:44:37 PM

k1114 said:
Some cutscenes use in game graphics/engine, others are just movie clips, pre rendered as others have said. Game graphics are rendered in real time, calculating polys, textures, lighting, effects, etc., movies are already rendered as single images. There's a lot of extra effects, more complicated realistic lighting, and high poly models which greatly increase render times. Pixar's latest released movie (Cars 2) took an average of 11.5 hours per frame on a renderfarm of over 12000 cores. Movies are primarily cpu based renderers as gpus can't handle the more complex calculations. So you can imagine how much is cut out of games to get you at least 30 frames per second.



Rendering 3D graphics as seen in cars would take horribly long to do using CPU's and waste alot of power. Most likely Pixar uses a cluster of servers running with Intel Xenons and Nvida Quaddro or Tesla GPUs. Nvidia has a specific line of cards for 3D animation, even the best Tri SLI GTX 580 setup cant beat one Nvidia Quaddro at rendering a 3D scene. Less time rendering= more money saved on production
Score
0
February 2, 2012 6:46:30 PM

thespieler said:
Rendering 3D graphics as seen in cars would take horribly long to do using CPU's and waste alot of power. Most likely Pixar uses a cluster of servers running with Intel Xenons and Nvida Quaddro or Tesla GPUs. Nvidia has a specific line of cards for 3D animation, even the best Tri SLI GTX 580 setup cant beat one Nvidia Quaddro at rendering a 3D scene. Less time rendering= more money saved on production


Am I right in thinking that using one of those 3D Animation cards for gaming would cause horrible performance due to the different calculation types?
Score
0
February 2, 2012 6:50:16 PM

sonnyd09 said:
Am I right in thinking that using one of those 3D Animation cards for gaming would cause horrible performance due to the different calculation types?

yes, an Nvidia quaddro is horrible for gaming, its made for the specific job of rendering lighting effects and other related technologies, I believe they also have a low core clock (which also attributes to its bad gaming performance)
Score
0
a c 181 U Graphics card
February 2, 2012 11:33:38 PM

Now we are going completely off topic but to clarify:

Pixar uses renderman which they've created which can have gpu acceleration. A renderfarm is a cluster of computers or servers, the same as a supercomputer. Of course the pros are going to use server components for reliability/stability. The exact specs of pixar's renderfarm is undisclosed but the render is still mainly cpu as stated of earlier shortcomings of gpu calculations. I never said it was just cpus, I said mainly cpus. This is my field and am well aware of the limitations of gpu vs cpu rendering and how the supercomputers/renderfarms of the world have made huge strides in efficiency with gpu acceleration ever since the change away from pixel pipelines to cudas/stream.

Quadros are not that bad for gaming, I've done it and it's a bit less vs an equivalent number of cuda/clock geforce card due to drivers. It's the same calculation methods, math is math no matter how you put it; light has the same exact formula to calculate it's trajectory. But quadros and geforce are made for different fields and handle different software more efficiently than the other (like how gtx is horrible with opengl). A geforce can beat quadro in rendering, it depends on the renderer. If it's cuda based, cuda is exactly the same on gtx or quadro. Gtx usually offer much more gflops at the fraction of the price. You also need to take into account memory usage, if you run out of vram, a render will suffer greatly. I can go on about geforce vs quadro vs tesla and how they differ but I will stop here.
Score
0
February 3, 2012 4:26:18 AM

k1114 said:
Now we are going completely off topic but to clarify:

Pixar uses renderman which they've created which can have gpu acceleration. A renderfarm is a cluster of computers or servers, the same as a supercomputer. Of course the pros are going to use server components for reliability/stability. The exact specs of pixar's renderfarm is undisclosed but the render is still mainly cpu as stated of earlier shortcomings of gpu calculations. I never said it was just cpus, I said mainly cpus. This is my field and am well aware of the limitations of gpu vs cpu rendering and how the supercomputers/renderfarms of the world have made huge strides in efficiency with gpu acceleration ever since the change away from pixel pipelines to cudas/stream.

Quadros are not that bad for gaming, I've done it and it's a bit less vs an equivalent number of cuda/clock geforce card due to drivers. It's the same calculation methods, math is math no matter how you put it; light has the same exact formula to calculate it's trajectory. But quadros and geforce are made for different fields and handle different software more efficiently than the other (like how gtx is horrible with opengl). A geforce can beat quadro in rendering, it depends on the renderer. If it's cuda based, cuda is exactly the same on gtx or quadro. Gtx usually offer much more gflops at the fraction of the price. You also need to take into account memory usage, if you run out of vram, a render will suffer greatly. I can go on about geforce vs quadro vs tesla and how they differ but I will stop here.


Good explanation, Do you mean to say that quadros is only meant for 3D-animation and rendering?. Once I remember using MAYA tool for a 3-D design, I found the 8800GT from nvidia started to struggle a lot after adding more and more polygon to the objects. I used FRAPS for checking the FPS, it was dropping drastically with increase in polygon as well as adding more objects into space.

I thought it was the downside of the GPU card\tool, but could it have improved if I had used Quadro GPU?. I did not remember much about rendering, but it consumed 100% of my CPU all through-out rendering anmd hefty of time for a 15 sec animation with 20FPS :pt1cable:  .

Score
0
a c 181 U Graphics card
February 3, 2012 3:39:10 PM

It's like cars vs trucks, you could hook up a trailer to a car but the truck is going to be better for hauling. They're capable of the same work but are made for different purposes. Especially in viewports, quadros fair much better than geforce. But you will find softmods gtx to quadro will almost put them on par to the card you're softmodding to. The default maya renderer and mental ray is cpu only.
Score
0
February 4, 2012 4:05:22 PM

Best answer selected by xtcx.
Score
0
a c 271 U Graphics card
February 4, 2012 11:06:55 PM

This topic has been closed by Mousemonkey
Score
0
!