Photorealistic Game Engine Tech Demos

The Human Race (Unreal)

At this year's GDC, Epic Games showed off the power of Unreal Engine once again, but this time it partnered with Chevrolet to showcase its new technology in a short movie trailer called The Human Race, which showed a human driver in the new Chevy Camaro ZL-1 competing against an AI driver in the ultimate test of man versus machine. When the trailer was over, Chevy executive Sam Russell took the stage with a Lenovo Phab 2 Pro smartphone. With the movie playing again in the background, Russell was able to use his smartphone, specifically a Tango-based Chevy app, to change the colors of the Camaro within the movie in real time.

To achieve the technological feat, the car used in the initial phase of shooting was The Blackbird, a buggy-like vehicle created by a visual effect studio called The Mill, and attached to it is a multi-camera rig that shoots 360 degrees of 3D video at 6K resolution. The Blackbird was also covered in markers, which is used by the Unreal Engine in post-production. Combined with the footage of a second camera that films the Blackbird zipping through the mountains, Unreal Engine rendered the Camaro over the Blackbird and also utilized its Sequencer feature to line up the real-world footage and the rendered car in order to show a trailer that looks close to the real thing.


MORE: Real-Time Cinematic VR Rendering With Epic Games And The Mill


MORE: Tom's Hardware's GDC 2017 Highlights

Hellblade: Senua's Sacrifice (Unreal)

Rendering a car in real-time is one thing, but achieving the same feat to mimic the real-life movements of the human face within a game is quite impressive. Epic Games, along with Cubic Motion and 3Lateral, showed that it's up to the task with last year's demo of Hellblade: Senua's Sacrifice.

The short clip features Senua facing personal nightmares as she searches for her beloved. The three giant screens within the room showed the demo as well as the face capture session with Melina Juergens, who played the part of Senua. It wasn't until the end of the demo that we discover that Juergens' performance was rendered in real-time into the game. Juergens herself was performing unseen on the left-hand side of the stage.

With this new technology, game directors can easily tell actors how to perform in specific scenes because they can see the character in the game environment in real-time instead of seeing a green screen backdrop. Developers can even add effects like fire, change the environment lighting, or even manipulate the camera angle in real time. The result is a more realistic facial and overall motion capture performance, all because the director can actually see the game environment as they shoot the scene.


MORE: Unreal Engine Getting Impressive 3D Cinematic Tools, VR Editor


MORE: Tom's Hardware's GDC 2017 Highlights

  • falchard
    The trouble with realistic real-time engines is making it look realistic in gameplay. Something like micro-displacement mapping lip movement is something you would need to do on a scene by scene basis instead of a modular approach. It would eat up a lot of space.
    Its also deceiving. Look at our realistic car in the foreground while you can't really see the shitty trees in the background. With most games today, you need to do everything at a higher quality where for a simple driven animation you can concentrate all the detail in the foreground and skip out on most of the games calculations.
    How it translates into gameplay will always be the final criteria. If it's just an animation with little player control, you might as well have packaged it as a movie.
    Reply
  • d_kuhn
    RE: Agni's Philosophy: shaky-cam sucks when it's live action - and now I know it's just as bad in realtime rendered engines. It luckily didn't live long as a tool for cinematographers - though you still see it used for low-budget TV occasionally.
    Reply
  • bit_user
    Thanks for posting. I always download and run the latest 3DMark and Unigine demos whenever I upgrade GPUs. It's amazing to see the evolution of realtime graphics.

    Truth be told, I once thought mainstream games might be on to realtime ray tracing, by now.
    Reply
  • TMTOWTSAC
    If only looking good translated directly into playing well, or being fun.
    Reply
  • cryoburner
    I would say none of these demos look truly photorealistic, as in looking like a photo, or a live-action video. Out of these examples, the Unreal Loft demo probably comes closest, along with perhaps the Mizuchi Museum demo, though that has a very narrow focal range hiding any potential imperfections behind lots of lens blur. And of course, the thing these two demos have in common is that they are limited to a single room with little organic material. There are no humans or animals with unrealistic skin and chunky hair, no cloth or plants blowing in the wind, no complex character animations or collision detection between objects. They are not actually games. To be fair though, even big-budget films tend to struggle to make CG look realistic, despite it not even being rendered in real time.
    Reply
  • quilciri
    Hey, Valve...what's up with Source 2? (you can still count to 2, right?) ;)
    Reply
  • falchard
    It's halfway to 3, and they don't want to make that sort of commitment yet.
    Reply