Deus Ex: Human Revolution Performance Analysis

Image Quality And Anti-Aliasing

Deus Ex: Human Revolution uses a modified version of the Crystal Dynamic game engine, most recently used on the PC in the Tomb Raider spinoff Lara Croft and the Guardian Of Light. The engine is updated for Deus Ex, with DirectX 11 features like soft shadows and tessellation added, in addition to FXAA and MLAA anti-aliasing support. Even without these features, the game is very attractive in DirectX 9 mode, as you can see in the following comparison:

Note the soft shadow effect on the bottom-left in DirectX 11 mode above. Tessellation and soft shadows are nice to have, but realistically, their effects are difficult to notice while you’re actually playing the game. So, owners of DirectX 9- and 10-class cards won’t feel compelled to upgrade for this title.

From a visual standpoint, the game looks great and the art direction is very slick. Our only complaint is that the character models look somewhat primitive at times, especially when the camera is focused on them during conversations.

The Detroit cityscape stirs memories of Blade Runner

FXAA and Morphological Anti-Aliasing (MLAA) in DirectX 11 mode

DirectX 9 mode has access to standard edge anti-aliasing, but DirectX 11 mode grants access to FXAA and MLAA. Both of these are post-process anti-aliasing filters that rely on graphics card pixel shader power to analyze the video output, look for aliased edges, and smooth them out as much as possible.

While Nvidia has offered FXAA code to developers for a little while already, AMD’s MLAA was previously only been available on Radeon graphics cards by forcing the feature with the Catalyst driver. This marks the first time AMD is sharing the MLAA code with a game developer.

There are two benefits to this approach: its vendor-agnostic (meaning it can be used on both Radeon and GeForce graphics cards) and it allows the developer to exclude the post-process AA effect from text and other informational displays that the filter would otherwise blur.

Having said all that, let’s compare the game’s AA modes: Edge AA, FXAA, and MLAA:

It’s interesting that MLAA output is somewhat similar to Edge AA in that textures usually remain aliased, except in rare cases. On the other hand, FXAA does a lot more work on aliased textures. Whether this is good or bad is a little subjective because FXAA does lose a little texture detail compared to MLAA. The good news is that the game accommodates both, and players can set their AA preference. As you’ll see in our benchmarks later, there isn’t a lot of difference between MLAA and FXAA with regard to performance.

  • Soma42
    Man as much as I prefer PC gaming I can't wait for the PS4 and Xbox 720 to come out so games will look noticeably better. Might actually want to upgrade my computer by then...

    Anywho, I didn't play the first two am I missing anything if I wanted to pick this up?
    Reply
  • festerovic
    @soma - I personally thought they were average - good, for the time. Not sure if they would stand up to time...

    Interesting to read the dual core HT chips outperformed real cores. Can we look forward to the 2600's HT being utilized in games before the next generation of CPUs comes out?
    Reply
  • haplo602
    nice review, finaly a new DeusEx game for me :-)
    Reply
  • Was AMD dual-core optimizer installed?
    Reply
  • gerchokas
    Good article - i still remember when i first saw the 2000' Deus Ex graphics on my friend's then-brand-new pc, i thought 'maaan... this looks *friggin* REAL!' I instantly knew my old Pentium cpu needed replacing ASAP...
    11 years later, i praise again the great graphics.. but this time they havent cought me off-guard!
    Reply
  • tacoslave
    i really wonder how far developers can take the graphics in 3 years.
    Reply
  • aznshinobi
    Hmm... The Nvidia cards perform better than the AMD cards of equivalent rank. I'm not playing fanboy but didn't AMD fund the studio? Afterall Eyefinity was made use of.
    Reply
  • th3loonatic
    Are there any typos? Coz I see a GTX560 Ti listed as a card used to test, but it doesn't appear in the results.
    Reply
  • fyasko
    festerovic@soma - I personally thought they were average - good, for the time. Not sure if they would stand up to time...Interesting to read the dual core HT chips outperformed real cores. Can we look forward to the 2600's HT being utilized in games before the next generation of CPUs comes out?
    HT isn't the reason dual core SB CPU's beat 6 core thubans. SB is a better architecture. Hurry up Bulldozer!
    Reply
  • mayankleoboy1
    @ Don Woligroski
    i want the CPU benchmarks at 1080p with highest settings.
    benches at 1024x768 are irrelevant. the gamer of today is atleast 1680, preferable 1080.
    so please add to the benches. also, this would show the real impact of CPU on FPS.
    Reply