Crytek: It's Getting Hard To 'Wow' Gamers With Graphics

In a recent interview with Dark Side of Gaming, Nicolas Schulz, principal rendering engineer at Crytek's main studio, admitted that it's getting hard to impress gamers with real-time visuals.

Why? Perhaps because the visual aspect of gaming is beginning to level off. There's no huge jump as we saw in the late 1990s and 2000s. Still, developers have tricks up their sleeve that should force a few jaws to drop as realism is pushed to the limits.

"As opposed to the times of the original Crysis, we as an industry have reached a quality level now where it is getting increasingly more difficult to really wow people," he said. "That said, there's still enough areas to explore and we will definitely keep pushing the boundaries as much as possible."

The interview pointed out that Ryse: Son of Rome is pushing 900p and 30 frames per second on the Xbox One, whereas the PC version will support 4K and 60 frames per second. He said that the current generation of high-end GPUs is still a long way from reaching 60 frames per second at a 4K resolution. He added that for 4K resolutions, you have four times the number of pixels that need to be shaded.

"This is very quickly saturating the available bandwidth," he said in the interview. "The consoles are clearly behind high-spec GPUs in terms of raw horsepower. However, on the positive side, they share the same modern architecture which enables a wealth of interesting optimization techniques."

He admitted that due to the console differences, the Ryse: Son of Rome team had to work harder on the final optimizations. The good news here is that the team didn't have to make any sacrifices in the visual quality of the product.

Specifications for the PC version of Ryse: Son of Rome appeared back in September. This version will require at a minimum a quad core processor or a dual-core processor with HyperThreading technology. The game will also require 4 GB of RAM, a DirectX 11 graphics card with 1 GB VRAM, a 64-bit operating system, and 26 GB of hard drive space.

Got a 4K monitor and a high-end PC? The game's recommended specs for 4K gaming include a quad-core or eight-core CPU, 8 GB of RAM, a DirectX 11 graphics card with 8 GB of VRAM, a 64-bit operating Windows operating system (Vista, 7, 8) and 26 GB of hard drive space.

Follow Kevin Parrish @exfileme. Follow us @tomshardware, on Facebook and on Google+.

  • soldier44
    Been gaming on 4K for a few months now with 2 GTX 780 classifieds, there have been some rough edges but the driver has finally smoothed things out at least on BF4 getting 70-80 fps ultra.
    Reply
  • zero2dash
    Could just be me, but personally, I'm more about gameplay now than graphics (and have been for several years now).

    I've had more fun and more play time spent on rinky dink indie titles over the last year or so than all the AAA games that I own combined in my Steam library. For instance - spent an hour or so playing Risk of Rain on Saturday night, and had a blast. Kept my interest a lot more than the last several Call of Duty's have.

    Again though - could just be me.
    Reply
  • ram1009
    There's more to gaming than graphics. Just ask Naughty Dog.
    Reply
  • joneb
    I wonder if game developers think everyone with a pc can afford a powerful gaming rig. I cant speak for anyone else but I cant. I would love to play these games but I am falling behind big time because I cant afford the i7s, high end gpus and even a new operating system (still on Vista 32bit).

    I do think a lot of people are like me and that by setting computer games at such a high end expense companies are missing out on revenue. The point I am making though is it would be a big leap for people like myself but its far too expensive to make it.
    Reply
  • warezme
    8GB of VRAM is that right? Maybe a Titan but most cards don't have 8GB of VRAM.
    Reply
  • Martell1977
    For me it has always been more about gameplay and story than graphics. I'm pretty sure tons of people agree, why else would someone play World of Warcraft, lol. If the gameplay is fun and/or the story immersive, the graphics are a much lower priority. Blizzard is the poster child for this. Gameplay graphics are decent (cinematics are awesome tho) but the game is fun and the story is pretty good (historically the story has been better, but the current ones are decent).

    Putting graphic quality over gameplay and story is like putting perfume on a pig.
    Reply
  • alidan
    Here's a fun thing to consider, since the PlayStation two I don't believe the graphics have gotten much better in video games at all. This is gonna take a little bit of explaining.

    Sure the poly counts of gotten better the textures have gotten better anti aliasing shadows all that crap has gotten better... However look at a game like Psychonauts, would higher polygon counts make the game better? Would more lighting effects make the game just fantastic?

    Take a look at the game Crysis, when everything is moving tell me that it's just not as easy to see your enemies as it would be in a game where all the scenery didn't move. See your enemies are crack shots, they can see you no matter what, all the scenery moving only affects your ability to see them... Crysis is a very pretty looking game, but it sacrifices some of the playability to look that good.

    Back to Psychonauts, all the game could really use is just higher resolution textures, that's it.

    But then take a look at a game like skyrim, Bethesda did such a crappy job texturing the game that even using their high resolution texture pack the game doesn't look that good. However other people have taken that game and re-textured everything, was lower resolution textures, but they look like they are higher resolution.

    The further we push technology the crapier and lazier the devs are actually going to be when it comes to graphics, they get four gigs of RAM there to use that four gigs of RAM regardless if the game could actually use it or not. they're not going to try and make the littlest texture possible that looks the best, they will use the bigger texture even if it looks like crap just because its bigger.
    Reply
  • I've always been an advocate for resolution over graphics. Two branches of the same tree. Which probably explains why I find the constant lowering of res on the modern consoles such a silly idea.

    Graphics in a 3D title can look as good or as bad as they want, but, unless the game is running on a high enough res, or, preferably the native res to your screen - It's just going to look blurry.

    To give an example, Batman: Arkham Asylum. Definitely not the most visually impressive of titles but it didn't really matter when things looked crisp and clear.

    I would much rather the developers dropped some of the eye-candy in their console versions and kept it at 1080p, than to keep them and drop to 900, or 720p.

    Not that it matters as a PC gamer but hey, the poor sods need to experience 1080p sometime in the next decade, right?
    Reply
  • tolham
    14322127 said:
    I wonder if game developers think everyone with a pc can afford a powerful gaming rig. I cant speak for anyone else but I cant. I would love to play these games but I am falling behind big time because I cant afford the i7s, high end gpus and even a new operating system (still on Vista 32bit).

    I do think a lot of people are like me and that by setting computer games at such a high end expense companies are missing out on revenue. The point I am making though is it would be a big leap for people like myself but its far too expensive to make it.
    the average gamer may not be able to play the latest AAA titles at max settings right away, but everyone eventually upgrades their computers. over time, games and computer parts get cheaper. for example, the 780ti has dropped in price by a couple hundred dollars recently, and bioshock infinite is now half the price it was at launch. eventually, 4k screens will get cheaper, and the GPUs needed to drive 4k gaming will get cheaper, and you can game at 4k without selling a kidney.
    Reply
  • chicofehr
    Graphics can be pushed much further. We still don't have graphics like in some of nvidea tech demos. Of course 10fps would suck.
    Reply