Skip to main content

Dying Light: Performance Analysis And Benchmarks

Image Quality, Settings And Test Setup

Techland uses the new Chrome Engine 6 as Dying Light's heart, body and mind. It incorporates a number of improvements over Chrome Engine 5, including new weather conditions, spherical harmonics-based indirect lighting and atmospheric scattering. The game looks and feels like a good survival-horror title should, putting a nice twist on an already saturated market. It's actually the first title to utilize Chrome Engine 6, and Dying Light is a great showcase for what the technology can do.

The landscape consists of two different maps: slums and an economically privileged area with much larger buildings. It is a pretty, yet dark game, and TechLand does a good job with both environments. They remain both scary and beautiful, day and night.

Our custom benchmark took a while to concoct. We settled on a spot that requires a fair bit of running from the home base, since there is no fast travel option. It begins in front of an abandoned school and continues to a bridge with a car that is on fire. This creates a dip in performance and shows the game's beautiful fire effects.

We decided to test the Low, Medium and Very High detail presets. We excluded the High setting, since it appears similar to Very High. Note that we disabled Nvidia's GeForce-specific settings to keep the rendering load equal.

Test System And Hardware

As always, we strive to represent results across a wide range of graphics hardware, testing every modern card we could get our hands on, from the Radeon HD 6450 to the GeForce GTX 980 and dual-GPU Radeon R9 295X2. We tried testing GeForce GTX 970 and 980 cards in SLI, but our samples aren't identical and SLI doesn't seem to work with boards from different manufacturers.

Our benchmarks include a number of resolutions, from 1280x720 to 3840x2160. The 4K resolution is equivalent to four 1080p monitors. But despite the massive number of pixels, Ultra HD screens are becoming more popular every day thanks to sub-$600 options like Asus' PB298Q.

This 28" display is capable of driving a 3840x2160 signal at 60Hz over a single DisplayPort 1.2 cable. You can read more about the screen in Asus PB287Q 28-Inch 4K Monitor Review: Ultra HD For $650

Test System
CPUIntel Core i7-3960X (Sandy Bridge-E), 3.3 GHz, Six Cores, LGA 2011, 15MB Shared L3 Cache, Hyper-Threading enabled.
MotherboardASRock X79 Extreme9 (LGA 2011) Chipset: Intel X79 Express
NetworkingOn-Board Gigabit LAN controller
MemoryCorsair Vengeance LP PC3-16000, 4 x 4GB, 1600 MT/s, CL 8-8-8-24-2T
GraphicsGeForce GT 730 512MB GDDR5GeForce GTX 650 2GB GDDR5GeForce GTX 750 Ti 2GB GDDR5GeForce GTX 660 2GB GDDR5GeForce GTX 760 2GB GDDR5GeForce GTX 970 4GB GDDR5GeForce GTX 980 4GB GDDR5Radeon HD 6450 512MB GDDR5Radeon R7 240 1GB DDR3Radeon R7 250X 1GB GDDR5Radeon R7 260X 1GB GDDR5Radeon R9 270 2GB GDDR5Radeon R9 285 3GB GDDR5Radeon R9 290X 4GB GDDR5Radeon R9 295X2 8GB GDDR5
SSDSamsung 840 Pro, 256 GB SSD, SATA 6Gb/s
PowerXFX PRO850W, ATX12V, EPS12V
Software and Drivers
Operating SystemMicrosoft Windows 8 Pro x64
DirectXDirectX 11
Graphics DriversAMD Catalyst 14.12 Omega, Nvidia GeForce 347.25 WHQL
Benchmarks
Dying LightIn-game Benchmark, Fraps run, 40 seconds
  • chimera201
    i5 in FPS chart, i7 in FPS over time chart. Which one is it?

    Edit: Gotta say that FX 9590 looks like a joke
    Reply
  • alidan
    there is one setting in the game, i believe its draw distance, that is able to halve if not drop the games fps to 1/3rd what you would get if you set it to minimum, and from what people have tested, it impacts gameplay in almost no meaningful way.

    what was that set to?
    did you change it per benchmark?
    is it before or after they patched it so even on max draw distance they lowered how far the game was drawing?

    i know on my brothers 290X, i dont know if he was doing 1920x1200 or 2560x1600 was benching SIGNIFICANTLY higher than is shown here.

    when you do benchmarks like this in the future, do you mind going through 3 or 4 setups and trying to get them to play at 60fps and list what options you have to tick to get that? it would be SO nice having an in depth analysis for games like this, or dragon age which i had to restart maybe 40 god damn times to see if i dialed in so i have the best mix between visuals and fps...
    Reply
  • rush21hit
    Q6600 default clock
    2x2GB DDR2 800mhz
    Gigabyte g31m-es2L
    GTX 750Ti 2GB DDR5
    Res: 1366x768

    Just a piece of advice to anyone on about the same boat as mine(old PC+new GPU and want to play this), just disable that Depth of Field and/or Ambient Occlusion effect(also applies on any latest game titles). And you're fine with your new GPU + its latest driver. Mine stays within 40-60FPS range without any lag on input. While running it on Very High Preset on other things...just without those effects.

    Those effects are the culprits for performance drops, most of the time.
    Reply
  • Cryio
    Was Core Parking taken into account when benchmarking on AMD hardware ? It makes no sense that the FX 4170 is faster than the 9590.

    The game works rather meh on my 560 Ti and Fx 6300 @4.5 GHz. But once I mess with the core affinity in task manager my GPU is getting 99% usage and all is for with the world.
    Reply
  • ohim
    Just shows how badly they optimize for AMD hardware ...no wonder everything works faster on Intel. This comes from an Intel CPU user BTW.
    Reply
  • xpeh
    Typical Nvidia Gameworks title. Anyone remember Metro: Last Light? Unplayable on AMD cards until 4A issued a game update a few months later. Can't make a card that competes? Pay off the game developers.
    Reply
  • Grognak
    @ xpeh - Couldn't agree more. A 750 Ti beating a 270X? 980 better than 295X2? I'm gonna stay polite but this is beyond ridiculous. This is pure, unabashed, sponsored favoritism.

    Edit: after checking some other sites it seems the results are all over the place. Some are similar to Tom's while others appear to be relatively neutral regarding both GPU and CPU performance (though the FXs still struggle against a modern i3)
    Reply
  • Amdlova
    remove the shadows of this nvidia game. soft shadows = suck!
    Reply
  • Empyah
    Sorry guys but you messed something up in these tests - cause my 290X is getting higher averages than your 980(i got an 4930k and view distance at at 50%), everybody knows know that you are heavily biased towards Nvidia and Intel, to the point it stops being sad and starts being funny how obvious it is - but if this test is on purpose we have ourselves found a new low today - cmon guys were in the same boat here - we love hardware and want competition.
    Reply
  • silverblue
    Looks like very high CPU overhead with the AMD drivers, and really poor use of multiple cores with the CPU test. The former can be solved easier, the latter sounds like the developers haven't yet grasped the idea of more than two CPU cores working on a problem. Is this game really heavy on the L3 cache? It could explain major issues for the 9590 in trying to use it effectively with more cores and its higher clock speeds counting for naught (and/or the CPU is being throttled), but as the L3 cache is screwed on FX CPUs, that would also have a detrimental effect on things - it'd be worth testing a 7850 alongside a Phenom II X4/X6 to see if the removal of L3 or falling back to a CPU family with good L3 would make any sort of difference to performance.
    Reply