Crysis 3 Performance, Benchmarked On 16 Graphics Cards

Test System And Graphics Hardware

As always, we strive to represent game performance across a wide range of graphics hardware. We're including cards from the low-end Radeon HD 6450 and GeForce GT 630 GDDR5 to the GeForce GTX 690, including multi-card Radeon HD 7970 CrossFire setups (note that we also wanted to include the low-end GeForce 210, but it's not DirectX 11-compatible and wouldn't run the game).

You'll also find benchmark results at 5760x1080, in case you're interested in Surround or Eyefinity performance.

We want to mention that PowerColor supplied us with a test sample of the new Radeon HD 7870 LE for this performance analysis. Unlike the Pitcairn-based Radeon HD 7870, the Radeon HD 7870 LE is built with a cut-down Tahiti GPU from the Radeon HD 7900 series.  The card offers 1,536 shaders and 1,500 MHz memory, and while it can be purchased for the same $240 as a garden-variety Radeon HD 7870, it performs closer to the $300 Radeon HD 7950.

Testing Notes

Our goal is to represent worst-case real-world performance. There is no point in benchmarking an area of the game that subjects your system to a relatively light load. If we based our hardware recommendations on that, you'd make your buying decision based on our tests and then run into problematic frame rates in more demanding sections of the game. So, we sought out the most challenging sequence possible, loaded with enemies that'd exact maximum load from the CPU and GPU.

We ended up choosing an area in the vast "Welcome to the Jungle" level. It includes a lot of lush foliage, soldiers, and things going on to challenge our test system. We often see comments in the forums that people with a similar hardware setup experience higher frame rates than our benchmarks: that's because they are testing a lighter load than we are.

We're including our new consecutive frame time metric, but we've modified the formula a little. Instead of simply reporting the variance between consecutive frames, we're comparing the variance between each frame and an ideal frame time calculated based on a range of previous and following frames. We'll go into this more in an upcoming story dedicated to our efforts. But this evolving approach removes some of the confusion with our results and natural variances in the frame rate. Note that we've also changed the nomenclature from 'latency' to 'variance' in order to more accurately describe our focus.

Keep in mind that we set all overclocked cards to reference specifications to represent the majority of products on the market.

Test System
CPU
Intel Core i7-3960X (Sandy Bridge-E), 3.3 GHz/3.9 GHz Max Turbo,
Six Cores, LGA 2011, 15 MB Shared L3 Cache, Hyper-Threading enabled.
Motherboard
ASRock X79 Extreme9 (LGA 2011) Chipset: Intel X79 Express
Networking
On-Board Gigabit LAN controller
Memory
Corsair Vengeance LP PC3-16000, 4 x 4 GB, 1600 MT/s, CL 8-8-8-24-2T
Graphics
GeForce GT 630 512 MB GDDR5
GeForce GTX 650 2 GB GDDR5
GeForce GTX 650 Ti 1 GB GDDR5
GeForce GTX 660 2 GB GDDR5
GeForce GTX 660 Ti 2 GB GDDR5
GeForce GTX 670 2 GB GDDR5
GeForce GTX 690
GeForce GTX Titan

Radeon HD 6450 512 MB GDDR5
Radeon HD 6670 512 MB DDR3
Radeon HD 7750 1 GB GDDR5
Radeon HD 7770 1 GB GDDR5
Radeon HD 7850 1 GB GDDR5
Radeon HD 7870 2 GB GDDR5
Radeon HD 7950 Boost 3 GB GDDR5
Radeon HD 7970 3 GB GDDR5
Hard Drive
Samsung 470-series 256 GB (SSD)
Power
ePower EP-1200E10-T2 1200 W
ATX12V, EPS12V
Software and Drivers
Operating System
Microsoft Windows 8
DirectX
DirectX 11.1
Graphics Drivers
Catalyst 13.2 beta 6, Nvidia 314.07 beta (314.09 Beta for GeForce GTX Titan)
Benchmarks
Crysis 3
v.1.0.0.1, "Welcome To The Jungle", 60-second Fraps run
Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
139 comments
    Your comment
    Top Comments
  • s3anister
    Cool article but every time I see an "ultra-mega-(insert specific game here)-gpu-performance-showdown" type article I can't help but feel that they are always lacking in comparison to older cards. It'd be nice if there were at least a few last gen cards tossed in for reference. Not everyone decided to upgrade from their HD 6970s or GTX 580s.
    39
  • stickmansam
    Still feel that the game is unduly harsh for what it displays

    Also hope AMD comes out with better drivers soon
    37
  • Immoral Medic
    I completed this game in 4.5 hours. I gotta say, having great graphics does NOT make a good game. It's sad when all you have to attract customers is "Best Graphics in a Game Yet". BUYBUYBUY. Don't even get me started on the absolutely terrible multiplayer...
    34
  • Other Comments
  • will1220
    Why would you include the top of the line amd, middle of the line intel (ivy bridge i5) and not the top of the line ivy bridge i7 3770k?????????
    14
  • stickmansam
    Still feel that the game is unduly harsh for what it displays

    Also hope AMD comes out with better drivers soon
    37
  • johnsonjohnson
    Right on time. I kinda suspect the i3-3220 performance from Techspot was unusual..
    8
  • rawrrr151
    I thought i3 3220 was IB, not SB?
    17
  • hero1
    Time to make an i7 rig and pass my current system to wife because Crysis demands. Nice review and the 13.2 driver from AMD has really improved frame variance for their cards. Keep it up red team so green team can do the same. The better the drivers the better our gaming experience. After all, we pay pretty penny looking for better experience. Cheers!
    5
  • DryCreamer
    I have a hand ful of benchmarks I ran when I upgraded to from the i3 3220 to the i7 3770K and I DEFINITELY noticed a jump in the minimum frame rates:

    http://www.tomshardware.com/forum/395367-33-crysis-benchmark-560ti

    Dry
    9
  • Immoral Medic
    I completed this game in 4.5 hours. I gotta say, having great graphics does NOT make a good game. It's sad when all you have to attract customers is "Best Graphics in a Game Yet". BUYBUYBUY. Don't even get me started on the absolutely terrible multiplayer...
    34
  • aussiejunior
    Wheres the gtx 680?
    1
  • xpeh
    The only thing this game has going for it are the graphics. I beat the game in under 6 hours. The story was simply tossed in the gutter. They should have stuck with fighting the Koreans instead of introducing Aliens.
    16
  • iam2thecrowe
    toms, your method of monitoring frame times must be screwed up, the cards vary wildly and at some point the lowly gtx 650ti was showing an unbelievably good score, even better than the gtx 670. There is something wrong with your testing method. I have also noticed the same thing in previous benchmarks where you measured frame time, not consistent results. Please look into this.
    22
  • JJ1217
    xpehThe only thing this game has going for it are the graphics. I beat the game in under 6 hours. The story was simply tossed in the gutter. They should have stuck with fighting the Koreans instead of introducing Aliens.


    While its no where near to Crysis 1, I don't understand the hate for C2/C3's campaign. I thought it was amazing, good fun, while crysis was just too serious. I loved jumping around in c2, sliding through hallways, spamming my shotgun.

    I do think that C2 and C3 shouldn't be C2 and C3, if you know what I mean, like it should be called something different, not in the same Crysis franchise.
    3
  • mouse24
    Wonder if theres tessellation under the ocean in this one to.
    17
  • cleeve
    iam2thecrowetoms, your method of monitoring frame times must be screwed up, the cards vary wildly and at some point the lowly gtx 650ti was showing an unbelievably good score, even better than the gtx 670. There is something wrong with your testing method. I have also noticed the same thing in previous benchmarks where you measured frame time, not consistent results. Please look into this.


    The method is fine, but the graphics load has a lot to do with the results. It's not cut and dry.

    Were writing an article around it this month, it should explain a lot.
    -5
  • cleeve
    aussiejuniorWheres the gtx 680?


    In the high detail and triple-monitor benchmarks
    5
  • s3anister
    Cool article but every time I see an "ultra-mega-(insert specific game here)-gpu-performance-showdown" type article I can't help but feel that they are always lacking in comparison to older cards. It'd be nice if there were at least a few last gen cards tossed in for reference. Not everyone decided to upgrade from their HD 6970s or GTX 580s.
    39
  • de5_Roy
    there was something more into fx8350's 'higher' performance after all. proof that average fps don't tell the full story.
    i hope crytek can fix this with an update.
    i can totally see this game becoming a benchmark staple very soon. :D
    10
  • Novuake
    The first graphics benching AA with GTX670???? Whats up with that?

    How can the minimum FPS be 30FPS but the average is 24 FPS?

    And its a little odd that the min FPS is so close across the the board... Explain?
    11
  • slomo4sho
    I wanted to see if the hd 7870 or a 2gb 7850 would be able to support 5760x1080 on low settings.
    6
  • JonnyDough
    Just like Crysis 1 and 2, I still don't care. I play TF2, Skyrim, and any other game that isn't brand new because I refuse to pay $60 for a game that I can't return to the store and I don't have time to play all the titles out there I want to anyway. Anyone who has to jump on the latest and greatest bandwagon doesn't understand what "good gameplay" is.
    -20
  • mouse24
    JonnyDoughJust like Crysis 1 and 2, I still don't care. I play TF2, Skyrim, and any other game that isn't brand new because I refuse to pay $60 for a game that I can't return to the store and I don't have time to play all the titles out there I want to anyway. Anyone who has to jump on the latest and greatest bandwagon doesn't understand what "good gameplay" is.


    Some people don't understand that peoples opinions/gameplay/genres are different.
    20