Results: Tomb Raider
Tomb Raider, one of AMD’s latest Gaming Evolved titles, is in my opinion one of the biggest hits thus far in 2013. It takes powerful graphics hardware to deliver playable performance using Ultimate quality settings, which enable realistic TressFX hair. Having dumped quite a bit of time into both playing the game and analyzing its performance, I felt it was beneficial to test two different game levels for today’s story.
Our 45-second custom run of the “Chasm Monastery” level contains a short cinematic that punishes graphics processors far more forcefully than the built-in benchmark, although mostly when TressFX hair is enabled. This manual run-through is easily repeatable and great for testing graphics cards.
Of course, as with many games, the hardware demands fluctuate from one map to the next. Outdoor areas encountered in the “Mountain Village” level offer a far better look at the game’s CPU demands. You’ll see less of this one on Tom’s Hardware, as it requires more user control and also overwrites the saved game, requiring somewhat tedious save slot juggling before each run. But used together, these two benchmarks provide a worst-case look at both the game’s CPU and GPU requirements.
Only normal hair effects are enabled at the High quality preset, and that flat area in the middle of our line graphs, where the dual-core processors receive a performance boost, is the cinematic sequence. All of our tested dual-core processors fly through this part of the game, yielding similar performance.
However, frame rates plummet once we step outdoors and overlook “Mountain Village”. This performance hit is most noticeable on the dual-core processors, and at times the Core 2 Duo E8400 makes it difficult to precisely control Lara’s maneuvers.
TressFX hair enabled by the Ultimate quality preset completely changes the flat cinematic portion of our run, and the mighty Radeon HD 7970 drops to 30 FPS, no matter the processor pairing. Once the camera zooms off of Lara, frame rates spike before control is handed back to the user. Similar cinematic sequences are unavoidable, and a big part of the game, which is why we’re taking the time to demonstrate this behavior within a CPU shootout.
Without a doubt, it takes powerful graphics hardware to crank out Ultimate details, but parts of this game really smack the processor as well. Game play is adversely affected by our two slowest dual-core chips; the Core i3 and overclocked Q9550 are about the least I’d want when playing though these areas of the game. But it’s the Core i5-3570K that earns respect for delivering far more consistent frame rates.
Also, amoralman, did you read this? It's basically assuring you that your C2D is still awesome as a budget processor.
Also holy crap on 1.45 vcore on the C2D