Tom's Hardware Graphics Charts: Performance In 2014

Tomb Raider And Hitman: Absolution

Tomb Raider

We’re going relatively easy on our test group with Tomb Raider. Typically, this game is made more demanding by enabling its compute-heavy TressFX feature. We disable the AMD-biased capability, though. And we aren't using PhysX in some of the other benchmarks. Fair is fair.

The benchmark runs three times, though our video only depicts one iteration. Of course, that first time through is discarded, and the second two are averaged together.

We adjusted the settings once again to let us test a wide and balanced range of boards rendering at smooth frame rates.

Swipe to scroll horizontally
Tomb Raider
Run 11920x1080 (1080p)API: DirectX 11Quality: UltraAnti-aliasing: FXAATexture Quality: UltraAF: 16xHair Quality: NormalShadows: NormalShadow Resolution: HighSSAO: UltraDoF: UltraReflection Quality: HighLOD Scale: UltraPost-processing: OnHigh Precision RT: OnTessellation: On
Run 23840x2160 (2160p)API: DirectX 11Quality: UltraAnti-aliasing: OffTexture Quality: HighAF: 8xHair Quality: NormalShadows: NormalShadow Resolution: HighSSAO: NormalDoF: NormalReflection Quality: HighLOD Scale: NormalPost-processing: OnHigh Precision RT: OnTessellation: Off
LoopsThree per resolution; two used for evaluation

Hitman: Absolution

Hitman is also lightweight enough that it can be played on almost any graphics card (Ed.: In fact, poor scaling was why I pulled it from our graphics card launches). It might not be the most recent game, but we still like to include it for this reason.

Another three benchmark runs per resolution give us one warm-up and two results to average. The video showcases the sequence used for our test.

Once again, here are the settings we use:

Swipe to scroll horizontally
Hitman: Absolution
Run 11920x1080 (1080p)MSAA: 2xTexture Quality: HighAF: 16xShadows: UltraSSAO: NormalGlobal Illumination: OnReflections: HighFXAA: OffLoD: UltraDoF: HighTessellation: OnBloom: Normal
Run 23840x2160 (2160p)MSAA: OffTexture Quality: HighAF: 16xShadows: HighSSAO: OffGlobal Illumination: OnReflections: HighFXAA: OffLoD: HighDoF: HighTessellation: OnBloom: Normal
LoopsThree per resolution; two used for evaluation
  • blackmagnum
    Thank you Tom's team for updating the charts. You're my goto when I'm upgrading my rigs. I'll be waiting... Bring on yesterday's gems.
    Reply
  • Pyree
    Awesome!
    Reply
  • outlw6669
    Nice writeup; I look forward to seeing the new charts!
    Reply
  • tomfreak
    First thing Tom need is to bench how PCIE 2.0 8x vs 16x perform on a modern top end GPU. Since 290X are passing the bandwidth from crossfire bridge to PCIE, may be is time to check them again? As I recall AMD do not recommend putting 290x XDMA crossfire on PCIE 2.0 8x. Please check this out
    Reply
  • cypeq
    First it's great to see new charts.
    I was never a fan of this style of benchmarking. It sure gives clean graph of gpu capabilities which we always needed. I would love to see new bottleneck analysis. Or at least parallel test done on midrange PC.

    Everyone should keep mind that these charts represent performance of <1% PC builds out there.

    13278215 said:
    First thing Tom need is to bench how PCIE 2.0 8x vs 16x perform on a modern top end GPU. Since 290X are passing the bandwidth from crossfire bridge to PCIE, may be is time to check them again? As I recall AMD do not recommend putting 290x XDMA crossfire on PCIE 2.0 8x. Please check this out

    If I recall correctly we are at this moment at the edge of PCI 2.0 x8 which = PCI 1.0 x16 . Next or following gen will finally outdate PCI 1.0 in single and PCI 2.0 in dual GPU configs as there will finally be noticeable bottle necks.
    Reply
  • mitcoes16
    Any Steam OS or GNU/Linux benchmarks?
    It would be nice to add any opengl crossplattform game as any ioquake based one or something more modern and test it under MS WOS and under GNU / Linux

    Better if it is future Steam OS to let us know the performance at the same game under MS WOS and under GNU/Linux.

    Also it would be nice to test at MS WOS with and without antivirus, perhaps avast that is free or any other of your preference.

    Last but not least, in opengl or in directx there are version changes and being able to split cards generations by opengl / directx version support would help as a current price / performance index based in your sponsored links prices.
    Reply
  • mitcoes16
    No 720p tests?
    720p ( 1280x720 píxels = 921.600 píxels) is half 1080p more or less
    1080p (1920\00d71080 píxels = 2.073.600 pixels)

    And when a game is very demanding or you prefer to play with better graphics playing at 720p is a great option

    Of course,latest best GPUs would be able to play at 4k and full graphics, but when we read the benchmarks we want to know also if our actual card CAN play at 720p (1k) or what the best ones can do at 1k to be able to compare

    Also even it is not a standard or accurate, for benchmarking purposes calling 720p (1k) 1080p (2k) and 2160p (4K) wouldbeeasier to understand in a fast sight than UHD FHD and HDR, that can be used too UHD (4k) FHD (2k) HDR (1k)
    Reply
  • InvalidError
    13278758 said:
    No 720p tests?
    720p does not stress most reasonably decent GPUs much and how many people would drop resolution to 720p these days with all the re-scaling artifacts that might add? In most cases, it would make more sense to stick with native resolution and tweak some of the more GPU/memory-intensive settings down a notch or two - at least I know I greatly prefer cleaner images over "details" that get blurred by the lower resolution and re-scaling that further distorts it.

    Considering how you can get 1080p displays for $100, I would call standardizing the GPU chart on 1080p fair enough: the people who can only afford a $100 display won't care much about enabling every bell and whistle and the people who want to max everything out likely won't be playing on $100 displays and $100 GPUs either.
    Reply
  • 2Be_or_Not2Be
    I really like to see the charts on how much noise a video card's cooling fans make. That makes more of a difference to me as limiting something distracting that I hear every time I game versus getting a louder card with 10 fps more.

    I also like seeing how current cards stack up performance-wise to previous generations. That really helps when you're deciding whether to upgrade or not.
    Reply
  • Ubrales
    Thank you! Good reference article!
    Reply