Update: Radeon R9 295X2 8 GB In CrossFire: Gaming At 4K

Results: Thief

For the first time, we observe significant difference between the FCAT- and Fraps-reported benchmark results using Thief’s in-game test. FCAT tells us that there’s a 62 FPS average, while Fraps spits back 77 FPS. But Fraps also tries convincing us that the game dips as low as 6 FPS and shoots as high as 1174 FPS, which surely throws the average out of whack. The FCAT number is far more believable, dropping to 16 FPS, and peaking under 90 FPS. That’s scaling in the 38% range, which is not great.

What’s up with the big difference between FCAT and Fraps in this title? Using four GPUs, the Thief benchmark exhibits strange behavior in that it starts, chops through a few seconds, and then spits out a rendered payload in faster-than-real-time until it catches up with where the action is supposed to be. AMD suggests to us that this could be due to the app compiling thousands of shaders upfront, affecting performance. If you play through the game for several minutes, the frame rate does even out a bit.

The big drop in performance happens at the end of the test for no clear reason.

Two Radeon R9 295X2s in CrossFire again yield the highest worst-case frame time variance, though I normally don’t consider the 6 ms-range problematic. We do know, however, that results in the 5 ms range can be distinguished in blind testing, depending on the title.

Big frame time variance spikes are indicative of our biggest problem with the Thief benchmark: severe stuttering. As with Battlefield 4 the experience in Thief simply isn’t acceptable. The issue was confirmed when we went into the actual game and encountered the same stuttering issues.

This thread is closed for comments
69 comments
    Your comment
  • redgarl
    I always said it, more than two cards takes too much resources to manage. Drivers are not there either. You are getting better results with simple Crossfire. Still, the way AMD corner Nvidia as the sole maker able to push 4k right now is amazing.
  • BigMack70
    I personally don't think we'll see a day that 3+ GPU setups become even a tiny bit economical.

    For that to happen, IMO, the time from one GPU release to the next would have to be so long that users needed more than 2x high end GPUs to handle games in the mean time.

    As it is, there's really no gaming setup that can't be reasonably managed by a pair of high end graphics cards (Crysis back in 2007 is the only example I can think of when that wasn't the case). 3 or 4 cards will always just be for people chasing crazy benchmark scores.
  • frozentundra123456
    I am not a great fan of mantle because of the low number of games that use it and its specificity to GCN hardware, but this would have been one of the best case scenarios for testing it with BF4.

    I cant believe the reviewer just shrugged of the fact that the games obviously look cpu limited by just saying "well, we had the fastest cpu you can get" when they could have used mantle in BF4 to lessen cpu usage.
  • Reynod
    Great article as always Chris ... sweet and to the point without bias.
  • west7
    i wasn't expecting 295x2 in crossfire review in any time soon well done toms
  • noobsaibot99
    Nothing to do here :D
  • noobsaibot99
    Nothing to do here :D
  • Matthew Posey
    The first non-bold paragraph says "even-thousand." Guessing that should be "eleven-thousand."
  • EricJohn2004
    Lol, I notice that too.
  • Haravikk
    How does a dual dual-GPU setup even operate under Crossfire? As I understand it the two GPUs on each board are essentially operating in Crossfire already, so is there then a second Crossfire layer combining the two cards on top of that, or has AMD tweaked Crossfire to be able to manage them as four separate GPUs? Either way it seems like a nightmare to manage, and not even close to being worth the $3,000 price tag, especially when I'm not really convinced that even a single of those $1,500 cards is really worth it to begin with; drool worthy, but even if I had a ton of disposable income I couldn't picture myself ever buying one.
  • EricJohn2004
    I don't see how AMD cornered Nvidia at all. The 780Ti is still the fastest single GPU you can get, and if you put them both is crossfire/sli, they are about even. Are you talking about the fast that AMD cards have more memory than Nvidia? You can always get Nvidia cards with more memory if you want to.

    But to say one company has another one cornered is a bit bias. Not a bit, just straight up bias. I like both companys, they are both doing great IMO.
  • ddpruitt
    I'm curious as to the type of loads both the CPU and RAM are seeing. I wouldn't be surprised if we're getting to the point where the bottleneck is elsewhere. I'm guessing that the GPUs are just overpowering everything else and are starving, hence the wild numbers. It also looks the driver is a beta driver, there's a lot of tuning to be done there.
  • CaptainTom
    Something was definitely bottle-necking those GPUs. Oh well. Some day I think we will see all crossfire/sli combinations work the way they should...
  • CaptainTom
    Something was definitely bottle-necking those GPUs. Oh well. Some day I think we will see all crossfire/sli combinations work the way they should...
  • Steveymoo
    Would be interesting to see benchmarks with an 8 or 12 core xeon with workstation grade hardware. Just sayin'.
  • h2323
    You have to wonder sometimes if its the hardware/software or the people doing the testing....
  • Haravikk
    Regarding the possibility of software issues; it'd be nice to see a retrospective article that looks to see if that's the case. I'd definitely be interested to know if the stability is improved after a few new driver versions to see if that's the problem, or if the hardware (or concept) is truly at fault.
  • St0rm_KILL3r
    I think it's still gonna be a while when games will be made to utilize 4 gpus well at one time.
  • gsxrme
    This has and will always be an issue with 3-way and 4-Way SLI or Crossfire. Nvidia and AMD both have extremely low support for this. Just spend the money on 2 cards and watercool/overclock them vs adding a 3rd or 4th card into the mix.

    After my last burn with SLI GTX295s, I will never go back to QuadSLI. I am still having an issue leaving my SLI GTX680s @ 1300MHzcore / 7Ghz Ram setup. Then again i am still at 1080p like 99% of the gamers.

    4K isn't ready until refresh rate is bumped up 60Hz- 120Hz and better HDMI standards.
  • Xavier Corraya
    295x2 in crossfire?
    I think Tom went mad to catch Jerry .... :P
  • de5_Roy
    seems like there was/were bottleneck/s. either in software or in hardware. i wasn't expecting much performance improvement, but didn't expect regressions.

    i woulda liked to see mantle benches for bf4 and thief if mantle was lessening any performance-limiting cpu bottleneck. mantle would be perfect for showcasing it's potential in scenarios like high end quad gpu cfx. if amd haven't added such support in mantle... they're going against their propaganda of eliminating cpu/platform bottlenecks and supporting multicore cpus and multiple gpus.

    edit: can that ibuypower test rig play crysis 3... in 4k?
  • redgarl
    Quote:
    I don't see how AMD cornered Nvidia at all. The 780Ti is still the fastest single GPU you can get, and if you put them both is crossfire/sli, they are about even. Are you talking about the fast that AMD cards have more memory than Nvidia? You can always get Nvidia cards with more memory if you want to. But to say one company has another one cornered is a bit bias. Not a bit, just straight up bias. I like both companys, they are both doing great IMO.


    The 780 TI is a good single card, but there is no point of putting them in SLI. Why? Because one is enough for 1440p, but they choke at 4k while AMD is offering a 125$ cheaper card offering better results at UHD. For that reason, the 780 TI is having no use in a SLI configuration. Also, I would even recommend two 770 GTX in SLI over a TI for 1440p.

    On the other hand, a pair of 290x with good custom HSF make sense. The benchs are there to prove it.
  • redgarl
    Quote:
    This has and will always be an issue with 3-way and 4-Way SLI or Crossfire. Nvidia and AMD both have extremely low support for this. Just spend the money on 2 cards and watercool/overclock them vs adding a 3rd or 4th card into the mix. After my last burn with SLI GTX295s, I will never go back to QuadSLI. I am still having an issue leaving my SLI GTX680s @ 1300MHzcore / 7Ghz Ram setup. Then again i am still at 1080p like 99% of the gamers. 4K isn't ready until refresh rate is bumped up 60Hz- 120Hz and better HDMI standards.


    Display Port... it's already on your graphic card and provide 4k @ 60fps.
  • burmese_dude
    Please send me that gaming rig. I need to thoroughly analyze myself.

    After I'm done, I'm not giving back.