Sign in with
Sign up | Sign in

Results: Battlefield 4

Update: Radeon R9 295X2 8 GB In CrossFire: Gaming At 4K
By

Our results in Battlefield 4 change dramatically compared to the original story, which showed barely any scaling at all with a second Radeon R9 295X2. As you can imagine, that first round of numbers seemed implausible, and we retested multiple times across several drive images. Consistently, we ended up with a frame rate over time chart that largely tracked a single 295X2, but spiked and dipped much more severely.

After that piece went live, AMD shared its own results with us, prompting me to revisit this game (and indeed all of the others as well). At one point, I noticed that, after installing the Catalyst beta package on the two-card system, CrossFire was reported as enabled, yet scaling was off in every test run. Then, after clicking the “Something requires your attention” pop-up, the technology suddenly showed up disabled. Between this and another apparent issue where turning CrossFire on or off caused the screen to go dark, necessitating a soft-reboot, there appear to be a couple of minor software bugs.

At any rate, toggling CrossFire off and back on seemed to help, yielding more impressive scaling figures. FCAT says an average of 84 FPS. Fraps says 84 FPS as well. It’s a match that yields a 75% boost with a second card.

Charting frame rate over time exposes more dramatic changes in instantaneous performance, with dips to 50 FPS and peaks up to 100 FPS. Even at its slowest, however, an array of Radeon R9 295X2 cards are faster than the fastest competition.

It’d be easy to call that the perfect example of why you’d spend $3000 on four Hawaii GPUs. But it’s not. There’s an experiential element that doesn’t show up in the average frame rate or frame rate over time charts, and that’s stutter. The stutter is so much more apparent with four GPUs than two, and I’d rather have the smoother game at lower frame rates than whatever two Radeon R9 295X2s give you. Battlefield 4 suffers from this more severely than any other title in our suite.

Nvidia’s GeForce GTX 780 Ti actually shows up in last place due to its 3 GB of memory per GPU, which isn’t enough for a smooth experience in this game. I’ve already done everything I can to dissuade you from 3 GB cards for 3840x2160, and that story remains intact.

More surprising to me is that we don’t get the sense of choppiness by looking at frame time variance. Typically a 95th percentile result in the 6 ms range isn’t bad.

Putting frame time variance on a line chart shows where there’s a ton of difference between frames rendered by two GeForce GTX 780 Tis. You can see where the two Radeon R9 295X2s peek out from behind, though. And again, regardless of what the charts say, the stutter is impossible to ignore and all the more bothersome from three-grand worth of graphics cards.

Display all 69 comments.
Top Comments
  • 11 Hide
    frozentundra123456 , April 21, 2014 5:24 AM
    I am not a great fan of mantle because of the low number of games that use it and its specificity to GCN hardware, but this would have been one of the best case scenarios for testing it with BF4.

    I cant believe the reviewer just shrugged of the fact that the games obviously look cpu limited by just saying "well, we had the fastest cpu you can get" when they could have used mantle in BF4 to lessen cpu usage.
Other Comments
  • 8 Hide
    redgarl , April 21, 2014 5:17 AM
    I always said it, more than two cards takes too much resources to manage. Drivers are not there either. You are getting better results with simple Crossfire. Still, the way AMD corner Nvidia as the sole maker able to push 4k right now is amazing.
  • 2 Hide
    BigMack70 , April 21, 2014 5:24 AM
    I personally don't think we'll see a day that 3+ GPU setups become even a tiny bit economical.

    For that to happen, IMO, the time from one GPU release to the next would have to be so long that users needed more than 2x high end GPUs to handle games in the mean time.

    As it is, there's really no gaming setup that can't be reasonably managed by a pair of high end graphics cards (Crysis back in 2007 is the only example I can think of when that wasn't the case). 3 or 4 cards will always just be for people chasing crazy benchmark scores.
  • 11 Hide
    frozentundra123456 , April 21, 2014 5:24 AM
    I am not a great fan of mantle because of the low number of games that use it and its specificity to GCN hardware, but this would have been one of the best case scenarios for testing it with BF4.

    I cant believe the reviewer just shrugged of the fact that the games obviously look cpu limited by just saying "well, we had the fastest cpu you can get" when they could have used mantle in BF4 to lessen cpu usage.
  • 2 Hide
    Reynod , April 21, 2014 5:39 AM
    Great article as always Chris ... sweet and to the point without bias.
  • 2 Hide
    west7 , April 21, 2014 5:42 AM
    i wasn't expecting 295x2 in crossfire review in any time soon well done toms
  • -2 Hide
    noobsaibot99 , April 21, 2014 5:45 AM
    Nothing to do here :D 
  • -3 Hide
    noobsaibot99 , April 21, 2014 5:51 AM
    Nothing to do here :D 
  • 4 Hide
    Matthew Posey , April 21, 2014 6:07 AM
    The first non-bold paragraph says "even-thousand." Guessing that should be "eleven-thousand."
  • 0 Hide
    EricJohn2004 , April 21, 2014 6:35 AM
    Lol, I notice that too.
  • -3 Hide
    Haravikk , April 21, 2014 6:37 AM
    How does a dual dual-GPU setup even operate under Crossfire? As I understand it the two GPUs on each board are essentially operating in Crossfire already, so is there then a second Crossfire layer combining the two cards on top of that, or has AMD tweaked Crossfire to be able to manage them as four separate GPUs? Either way it seems like a nightmare to manage, and not even close to being worth the $3,000 price tag, especially when I'm not really convinced that even a single of those $1,500 cards is really worth it to begin with; drool worthy, but even if I had a ton of disposable income I couldn't picture myself ever buying one.
  • -7 Hide
    EricJohn2004 , April 21, 2014 6:38 AM
    I don't see how AMD cornered Nvidia at all. The 780Ti is still the fastest single GPU you can get, and if you put them both is crossfire/sli, they are about even. Are you talking about the fast that AMD cards have more memory than Nvidia? You can always get Nvidia cards with more memory if you want to.

    But to say one company has another one cornered is a bit bias. Not a bit, just straight up bias. I like both companys, they are both doing great IMO.
  • 3 Hide
    ddpruitt , April 21, 2014 6:55 AM
    I'm curious as to the type of loads both the CPU and RAM are seeing. I wouldn't be surprised if we're getting to the point where the bottleneck is elsewhere. I'm guessing that the GPUs are just overpowering everything else and are starving, hence the wild numbers. It also looks the driver is a beta driver, there's a lot of tuning to be done there.
  • 2 Hide
    CaptainTom , April 21, 2014 7:26 AM
    Something was definitely bottle-necking those GPUs. Oh well. Some day I think we will see all crossfire/sli combinations work the way they should...
  • -2 Hide
    CaptainTom , April 21, 2014 7:27 AM
    Something was definitely bottle-necking those GPUs. Oh well. Some day I think we will see all crossfire/sli combinations work the way they should...
  • 2 Hide
    Steveymoo , April 21, 2014 7:30 AM
    Would be interesting to see benchmarks with an 8 or 12 core xeon with workstation grade hardware. Just sayin'.
  • -5 Hide
    h2323 , April 21, 2014 7:49 AM
    You have to wonder sometimes if its the hardware/software or the people doing the testing....
  • 2 Hide
    Haravikk , April 21, 2014 8:17 AM
    Regarding the possibility of software issues; it'd be nice to see a retrospective article that looks to see if that's the case. I'd definitely be interested to know if the stability is improved after a few new driver versions to see if that's the problem, or if the hardware (or concept) is truly at fault.
  • 0 Hide
    St0rm_KILL3r , April 21, 2014 9:01 AM
    I think it's still gonna be a while when games will be made to utilize 4 gpus well at one time.
  • 0 Hide
    gsxrme , April 21, 2014 9:15 AM
    This has and will always be an issue with 3-way and 4-Way SLI or Crossfire. Nvidia and AMD both have extremely low support for this. Just spend the money on 2 cards and watercool/overclock them vs adding a 3rd or 4th card into the mix.

    After my last burn with SLI GTX295s, I will never go back to QuadSLI. I am still having an issue leaving my SLI GTX680s @ 1300MHzcore / 7Ghz Ram setup. Then again i am still at 1080p like 99% of the gamers.

    4K isn't ready until refresh rate is bumped up 60Hz- 120Hz and better HDMI standards.
  • 3 Hide
    Xavier Corraya , April 21, 2014 9:46 AM
    295x2 in crossfire?
    I think Tom went mad to catch Jerry .... :p 
Display more comments
React To This Article