Battlefield 4 Beta Performance: 16 Graphics Cards, Benchmarked

Battlefield 4: We'll See You On The Battlefield!

The Battlefield franchise is back and better-looking than ever. No matter how you slice it, this is an iterative release. But that's the way a lot of franchise fans want to see things happen. However you felt about Battlefield 3, you'll probably feel the same way about Battlefield 4.

You don't need a monster graphics subsystem to play it, but low-end cards like the Radeon HD 6450 and GeForce 210 just aren't fast enough. At least spend the money on a Radeon HD 6570/6670 DDR3 or GeForce GT 630 GDDR5 to play this game using low details at 1280x720. Higher resolutions necessitate a Radeon HD 7770 or GeForce GTX 650, and that'd only be for 1680x1050.

Doctor Jones, oh please wake up!

If you really want to step up to Battlefield 4 at its High detail preset, the GeForce GTX 650 Ti or Radeon HD 7850 should be sufficient for 1680x1050. More enthusiasts have 1080p displays though, and we're recommending a $200 Radeon HD 7870 or $180 GeForce GTX 660. For those of you who just caught Angelini's AMD Radeon R9 280X, R9 270X, And R7 260X: Old GPUs, New Names, make that an R9 270X for at least $200.

Boom.

How about the most hardcore enthusiasts? The demanding Ultra preset compels you go go with a Radeon HD 7970/R9 280X or GeForce GTX 670/760 at 1920x1080. Stepping up to QHD means you don't want anything less than a 7970 or GTX 680/770, though an even faster graphics subsystem is recommended.

The good news is that a fairly inexpensive Core i3 or FX-4000-series CPU could be fast enough to handle the High preset without capping graphics performance. If you want to play at the Ultra preset, grab a Core i5 or FX-6000-class processor at the very least. Of course, we realize the irony: nobody with a $1000 GeForce GTX Titan goes cheap on a mid-range CPU. Chances are good that if you want to enjoy Battlefield 4 in its full glory, you're doing it on a beast of a machine. More power to you!

This game doesn't need the best of the best for playable performance, but it does want a capable gaming machine. We look forward to the commercial release at the end of the month, at which point we'll be revisiting it.

  • corvetteguy1994
    My system is good to go!


    ****EDIT BY TOM'S HARDWARE****
    Sorry, corvetteguy, you're the first so I'm going to hijack your post to answer some common questions:

    - Why didn't you mention mantle?I probably *should* have mentioned it, but at this point it seems a little early. We don't know that much about it and we don't even know exactly when it arrives. Rest assured, when Mantle is rolled out we will cover it!

    - Why did you use a Titan in the CPU tests instead of the dual-GPU 690 or 7990?Dual-GPU performance can be tricky, and without FCAT working, I didn't want to report potential pie-in-the-sky FRAPS performance that is difficult to verify. Titan is the fastest single GPU card we have.

    - Why no FX-6000 CPU?We benched the FX-4170 and FX-8350. The FX-6000 will be in between, there wasn't a colossal spread so it seems pretty straightforward.

    - For the love of everything good and pure, why did you use IE?Haha! Lots of comments on this. I used it because it was there - remember, we clean install for our benchmarks, so unless the test involves browsers we don't bother investing time installing anything else. For the record I feel dirty and violated having opened the software, but you should all know that my personal PC has both Firefox and Chrome installed. :)

    Hope that clarifies things!

    - Don Woligroski

    ****END OF EDIT BY TOM'S HARDWARE****
    Reply
  • CaptainTom
    Looks about right. My 7970 @ 1165/1805 gets 50-60+ FPS. But no quad core i7's?
    Reply
  • itzsnypah
    Why did you use a Titan for the CPU benchmarks when you have the GTX 690 / HD7990 delivering ~30% more FPS?
    Reply
  • slomo4sho
    Any particular reason why only the 2500K was overclocked in the CPU benchmarks and why the FX-4170 was benchmarked in place of the 4300 or 6300?
    Reply
  • BigMack70
    Would love to see some more detailed CPU benchmarks on a full 64 man conquest server once the game comes out... from some other data out there it looks possible that BF4 multiplayer is the first game to actually benefit from Hyperthreaded i7s over their i5 counterparts.

    In 64 man conquest games, doing a FRAPS benchmark of an entire 30 minute round, I got a minimum framerate of 42, average of 74, and max of 118 on my rig (4.8 GHz 2600k || 780 SLI @ 1100/1500 || 16GB DDR3 2133c11) at 1440p with all settings maxed and 120 fov.

    Also interesting to see 2GB cards struggling at high res on this game. I really didn't think we'd see that so soon, given that the 780/Titan/7950/7970 are the only cards yet released with >2GB standard memory.
    Reply
  • BigMack70
    Why would they mention Mantle in an article about beta performance?
    Reply
  • loops
    I have an 2500k and an 7870xt (7930). As long as I dont max out AA I tend to be able to play at 45-50 fps with a mix of high/ultra on 1080p/ 24" screen.

    But not matter what, each time that main building is blown up I loss at least 5 fps for the rest of the round and have big time fps/lag spikes.

    Imo you want an 7970/280x and a quad core to be able to play smooth.

    Also, I hear a lot about vram...what is the feed back on 2 gigs vs 3 ?
    Reply
  • smeezekitty
    I think they focused too much on the bottom end cards (6450, 210). I think anybody that has less than a 6670 probably won't be buying BF4.

    I also wish they tested a Radeon and Geforce card that would be considered equal to see how it performs by brand.
    Reply
  • nevilence
    I have a 7770 and an i5, runs pretty clean on high, wouldnt want to bump up to ultra though, that would likely suck
    Reply
  • slomo4sho
    11689688 said:
    Weird that there is absolutely no mention of Mantle when BF4 is going to be the first game to implement it.

    Considering that mantle wont be available until December, why would it be mentioned? Especially considering the fact that none of the "new" AMD GPUs were included in the benchmarks...
    Reply