Sign in with
Sign up | Sign in

Medal Of Honor Warfighter Performance, Benchmarked

Medal Of Honor Warfighter Performance, Benchmarked
By

We take Medal of Honor Warfighter for a spin on 12 different graphics cards to figure out how much hardware you need to get this modern-day account of our Tier 1 operators' work running smoothly. Not surprisingly, the single-player campaign is GPU-heavy.

When Medal of Honor Warfighter landed on my desk, I decided to brush up on my knowledge of the series. Can you believe that this is the fourteenth Medal of Honor title since the original was released back in 1999? The first twelve were a part of the World War II craze that had such a profound impact on first-person shooters over the past decade. But the prior installment (named simply Medal of Honor, and released in 2010) put the action in present-day Afghanistan. That title's single-player component employed a modified version of Epic's Unreal Engine 3.

The Call of Duty and Battlefield franchises have also shifted away from the historical themes toward more modern stories. Warfighter even employs the same Frostbite 2 engine for its single-player campaign that DICE developed for Battlefield 3.


In the crowded world of first-person-shooters, how does Medal of Honor Warfighter differentiate itself, aside from great-looking graphics?

Well, the game follows the stories of Tier 1 operators (members of Special Mission Units in the U.S. Armed Forces) through a number of locations. "Preacher," one of the main characters from the previous title, and a DEVGRU operator, is the character you play through much of the game.

The gameplay is fairly typical first-person shooter fare. Cutscenes often approach the private lives of elite military personnel, including uncomfortable and all-too-real topics like struggling to keep a family together and burying a comrade. At its best, this game had me thinking about the life of a career soldier, along with the disconnect between the conscienceless elimination of "bad guys" and caring for loved ones. Danger Close followed a path that other developers really haven't (but one that did remind us of the Bandito Brothers' Act of Valor). However, I think they approached sensitive topics with the respect they deserve.

Preacher gets hurt, bleeds, and even gets admitted to the hospital.Preacher gets hurt, bleeds, and even gets admitted to the hospital.

At its worst, though, Medal of Honor Warfighter is just another slick shooter that invests more into keeping the missions fresh than a cohesive narrative. Every level is unique, but the flow sometimes feels forced. There's the requisite driving level, the stealth level, an open warfare level, an urban level, a jungle level, a level from a terrorist's perspective, and so on. The game's director clearly wanted to keep each mission different, tight, and polished. Those priorities sometimes cost the story its overall fit, though.

This level is seen through the eyes of a terrorist in training camp. Creepy.This level is seen through the eyes of a terrorist in training camp. Creepy.

As far as technical aspects go, I never saw debilitating drops in frame rate, and the title never crashed (which, apparently, some folks had trouble with). Playing the game after its massive day-one patch, along with the latest drivers from AMD and Nvidia, yielded a pleasant experience.

FPS: First-Person SmasherFPS: First-Person Smasher

Did I enjoy the game? It reminded me of its contemporaries from the Battlefield and Call of Duty franchises. I don't think it's as terrible as some reviews suggest, but it isn't a genre-defining masterpiece, either. If you love running and gunny through the typical modern military-themed first-person shooter, then you'll probably enjoy Medal of Honor Warfighter.

You know where you are? You're in the jungle, baby. You're gonna die.You know where you are? You're in the jungle, baby. You're gonna die.

Our job isn't to review games, though. We're more concerned about how they perform on your hardware so that you know what you need to enjoy the latest titles. To that end, we didn't spend any time in the multiplayer component of this one, which was also developed by Danger Close Games using DICE's Frostbite 2 engine. Similar to what we experienced in Battlefield 3 Performance: 30+ Graphics Cards, Benchmarked, the single-player campaign has sequences that are repeatable, while multiplayer is much more random and dependent on actions of others, making it harder to test.

Of course, in Battlefield 3, we saw that the single-player game was consequently very graphics-bound, taxing our graphics cards for all they were worth. Meanwhile, big multiplayer maps were bringing capable CPUs to their knees. Because Medal of Honor Warfighter uses the same Frostbite 2 engine, we expect that it'll behave similarly.

Display 54 Comments.
This thread is closed for comments
Top Comments
  • 22 Hide
    greghome , November 2, 2012 6:14 AM
    No 7850 2GB to see if it's a memory bottleneck ? :/ 

    and you're missing the 7870 and 7950 in them. just sayin'
  • 17 Hide
    JJ1217 , November 2, 2012 6:23 AM
    You put a 7850 1 GB, so now no one is going to buy a 7850 to play this game, as they'll get the wrong results due to memory bandwidth constraints. People who know about video ram will have no issue with this, but what about those looking for a good cheap video card to run games well? You pretty much just destroyed any chance of someone getting a 7850 for this game, due to the terrible gathering of results.

    Expected more from T.H to be honest.
  • 12 Hide
    ojas , November 2, 2012 9:46 AM
    mohit9206its great to see that entry level cards like 650, 7750 and 7770 are all a viable option even at 1080p at high setting !!! am so proud of my 7750 .. hehe..btw i dont agree with toms on the fact that a game becomes "UNPLAYABLE" if its minimum fps drops below 30.thats just a load of bulls**t.

    Try playing the game (or any game) on a constant 60 and you'll see.

    Of course the level of comfort (as far as fps is concerned) varies from person to person, I personally don't enjoy it when the frame rates drop below 40, and sub 30 is intolerable.

    I guess what Don meant by unplayable was intolerable. And i guess most here, including me, would agree.
Other Comments
  • 4 Hide
    mayankleoboy1 , November 2, 2012 4:46 AM
    Nice review! :) 
    In CPU benchmark, it would have been better to see the continuous FPS graph , rather than just the single values of 'Average' and 'minimum' .

    Also, CPU frequency scaling is needed
  • 10 Hide
    esrever , November 2, 2012 5:39 AM
    Interesting that the 1gb on the 7850 starts showing signs of weakness at higher settings even at 1080p. The minimals went lower than the 7770 :o 

    I think nvidia's gpu boost is causing the nvidia cards to have higher average and lower minimals since it can render higher fps when less things are going on but they can only have so much performance when the rendering gets tough. I think GPU boost is a pointless feature because of that since why would anyone want high maximal fps and low minimal fps?
  • 22 Hide
    greghome , November 2, 2012 6:14 AM
    No 7850 2GB to see if it's a memory bottleneck ? :/ 

    and you're missing the 7870 and 7950 in them. just sayin'
  • 17 Hide
    JJ1217 , November 2, 2012 6:23 AM
    You put a 7850 1 GB, so now no one is going to buy a 7850 to play this game, as they'll get the wrong results due to memory bandwidth constraints. People who know about video ram will have no issue with this, but what about those looking for a good cheap video card to run games well? You pretty much just destroyed any chance of someone getting a 7850 for this game, due to the terrible gathering of results.

    Expected more from T.H to be honest.
  • 8 Hide
    JJ1217 , November 2, 2012 6:25 AM
    Woops didn't mean memory bandwidth, meant amount of memory ^.^
  • 6 Hide
    EzioAs , November 2, 2012 6:28 AM
    Quote:
    No 7850 2GB to see if it's a memory bottleneck ? :/ 

    and you're missing the 7870 and 7950 in them. just sayin'


    I'm curious as well, though in my opinion it's most probably a memory bottleneck at 1080p wilth ultra settings. BF3 already uses more than 1GB with max image settings with 4xAA as well so if Warfighter uses an updated Frosbite2 engine, it's highly plausible.

    On the other hand, I'm not fully satisfied that they didn't test the game with the 7870. And how about 560ti and 6870(the 2 very popular card from last-gen), I think at least a couple mid-range card from last gen should be tested
  • 4 Hide
    greghome , November 2, 2012 6:30 AM
    EzioAshow about 560ti and 6870(the 2 very popular card from last-gen), I think at least a couple mid-range card from last gen should be tested


    i miss my 6950 on benchmarks.......
    Story of my hardware life.

    First Year, Wow Top of the line
    2nd Year, Still in benchmarks
    3rd Year, Still performing good enough
    4th Year......I need an uphrade
  • 7 Hide
    the3dsgeek , November 2, 2012 6:54 AM
    Can you please do a performance benchmark comparison of NFS most wanted? its running like shit on my GTX670
  • 4 Hide
    ojas , November 2, 2012 7:40 AM
    Liked the way you ran benchmarks, covered all major resolutions with all major detail levels across a wide spectrum of cards.

    Anyway, didn't really read your game review, but Rock, Paper, Shotgun was extremely critical of the game, and i understand their sentiment, because BF3 is similar in some respects.
    http://www.rockpapershotgun.com/2012/10/29/wot-i-think-medal-of-honor-warfighter/

    P.S. Why you no benchmark Sleeping Dogs? It brings my GTX 560 down to 40 fps minimums at 1024x768 at the highest settings...It may be a CPU bottleneck though, have to look into that fully.
  • 7 Hide
    ojas , November 2, 2012 7:52 AM
    the3dsgeekCan you please do a performance benchmark comparison of NFS most wanted? its running like shit on my GTX670

    Lol that's because it's a sucky console port.
  • -6 Hide
    mohit9206 , November 2, 2012 8:56 AM
    its great to see that entry level cards like 650, 7750 and 7770 are all a viable option even at 1080p at high setting !!! am so proud of my 7750 .. hehe..
    btw i dont agree with toms on the fact that a game becomes "UNPLAYABLE" if its minimum fps drops below 30.
    thats just a load of bulls**t.
  • 2 Hide
    captainblacko , November 2, 2012 9:03 AM
    Im shocked at the Pentium G860's FPS. that's pretty impressive for a £52 CPU!
  • 7 Hide
    Iastfan112 , November 2, 2012 9:08 AM
    I always give a big sigh when I see them acknowledge that the multiplayer is likely a CPU bottleneck....yet we're not going to make any sort of attempt to illustrate where it exists. It'd be lovely to know, for instance, does the 4170's four "cores" help it compared to the i3?

    I understand there would be a significantly greater margin of error compared to the repeatable SP benches but the information would still be pertinent and useful.
  • 12 Hide
    ojas , November 2, 2012 9:46 AM
    mohit9206its great to see that entry level cards like 650, 7750 and 7770 are all a viable option even at 1080p at high setting !!! am so proud of my 7750 .. hehe..btw i dont agree with toms on the fact that a game becomes "UNPLAYABLE" if its minimum fps drops below 30.thats just a load of bulls**t.

    Try playing the game (or any game) on a constant 60 and you'll see.

    Of course the level of comfort (as far as fps is concerned) varies from person to person, I personally don't enjoy it when the frame rates drop below 40, and sub 30 is intolerable.

    I guess what Don meant by unplayable was intolerable. And i guess most here, including me, would agree.
  • 6 Hide
    mayankleoboy1 , November 2, 2012 10:15 AM
    ojasTry plating the game (or any game) on a constant 60 and you'll see.Of course the level of comfort (as far as fps is concerned) varies from person to person, I personally don't enjoy it when the frame rates drop below 40, and sub 30 is intolerable.I guess what Don meant by unplayable was intolerable. And i guess most here, including me, would agree.


    Playing on intel IGP + P4 for many years made me accustomed to 30FPS. :p 
  • 4 Hide
    Onus , November 2, 2012 10:41 AM
    Interesting. I too have to wonder about the 1GB HD7850. The results don't appear to extrapolate cleanly to my 2GB HD7870.
    I've noticed you've used the DDR3 version of the HD6670 in recent tests, and would really like to see the GDDR5 version instead. For those who can't quite afford a HD7750, it seems to me that even the most entry level card for games should be one with GDDR5. Particularly in this case, it looks like this change might cross the line back into "playable" on some settings.
    It is also rather remarkable that an old Athlon II X2 240 can play this game as well as it does. Even though objective measurement might not be possible, I think some subjective observations on its ability to handle Multi-player would be useful.
  • 3 Hide
    ojas , November 2, 2012 11:04 AM
    jtt283Interesting. I too have to wonder about the 1GB HD7850. The results don't appear to extrapolate cleanly to my 2GB HD7870.I've noticed you've used the DDR3 version of the HD6670 in recent tests, and would really like to see the GDDR5 version instead. For those who can't quite afford a HD7750, it seems to me that even the most entry level card for games should be one with GDDR5. Particularly in this case, it looks like this change might cross the line back into "playable" on some settings.It is also rather remarkable that an old Athlon II X2 240 can play this game as well as it does. Even though objective measurement might not be possible, I think some subjective observations on its ability to handle Multi-player would be useful.

    I think their GPU chart puts the 6670 GDDR5 two tiers above the GDDR3...at par with a 9800GT.

    Also Tom's: Dishonored and Hitman: Absolution. One i know is resource intensive, the other one simple looks great, so i'm interested. :p 
  • 5 Hide
    Katsu_rap , November 2, 2012 11:25 AM
    Can't believe my 560ti isn't in the benches. It's not too long ago that I bought it and I believe many people who bought it a couple months ago aren't planning for another video card upgrade just yet.

    I'm not really complaining but you know, where's the value of mid-range cards if the next gen cards and new games comes out and they aren't even tested?
Display more comments