Mass Effect Andromeda Performance Review

Hotly anticipated by fans of the franchise, Mass Effect: Andromeda is finally available on the PC, PlayStation 4, and Xbox One. Published by EA after five years of BioWare's development work, this game utilizes DICE's Frostbite 3 engine. Gone is Unreal Engine 3, which BioWare used for Mass Effect 3. Somewhat disappointing, then, is that Andromeda is only compatible with DirectX 11, even though Frostbite 3 can be made to work with DirectX 12 (as in Battlefield 1, for example). But enough chit-chat: let's see how Mass Effect: Andromeda performs on our mid-range gaming PC.

Minimum And Recommended System Requirements

EA suggests minimum and recommended system configurations expected to facilitate smooth frame rates in Mass Effect: Andromeda. Although the publisher's "Are you ready..." page lists predicted performance at 30 FPS, we interpret this to mean at a minimum, rather than average frame rates that'd dip uncomfortably low. With that said, EA's minimum requirement targets 720p using Low quality settings, while the recommended spec aims for 1080p using High details. Based on the following components, this appears to be a fairly demanding title.

ConfigurationMinimumRecommended
ProcessorIntel Core i5-3570 or AMD FX-6350
Intel Core i7-4790 or AMD FX-8350
Memory
8GB
16GB
Graphics CardGeForce GTX 660 2GB or Radeon HD 7750 2GB
GeForce GTX 1060 3GB or Radeon RX 480 4GB
Operating System
Windows 7, 8.1, 10 (64-bit only)
Windows 7, 8.1, 10 (64-bit only)
Disk Space
55GB
55GB
Audio
DirectSound-compatible DirectSound-compatible

Radeon vs GeForce

We begin with a graphics quality comparison between AMD's Radeon and Nvidia's GeForce cards. Being a multi-platform game, and therefore destined to run on AMD-based consoles and a diverse range of PCs, it'd be disappointing if we didn't see identical output from both architectures.

GeForceGeForceRadeonRadeon

Aside from a handful of pixels, the images are largely identical. That's good news.

Graphics Settings

You have access to enough quality settings for a customized experience, but not so many that you're overwhelmed by technical terms and incomprehensible parameters.

Four presets are available: Low, Medium, High, and Ultra. Of course, you can also fine-tune the underlying options.

The most obvious differences between High and Ultra are in the terrain textures (specifically, the foliage beside paths, which is much more detailed in Ultra mode) and the distance at which decorative elements appear (for example, the greyish-blue block you can see in the Ultra screenshot, but not in the High one).

HighHighUltraUltra

Dropping down to the Normal preset, shadows and textures are much less detailed. The default settings also force a 900p resolution, which is upscaled to 1080p. This results in a slightly pixelated look. Anti-aliasing under the Normal preset isn't as pronounced.

The Low setting further chips away at fidelity, resulting in blurrier, more pixelated, and somewhat faded visuals. The shadows and draw distance of decorative items is reduced to a minimum, while the target resolution falls to 720p before being upscaled to 1080p. It goes without saying that you should only suffer the Low preset if your hardware can't handle anything higher.

LowLowMediumMedium

Obvious Bugs

Although Mass Effect: Andromeda looks great, its physics engine seems a little lost at times. We've seen a number of collision problems and objects floating in the air where they shouldn't. Hopefully a future patch addresses these issues.

MORE: Ghost Recon Wildlands Performance Review

MORE: For Honor Performance Review

MORE: Resident Evil 7: Biohazard, Benchmarked

Create a new thread in the Reviews comments forum about this subject
This thread is closed for comments
31 comments
Comment from the forums
    Your comment
  • WINTERLORD
    glad to see tomshardware doing articles like this. i think it may of been cool to see a radeon fury in there since its 4gb limited ram i dont own one but am sure somebody does. however stickijg to the cards listed in the article would have been nice to see how it plays at 4k resolution. at Any rate great article
    2
  • IceMyth
    Hmmm...I dont think this article is accurate when it comes to GPUs FPS . Yes, The 1050 and 460 are the worst for this game, but the other GPUs are not.

    The problem I see is that you didn't eliminate the CPU bottleneck which affects the GPU performance as well, for example PCGamer used different setup to eliminate the CPU bottleneck and all GPUs used where MSI. The results they got is that MSI RX 480 is slightly higher FPS than MSI 1060 (which is 300Mhz higher) on Ultra. While on Medium settings the Rx 480 is faster by around 10FPS.

    I know this is not a GPU benchmark but since you include FPS/CPU/Memory it is a way to benchmark computer hardware.
    http://www.pcgamer.com/mass-effect-andromeda-pc-performance-analysis/

    Windows 10 has around 47% of market share, you sure about this? I mean so far all what I heard and by googling the 47% market share is for Windows 7 and not Windows 10. If you mean Win10-64bit vs Win10-32bit market share then this is something else.
    1- https://www.neowin.net/news/windows-xps-market-share-takes-another-hit-as-windows-7-and-10-rise
    2- https://www.wincert.net/microsoft-windows/windows-10/windows-10-market-share-without-changes/
    3- https://betanews.com/2017/03/01/windows-10-loses-share-again/

    Regards,
    6
  • brandxbeer
    Wheres the 2k and 4k benchmarks? If we wanted a 1080p 1050 review I'd look at a ps4 pro review. Stop being lazy with you're benchmarks.
    -4
  • rantoc
    Personally i'we disabled the motion blur (need to make a cfg file, google it).
    Changed to HALF16 instead of Compressed to get less washed out gfx that have more depth to it
    Changed to Double-buffering from Trippe-buffering to get rid of the console like high latency in the game = Much faster response and it also appeared to give a better frame pacing.

    3440x1440 running with around 80-90 fps depending on area (gsyned 100 hz), no stuttering after the above was fixed.

    The above along with some minor tweaks to light/shadow and the game is both responsive yet really beautiful
    2
  • Jan_26
    Anonymous said:
    Personally i'we disabled the motion blur (need to make a cfg file, google it).
    Changed to HALF16 instead of Compressed to get less washed out gfx that have more depth to it
    Changed to Double-buffering from Trippe-buffering to get rid of the console like high latency in the game = Much faster response and it also appeared to give a better frame pacing.

    3440x1440 running with around 80-90 fps depending on area (gsyned 100 hz), no stuttering after the above was fixed.

    The above along with some minor tweaks to light/shadow and the game is both responsive yet really beautiful


    I think you mean you changed from double-buffering to tripple buffering. As tripple buffering is superior to double buffering in every aspect except gpu ram requirements :-)
    3
  • Masterarms
    This game is a joke, a complete horribly rendered game with more bugs and glitches than content. The game is a flop, use something else.
    5
  • bgunner
    Quote "Then again, 600 years have passed since Mass Effect 3, so perhaps evolution is to blame?"

    This could also be the reason of the horrible facial expressions used. After having their faces in one position for so long, 600+ years, they don't have much muscle control to express themselves properly. lol

    As for physics engine issues they seem to pop up every where. From dead body's half in the ground, wild life materializing in the ground, floating objects and the occasional shooting the enemy and it not registering these all can be a turn off but when certain parts of terrain are rendered just plain wrong this is a full deal breaker.

    This is most commonly seen by me on the Tempest's bridge and looking in to the escape pod room where Peebee hangs out. most times it will show space, stars nebula's and black, then slowly switch over to the actual room view.

    GPU and CPU performance aside this thing needs some serious patch work.
    4
  • dstarr3
    I'm hoping to get another year and a half or two years out of my 980Ti. And judging by the 1060/1070 here, it's good to see that I'll still be able to run this year's AAA games at 1080p/60/Ultra. Hang in there, buddy! Once the 1180Ti or whatever comes out, then you can take a well-deserved vacation!
    0
  • bgunner
    Anonymous said:
    Wheres the 2k and 4k benchmarks? If we wanted a 1080p 1050 review I'd look at a ps4 pro review. Stop being lazy with you're benchmarks.



    It is obvious the whole meaning to this actical is completely lost on you even though it was strait spelled out for you. And I quote "How does it run on mainstream gaming hardware? We benchmark it on eight different graphics cards to find out." This was not a benchmark of the top GPU's and top CPU's but a bench mark of mid range hardware.

    For this the article served it purpose But I would have liked to see other CPU's added in to the mix to show the CPU bottlenecks at what points.

    Because very few of those that use Steam use 2k and even less use 4k monitors the mid-range segment is in the 1920X1080 resolution. For this game to run at 4k resolution more than one GTX1080 will be necessary and again puts it well out of the mid-range hardware that this article was meant to cover.
    3
  • rantoc
    Anonymous said:
    Anonymous said:
    Personally i'we disabled the motion blur (need to make a cfg file, google it).
    Changed to HALF16 instead of Compressed to get less washed out gfx that have more depth to it
    Changed to Double-buffering from Trippe-buffering to get rid of the console like high latency in the game = Much faster response and it also appeared to give a better frame pacing.

    3440x1440 running with around 80-90 fps depending on area (gsyned 100 hz), no stuttering after the above was fixed.

    The above along with some minor tweaks to light/shadow and the game is both responsive yet really beautiful


    I think you mean you changed from double-buffering to tripple buffering. As tripple buffering is superior to double buffering in every aspect except gpu ram requirements :-)


    Actually no, proper triple buffering as long as the vram can afford it runs great on one card and is usually what i recommend myself but me:a's implementation with multiple cards / temporal aa causes frame time issues.

    Where double / temporal works quite ok considering temporal have to xfer prev frame back and forth between the cards and an nvidia double sli bridge seems to be enough for quite acceptable min/avg fps.
    1
  • brandxbeer
    Anonymous said:
    Anonymous said:
    Wheres the 2k and 4k benchmarks? If we wanted a 1080p 1050 review I'd look at a ps4 pro review. Stop being lazy with you're benchmarks.



    It is obvious the whole meaning to this actical is completely lost on you even though it was strait spelled out for you. And I quote "How does it run on mainstream gaming hardware? We benchmark it on eight different graphics cards to find out." This was not a benchmark of the top GPU's and top CPU's but a bench mark of mid range hardware.

    For this the article served it purpose But I would have liked to see other CPU's added in to the mix to show the CPU bottlenecks at what points.

    Because very few of those that use Steam use 2k and even less use 4k monitors the mid-range segment is in the 1920X1080 resolution. For this game to run at 4k resolution more than one GTX1080 will be necessary and again puts it well out of the mid-range hardware that this article was meant to cover.


    No, I fully understand the point of it. I still find it lazy though. Most of the benchmarks were done during the EA origins early play thing without patches or driver updates. I was mostly hoping for a proper benchmarks with all available GPU's to compare and a proper 1080 ti benchmark. I also thought it would go much deeper into the graphics settings with screen shots and fps gain/loss for each setting like geforce or gamernexus does. I guess I just expected more from Tom's.
    Game is great so far though. I haven't seen any major glitches yet, only some minor clipping issues.
    -2
  • cryoburner
    Anonymous said:
    Hmmm...I dont think this article is accurate when it comes to GPUs FPS . Yes, The 1050 and 460 are the worst for this game, but the other GPUs are not.

    The problem I see is that you didn't eliminate the CPU bottleneck which affects the GPU performance as well, for example PCGamer used different setup to eliminate the CPU bottleneck and all GPUs used where MSI. The results they got is that MSI RX 480 is slightly higher FPS than MSI 1060 (which is 300Mhz higher) on Ultra. While on Medium settings the Rx 480 is faster by around 10FPS.

    That is a good point, and it might have been nice if they had tested with a few different CPUs. Even an i5-6600 could have made a notable difference in performance if the RX480 was getting CPU-limited more often on the 6500 than the GTX 1060, and that's still very much a "mainstream" CPU. They tested with 8 different graphics cards with prices ranging from under $100 for the RX 460, to around $300 for that GTX 1060 Strix OC (and someone could have paid even more for a GTX 970 or R9 390), and yet they only tested with a single $200 CPU, in a game that's getting CPU limited with some of the cards. Why not likewise test with a $100, $200, and $300 CPU to round things out?

    Anonymous said:

    Windows 10 has around 47% of market share, you sure about this? I mean so far all what I heard and by googling the 47% market share is for Windows 7 and not Windows 10.

    Have a look at Steam's latest hardware survey, which should provide a better depiction of systems that are actually used for gaming...
    http://store.steampowered.com/hwsurvey?platform=pc
    Currently, Steam is showing Windows 10 64 bit at 52.22% and rising, while Windows 7 64 bit is at 31.20% and dropping, among Windows versions. Those articles are undoubtedly counting business systems in the mix, which tend to be slow to upgrade to avoid having to retrain staff, as well as any potential hardware and software conflicts that might arise from moving to a new OS.

    Anonymous said:

    As for physics engine issues they seem to pop up every where. From dead body's half in the ground, wild life materializing in the ground, floating objects and the occasional shooting the enemy and it not registering these all can be a turn off but when certain parts of terrain are rendered just plain wrong this is a full deal breaker.

    This is most commonly seen by me on the Tempest's bridge and looking in to the escape pod room where Peebee hangs out. most times it will show space, stars nebula's and black, then slowly switch over to the actual room view.

    That makes me wonder if that kind of pop-in could actually affect the results of these framerate tests. If a particular GPU took longer to load something in, it might not need to render that object until later, potentially resulting in increased framerates (or the opposite). Even when viewing Tom's video of their benchmark run, I noticed a character briefly "teleport in" at 0:13. Had that character loaded sooner, or not at all, it seems like they could have potentially affected the framerate in some way.
    0
  • ledhead11
    I've been playing this on both my rigs. Yeah the game is buggy. Smooth flowing is not something that really applies anywhere for more than a few seconds here and there. I do recommend it be installed in a SSD. I tried it on SATA III RAID0 platter raids on both rigs and when I put it on an SSD I saw a pretty big change in load times and also all the annoying pauses between cutscene transitions were greatly minimized. They're still there but at least now it feels more like a dropped frame or two as opposed to feeling like something just crashed.

    Old Rig: 2600k(4.2ghz), 16GB(1333mhz), Z68, 2xG1970's SLI, 1440p, 144hz G-Sync
    New Rig 4930k(4.1ghz), 32GB(2133mhz), X79, 2xXteme1080's SLI, 4k 60hz V-Sync

    The Old rig @ 1440p: All textures/shadows maxed, FXAA and G-Sync(all V-sync off) avg's 40-80fps. Uses roughly 6-10GB Ram, stays pinged in 3.5GB Vram limit of the 970, and according to MSI afterburner its using a pagefile around 10-20GB's. The 970's stay fairly cool in the 60-70c. CPU usage averages 40-50%. It's obvious the game wants more Vram than the 3.5gb the 970's give but otherwise very enjoyable thanks to G-sync.

    New rig @ 4k/60hz: Everything maxed. V-Sync On. It averages 50-60fps most of the time and dips ~40-45fps. Will use ~10-16GB ram with Vram mostly around 6-7GB and sometimes a little higher. Afterburner's report of pagefile is the same 10-20GB. This game makes my 1080's run HOT. I've never seen this in any of the other demanding games I own. They will average 70-80c and only on this game at this time. Witcher3/GTA V/Fallout 4/Doom don't do this, they all hang around 60-70c. CPU usage is around 35-45%. The game recognized and enabled HDR10 for the HiSense 4k HDR TV I have connected. I didn't have to do anything other than leave it on auto. Pretty cool on that front.

    Play experience is nearly identical between both rigs now other than resolution. Both rigs originally had the same RAID0 platter setup(2 Seagate 500GB SATA III) but now the 2600k rig has a Toshiba OCZ 960GB. I can't emphasize enough how an SSD can make a difference for this game. An SSD won't cure this game's bugs but it will make them less painful.
    1
  • photonboy
    *Some of your CONCLUSIONS are probably misleading. The difference in VRAM probably explains both the system and video differences in usage.

    For example, the RX-480 8GB may buffer more in video memory (however it may not NEED more than 6GB of what it buffers) because it can. Conversely the GTX1060 6GB probably buffers more in system memory because it doesn't have enough VRAM.

    **Where the CONFUSION comes in over and over with different games as well is the assumption that the game NEEDS to have all the buffered data or issues happen. That is NOT necessarily true. Some games buffer as long as there's sufficient memory, and those that don't have it may simply swap data during a level load or other such transition with minimal impact on game play.
    1
  • Achoo22
    When I look at the system requirements, it's the CPU requirements that most pique my interest. The AMD counterpart for recommended is something I value much less than the Intel part for basic operation, so I would've like to see the game benchmarked on AMD CPUs in addition to Intel iron.
    1
  • jerm1027
    This game punishes my old 2500k @ 4.2GHz, 8GB 1866MHz RAM and R9 380x. I play with customized settings that are probably between Medium and High - I'm at native 1080p, high shadows, high textures, high AA, but most other things turned to low. I'm particularly sensitive to aliasing, and can't stand super low-res, blocky shadows. I also have FreeSync, so despite running higher resolution, and having my all hardware pegged, I'm getting a pretty good experience, bizarre animation aside.
    0
  • cinergy
    Too feministic game for my liking. Pass.
    -4
  • Burstaholic
    That settles it: gotta be my CPU. My R9 Fury running at 1105/750 is still seeing 20-40 FPS at 1080/Ultra (automatic). My Phenom II X4 960T must be the culprit.

    Can't wait to jump on the Ryzen 5 release next week for a shiny new 1600x :D
    0
  • WyomingKnott
    Anonymous said:
    Too feministic game for my liking. Pass.


    Feministic how?
    0
  • bgunner
    Anonymous said:
    Anonymous said:
    Too feministic game for my liking. Pass.


    Feministic how?


    Have you noticed how the NPC Humans stand? Chest and stomach puffed out with the hand on the hip like a woman. :lol:
    -1