Guild Wars 2: Your Graphics Card And CPU Performance Guide

Image Quality And Settings

The game’s art features lots of bright colors. It's almost cartoonish, but in the style of a graphic novel rather than the exaggerated caricatures of World Of Warcraft. Guild Wars 2 doesn't push any stylistic envelopes. However, it's still a great-looking game.

There are a number of different sliders available for tuning Guild Wars 2's graphics options, but we're focusing on the three main presets here today: Best Performance, Balanced, and Best Appearance.

The Best Performance preset applies no anti-aliasing, uses low-quality textures, low-quality details, and no shadows or post-processing effects. Bad as all of that sounds, it still looks decent, and it runs well on low-end hardware (as you'll see in the benchmarks).

The Balanced preset also neglects anti-aliasing, but it benefits from Medium texture detail, shadows, terrain and sky reflections, and Low post-processing effects. It looks markedly better than Best Performance, but understandably requires more graphics muscle to achieve adequate performance.

The Best Appearance preset enables FXAA anti-aliasing, high-quality textures, the Ultra shadows setting, High post-processing effects, All reflections. Performance takes a significant hit under this setting, taxing the upper bounds of many low- to mid-range graphics cards.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
148 comments
    Your comment
    Top Comments
  • tomfreak
    Guild wars 2 min system requirements is Core 2 Duo 2.0 Geforce 7800, I would like u to test base on that too.
    13
  • EzioAs
    Interesting. The less-than-$100-without-external-power-connector Radeon 7750 is a balance card providing appealing visual while still runs good framerates at 1080p. I imagined if you tinker with the Best Appearance preset a little bit you can get better image quality without framerates dropping below 30. I mean let's face it, who plays on their PC without tinkering the settings here and there, that's just stupid.

    Great review as always! Really appreciate it
    12
  • rdc85
    I'm wonder how my pII x4 955BE will perform, there none in the chart...

    Anyone know? at stock speed and at 3.8 O.C....
    11
  • Other Comments
  • cmcghee358
    Hmm.. shame I can never touch an MMORPG ever again...
    8
  • haplo602
    get your graphs and test setup to match:

    Radeon HD 6450 512 MB GDDR5
    Radeon HD 6670 512 MB DDR3
    Radeon HD 7770 1 GB GDDR5
    Radeon HD 6850 1 GB GDDR5
    Radeon HD 7870 2 GB GDDR5
    Radeon HD 7970 3 GB GDDR5

    where's the 6850 in the graphs ? There's a 6870 instead ...
    -8
  • rdc85
    I'm wonder how my pII x4 955BE will perform, there none in the chart...

    Anyone know? at stock speed and at 3.8 O.C....
    11
  • EzioAs
    Interesting. The less-than-$100-without-external-power-connector Radeon 7750 is a balance card providing appealing visual while still runs good framerates at 1080p. I imagined if you tinker with the Best Appearance preset a little bit you can get better image quality without framerates dropping below 30. I mean let's face it, who plays on their PC without tinkering the settings here and there, that's just stupid.

    Great review as always! Really appreciate it
    12
  • stingstang
    I'm disappointed that this neglects the post processing bar when determining if the best appearance setting is enabled when taking in to account processing ability. I have an fx4100 and an hd 7950. How will that do at high grahhics settings?
    Duh, we want to know this stuff.
    2
  • EzioAs
    Anonymous said:
    get your graphs and test setup to match:

    Radeon HD 6450 512 MB GDDR5
    Radeon HD 6670 512 MB DDR3
    Radeon HD 7770 1 GB GDDR5
    Radeon HD 6850 1 GB GDDR5
    Radeon HD 7870 2 GB GDDR5
    Radeon HD 7970 3 GB GDDR5

    where's the 6850 in the graphs ? There's a 6870 instead ...


    Although it's probably a typo, there's probably no need to use the 6850 as well since the 7770 should perform similar
    3
  • dudewitbow
    Now I wonder where some people got the idea that GW2 was nvidia favored O_o
    0
  • dormantreign
    You gots me wanting to buy this game......
    3
  • serhat359
    stingstangI'm disappointed that this neglects the post processing bar when determining if the best appearance setting is enabled when taking in to account processing ability. I have an fx4100 and an hd 7950. How will that do at high grahhics settings?Duh, we want to know this stuff.

    It will be cpu limited, you'll get around 35-40fps
    5
  • tomfreak
    Guild wars 2 min system requirements is Core 2 Duo 2.0 Geforce 7800, I would like u to test base on that too.
    13
  • dudewitbow
    TomfreakGuild wars 2 min system requirements is Core 2 Duo 2.0 Geforce 7800, I would like u to test base on that too.

    gpu wise, the gt version of the 7800 will perform about the same level as the 6450 in question. a low end core 2 duo will be on the lower end of the cpu chart.
    1
  • burmese_dude
    Playing mine on EVGA GTX 570 SC... everything runs smoothly. The game is absolutely great. I was a 50/50 member of GW1 and I was hoping GW2 wouldn't disappoint. GW2 truly delivers fun experience and entertainment without having to shell out 10 or 15 monthly. My gaming laptop with GT 650M with output to 1080p display to a TV also performed smoothly in medium setting. I haven't tried max setting though on that.
    0
  • falchard
    Ugh really hate developers who do that. Its not just low performance with the Bulldozer cores. Its almost purpostantial performance loss, or a big error in coding. Heard that sometimes with Bulldozer everything gets piled onto 1 core like the engine has no idea what to do with the architecture.
    Also DX9, are they serious? THIS IS 2012. DX10 is 6 years old. Get with it already and learn to code a game engine. Its not like this is a multi-platform game.
    2
  • dudewitbow
    burmese_dudePlaying mine on EVGA GTX 570 SC... everything runs smoothly. The game is absolutely great. I was a 50/50 member of GW1 and I was hoping GW2 wouldn't disappoint. GW2 truly delivers fun experience and entertainment without having to shell out 10 or 15 monthly. My gaming laptop with GT 650M with output to 1080p display to a TV also performed smoothly in medium setting. I haven't tried max setting though on that.


    the 650m will perform similarish to the 7750 in question

    falchardUgh really hate developers who do that. Its not just low performance with the Bulldozer cores. Its almost purpostantial performance loss, or a big error in coding. Heard that sometimes with Bulldozer everything gets piled onto 1 core like the engine has no idea what to do with the architecture.Also DX9, are they serious? THIS IS 2012. DX10 is 6 years old. Get with it already and learn to code a game engine. Its not like this is a multi-platform game.


    being dx9, it allows users who still use windows XP to play without someone creating a mod or use the directx hack to force xp to run it. I mean skyrim also runs on DX9
    6
  • dudewitbow
    amuffinA dual core SB pentium outperforming an 8 core FX.........ROFL!

    Thats why I really hope that piledriver/steamroller pulls through.
    8
  • amuffin
    A dual core SB pentium outperforming an 8 core FX.........ROFL!
    3
  • Cryio
    You tested the game only in DirectX 9. Where is DirectX10 and 11? Or aren't they implemented yet (post-release patch)?
    7
  • mmstick
    This definitely shows us that the way GW2 was compiled does not favour AMD instructions or architecture, but is compiled in such an inefficient manner that it only supports Intel.
    -10
  • Cryio
    Quote:
    A dual core SB pentium outperforming an 8 core FX.........ROFL!


    That really makes you wander, if games/programs really know to put Bulldozer to work. I think it just sits there, idling at least 50% of processor raw power.
    8
  • alidan
    i have a question...

    intel has moved on from their core2 line, and came out with higher preforming parts, amd has moved from athlon and phenom line to... a new architecture, i dont know if they match the old one yet or not.

    but when you are doing a cpu test on a game like this where its very scaleable, it would be nice to see the core 2 dual and quad, also a phenom dual tri and quad core (from what i understand athlon and phenom for most gaming scenarios are the same) because many of us have the old dual core, and quad core cpus, and dont feel the need to upgrade because its just not nessassary for normal computer use yet.
    11