Sign-in / Sign-up
Your question

BF3 6950 crossfire issue

Tags:
  • Graphics Cards
  • Battlefield
  • Crossfire
  • Graphics
Last response: in Graphics Cards
January 9, 2012 11:29:04 PM

Hi Forum,

This post is based primarily on Battlefield 3.

1. I play on1080p
2. CPU is currently OC'd to 3.6Ghz.
3. Deciding between 6950 2GB crossfire or Single GTX 580 1.5GB only (no other setup/card)
4. Microstutter is the main focus
5. I will use MSAA x2-x4 for BF3
6. Purchase date: now-2 weeks (not waiting for new cards from nvidea/ATI).



So, I'm hoping for insight from current/retired users of 6950 crossfire (preferrably who play FPS, or more even better BF3) regarding
microstuttering on the 6950 CF. I'd like to know how "bad" it is, like how noticeable the smoothness decrease/increased stuttering you
will experience going from single 6950, to crossfired 6950.

Secondly, i hear Sli has less microstuttering compared to ATI - Is this true?


Thanks in advance.


More about : bf3 6950 crossfire issue

a b U Graphics card
January 10, 2012 12:20:53 AM

I have 2x 6950s unlocked to 6970s. I play BF3 on my lg LH90 tv which is supposed to be laggier than a monitor and I don't feel any microstutter in the game and yes I play with everything at ultra...I am a video quality freak. Now my bro has 1 X 6950 unlocked to 6970 and i do get the feeling that his comp is a lot more smoother for some reason with everything being similar(except he has to turn MSAA off to get 55 FPS) however my gameplay is not choppy at all...its just that his is more smooth. Could be that I like to play with 240hz on my tv so the input lag is a lot higher than his monitor....which has a 2ms refresh rate.

You've listed that you don't want to wait for the new cards. Why? The 7950 will be out in Feb...and it is easily going to beat the GTX 580 because HARDOCP overclocked the 7970 by 30% and its 40% faster than an overclocked 580. So take the 7950 overclock it and beat the GTX 580...also you will get 3gb vram so games like crysis 2 or bf3 that need 1.5gb+vram are not an issue and you won't have issues with newer games that do need more vram. Also I have a feeling the 7950 is gonna be just like the 7970 because it looks identical so it might have hidden shaders that can be unlocked. The street price is rumoured at $400-$450 same as GTX 580. Why not wait?
m
0
l
January 10, 2012 2:09:44 AM

dharmenparikh said:
I have 2x 6950s unlocked to 6970s. I play BF3 on my lg LH90 tv which is supposed to be laggier than a monitor and I don't feel any microstutter in the game and yes I play with everything at ultra...I am a video quality freak. Now my bro has 1 X 6950 unlocked to 6970 and i do get the feeling that his comp is a lot more smoother for some reason with everything being similar(except he has to turn MSAA off to get 55 FPS) however my gameplay is not choppy at all...its just that his is more smooth. Could be that I like to play with 240hz on my tv so the input lag is a lot higher than his monitor....which has a 2ms refresh rate.

You've listed that you don't want to wait for the new cards. Why? The 7950 will be out in Feb...and it is easily going to beat the GTX 580 because HARDOCP overclocked the 7970 by 30% and its 40% faster than an overclocked 580. So take the 7950 overclock it and beat the GTX 580...also you will get 3gb vram so games like crysis 2 or bf3 that need 1.5gb+vram are not an issue and you won't have issues with newer games that do need more vram. Also I have a feeling the 7950 is gonna be just like the 7970 because it looks identical so it might have hidden shaders that can be unlocked. The street price is rumoured at $400-$450 same as GTX 580. Why not wait?



Thanks for your input. The reason why i don't want to wait for the 7950 is because i plan to play bf3 with MSAA on. I've noticed on various benchmarks that once MSAA is on in BF3, ATI cards FPS go down quite a bit, and not as much for GTX 580. This results in a low fps increase of the 7970 over the 580.. but bearing in mind it cost an extra 200$ for me here. So the 7950 will definately be slower than the 7970, or be at the same speed as GTX 580 but will likely cost about the same as the GTX 580, if not more - so i figured i woudln't wait.
# So im trying to determine whether microstutter will be an issue with the 6950 CF, which will perform better than the 7970 and at a cheaper price.

http://www.guru3d.com/article/radeon-hd-7970-crossfire-...
http://www.anandtech.com/show/5261/amd-radeon-hd-7970-r...
http://www.hardwareheaven.com/reviews/1373/pg5/xfx-r797...
m
0
l
Related resources
Can't find your answer ? Ask !
a b U Graphics card
January 10, 2012 1:03:37 PM



http://www.hardocp.com/article/2011/11/02/battlefield_3...
Oh if you can get GTX 580 for $200 cheaper then definitely get that over the 7970/7950. However there is a dirty secret of Bf3...as per HardOcp's analysis MSAA adds almost nothing to image quality if you have FXAA on. They say if you can afford to turn it on then do so but its not a deal breaker at all. You might even consider the GTX 570 or a single 6970 and save money. My bro plays with MSAA off with 6970 and he gets 55FPs on avg for multiplayer..very smooth and no issues...no microstutter..nothing..and I'm sure you will save another $50-$100 bucks over GTX 580.


http://www.hardocp.com/article/2011/11/02/battlefield_3...

http://www.hardocp.com/article/2011/11/22/battlefield_3...

Here is the summary from HARDOCP on BF3 AA

Anti-Aliasing in Battlefield 3
More and more, we feel that hardware multisampling is in danger of becoming an obsolete technology. Battlefield 3 gives us more ammunition to support that idea. As it is currently implemented, it is limited to old-school rendering pipelines without considerable hacking. It is workable with deferred shading models, but it still has its limitations. It can't address edge aliasing produced by lighting stages, and it inflicts a far more damaging performance penalty than its image quality improvement justifies. It is a nice old technology, and it has served us well, but there are better options.

Shader AA technology has gotten better. One of the first times we ever saw it was in S.T.A.L.K.E.R.: Shadow of Chernobyl. At that time it was, to be frank, ugly and ineffective. But times have changed and shader AA has steadily improved. With the advent of technologies like NVIDIA's FXAA and AMD's MLAA, computational AA is now a legitimate threat to the legitimacy of traditional multisampling. While we have seen FXAA and MLAA to be great systems, it still wouldn't have been possible without the rampant increase in computational horsepower we've witnessed in desktop GPUs. Even the inexpensive Radeon HD 6870 didn't even flinch when we enabled High FXAA at 1920x1200. In fact, it barely seemed to notice that there was an extra processing load.

In Battlefield 3, MSAA is disadvantaged. With a deferred shading engine, MSAA is challenged from the beginning. There are things it just can't do. It can't address edge aliasing that is exaggerated by the lighting stage, because it happens before lighting. it can't reduce aliasing due to shaders (sometimes referred to as "specular aliasing") or transparent textures without external help from AMD and NVIDIA control panel options. FXAA doesn't share any of these problems. It's not perfect, but FXAA does exactly what it sets out to do: it is an effective and very fast approximation of multi-sampling in a single-pass shader program. It smoothes geometry edges, alpha texture edges, lighting edges, and specular aliasing. And it does these things very quickly.

Let's not forget that MSAA and FXAA can be used together uniquely in this game. DICE talked up that possibility, and even mentioned that they "complement each other," but we feel that the reality of the situation does not warrant much excitement. Yes, they work together, but there is no immediately and persistently noticeable reason to do it. If you take still screenshots and zoom in a few hundred percent, it is easy to find differences side by side. But if you play the game, chances are you'll never actually see what is different with MSAA and FXAA as opposed to just FXAA.

In AMD's own review guide for this game AMD recommends to use FXAA in this game instead of MSAA. This is a bold statement from AMD since FXAA is the competitors technology. This leads credence to the benefit and positive effect that today's shader based AA technology provides. It is very easy to see that FXAA is more effective than MSAA in this particular game title.
m
0
l
January 10, 2012 11:33:04 PM

dharmenparikh said:
http://www.hardocp.com/article/2011/11/02/battlefield_3...
Oh if you can get GTX 580 for $200 cheaper then definitely get that over the 7970/7950. However there is a dirty secret of Bf3...as per HardOcp's analysis MSAA adds almost nothing to image quality if you have FXAA on. They say if you can afford to turn it on then do so but its not a deal breaker at all. You might even consider the GTX 570 or a single 6970 and save money. My bro plays with MSAA off with 6970 and he gets 55FPs on avg for multiplayer..very smooth and no issues...no microstutter..nothing..and I'm sure you will save another $50-$100 bucks over GTX 580.


http://www.hardocp.com/article/2011/11/02/battlefield_3...

http://www.hardocp.com/article/2011/11/22/battlefield_3...

Here is the summary from HARDOCP on BF3 AA

Anti-Aliasing in Battlefield 3
More and more, we feel that hardware multisampling is in danger of becoming an obsolete technology. Battlefield 3 gives us more ammunition to support that idea. As it is currently implemented, it is limited to old-school rendering pipelines without considerable hacking. It is workable with deferred shading models, but it still has its limitations. It can't address edge aliasing produced by lighting stages, and it inflicts a far more damaging performance penalty than its image quality improvement justifies. It is a nice old technology, and it has served us well, but there are better options.

Shader AA technology has gotten better. One of the first times we ever saw it was in S.T.A.L.K.E.R.: Shadow of Chernobyl. At that time it was, to be frank, ugly and ineffective. But times have changed and shader AA has steadily improved. With the advent of technologies like NVIDIA's FXAA and AMD's MLAA, computational AA is now a legitimate threat to the legitimacy of traditional multisampling. While we have seen FXAA and MLAA to be great systems, it still wouldn't have been possible without the rampant increase in computational horsepower we've witnessed in desktop GPUs. Even the inexpensive Radeon HD 6870 didn't even flinch when we enabled High FXAA at 1920x1200. In fact, it barely seemed to notice that there was an extra processing load.

In Battlefield 3, MSAA is disadvantaged. With a deferred shading engine, MSAA is challenged from the beginning. There are things it just can't do. It can't address edge aliasing that is exaggerated by the lighting stage, because it happens before lighting. it can't reduce aliasing due to shaders (sometimes referred to as "specular aliasing") or transparent textures without external help from AMD and NVIDIA control panel options. FXAA doesn't share any of these problems. It's not perfect, but FXAA does exactly what it sets out to do: it is an effective and very fast approximation of multi-sampling in a single-pass shader program. It smoothes geometry edges, alpha texture edges, lighting edges, and specular aliasing. And it does these things very quickly.

Let's not forget that MSAA and FXAA can be used together uniquely in this game. DICE talked up that possibility, and even mentioned that they "complement each other," but we feel that the reality of the situation does not warrant much excitement. Yes, they work together, but there is no immediately and persistently noticeable reason to do it. If you take still screenshots and zoom in a few hundred percent, it is easy to find differences side by side. But if you play the game, chances are you'll never actually see what is different with MSAA and FXAA as opposed to just FXAA.

In AMD's own review guide for this game AMD recommends to use FXAA in this game instead of MSAA. This is a bold statement from AMD since FXAA is the competitors technology. This leads credence to the benefit and positive effect that today's shader based AA technology provides. It is very easy to see that FXAA is more effective than MSAA in this particular game title.



Thank your for the in depth input, i appreciate it. I certainly agree that FXAA is very effective at what it does, and i also personally feel it makes the game look more realistic by adding that blur and smoothing those jagged edges (let's face it no one in real life really see's objects with clearly defined edges like x4 AA).

But i actually don't play with FXAA on, because for me i personally see it as a disadvantage as the blurriness makes it difficult to see clearly, especially far in the distance (i'm competitive when it comes to FPS games). That's why i used MSAA instead, which really made it better (for me) by making everything more crisp. There were many occasions when i previously played with FXAA on, i'd have to squint at something for some time before i realized it was actually an enemy proning/their head and not just some terrain object (for me personally).


In the end i decided to settle with the overpriced MSI 7970 as it satisfies my needs. (It has 3GB VRAM buffer to future proof myself from future games that eat up more VRAM, i can utilize eye infinity, and i can entirely avoid the microstuttering issues from CF and driver delays).

I'v heard alot of people saying 1GB or so is sufficient for gaming at 1080p, and while i was able to play with some ultra settings such as the texture quality setting (which uses 500 VRAM on its own), it did produce noticeable stuttering, forcing me to dial down settings - despite having around 60 fps. The has 7970 is eliminated this entirely and provided a very smooth experience. I'm currently running on ultra settings with MSAA x4 and maintaining a 65 fps average and i am very pleased.

For the benefit of people who'd like insight regarding overclocking/temperature/fan noise of the 7970 i can say that i'm currently running it on 1100mhz (i haven't had enough time to test it long enough), but at it's stable at the time being. Temperatures are fine, i reached max 60 degrees at 99% load but the fan's noise is like a vacuum. But i've always gamed with earphones so this actually isn't a problem to me whatsoever.
m
0
l
a b U Graphics card
January 11, 2012 12:08:10 PM

Good choice. Enjoy
m
0
l
January 11, 2012 3:02:21 PM

I've been out of the hardcore PC realm so I can't really comment on AA or anything technical like that, but I do have 2 Asus HD 6950 1GB CUII cards xfired. I'm playing BF3 on a 23inch monitor at 1920x1080 (max settings/ultra high everything on). I haven't done any absolute tests while in game but from what I can see, it's quite smooth. I'm somewhere in the neighborhood of 100+ fps avg (according to render.drawfps but not sure how accurate that is?) and the GPU temps hold steady for the most part somewhere in the 55-65 degrees C range.
m
0
l
January 11, 2012 8:06:48 PM

Hw4ng3r said:
I've been out of the hardcore PC realm so I can't really comment on AA or anything technical like that, but I do have 2 Asus HD 6950 1GB CUII cards xfired. I'm playing BF3 on a 23inch monitor at 1920x1080 (max settings/ultra high everything on). I haven't done any absolute tests while in game but from what I can see, it's quite smooth. I'm somewhere in the neighborhood of 100+ fps avg (according to render.drawfps but not sure how accurate that is?) and the GPU temps hold steady for the most part somewhere in the 55-65 degrees C range.


That's a fantastic avg fps. A part of me does know better value would have come out of a 6970 Crossfire, but the chicken inside of me regarding micro stutter and driver issues has led me to purchase the 7970. I do not regret it though :lol: 

On another note, i did some calculations and observed that with MSAA x4 on, FPS drop was only about 30%. This is better than the 68xx and 69xx series, which experience around 40%+ drop.

Definitely picking up another 7970 down the line to crossfire.
m
0
l
a b U Graphics card
January 11, 2012 10:36:54 PM

6950 crossfire can definitely get you 60+fps...but I doubt you're getting 100fps avg on all ultra...its not possible.

I have 6970 crossfire...and 2500k with 4.5ghz..and I don't get more than 60-80fps on avg. Even the websites that do reviews don't get 100. And no i don't have anything against you...just didn't want the other folks to think 6950 CFX is 100fps on bf3 ultra at 1080P.
m
0
l