Sign in with
Sign up | Sign in
Your question
Closed

Add Another GTX580 or Crossfire 7950s?

Last response: in Graphics & Displays
Share
March 6, 2012 8:46:21 PM

Hello all,

I am in an interesting predicament here. I am unsure if I will go in either direction, but here is what I have in mind. Let me know your thoughts and/or ideas.

My current setup is:

Gigabyte P55-UD3 Motherboard (8x8 Crossfire Only, no SLI)
Intel C-i7 860 @ 3.80ghz w/ (21x181mhz, 1.325v, Zalman CNPS10x Quiet & 24+ Hours Prime95 Stable)
8gb OCZ DDR3 12800 @ 1448mhz w/ (8-8-8-22: 1T, 1.64v)
PNY GTX580 1.5gb OC'd 17%

I am toying with the idea of an upgrade. I could either purchase 2x7950's and keep all else the same (would be pushing my PSU to the Max, but lets leave that aside for now) or get a new CPU, Mobo, PSU, CPU Cooler, and another GTX580 1.5gb. The pricing is relative.

2x7950: http://www.newegg.com/Product/Product.aspx?Item=N82E168... = $928 (after shipping)

or

CPU (2500k): http://www.microcenter.com/single_product_results.phtml... $180

Mobo (z68): http://www.newegg.com/Product/Product.aspx?Item=N82E168... = $130

PSU (PCP&C 950w): http://www.newegg.com/Product/Product.aspx?Item=N82E168... = $150

CPU Cooler (CM Hyper 212): http://www.newegg.com/Product/Product.aspx?Item=N82E168... = $27

PNY GTX580 1.5gb: http://www.newegg.com/Product/Product.aspx?Item=N82E168... = $475

Total = $993 (after shipping and local tax on CPU)

I run a dual monitor setup but only play on one of them, at the resolution of 1920x1080. BF3 does not quite run to my liking. I want to run the game at full res, 4xMSAA and 80 FoV with a MINIMUM of 60FPS. I cannot accomplish this even with the lowest settings on every other category including setting high performance settings in my drivers. So here I am. Thoughts? Thanks in advance!

Best,

3Ball
a b U Graphics card
March 6, 2012 9:09:12 PM

I can't speak for the 7950's but the 2 580's will definitely max you out. I run 2 580's @2560x1440 on full ultra (see below for rest of system) and I never see it go below 60fps, it usually runs around 80-120 fps in BF3.

Generally I see less issues overall with SLI than crossfire as well.

You will need to get new RAM for the 2500k as well, it uses 1.5v RAM but that's not a big deal, RAM is dirt cheap.

I do not suggest running 2 thirsty cards in demanding games like BF3 on a PSU that is maxed out. Something is going to give out and it will probably be the PSU and take half the rest of your system with it when it fails. I'd suggest a better brand PSU as well, something from Corsair, Enermax, or other reputable brand. They have quality offerings at similar prices. Don't skimp on the PSU, it matters a lot more when you are sucking that kind of power. 900w+ should be fine though. My system pulls around 750-800 max with BF3 running. That PSU looks like it might be well made but I am not familiar with them.

Hope that helps.
Score
0
a c 125 U Graphics card
March 6, 2012 9:26:18 PM

I think BF3 has an issue with 4xMSAA.

But anyway,

IMO it would be a huge waste to get CF 7950s, and it would also be a huge waste to buy basically a new system. Just get a 580, and if necessary a PSU. That is all.

EDIT: Ah nevermind I see the problem, no SLI on mobo. Well, that is teh sux...

I'd have to recommend the 7870 though. Seriously.



And they overclock like a mofo:

Score
0
Related resources
March 6, 2012 9:48:11 PM

wait for the GTX 670 Ti or the GTX 680, nvidia runs smoother than amd.
Score
0
a c 125 U Graphics card
March 6, 2012 9:49:30 PM

tinnitus said:
wait for the GTX 670 Ti or the GTX 680, nvidia runs smoother than amd.


Bring facts if you're going to make a claim like that. I have sources to prove you false, but not much point when all you have is your opinion.
Score
0
March 6, 2012 9:51:53 PM

wolfram23 said:
I think BF3 has an issue with 4xMSAA.

But anyway,

IMO it would be a huge waste to get CF 7950s, and it would also be a huge waste to buy basically a new system. Just get a 580, and if necessary a PSU. That is all.

EDIT: Ah nevermind I see the problem, no SLI on mobo. Well, that is teh sux...

I'd have to recommend the 7870 though. Seriously.

http://cdn5.tweaktown.com/content/4/5/4597_27_amd_radeon_hd_7870_2gb_reference_video_cards_in_crossfire.png

And they overclock like a mofo:

http://techreport.com/r.x/pitcairn/bf3-fps-oc.png


Yeah, I have considered the 7970, and you are absolutely right about there being an issue with 4xMSAA in BF3. The issue is that BF3 (like starcraft 2), and many other games these days, uses a deferred lighting engine, which causes HUGE problems with traditional AA methods. The reason why BF3 stops at 4x, is because 4x is the highest that you can go on these engines before it starts actually affecting objects on the screen in other odd ways. It is something that I absolutely cannot stand as both frostbyte 1 and 2 engines suffer horribly with Aliasing. Atleast frostbyte 1 had the ability to scale higher AA settings.

Full res followed by 8xAA is how I prioritize all of my games settings when I am optimizing them, so this is very disappointing for me in BF3 as I am not a fan at all of the blurring AA methods (FXAA and MLAA). Oh well, I guess this is just the road we are headed down. Going to have to adjust to it I suppose. What I may end up doing is just building an Ivy Bridge system with 2x680s sometime later this year. That would probably be the wisest decision, though, Im not in the business of making the "wise" decision when it comes to my gaming needs ;) 

Thanks all!

Best,

3Ball
Score
0
March 6, 2012 11:27:31 PM

wolfram23 said:
Bring facts if you're going to make a claim like that. I have sources to prove you false, but not much point when all you have is your opinion.


it's not my opinion, it's just like that, from 2008 i've been testing various cards: 4670, 5670, 5750, GTX 460, GTX 560 Ti, now i'm with a 6850 cf, nvidia runs smoother than amd, but they are too high on price.
Score
0
a c 143 U Graphics card
March 6, 2012 11:54:00 PM

HD 7970/50 or HD 7870 are useless in your situation, i would consider upgrading your CPU and MOBO then add a GTX 580.
Score
0
a b U Graphics card
March 6, 2012 11:58:30 PM

ilysaml said:
HD 7970/50 or HD 7870 are useless in your situation, i would consider upgrading your CPU and MOBO then add a GTX 580.


Yeah, I agree. A new motherboard that supports SLI and another 580 will be the cheapest route. Correct me if I'm wrong, but don't they make SLI boards for that cpu? A full jump to SB would just be a waste.

@tinnitus
You are either trying to start a flame war, trolling, or have fanboy based dillusion that they are somehow smoother in games.
Score
0
a b U Graphics card
March 7, 2012 12:00:44 AM

i'd definitely go for 2nd 580

i7 860 has alot of life left in it

edit: facepalm.. i managed to miss the fact that your board doesn't support sli

guess 2500k route it is then
Score
0
March 7, 2012 3:59:22 AM

omega21xx said:
Yeah, I agree. A new motherboard that supports SLI and another 580 will be the cheapest route. Correct me if I'm wrong, but don't they make SLI boards for that cpu? A full jump to SB would just be a waste.

@tinnitus
You are either trying to start a flame war, trolling, or have fanboy based dillusion that they are somehow smoother in games.


i'm not trying to start nothing, if you try both brands you'll see that nvidia is smoother than amd, nvidia shows more like console speeds, amd looks more like frame-by-frame. you can see it on youtube videos (recorded from cameras). currently i'm using amd cause it's cheap.
Score
0
a b U Graphics card
March 7, 2012 4:14:40 AM

tinnitus said:
i'm not trying to start nothing, if you try both brands you'll see that nvidia is smoother than amd, nvidia shows more like console speeds, amd looks more like frame-by-frame. you can see it on youtube videos (recorded from cameras). currently i'm using amd cause it's cheap.

"console speeds"
Really?! You realize ps3 and 360 games barely hit there 60fps goal at lower settings than pc gfx cards do right?
Regardless, I HAVE NVIDIA cards, grew up with them actually. Until I got an HD 2400 (low end yes)
There is NO, seriously NO difference between them as far as perceived smoothness. As long as both cards you use from either brand run a minimum of 30-60fps in the game or benchmark neither one is better than the other.

I'm sure everyone would love to see some sort of proof to your claim but you have to be pretty uniformed or inexperienced to think that way about them. If you are in some way have issues with two comparable cards then obviously its a user error on your part and not a brand superiority or inferiority.
Score
0
March 7, 2012 5:18:21 AM

omega21xx said:
"console speeds"
Really?! You realize ps3 and 360 games barely hit there 60fps goal at lower settings than pc gfx cards do right?
Regardless, I HAVE NVIDIA cards, grew up with them actually. Until I got an HD 2400 (low end yes)
There is NO, seriously NO difference between them as far as perceived smoothness. As long as both cards you use from either brand run a minimum of 30-60fps in the game or benchmark neither one is better than the other.

I'm sure everyone would love to see some sort of proof to your claim but you have to be pretty uniformed or inexperienced to think that way about them. If you are in some way have issues with two comparable cards then obviously its a user error on your part and not a brand superiority or inferiority.


i didn't mean fps, i mean looks like.

if you see this:

http://www.youtube.com/watch?v=ApFdUQGyq0g

right is more smoother.

but if you don't believe it, i don't gonna waste my time. my first 3d card experience was from a diamond monster 3d on 97. believe what you want.
Score
0
a b U Graphics card
March 7, 2012 5:28:11 AM

tinnitus said:
i didn't mean fps, i mean looks like.

if you see this:

http://www.youtube.com/watch?v=ApFdUQGyq0g

right is more smoother.

but if you don't believe it, i don't gonna waste my time. my first 3d card experience was from a diamond monster 3d on 97. believe what you want.


The 560ti competes with the 6950 not the 6870. So that comparison is already foiled.
Score
0
March 7, 2012 2:11:10 PM

lol, this sure took an interesting turn. Nontheless, I found a solution to get the most out of BF3 that my system can handle.

I turned off HT, and bumbed my CPU up to 4.2ghz, which is the max I can get. My GPU is also already at the max I can get it, so I maxed out BF3 and measured my frames on the expansion maps (as they are more taxing) and found I ranged between about 38FPS and 85FPS.

The fluctuations made the game quite annoying, so I put an FPS limiter on the game at 40FPS, in which case it game runs extremely smooth as it runs at 40FPS about 90% of the time if it does drop, I dont see or feel it like I did with the massive flucuations between 38 and 85 and there is zero choppiness.

As for the nVidia vs ATI thing. I used to absolutely love ATI, and throughout my life had maybe 2 nVidia cards while having roughly 10 ATI cards. I used to scoff at anyone who said that ATI had any driver issues until I got my 5870. Once I got my 5870, I convinced all of my friends to do the same, and some of them got crossfire setup.

Ever since then every single person has gone over to the nVidia camp for the same reason. Horrid driver issues. I mean, issues that were unheard of. As in, I couldnt launch 10 or 15 of my games. Often times had games that had graphical errors, or sporadic drops in FPS. After nVidia released the driver update for the GTX4xx series that improved them by nearly 50% (seeing first hand on my friends 480), I was sold.

I literally gave up my 5870 for a GTX470, thats right a "slower" card by benchmark standards. While my maximum frames were undoubtedly lower, my minimum frames had to have been atleast 20% higher. I have zero issues with any games running, and never have since. I then of course went to the 580 from the 470. So thats my story.

I would love to go back to the ATI camp, but it takes alot. I am always open to getting new hardware from a different company if it benefits me the most, but one thing I NEVER worry about. Is power and heat. The GTX4xx series (fermi), gets a bad rap for this, but for me even though it had those "issues" it provided a much better experience for myself and 4 friends, including one guy who you would have called an (AMD/ATI) fanboy, but after he lost the ability to play his favorite game (couldnt even install it actually, the game was Far Cry) because of the ATI drivers, he is now running my old 470 and looking to get a 680 when they come out.

I really hope to go over to the ATI camp again someday, but I literally am a bit afraid of it. lol, my next setup will probably be a Ivy Bridge + 670 or 680 SLI setup.

Best,

3Ball
Score
0
a c 125 U Graphics card
March 7, 2012 2:31:20 PM

tinnitus said:
it's not my opinion, it's just like that,


:lol: 

Yeah... no.

As for Nvidia being "smoother" how do you qualify that? Because techreport has a nice method. They look at how long it takes to render a frame, the longer it takes the "less smooth" it might seem. I say might, because below a certain threshold you wouldn't even notice a difference. Let's look at some examples of how much smoother Nvidia is...






And there's more. But basically, at worst they are even. As you can probably see, most of the "smoothness" comes down to how fast the card is and what sort of FPS it can maintain. Saying one brand or the other is smoother is basically nonsense.

Please note I'm talking about single card configs, we could get into CF vs SLI if you want though.

Quote:
console speeds are slow as in capped @ 30fps except mw2 that runs at a full speed 60fps you real hve no idea and in most new games consoles are a bit choppy and the only saving grace is that they re bound to a ridiculously slow game pad to match the ridiculously slow frame rates. Nvidia and AMD are both as smooth as there frame rates you biased fanboy.

There's approximately 171 console games that run at 60 fps http://www.giantbomb.com/60-fps-on-consoles/92-3223/gam...

However, they almost never run at 1080p, and in fact many run below 720p: http://forum.beyond3d.com/showthread.php?t=46241
Score
0
March 7, 2012 2:44:17 PM

Thats really interesting Wolf. Ive never seen that sort of analysis, but dont you think that backing up an argument with proof is kinda "old school"? ;) 
Score
0
a c 125 U Graphics card
March 7, 2012 2:53:17 PM

3Ball said:
Thats really interesting Wolf. Ive never seen that sort of analysis, but dont you think that backing up an argument with proof is kinda "old school"? ;) 


:whistle: 

Sadly, yes. Too many people bring anecdotes to the table like it's the be-all-end-all of debate. Might as well bring in the "my dad says blah blah blah" like when we were in elementary.
Score
0
a c 620 U Graphics card
March 7, 2012 5:24:20 PM

This was an overlooked point made by HardOCP in their review of the 7950:

"There is always an aspect of gameplay performance that is hard to relate to gamers through a graph, or even words. We are talking about physically "feeling" a game as you play it. What people perceive as playable performance is not always attached to framerate. This seems to be a fact of CrossFireX that we've encountered in our gameplay testing. At times, the framerate being displayed on the screen doesn't match what we are "feeling" as we play the game.

For example, if 40 or 50 FPS is indicated, even though that should be playable since its above 30 FPS it won't necessarily feel playable. We have to shoot for higher FPS. We experience some kind of lag or choppiness in gameplay with CrossFireX even though the framerate indicates it should be playable. This means you cannot always rely on framerate alone to determine playable performance.

This is a difference that separates CrossFireX from SLI. With SLI we do not experience this phenomenon as much. With SLI, framerates seem smoother at lower framerates, than these do with CrossFireX. For example, we often find we need to aim for higher framerates in order for CrossFireX to feel like it’s playable. Whereas, with SLI we often find we can settle with lower framerates, because it feels playable at those framerates. Trust us, we do not go by framerates when evaluating how these cards actually game. The framerates a lie.

Some of this can be seen in the graphs, when we talk about consistency. We've shown it in this evaluation, look back at the Deus Ex or Skyrim graphs and you will see SLI producing a more consistent framerate. These are just facts between CrossFireX and SLI, but it makes it so that SLI feels smoother and better to us, than CrossFireX does often. This was the case a lot of the time testing Radeon HD 7950 CrossFireX versus GeForce GTX 580 SLI. We just felt GeForce GTX 580 SLI offered a smoother experience, in pretty much every game, even the ones where Radeon HD 7950 CrossFireX allowed higher in-game settings."
http://www.hardocp.com/article/2012/01/30/amd_radeon_hd...
Score
0
a c 125 U Graphics card
March 7, 2012 5:31:03 PM

Yeah, CF and SLI are quite a bit different than single card. Generally it seems like CF has a little more of a stutter issue, but it's not non-existant in SLI. More graphs!











Score
0
March 7, 2012 9:26:53 PM

17seconds said:
This was an overlooked point made by HardOCP in their review of the 7950:

"There is always an aspect of gameplay performance that is hard to relate to gamers through a graph, or even words. We are talking about physically "feeling" a game as you play it. What people perceive as playable performance is not always attached to framerate. This seems to be a fact of CrossFireX that we've encountered in our gameplay testing. At times, the framerate being displayed on the screen doesn't match what we are "feeling" as we play the game.

For example, if 40 or 50 FPS is indicated, even though that should be playable since its above 30 FPS it won't necessarily feel playable. We have to shoot for higher FPS. We experience some kind of lag or choppiness in gameplay with CrossFireX even though the framerate indicates it should be playable. This means you cannot always rely on framerate alone to determine playable performance.

This is a difference that separates CrossFireX from SLI. With SLI we do not experience this phenomenon as much. With SLI, framerates seem smoother at lower framerates, than these do with CrossFireX. For example, we often find we need to aim for higher framerates in order for CrossFireX to feel like it’s playable. Whereas, with SLI we often find we can settle with lower framerates, because it feels playable at those framerates. Trust us, we do not go by framerates when evaluating how these cards actually game. The framerates a lie.

Some of this can be seen in the graphs, when we talk about consistency. We've shown it in this evaluation, look back at the Deus Ex or Skyrim graphs and you will see SLI producing a more consistent framerate. These are just facts between CrossFireX and SLI, but it makes it so that SLI feels smoother and better to us, than CrossFireX does often. This was the case a lot of the time testing Radeon HD 7950 CrossFireX versus GeForce GTX 580 SLI. We just felt GeForce GTX 580 SLI offered a smoother experience, in pretty much every game, even the ones where Radeon HD 7950 CrossFireX allowed higher in-game settings."
http://www.hardocp.com/article/2012/01/30/amd_radeon_hd...


Very nice find.
Score
0
a b U Graphics card
March 8, 2012 1:17:54 AM

17seconds said:
This was an overlooked point made by HardOCP in their review of the 7950:

"There is always an aspect of gameplay performance that is hard to relate to gamers through a graph, or even words. We are talking about physically "feeling" a game as you play it. What people perceive as playable performance is not always attached to framerate. This seems to be a fact of CrossFireX that we've encountered in our gameplay testing. At times, the framerate being displayed on the screen doesn't match what we are "feeling" as we play the game.

For example, if 40 or 50 FPS is indicated, even though that should be playable since its above 30 FPS it won't necessarily feel playable. We have to shoot for higher FPS. We experience some kind of lag or choppiness in gameplay with CrossFireX even though the framerate indicates it should be playable. This means you cannot always rely on framerate alone to determine playable performance.

This is a difference that separates CrossFireX from SLI. With SLI we do not experience this phenomenon as much. With SLI, framerates seem smoother at lower framerates, than these do with CrossFireX. For example, we often find we need to aim for higher framerates in order for CrossFireX to feel like it’s playable. Whereas, with SLI we often find we can settle with lower framerates, because it feels playable at those framerates. Trust us, we do not go by framerates when evaluating how these cards actually game. The framerates a lie.

Some of this can be seen in the graphs, when we talk about consistency. We've shown it in this evaluation, look back at the Deus Ex or Skyrim graphs and you will see SLI producing a more consistent framerate. These are just facts between CrossFireX and SLI, but it makes it so that SLI feels smoother and better to us, than CrossFireX does often. This was the case a lot of the time testing Radeon HD 7950 CrossFireX versus GeForce GTX 580 SLI. We just felt GeForce GTX 580 SLI offered a smoother experience, in pretty much every game, even the ones where Radeon HD 7950 CrossFireX allowed higher in-game settings."
http://www.hardocp.com/article/2012/01/30/amd_radeon_hd...


A good friend of mine has the 7950's in xfire (he has always preferred ATI) and he noticed this issue as well, although he didn't notice it until he was over my house and played BF3 on my system with SLI 580's. The rest of our systems are mostly the same. He noticed it nearly right away and said it just felt smoother on my rig than on his. He figured it was running at higher frames than his so we tested it and found that the frames were pretty much the same on both our systems but it just felt smoother in general on mine than his. He also noticed that mine didn't get that very slight "stutter" when buildings and other things blew up around him like he got on his.

He still loves ATI and isn't going to trade his in even though he can afford it but he is somewhat mad that they aren't performing as well in the real world as they do on paper. It's not a "game breaker" and a lot of people will never even notice it, he's pretty sensitive to frame rate and even he didn't realize it until he compared it to something else. All things being equal sometimes its the little things like that to me that matter though.
Score
0
a c 125 U Graphics card
March 8, 2012 2:12:07 PM

Quote:
Enable Vsync and stop being a fan boy.


:pfff: 

Vsync makes any sort of stutter worse.
Score
0
a c 125 U Graphics card
March 8, 2012 2:32:10 PM

You tell me which will be smoother:


Score
0
March 8, 2012 2:57:18 PM

Quote:
Vsync offers best image quality and optimal performance for the monitors max possible capability. I don't like frame rates jumping and spiking up and down all over the place therefor like many gamers I prefer 60fps Vsync with no drops or increases in framerate. I would personally turn down settings in order to consistently maintain a 60fps min framrate with Vsynch enabled however with 7850 CF I will not have to turn down settings.


I personally hate vsync, and do not use it in about 99% of my games. It takes a very special care for vsync to be useful to me. The input lag imposed by vsync (no triple buffering does not do away with it completely), is just not worth dealing with for me. I much prefer framerate limiters. Very similar to you. I would rather have a constant framerate than a variable one in most cases. For example, BF3, which I am using a limiter for now, as I cannot get to 60FPS minimum, so I set the FPS limiter to my minimum of 40 and have a very steady framerate. Framerate limiters impose this restriction with zero affect on input lag as they do not stop frames from being buffered, they only restrict the visual representation. This is the best of both world imo. Unfortunately there isnt always a viable solution in this regard for every game.

Best,

3Ball
Score
0
March 8, 2012 3:16:43 PM

Quote:
I don't understand how some people claim mouse lag with Vsynch on I have never had that problem in any game.


Do you play on a 60hz monitor? If so, then you are not very sensitive to the lag that is there, or you arent playing too many very quick reflex required games. In some games it is very slight, in others it is worse, but it is the nature of it. Not necessarily the matter of opinion on whether it happens or not, triple buffering exists for the sole reason of "reducing" input lag, which it does, but it doesnt rid it entirely. That is why often in games, if there is a vsync option, there will be a grey'd out "reduce input lag" or "enable triple buffering" check box/option until you select vsync on. In which case, the other option becomes available. One way to combat this is to have higher refresh rates, like the 85hz or higher of the old CRT monitors, but those refresh rates are hard to come by these days.

You also need to realize the person you are talking to. I am extremely sensitive to input lag. I do not buy a TV or monitor without first either having the input lag tested or seeing a review where they measured it themselves against a traditional CRT. No LCD is without it entirely, but many can be very low. For example, you can see in the review below that my Dell 2312hm monitor has exceptionally low input lag (click the input lag selection near the top to be taken to the correct section)

http://www.tftcentral.co.uk/reviews/dell_u2312hm.htm

Also, for comparison to my sensitivity. The Viewsonic VX2739wm that is on there at an average of 9.4ms. I sold that monitor due to the input lag driving me insane.

Best,

3Ball
Score
0
March 8, 2012 3:49:19 PM

Quote:
I notice input lag as well that's why I bought a Samsung SyncMaster 226BW back in 07 for $500 it is a 22" 1680x1050 monitor with 2ms response time and I never have any mouse lag even when my Logitec G9X is set to 5700Dpi



Well you are very fortunate then with your vsync experience, as more people will definitely run into my experience than will run into yours. Its just how it works, vsync alone is like feeding 20 people through a single door, with triple buffering its like feeding them through 3 doors, and without either its as if there is just a gaping hole in the wall, no doors required. Also, as a note: the ms response time has absolutely nothing to do with input lag. That is something you would use to measure ghosting.

Best,

3Ball
Score
0
a c 620 U Graphics card
March 8, 2012 4:52:42 PM

Vsync defeats the purpose of having a high performance setup. There is also the very real possibility of cutting your framerates in half down to 30 fps due to the double buffering effect (when the video card has to wait for a response from the monitor before rendering the next frame). And yes, it does introduce the possibility of mouse lag.
http://www.tweakguides.com/Graphics_9.html

By the way, triple buffering only works on OpenGL games and comes with its own host of potential issues.
http://www.tweakguides.com/Graphics_10.html

For me, Vsync only gets turned on in response to a specific problem. Screen tearing is almost never enough of a problem to prompt me to compromise on performance. For, while the screen is only showing a max of 60 fps, the controls certainly still feel smoother with the highest possible framerates, well over 60 if possible. If anyone is getting unbearable screen tearing due to high framerates, the first solution should be to enable higher settings, not turn on Vsync.
Score
0
March 8, 2012 4:58:51 PM

Quote:
All Vsync does is match the framerate to the monitors native refreshrate theres nothing complex about it or that would degrade performance.


V-Sync is significantly more complex than you are making it seem. While it isnt the most complex thing in the world, it does much more than just "match framerate to the monitors native refreshrate". It is how it does it.

Basically the problem is that the game engine sends more frames that the monitor can display in a second, for example your Pc can run the game at 100 fps, your monitor can display 60, so while the monitor refreshes it's vertical lines, there will be lines what till display an older frame, some a new frame so the result will be a tearing picture.

If you turn v-sync on, the game will still make 100 frames, but it will go into a buffer, and the buffer will send the right amount (in this case 60) to the monitor to let is display every frame separately, without tearing. The mouse "lag" you get is because the frames have to wait for the monitor so you get a bit of delay.

There is another option what you can use by 3rd party programs, or in the vga driver called OpenGL or Direct3D Triple buffering, it means that the VGA uses 3 buffer instead of one, so the waiting lines for the frames are much shorter, so it reduce the input lag, nearly to zero, in ideal situations, but still not entirely.

Also, specifically on degrading performance, vysnc uses significantly more memory than the game would if it were off. So if using up more memory doesnt degrade performance (this is quantifiable in benchmarks), then I dont know what does (this is of course assuming that you are using up your entire frame buffer).

All of this is why a framerate limiter and vsync are different. When using a framerate limiter, even if setting the frames to 30 (or 40 in my case for BF3), you can (and I still do) from time to time see tearing on the screen, even though my FPS is well below the 60hz refresh rate that my monitor has. Console games locked at 30FPS can have screen tearing for this exact same reason as well.

more info: http://www.tweakguides.com/Graphics_9.html

Best,

3Ball
Score
0
a c 125 U Graphics card
March 8, 2012 5:07:07 PM

17seconds said:
For me, Vsync only gets turned on in response to a specific problem. Screen tearing is almost never enough of a problem to prompt me to compromise on performance. For, while the screen is only showing a max of 60 fps, the controls certainly still feel smoother with the highest possible framerates, well over 60 if possible. If anyone is getting unbearable screen tearing due to high framerates, the first solution should be to enable higher settings, not turn on Vsync.


This.

Vsync is *** and only to be used if I can not control having horrid screen tearing, even by forcing 8xSSAA and disabling 1 GPU.

Vsync also almost always flickers from 60 to around 57-59 because of that buffering/waiting effect. If you graph your FPS like I do (100% of the time on 2nd monitor) it's very apparent that at a constant intervale the FPS flickers. Keeping the FPS above 60=smoothest visual experience unless you get screen tearing.
Score
0
March 8, 2012 5:11:34 PM

17seconds said:
Vsync defeats the purpose of having a high performance setup. There is also the very real possibility of cutting your framerates in half down to 30 fps due to the double buffering effect (when the video card has to wait for a response from the monitor before rendering the next frame). And yes, it does introduce the possibility of mouse lag.
http://www.tweakguides.com/Graphics_9.html

By the way, triple buffering only works on OpenGL games and comes with its own host of potential issues.
http://www.tweakguides.com/Graphics_10.html

For me, Vsync only gets turned on in response to a specific problem. Screen tearing is almost never enough of a problem to prompt me to compromise on performance. For, while the screen is only showing a max of 60 fps, the controls certainly still feel smoother with the highest possible framerates, well over 60 if possible. If anyone is getting unbearable screen tearing due to high framerates, the first solution should be to enable higher settings, not turn on Vsync.


wolfram23 said:
This.

Vsync is *** and only to be used if I can not control having horrid screen tearing, even by forcing 8xSSAA and disabling 1 GPU.

Vsync also almost always flickers from 60 to around 57-59 because of that buffering/waiting effect. If you graph your FPS like I do (100% of the time on 2nd monitor) it's very apparent that at a constant intervale the FPS flickers. Keeping the FPS above 60=smoothest visual experience unless you get screen tearing.


+1
Score
0
March 8, 2012 5:42:13 PM

I have two 6970s and blow any game out of the water at 1080p. I get over 80fps in everything.

I can imagine two 79xx series cards would kick some tail for a while to come. So stick with AMD.
Score
0
March 8, 2012 5:46:07 PM

PCgamer81 said:
I have two 6970s and blow any game out of the water at 1080p. I get over 80fps in everything.

I can imagine two 79xx series cards would kick some tail for a while to come. So stick with AMD.

Do 2 7950s have microstuttering or did the 7xxx series do away with that?
Score
0
March 8, 2012 5:50:32 PM

Quote:
The screen cannot display any faster than it's Native refreshrate anything more is a placebo effect as far as input speeds from the mouse. If you want faster input speeds get a 75hz or 120hz monitor then it will have faster in put speeds from mouse input to what you see on screen. It is physically impossible to push a monitor past it's Native refreshrate. Vsync = quality like 1080i vs 1080P and I did not pay for a top end monitor and GPU to display an ugly image with up and down choppy framerates.


This entire discussion has been based on the performance impact of vsync, and you seem to just be ignoring any and everything that we have put out there explaining the specific technical reasons is to why it has negative affects. Its great that you dont have a single issue with vsync and that it provides you with the experience that you want, but sitting here telling us that it does not cause the effects that it clearly does cause and we have proved is just being childish. I did not pay for a top end monitor and top end gpu in order to feel like I am floating on a raft when trying to aim in my games. We have different desires, that is all, nothing wrong with it.

Also, some games/game engines are designed to run at higher framerates, the physics and netcode rates are often affected by these things. In the example of quake live/quake 3, the target for FPS is 125. You are at a disadvantage if you cannot maintain 125FPS at a constant steady rate. So there are benefits beyond perception that higher framerates yield.

Best,

3Ball
Score
0
March 8, 2012 5:52:40 PM

Quote:
Any dual card config can have micro stuttering in fact all do that Includes Nvidias but that does not mean it cannot be mitigated.


This is the issue that has kept me from going dual GPUs thus far. I am really hoping that they can find a way to get rid of this in the future. They have certainly made strides since the original inception, so I hope that they can continue making progress.

Best,

3Ball
Score
0
a c 620 U Graphics card
March 8, 2012 5:55:00 PM

3Ball said:
+1

3Ball, I just noticed your Unreal avatar. Do you play UT3 with PhysX? I played it a for a long time without PhysX, then when I upgraded my machine, it was fun to see what I had been missing. Aircraft parts raining from the sky, flak bouncing off the walls, and the most surprising part for me, body parts splattering everywhere! Fun stuff.
Score
0
a c 125 U Graphics card
March 8, 2012 6:06:25 PM

Quote:
See what you are not realizing is that unless you have a 120hz monitor you will never see or feel the performance and smoothness benefits that 125FPS will enable and that is with Vsync on or off cause even when fraps says the frame rate is lets say 90fps when Vsync is off the monitor is still Bound to 60hz FPS max.


:heink: 

No, what YOU are not realizing is that it DOES have an effect. What comes out of your display is one thing, what happens in your computer is another. Just because you don't see the extra frames doesn't means things aren't getting done behind the scenes.
Score
0
March 8, 2012 6:18:59 PM

17seconds said:
3Ball, I just noticed your Unreal avatar. Do you play UT3 with PhysX? I played it a for a long time without PhysX, then when I upgraded my machine, it was fun to see what I had been missing. Aircraft parts raining from the sky, flak bouncing off the walls, and the most surprising part for me, body parts splattering everywhere! Fun stuff.


Yes I have played it with the PhysX, and it is great! I just wish the game itself wasnt so dead :(  Playing tribes ascend, quake live, bf3, CSS, dota 2 and Sc2 mostly atm.

wolfram23 said:
:heink: 

No, what YOU are not realizing is that it DOES have an effect. What comes out of your display is one thing, what happens in your computer is another. Just because you don't see the extra frames doesn't means things aren't getting done behind the scenes.


Just leave it alone man, he isnt going to open his mind up to it and quite frankly just doesnt understand that the FPS in a game is affected by and affects many more aspects of a game than just the picture on the screen.

Best,

3Ball
Score
0
a c 125 U Graphics card
March 8, 2012 6:19:21 PM

I'll one up you.

Your eyes are the bottleneck of any rig.
Score
0
March 8, 2012 6:24:50 PM

Quote:
PS about the dual card micro stuttering thing well I have run two dual card setups back in 09 and in 11 and real it is not a big issue as in it is game specific and all the games at the time that I owned and play they were just fine if not after a patch of hot fix you would never have known it was running in CF mode other than the really high frame rates I loved Far Cry 2 @ 120fps LOL even tho I was running a 60hz monitor the ample performance overhead was real nice.


You see, I recently played on a buddies crossfire 5870 setup and I felt quite a bit of microsutting in quake live, even though he was running high fps. I think I need to try some other games with his, but it is still worrisome to me because I buy many games day one, meaning that I would likely deal with crossfire or sli issues for a while until the fix was made. I have another friend who buys very few games, usually only big name titles, and not until after a price drop has been made, so he usually gets them after the bugs have been ironed out.

I guess its just one of those things that you have to adapt to. We will see, I have been toying with the idea, and I am fairly certain that before the end of this year I will be running for my first dual GPU setup.

Best,

3Ball
Score
0
a c 125 U Graphics card
March 8, 2012 6:42:50 PM

Quote:
I know you will hate it but in order to mitigate the Micro stuttering you have to run Vsync in SLI / CF


This is literally only true for Fallout and Elder Scrolls games because of the stupid engine they use, and even then they still have stutter under many conditions. Not to mention that the stutter happens with single card setups in those games as well.
Score
0
a c 125 U Graphics card
March 8, 2012 6:47:42 PM

Quote:
Elder Scrolls and Fallout NV have CPU scaling issues from buggy engines.


Yes... which is another issue altogether.
Score
0
March 8, 2012 7:01:16 PM

Hmm well, if I am going to have to enable vsync to get rid of microstuttering then I will just have to stay away from dual card solutions, because I just flatout cannot play with vsync enabled. For me, the difference between having vsync off and on is like going from a M&K to a Controller. No joke. lol

Though if it were only for those 2 games then it wouldnt be an issue as I dont play either, but if I did. Vsync wouldnt be a huge issue as it is a SP game and not a competitive one.

Best,

3Ball
Score
0
a c 125 U Graphics card
March 8, 2012 7:06:40 PM

Quote:
It's probably that Nvidiot card thats making your rig crappy. If the framerate is locked at 60fps then there is no room for the micro stuttering that would have been noticeable otherwise unless the framerate is rapidly moving between 59 and 60fps.


... I see. I'm sorry, I don't normally argue with mentally handicapped children. My bad. Would you like a lollipop?
Score
0
March 8, 2012 7:16:08 PM

Quote:
Vsync is not slow this is not 05 Edit Vsync is as fast as you monitors max capability.


You act as if I havent used vsync before? Over the past 15 years the issue has remained the same with vsync. Every few months I give it another shot, in hopes that it has in some way been improved. Nothing has changed though. Vsync is a no go for me man, like I said earlier, its just different for both of us. I and others provided you references and information on why the issue we see/feel happen. Once again, it is great that it works for you, but if I had to play games with vsync on, I literally would give up the hobby. I am an extremely competitive person, I play games for competitive reasons. Vsync (FOR ME) causes me to perform significantly worse.

Like I said earlier, I am done trying to convince you of this. I am only replying to you right now on a personal matter because that is the only way you seem to be able to take it. You like vsync, I do not. Thats how it is. I think we can still manage to be friends after this. Its not a huge deal. Just understand that you arent going to sway me over to the side of vsync without 2 things 1) Show me hard data (like we did you) is to why vsync absolutely causes no issues 2) Detail me a plan for applying that data to my rig, so that my issues with input lag are completely negated.

Even after you do that, I still wont use vsync on some games because the game engine (physics and netcode, not visuals) are designed around 125FPS + specific rates in order to work to maximum potential. I do realize that my monitor is displaying at 60 FPS even when I am getting 125FPS in fraps, what you need to realize is that those extra 65 FPS are not for my visual benefit, but for the benefit of the engine data itself, and with my computers ability to communicate with it.

Here is a slight idea of what I am referring to (there is more to it though): http://wiki.modsrepository.com/index.php/Call_of_Duty_:...

Best,

3Ball
Score
0
a b U Graphics card
March 8, 2012 8:57:44 PM

Quote:
How quick you can flick your wrist and how fast in Dpi your mouse is and no matter how fast your hardware and Internet connection can process it all will come down to your monitor which is where it will all become bottlenecked or limited for lack of better word. Does your hardware have to wait on your monitor to process your inputs absolutely not but you have to wait on your monitor before you can make an accurate input so what I am suggesting is that the monitor is holding back to potential of your hardware but in reality doesn't matter it is all a placebo effect after 60hz fps. I am real trying hard to understand what you are suggesting with the game engine and how running it at it's native framerate makes it faster even tho your monitor cannot display it all.



Google is your friend, had you used it you wouldn't look silly and uninformed in public. Almost every post you made on the topic of Vsync was wrong.
Score
0
March 8, 2012 11:33:01 PM

Quote:
When the next consoles come out is when the cards after and including HD 68xx and GTX 4xx series and up will become crappy. The only game pushing hardware today seems to be BF3 which is because it was apparently made from the ground up for the PC and then ported to the consoles .

Maybe you misread my post, I have two 6970s, not two 6870s.

I think what you are saying is that all 68xx/4xx and equal or lesser GPUs will become crappy when the next generation of consoles arrive. I am not sure I agree with that.

You have to keep in mind, it is the PC that further advances console hardware; I hardly think that a generation of consoles would have any effect on PC hardware, but rather to the contrary.

With that said, I cannot say for sure exactly how today's mid-range PCs will stand up to tomorrow's next-gen consoles, but suffice to say that today's high-end PCs such as mine and better will outperform the next generation of consoles. When consoles are released, they are usually equal to or lesser than most mid to high end PCs of the day, with PC hardware usually remaining a good 3-4 years ahead of console hardware on average. Less at the start of the life cycle, more at the end.

All things considered, I think the hardware attributes of the next generation of consoles would probably fall around the equivalent of an 8 core processor at around 2.00GHz per core, 4GB of ram, and a GTX570. Of course, you will get a lot more out of the console in regards to gaming than you would a PC of equal specs, seeing as how that's all the console does. But in regards to actual hardware, I betcha that's about where it'll be.

I may have been too generous.
Score
0
a b U Graphics card
March 8, 2012 11:45:34 PM

tinnitus said:
it's not my opinion, it's just like that, from 2008 i've been testing various cards: 4670, 5670, 5750, GTX 460, GTX 560 Ti, now i'm with a 6850 cf, nvidia runs smoother than amd, but they are too high on price.


Ok. Now. Lets see. Through all the cards I've used, the nVidia cards have the more constant frame rates, but as for being smooth? Nothing runs smoother than my 3 6950s. So don't go fanboying us. Also, 4670, 5670, 5750 are all lower and cards than a 460 and a 560 Ti, and the 6850s are very likely to microstutter. That's just what the 6800 series seem to do. And I haven't seen a single 6800 series crossfire config that hasn't micro stuttered. Please, just find someplace else to be a fanboy.
Score
0
a c 217 U Graphics card
March 9, 2012 4:53:47 AM

V-sync seems quite different with a 120hz monitor vs 60hz. With the 120hz monitor, the latency seems to go away or at least it's too small to feel, but on a 60hz monitor, I do feel the difference at times.

Also, don't worry about the fact that DirectX doesn't offer triple-buffing option, because with DirectX, triple buffing is built in. You can see it quite obviously through FPS. In OpenGL without triple-buffing, if the game can't maintain over 60 FPS, you'll have 30 FPS instead, but in DirectX, you'll get nearly the same FPS with or without v-sync on.

Score
0
a c 217 U Graphics card
March 9, 2012 5:17:19 AM

Quote:
agreed but IMHO what you are experiencing between 120hz and 60hz is just the fact that 120hz is double the speed of 60hz. By the way GPUs do you run to manage 120hz fps in todays games ?


I don't manage 120 FPS, but I either make sure to get 60 fps in 3D vision, or make sure to get at least 75 FPS without 3D. One of the differences between the two when it comes to latency is with 120hz, the time the GPU must wait on the next vertical refresh is halved. If v-sync is forcing each frame to wait 1/120th of a second to display the next frame, a 120hz monitor will half that time. Of course if v-sync is lining up well with rendered frames, it might not actually cause latency, but it can.
Score
0
!