Sign in with
Sign up | Sign in
Your question

1280 x 1024 - Do I need SLI/Crossfire?

Last response: in Graphics & Displays
Share
August 25, 2006 4:18:13 PM

Heyas,

Just curious actually, but if I only run games (preferably options maxed) @ 1280 x 1024, can the next generation of single GPUs handle this?

I feel like I've got myself completly caught up in the SLI/Crossfire craze. I own 2 7800GTXs and have only seen major performance changes with Oblivion. I'm just hoping more powerful next generation single GPUs is all one really needs for beautiful graphics at good FPS. Its been so long since I've run a single card...I kind of feel out of touch lol. How low of a FPS can the human eye really detect? The older i get and the more I value my $, I'm not sure I need 120fps when there would be no difference with 60fps at the same settings.

I apologize if this an age old question, but I'de really aprpeciate some thoughtful (non-flaming) advice.

Thanks!
August 25, 2006 4:25:00 PM

I believe the human eye can only distinguish somewhere between 23-26FPS given the environment and moving picture.

However if your gaming @ those resolutions, SLI or XF would be something not wasting your $$ on. Go for a single high end GPU. Even then its still pretty stupid to go high end if thats the max your monitor supports.
August 26, 2006 12:18:38 AM

Quote:
I believe the human eye can only distinguish somewhere between 23-26FPS given the environment and moving picture.

However if your gaming @ those resolutions, SLI or XF would be something not wasting your $$ on. Go for a single high end GPU. Even then its still pretty stupid to go high end if thats the max your monitor supports.


Raven is correct. The human eye cannot detect differences above 24 frames per second.

SLI would most definately be a waste if youre only gaming at 12*10.
Related resources
August 26, 2006 12:54:54 AM

not to throw the validity off of the highest change in fps that is still detectable to the human eye... maybe its different from person to person though too... ...in some games though (ill give dungeon siege as an example)... even running at 30 FPS, with verticle sync enabled in the gpu settings too... the overall movement doesnt 'feel' as smooth compared to running at, say, 60 FPS... ...you wont see stuttering at 60 FPS at all really, but subtleties in the overall fluidity in how things interact, and move around each other and everything, as the perspective rotates around, really doesnt seem as smooth running at much less... ...that could only be a result of that games particular engine... but, FPS above 30 in that instance definetly comes into play as far as being detectable... so... i guess it might be dependant a great deal on the games internal engine too.
August 26, 2006 12:58:44 AM

Quote:
I believe the human eye can only distinguish somewhere between 23-26FPS given the environment and moving picture.

However if your gaming @ those resolutions, SLI or XF would be something not wasting your $$ on. Go for a single high end GPU. Even then its still pretty stupid to go high end if thats the max your monitor supports.


Raven is correct. The human eye cannot detect differences above 24 frames per second.

SLI would most definately be a waste if youre only gaming at 12*10.

Sorry guys, but that's wrong. I can see the diff between 60 and 100. Fighter pilots can distingush things at a rate of up to 1/200th of a second in training conditions.

The reason movies and television are only 24 frames a second is because the camera arpeture is open for the whole 1/24th of a second, hence capturing everything that happens during that time. Computers only render still images. Thus, you need alot more frames on a computer to keep up the illusion of fluid motion.

Most gamers find 60 frames and up to be the sweet spot. that's where Joe Average can no longer tell the diff.

Back to the topic at hand, YES SLi is a waste of money. Unless you're gaming on a 1080P set with all the bells and whistles on, you're just throwing money away. I've got a single 7900GT and it shreds everything I've thrown at it @ 1280x1024. BF2 maxed out 4xaa 8xaf runs like butter. Oblivion is the only game that I have any visible slowdown in, but it can bring a quadcore quadgpu system to its knees with the right options enabled, so I'm not worried.
August 26, 2006 1:08:28 AM

At 1280x1024 you only really need Crossfire and SLI for Oblivion; literally every other title is fine on a single GPU solution.
August 26, 2006 2:24:12 AM

Quote:
I believe the human eye can only distinguish somewhere between 23-26FPS given the environment and moving picture.

However if your gaming @ those resolutions, SLI or XF would be something not wasting your $$ on. Go for a single high end GPU. Even then its still pretty stupid to go high end if thats the max your monitor supports.


Raven is correct. The human eye cannot detect differences above 24 frames per second.

SLI would most definately be a waste if youre only gaming at 12*10.

Sorry guys, but that's wrong. I can see the diff between 60 and 100. Fighter pilots can distingush things at a rate of up to 1/200th of a second in training conditions.

The reason movies and television are only 24 frames a second is because the camera arpeture is open for the whole 1/24th of a second, hence capturing everything that happens during that time. Computers only render still images. Thus, you need alot more frames on a computer to keep up the illusion of fluid motion.

Most gamers find 60 frames and up to be the sweet spot. that's where Joe Average can no longer tell the diff.

Back to the topic at hand, YES SLi is a waste of money. Unless you're gaming on a 1080P set with all the bells and whistles on, you're just throwing money away. I've got a single 7900GT and it shreds everything I've thrown at it @ 1280x1024. BF2 maxed out 4xaa 8xaf runs like butter. Oblivion is the only game that I have any visible slowdown in, but it can bring a quadcore quadgpu system to its knees with the right options enabled, so I'm not worried.
Sorry, but that's not wrong. It may not be and absolut truth, but it's not wrong, and i guess ~30 fps (steady) is good for most players. Also, i think there's the psychological perception of 100 fps due to that fraps indicator at the corner of your screen...

As for the original question, if you had bought a single 7800gtx you would be good to go on 1280x1024
a c 358 U Graphics card
August 26, 2006 2:40:27 AM

Quote:


The reason movies and television are only 24 frames a second is because the camera arpeture is open for the whole 1/24th of a second, hence capturing everything that happens during that time. Computers only render still images. Thus, you need alot more frames on a computer to keep up the illusion of fluid motion.



Actually TV programs are shot at 30fps.
August 26, 2006 2:55:39 AM

IMO, X-Fire/SLI is only worth it if you have a monitor that will go to 1600x1200 or more. At 1280x1024, you'll do fine with a single high-end card.
August 26, 2006 3:25:43 AM

Quote:
IMO, X-Fire/SLI is only worth it if you have a monitor that will go to 1600x1200 or more. At 1280x1024, you'll do fine with a single high-end card.


Heck, I'd say xfire/sli doesn't come into its own until 1900*1200+ (Unless u play Oblivion, of course :p )

And by that point, you're in the 24" LCD territory and the people who buy those probably won't bat an eye at dropping a G for their graphics solution.

My guess is, as LCDs get cheaper and GPUs get more powerful, I bet you'll see the popularity of xfire/sli decline. It may just be a temporary band aid solution for the lucky few who game at 1900*1200+. I'm fairly confident that within a year or so you will see single gpu solutions capable of handling those rezzes with aplomb. Perhaps maybe dual-core GPUs (not sli-on-a-stick like the 7950) will emerge to give us a boost.

Of course, predicting the future of technology is always tricky.

Best of luck with your choice.
August 26, 2006 3:30:44 AM

Part of the trouble here is that while human eyes do vary, with some being able to detect differences up to 60 fps, I think the real source of the problem is minimum frame rates. I like to have a gpu that delivers a minimum of 30 fps, in fact my eyes like a minimum of 40 fps. With a minimum of 30 (or 40) fps, much of the game play will be well beyond that, showing 100 to 200 fps at times.

As to Crossfire/SLI, I think its wasted money unless you're got a big monitor with a very high resolution. Perhaps as more games become like Oblivion or beyond, with eye candy galore, it might seem necessary. At the same time, as single cards keep getting better, a single card may handle everything anyway with small monitors at lower resolutions.
August 26, 2006 4:02:29 AM

I can easily tell when a game is running under 30fps. And for some reason under 60 on BF2.. When I was running fraps I noticed a big difference in fluidity from 85 fps and 60fps, I cant tell on other games though. 30fps seems a bit laggy to me when I play fear, and 30fps is unplayable on bf2. I haven't a clue as to why I think so :D  Maybe Frap's fps counter is messing with my head
August 26, 2006 4:39:59 AM

Quote:
I believe the human eye can only distinguish somewhere between 23-26FPS given the environment and moving picture.

However if your gaming @ those resolutions, SLI or XF would be something not wasting your $$ on. Go for a single high end GPU. Even then its still pretty stupid to go high end if thats the max your monitor supports.


Ok lets set this arguement to bed.

Youre right and youre wrong. In live motion capture the human eye can detect up to about 26-28 FPS like you said, you got that point correct. However, what you haven't addressed is the context we are speaking about, gaming. Gaming is an entirely diffrent animal when compared with movies and live motion capture because each frame on a game stands on its own. That is to say in the gaming area there is no motion blur (even the programmed motion blur in NFS is still just that programmed).

I can definately tell the diffrence between 30, 45, 60 FPS, once you get beyond 60-70 FPS it does become a moot point, I agree. However, depending on the game, having 60 FPS instead of 30 FPS can make the diffrence between life and death. That split second lag can cost you a snipe or cause you to be sniped. However, if you get into the RPG segment and strategy games like Command and Conquer or WoW, anything over 30FPS is gravy and not really that necessary, believe me I play both.

That being said, back to the point of this thread. SLI on anything less than 1600x1200 is a waste of money (Oblivion withstanding). Even at 1600x1200 the benefit isn't as great as I would like it to be for the cost. I used to have 2x 7900 GTX's on a DFI Expert SLI board so I have had the experiance. Looking ahead I would say that you only need one GPU if you intend on keeping your res at 1280x1024, you will just have to continue to buy the high end card if you want to keep the level of detail at max. Well ... depending on how the 2nd place cards are configured, you may be able to get by with that at your res, you will have to wait and see.

I game at 1600x1200 with max details and I am still hard pressed to justify my SLI (when I had it).
August 26, 2006 6:15:12 PM

Quote:
I can easily tell when a game is running under 30fps. And for some reason under 60 on BF2.. When I was running fraps I noticed a big difference in fluidity from 85 fps and 60fps, I cant tell on other games though. 30fps seems a bit laggy to me when I play fear, and 30fps is unplayable on bf2. I haven't a clue as to why I think so :D  Maybe Frap's fps counter is messing with my head


That was part of my point. Many people can tell when a game drops below 30 fps, for myself, about 40 fps. With some games, as you point out with BF2, there can be enough going on that if the frame rate drops too much, the brain can seem to disconnect with what the eyes see. That causes a feeling something like vertigo, making the game unplayable for some people.

If a video card is good enough that the minimum frame rate is 30, when the absolute most is going on, then most of the game will be significantly faster. Observe how Oblivion can move along ok until an outdoor scene occurs and then the game play slows to a crawl. That's when the gpu is put under the most testing, and subsequantly the need for a very powerful gpu to keep the frame rate adequate.
a b U Graphics card
August 26, 2006 6:20:21 PM

Quote:
I believe the human eye can only distinguish somewhere between 23-26FPS given the environment and moving picture.

That was a joke right? Most games are noticeable laggy at those framerates.
August 26, 2006 8:23:50 PM

Quote:
I believe the human eye can only distinguish somewhere between 23-26FPS given the environment and moving picture.

That was a joke right? Most games are noticeable laggy at those framerates.

Read my post ... he is right in live action context, but not gaming.
a b U Graphics card
August 26, 2006 8:36:05 PM

What do you get in 3dmark05 with your sig rig?
August 26, 2006 9:06:58 PM

Quote:
What do you get in 3dmark05 with your sig rig?


No idea, I lost my benchies when I fried my hard drive and never bothered to redo them. I would run through them for you, but I am waiting on my replacement board from Newegg ... I fried another DFI board lol.

First time was a capacitor gone awry and this time the BIOS ROM chip failed. Oh well, gives me time to clean the case? HAHA like I clean my case... just my fans. 8O
August 29, 2006 12:55:17 AM

Quote:
I believe the human eye can only distinguish somewhere between 23-26FPS given the environment and moving picture.

That was a joke right? Most games are noticeable laggy at those framerates.

Hense my comment on the environment.... perhaps I should have used the word "game stage"

I can get the settings just right on WoW to show you that a constant 24fps on a 9600pro looks very smooth.... MMO's are something of the exception.
But a 3rd/1st person (oblivion, FEAR etc) of course you shouldnt play anything below 30fps.
August 29, 2006 1:15:54 AM

if u do want to sli or x-fire, DO NOT SLI A 7600gt !! and do NOT CROSSFIRE AN x1600xt!!! U ARE BETTER OFF GETTING A HIGH END VIDEO CARD!!! such as an x1900xt!!!
a b U Graphics card
August 29, 2006 1:33:53 AM

i have sli <2 68gts> and game mostly on a dell 19 inch crt
at 1024x768. compared to just one 68gt i cant see any diff
though i can feel the diff its warmer now lol.
i have a 32 inch sony hdtv and bf2 fear oblivion cod2
all looked great until i went sli and now cant get it to work
properly on the sony tv
with sli thgat is. with one card it looked great
a b U Graphics card
August 29, 2006 1:40:56 AM

but hey if you already have sli
you could split them up and have two good computers
for a little while that is :) 
August 29, 2006 1:48:37 AM

I'm just repeating what most everyone's already said, but...

Unless you game at 1600x1200 or above, SLI really isn't worth it. I've got 2 7900gt's OC and the returns are diminishing vs. a single card when you go below that res. At 1024x1200 you're much better off on many levels with a single high-end card. I usually play at 1900x1200 where SLI really starts to stretch it's legs.

Also, may I ask why you spent a large chunk of change on two cards, but not a monitor that supports higher res?
August 29, 2006 1:56:17 AM

Quote:
if u do want to sli or x-fire, DO NOT SLI A 7600gt !! and do NOT CROSSFIRE AN x1600xt!!! U ARE BETTER OFF GETTING A HIGH END VIDEO CARD!!! such as an x1900xt!!!


ok well let me add to the convo. I just built a nice core2duo system with the following
x1900xt
2gigs ram
E6400

I play games like NFSMW, FEAR, Dawn of War, and AOE3. I have an old semi-crappy monitor, that seems to have an issue with ghosting, but when i crank each game up to the max, i get some stuttering. even with my card, FEAR and NFSMW @ 1280x1024 have some stuttering. Could it be my monitor? i dunno i believe that it has a 25ms and 350:1 specs. My main question is that i am looking at getting a new monitor. If i go 20 inch, it would kind of be a waste since i couldnt run that res yes? better off getting a 19inch? and should i be able to run those games at their fullest? or would it be normal to have a bit of stuttering?
NOTE: i tried that catalyst overdrive thing, and it did help but i worry about using it...
August 29, 2006 2:10:32 AM

Quote:
Quote:

Raven is correct. The human eye cannot detect differences above 24 frames per second.

SLI would most definately be a waste if youre only gaming at 12*10.


Im sorry but this is major bs. I can tell a difference in any game that drops below even 45fps much less 35 or less.
a b U Graphics card
August 29, 2006 2:16:11 AM

can you tell a difference from 25hz and 5 hz in sound
i mean low frequency that is.

i dont think so it can be measured but not heard
August 29, 2006 2:19:53 AM

Quote:
Quote:

Raven is correct. The human eye cannot detect differences above 24 frames per second.

SLI would most definately be a waste if youre only gaming at 12*10.


Im sorry but this is major bs. I can tell a difference in any game that drops below even 45fps much less 35 or less.
i cant tell the difference between 30 or 50 fps, but my d**k is bigger than yours
a b U Graphics card
August 29, 2006 2:25:36 AM

lol
August 29, 2006 2:40:18 AM

Quote:
if u do want to sli or x-fire, DO NOT SLI A 7600gt !! and do NOT CROSSFIRE AN x1600xt!!! U ARE BETTER OFF GETTING A HIGH END VIDEO CARD!!! such as an x1900xt!!!


ok well let me add to the convo. I just built a nice core2duo system with the following
x1900xt
2gigs ram
E6400

I play games like NFSMW, FEAR, Dawn of War, and AOE3. I have an old semi-crappy monitor, that seems to have an issue with ghosting, but when i crank each game up to the max, i get some stuttering. even with my card, FEAR and NFSMW @ 1280x1024 have some stuttering. Could it be my monitor? i dunno i believe that it has a 25ms and 350:1 specs. My main question is that i am looking at getting a new monitor. If i go 20 inch, it would kind of be a waste since i couldnt run that res yes? better off getting a 19inch? and should i be able to run those games at their fullest? or would it be normal to have a bit of stuttering?
NOTE: i tried that catalyst overdrive thing, and it did help but i worry about using it...

depending on the resolution your x1900xt will handle any 20 inch monitor with full eye candy except for oblivion which seems to be the exception for everything...only until new games come out next year will you have to sacrifice eye candy but you still be able to play at that high of a res...assuming its 1600x1200
August 29, 2006 11:10:11 AM

Quote:
i still get some stuttering at 1280by1024 at full graphics and aliasing etc, in fear and nfsmw, what can i look at to fix it? i should be able to run them at 1600by1200? see this post if you could offer me a solution
http://forumz.tomshardware.com/hardware/modules.php?nam...


If you are getting stuttering in NFSMW go into advanced video settings and turn down the reflection rate refresh. If I remember that was one of the biggest factors in FPS when I was playing (I beat the game a long time ago so memory is a bit rusty).


FREAKING cricketts invaded my roof last night... i can't hear my own thoughts anymore.
!