Sign in with
Sign up | Sign in
Your question

How Necessary Will Dual Video Cards Be in the New Future?

Last response: in Graphics & Displays
Share
September 10, 2006 2:05:07 AM

I'm looking into getting a new machine and I'm wondering if I should get a motherboard with dual video card capability or just single video card capability. Will I not be able to play the games that come out two years from now with a single video card?

Thanks...
September 10, 2006 2:11:14 AM

They will never be necessary. It's only worthy to have two top-line cards (one high-end card > two mid-tange cards), and not everyone can afford two $500 cards.
September 10, 2006 2:16:50 AM

Quote:
I'm looking into getting a new machine and I'm wondering if I should get a motherboard with dual video card capability or just single video card capability. Will I not be able to play the games that come out two years from now with a single video card?

Thanks...


A dual-GPU solution is a supreme waste of money unless you are gaming at high resolutions and have cash to burn. Such resolutions include: 1600 x 1200 (4:3), 1920 x 1200 (16:10), 2048×1536 (4:3) ... etc.

In many (most) cases, a faster single GPU will beat out a dual GPU solution of (even slightly) lower grade. The price you pay for two video cards does not balance well with the performance you end up getting. Until the technology further matures, it's something better overlooked ... especially considering the exorbitant cost to run two decent (or high end) cards in a dual arrangement.
Related resources
September 10, 2006 4:23:34 AM

I do agree with Arby, I would also like to add that I'm pretty impressed at how single card(gpu) are performing, even at the higher resolutions. This review: http://www.anandtech.com/video/showdoc.aspx?i=2821 at Anandtech I know is supposed to showcase ATI's top Xfire setup, but you can't help but notice the numbers the single cards are putting up at resolutions 1600x1200+. Then again I'm sure it's only a matter of a month or so until we see a game that brings all of those boards to their knees even at 1280x1024. ;) 
September 10, 2006 4:41:34 AM

My question is....knowing that graphic load switches from cpu to gpu at the resolution 1600x1200, how does the load work at the resolution of 1680x1050 (my widescreen's native res)?

yes SLI/xfire is costly and impractical for the general population, and the "gains" for resolutions under 1600x1200 is laughable in terms of cost. my main concern is gaming at 1680x1050 and does SLI fall into the "costly but true gains" catagory or "you paid $400-600 for 5 fps" catagory.

from what I understand, 1600x1200=1.92 million pixels, and 1680x1050=1.764 million pixels. not too far apart. has to load shifted yet?

someone please help me understand this graphical load shift operation please 8O
September 10, 2006 4:51:22 AM

Quote:
A dual-GPU solution is a supreme waste of money unless you are gaming at high resolutions and have cash to burn.


Now wait a minute... You just bought a 1950XTX and you're talking about dual GPU solutions being a waste of money? Value is not a simple quantity to define in many computing environments, but in the context of your recent purchase, this does sound a bit pot/kettle/black.

Quote:
Until the technology further matures, it's something better overlooked ... especially considering the exorbitant cost to run two decent (or high end) cards in a dual arrangement.


It all depends on what you're trying to achieve and how you set your priorities. If money really matters most, then quit gameing and go to work instead! Regarding SLI and XFire, remember that power requirements and heat management can't be ignored. These configurations are overkill and cost a wad of cash.
September 10, 2006 5:13:29 AM

dual card solutions will most likely never be necessary, with each generation the leap in performance is usually better than two older top of the line generation cards.

even though people are buying sli and crossfire solutions today one that is on a budget should never consider sli or crossfire only until when you get a second gfx card you get atleast a 90%-100% increase in performance...once that happens dual card solutions wont be necessary but i would venture to say they would be the norm...

but if it never comes to that % increase over one card then i doubt we'll ever see 2 gfx cards as the norm
September 10, 2006 5:19:38 AM

Quote:
My question is....knowing that graphic load switches from cpu to gpu at the resolution 1600x1200, how does the load work at the resolution of 1680x1050 (my widescreen's native res)?
I always thought it was 1280x1024? Or have things changed? I would be interested in this as well because I'm looking at either a 20" WS (1680x1050) or a 24" WS(1920x1200), so far for reviews I've been guesstimating from the results at 1600x1200 and 1920x1440 to get an idea of performance.
September 10, 2006 5:35:40 AM

Quote:
I always thought it was 1280x1024? Or have things changed? I would be interested in this as well because I'm looking at either a 20" WS (1680x1050) or a 24" WS(1920x1200), so far for reviews I've been guesstimating from the results at 1600x1200 and 1920x1440 to get an idea of performance.
If you are planning on playing all of your games at your native resolution of 1920x1200, I'd suggest getting a dual GPU setup. Two X1900XTs or 7900GTXs will provide a good gaming experience at this resolution.
September 10, 2006 5:47:44 AM

what about resolution of 1680x1050? which is more effective for performance single or dual graphic solution?
September 10, 2006 5:52:21 AM

Dont waste your time with dual GPU setups even for 1920x1200 i currently have a 24" wide screen with that native resalution. i have a x1900xtx i run HL2, BF2, and a number of other games at 8x fsaa to 16x (depending on the max the game allows) with no stutter or slow down. Dual GPUS are unessisary. I should mention i have 3 gigs of ram and a 4800+ x2 also. Dual GPU is just as far as i have ever seen in any benchmark something to brag about having as its utility isnt there.
September 10, 2006 6:03:06 AM

ya I know that single card can perform very well on current games. my new comp build includes 1 7950gx2. I'm mainly concerned for future proofing (ya ya i know thats impossible, just leaving options open), as in to be able to continue gaming at 1680x1050 without comprimising much.

point is though, I can't seem to find much info regarding shifting the load from cpu to gpu at that specific resolution (been out of the loop for 5 years) so if someone has a link, I'd appreciate it so I can read up on that :wink:
September 10, 2006 6:30:03 AM

Its not impossible and in video its alot harder then other aspects. However dual GPU's only means you GPU gets out dated in 6 months twice. The minimul bonus if any as i have seen games praform worse under a sli setup isnt even worth spending 2x more for. If you want to "Future Proof" your system i suggest spending whatever it takes to get a good motherboard with alot of future able upgrading options and build off of that. Make sure it has a sli option just in case it actualy becomes worth getting in the future and actualy gives a praformance boost that would be comparable to running two cards instead of maybe what you would expect from a overclocked single card.

Btw i also upgraded from a x850xt pe which also ran games on my old monitor at 1920x1440. i wasnt able to use all the fsaa options and such but it ran the game in more then playable frames. the resalution you wish to run runs on my old 9800xt with more then playable frames even on most of todays game at 1600x1200 easy. How old is that card? seriously though you do not need sli for anything.

P.S. as far as the video card or cpu taking the load it really depends on the engine. Unreal uses alot of cpu power as others could really care less about the cpu.
September 10, 2006 6:42:04 AM

:D  I'm doing exactly what you're recommending lol. my set up is c2d e6600, ocz700w, p180b, corsair xms2 ddr800. currently I'm just waiting for my 7950gx2 card and motherboard to be released (590 SLI I.E.). I'll be just using 1 card (SLI on a stick) for now, and maybe in a year or 2 upgrade to quad-core, SLI two 2nd gen dx10 cards (I don't trust first gens), and vista after service pack is released. so ya it's going to be a long time before I actually go "true" SLI (or quad, depending on future driver/game release).

neverless, I'm still interested in learning whether the cpu load shifts to gpu at 1680x1050 or only at 1600x1200 and above. my only question really 8)

EDIT: The game I'm looking forward to playing with lots of features on are Oblivion and F.E.A.R. I'm also trying to set my system up as overkill for now in hopes to be able to fully enjoy future games like Crysis and Project Offset with out having to change more than 1-2 components
September 10, 2006 7:09:22 AM

Quote:
Dont waste your time with dual GPU setups even for 1920x1200 i currently have a 24" wide screen with that native resalution. i have a x1900xtx i run HL2, BF2, and a number of other games at 8x fsaa to 16x (depending on the max the game allows) with no stutter or slow down. Dual GPUS are unessisary. I should mention i have 3 gigs of ram and a 4800+ x2 also. Dual GPU is just as far as i have ever seen in any benchmark something to brag about having as its utility isnt there.
Interestingly enough, Battlefield2 doesn't support 1920x1200, and if you did your field of view is unchangeable, causing you to actually lose viewing space. Try playing F.E.A.R., or Oblivion at 1920x1200 with 4xAA and you'll see what I mean.
September 10, 2006 7:31:24 AM

Interestingly enough BF2 max resalution is 1600x1200 my monitor is 1920x1200 the fov "hack" applys to resalutions where the vertical is less then 1200 which mine isnt. i simply removed the resalution lines in hte video.con's and added a resalution line in the shortcut. your right though it does NOT support that resalution but from tests as long as your vertical is minimum 1200 i have seen no loss in area( i could however be wrong but i have yet to see it) However any game that supports a resalution of 1920x1440 would from what i understand if you try to run 1920x1200 take the higher res and reduce teh vertical effectivly taking away 240 pixles from your view even though it is still rendering it. As far as fear or oblivion yes i know the two other cards i have mentioned would have a horrid time playing it. However the newest ATI cards and im sure the nvidia ones also can play that game in really high settings with playable framerates. but the extra what was it 5 or 10 frames? from the sli setup imo isnt worth the price of what $500? im not trying to bash you or anything im just trying to help him out. As far as quad core unless i missed something windows XP sp 2 is only able to run two cpus. i know in 2k you needed 2k server to run 4 cpus not sure if xp is the same i didnt really check into XP's cpu support vary much since when it came out there really wasnt much in the way of dual cpu's

P.S. I Dont know if this actualy matter but my monitor supports 16:10 aspect ratio while running in wide screen. which might also be why it isnt cutting off anything in BF2.
September 10, 2006 7:37:14 AM

Quote:
As far as quad core unless i missed something windows XP sp 2 is only able to run two cpus. i know in 2k you needed 2k server to run 4 cpus not sure if xp is the same i didnt really check into XP's cpu support vary much since when it came out there really wasnt much in the way of dual cpu's
You are correct. But MS defines a cpu/processor as the physical chip you put in the socket. You can have "n" cores for any cpu/processor.
September 10, 2006 7:40:52 AM

Guess i need to read up more about it i always just assumed it read it as two seperate cpu's and would treat it as such. I'm more of a hardware guy sorry im not the person to ask when it comes to software. So if windows will see it as a quad core and treat it as such I cant wait till those come out.
September 10, 2006 8:01:41 AM

Quote:
I'm looking into getting a new machine and I'm wondering if I should get a motherboard with dual video card capability or just single video card capability. Will I not be able to play the games that come out two years from now with a single video card?

Thanks...


My crystal ball gets a bit cloudy after about 90 days... I think that its fairly safe to say that the AGP interface will be very scarce in 24 months, I'm not sure at all about the pci-e interface. I'm also not sure what cpu's and MB / sockets will be available in 24 mos... nor what games, OS's, etc. that also might have an impact on anyone's decision.

I really don't think that anyone on here can accurately predict what'll be available in 24 mos., either game-wise or hardware-wise. Get what you think is the best bang for your currency at the moment, because in 90 days it will probably be outdated. When its no longer useable for what it is that you need it for, then get whatever's the best for you at that time.

'You places your bet, and you takes your chances...'

Just my 2p worth.
September 10, 2006 8:18:02 AM

No worries. Their documents are kind of ambiguous, I'm sure they don't mind getting people to shell extra unnecessary cash on an unnecessary OS upgrade. :D 
September 10, 2006 1:07:30 PM

Quote:
A dual-GPU solution is a supreme waste of money unless you are gaming at high resolutions and have cash to burn.


Now wait a minute... You just bought a 1950XTX and you're talking about dual GPU solutions being a waste of money? Value is not a simple quantity to define in many computing environments, but in the context of your recent purchase, this does sound a bit pot/kettle/black.

In the same post, I later explained my point. The performance gained by a dual GPU arrangement is far from matched by the price you pay. You mention the X1950XTX ... two of those cards in CrossFire would cost upwards of $950, depending on where you purchased them. The performance of the dual X1950's would not reflect the performance of one (1) X1950 x (times) two (2). It's only at high resolutions that the advantage becomes more obvious; and still, it's an advantage which doesn't justify spending nearly $1,000.

Quote:
Until the technology further matures, it's something better overlooked ... especially considering the exorbitant cost to run two decent (or high end) cards in a dual arrangement.


It all depends on what you're trying to achieve and how you set your priorities. If money really matters most, then quit gameing and go to work instead! Regarding SLI and XFire, remember that power requirements and heat management can't be ignored. These configurations are overkill and cost a wad of cash.

What you say is fair, just as what I said was fair. I was simply giving onimusha2000 some advice and trying my best to explain myself. Your last sentence well reflects my original "bottom line." :) 
September 10, 2006 1:30:39 PM

Dual cards will never be necessary, and they will ALWAYS be a waste of money. There's your answer.
a b U Graphics card
September 10, 2006 2:38:26 PM

i would say get a dual slot mobo as single slots are rare
and there isnt much diff in price. that way you can add another
card in the futuer if you see the need for it.

i have an sli set up but am currently using only one card
running both cards add more heat than i like.

and i play at 1026x768 mostly so i dont see much diff
with sli or not.

i do have a 32inch hdtv monitor with 1920x180
but cant get sli to work properly with it. :cry: 
a b U Graphics card
September 10, 2006 2:58:39 PM

i can get it to work but cant see all icons or start menu
and cant see all icons in games

i have 2 arctic coolers for both cards one is on the top card
and it blows out the back of the case.
the bottom card is stock cooler which blows air inside
the ac cooler wont fit in the bottom slot
unless i raise my case and cut a hole in the bottom of it. :x
a b U Graphics card
September 10, 2006 3:21:19 PM

i havent messed with it ahole lot. it does have diff options though
some otions are <treat as hd display> or not
another is the refresh rate of 240khz havent even meesed with that
it works great with one card though. oblivion looks awsome
but cant see the ingame icons. callofduty at 1920x1080 is great
it is the only game that works good with sli and 8x a.a.
tried 16x a.a. but it is a little choppy.

ill try some more and let you know :) 
September 10, 2006 10:54:53 PM

it could be your aspect ratio. looks like you have a 16:9 aspect ratio if im not mistaken HL2 has a option to check what kind of aspect ratio you have maybe its just not setup correctly? (wouldnt be able to tell you if there is a option for that in windows.) also you might wana move your mouse around and see if the screen moves. windows may have set up a resalution your monitor doesnt support. Normaly its as easy as deleting hte default monitor to fix something like that.

And to the guy who said hes getting 50% to 80% fps increases (which i dont believe for a second) you sir defy all benchmarks i have read in past year at least. Wonder why they cant get results of a constant over 10% but you seem to do it with ease :-/ course with a single radeon x1900xtx i get about 60 to 100 frames in bf2 qith everything jacked up in 1920x1200 i believe that 60 is at the max spectrume you can see and 100 fps is or was typicly all a game would render.so getting to about 70 to 110 isnt really something thats worth $500. I do consider my self one of those i have to have the best computer people but damn i like to pay for something that actualy increases praformance.
September 10, 2006 11:06:55 PM

Not sure those seem lower then my average frames in those games. In source games i typicaly cap out at 100 frames almost all the time (love well coded engines btw) From the looks of our specs they are pretty close. No offence but it sounds like a downgrade for me.
September 10, 2006 11:25:22 PM

I dont mean a downgrade for you. I ment for me and your setup isnt slow i think your taking me wrong here I dont mean to belittle your system or your upgrade I'm only trying to tell this guy asking about it that its simply not worth the money. Also my bad i didnt notice you said x1900xt and not xtx guess i shall try reading harder when i wake up or am tierd. I use the 3 gigs simply cuz i got 2 more gigs and decided to keep hte other 1 gigi 512 x2 in till i buy another 1024 x2 later. i run ALOT of stuff normaly when i play mmo i run 3 to 4 accounts at once in the max detail the game allows.

P.S. didnt know the HL2 (source) Engine was hihgly CPU dependant thought it was able to strike a nice blende between the two since its able to run on considerably lesser systems then most engines.
September 10, 2006 11:38:49 PM

Quote:
Dual cards will never be necessary, and they will ALWAYS be a waste of money. There's your answer.
Of course that is all subjective. In most cases, SLI and Crossfire is not required, but that doesn't mean that you should discourage someone from getting it if it fits their needs.
September 10, 2006 11:49:56 PM

I was thinking DOD lost cost as far as i could see didnt really change it much. but i havent tried it on my new video card just my old x850xt pe and 3500+ athlon 64. so i wasnt running 16xfsaa (on lost cost mind you) i have recently downloaded episode 1 and once i actualy get to play it ill let you know how it works in the max details.

reply heyyou: He was simply asking if sli was nesisary which its 100% not but i did however tell him to get a mobo that can use all the latest sli options in case it does become worth the money.
September 11, 2006 12:02:58 AM

I would have to say that although two video cards are not necesary,having them does give you a significant performance increase.at least i've noticed it in my 3dmarks as well as gaming.but that is me.a single high end card will work fine.but i do have to say that i believe in the future two cards may very well be the mainstream as far as it goes.anyways i hope i've helped you in some way.goodluck.

Dahak

EVGA VF4 SLI MB
X2 4400+@2.4 S-939
2 7800GT'S IN SLI MODE
2X1GIG DDR400 MEMORY IN DC MODE
WD300GIG HD
520WATT PSU
EXTREME 19IN.MONITOR
3DMARK05:11,582
a b U Graphics card
September 11, 2006 12:56:09 AM

yes my monitor is 16x9 or 4x3 it does both
it is acually a sony trinitron 32 inch hdtv
it diplays h.d. in wide screen meaning black bars at the top
and bottom of the screen. it has composite in, component in, svid in,
and dvi in.
it is a 200lb beheamouth flat screen crt about 4 yrs old

i have tried 800x600 1024x768 1280x1020 1600x1200
1920x180.

with those settings the desktop is f.u.b.a.r.
meaning i cant see all of the deskstop startmenusome icons ect.
with one card i can see evrything fine.
but at 1166x788 or something close to that.

when i try sli i cant get the whole desktop to show at any resolution

i havent tried it but twice.
i take my comp in the living room to hook it up
it is normally in my bedroom on a 17inch dell crt
of course on the 17 inch there is not much diff with sli
oblivion is faster with sli all other games are the same
fear isnt any faster with sli or not.

the 1.5 pacth for fear made my single gpu seem twice as fast
as i have never updated fear since i bought it new 6 months
or when ever it was available for sale.
September 11, 2006 1:24:49 AM

well like i said before i dont know if windows has a option or the game try setting it to 16:9 aspect ratio mine doesnt run 16:9 it runs 16:10 could be that simple. Normaly however 16:9 eliminates those black bars you could be running in 16:10 :-/ and im not even sure if 1920x180 is even a valid resalution. you mean by all as in left to right or do you use a double tool bar like i do and cant see hte bottom tool bar? this is highly unlikely since its a crt but have you checked to see if hte monitor supports all those resalutions or perhaps your video card? some video cards tend to support different resalutions. I'm sure your using the DVI in correct? your problem sounds vary strang indeed. Dont suppose your monitor or tv has computer drivers? sounds like a nice TV btw.

As far as the SLI thing goes in the future when stuff actualy is coded to use it and the tech is more mature i expect the praformance gains will be HUGE which is why i told him to get a board that will support it for future use. assuming hes not like me and builds a whole new $2000 computer every few months to a year that is.

O i forgot the last hour i have been playing lost cost. The graphics are vary nice and my frames hardly moved its still maxed out havent seen it bob down far enough to even make out that the frames had dropped.
September 11, 2006 1:27:54 AM

Quote:
A dual-GPU solution is a supreme waste of money unless you are gaming at high resolutions and have cash to burn.


Now wait a minute... You just bought a 1950XTX and you're talking about dual GPU solutions being a waste of money? Value is not a simple quantity to define in many computing environments, but in the context of your recent purchase, this does sound a bit pot/kettle/black.

In the same post, I later explained my point. The performance gained by a dual GPU arrangement is far from matched by the price you pay. You mention the X1950XTX ... two of those cards in CrossFire would cost upwards of $950, depending on where you purchased them. The performance of the dual X1950's would not reflect the performance of one (1) X1950 x (times) two (2). It's only at high resolutions that the advantage becomes more obvious; and still, it's an advantage which doesn't justify spending nearly $1,000.

I think you missed it. My cynical point was that when one buys a hot GPU right at release, then one must be prepared to pay a price penalty. You have chosen to pay said penalty for the 1950 you bought. I'm not faulting you for that in the least, OK? The 1950 looks good to me too - even XFired 1950s look good... But I'm not advising the OP to think about cost effectiveness - you are. If you were worried about cost effectiveness, then all you would have had to have done is to wait a couple of months and you could snag a deal on a 1950. That's more cost effective than being an early adopter just like buying a single graphics card is more cost effective than doing SLI or XFire.
!