How Necessary Will Dual Video Cards Be in the New Future?

onimusha2000

Distinguished
Sep 10, 2006
2
0
18,510
I'm looking into getting a new machine and I'm wondering if I should get a motherboard with dual video card capability or just single video card capability. Will I not be able to play the games that come out two years from now with a single video card?

Thanks...
 

prozac26

Distinguished
May 9, 2005
2,808
0
20,780
They will never be necessary. It's only worthy to have two top-line cards (one high-end card > two mid-tange cards), and not everyone can afford two $500 cards.
 

ArbY

Distinguished
Aug 17, 2004
346
0
18,780
I'm looking into getting a new machine and I'm wondering if I should get a motherboard with dual video card capability or just single video card capability. Will I not be able to play the games that come out two years from now with a single video card?

Thanks...

A dual-GPU solution is a supreme waste of money unless you are gaming at high resolutions and have cash to burn. Such resolutions include: 1600 x 1200 (4:3), 1920 x 1200 (16:10), 2048×1536 (4:3) ... etc.

In many (most) cases, a faster single GPU will beat out a dual GPU solution of (even slightly) lower grade. The price you pay for two video cards does not balance well with the performance you end up getting. Until the technology further matures, it's something better overlooked ... especially considering the exorbitant cost to run two decent (or high end) cards in a dual arrangement.
 

function9

Distinguished
Aug 17, 2002
657
0
18,980
I do agree with Arby, I would also like to add that I'm pretty impressed at how single card(gpu) are performing, even at the higher resolutions. This review: http://www.anandtech.com/video/showdoc.aspx?i=2821 at Anandtech I know is supposed to showcase ATI's top Xfire setup, but you can't help but notice the numbers the single cards are putting up at resolutions 1600x1200+. Then again I'm sure it's only a matter of a month or so until we see a game that brings all of those boards to their knees even at 1280x1024. ;)
 

KaozDragon

Distinguished
Aug 28, 2006
76
0
18,630
My question is....knowing that graphic load switches from cpu to gpu at the resolution 1600x1200, how does the load work at the resolution of 1680x1050 (my widescreen's native res)?

yes SLI/xfire is costly and impractical for the general population, and the "gains" for resolutions under 1600x1200 is laughable in terms of cost. my main concern is gaming at 1680x1050 and does SLI fall into the "costly but true gains" catagory or "you paid $400-600 for 5 fps" catagory.

from what I understand, 1600x1200=1.92 million pixels, and 1680x1050=1.764 million pixels. not too far apart. has to load shifted yet?

someone please help me understand this graphical load shift operation please 8O
 

clue69less

Splendid
Mar 2, 2006
3,622
0
22,780
A dual-GPU solution is a supreme waste of money unless you are gaming at high resolutions and have cash to burn.

Now wait a minute... You just bought a 1950XTX and you're talking about dual GPU solutions being a waste of money? Value is not a simple quantity to define in many computing environments, but in the context of your recent purchase, this does sound a bit pot/kettle/black.

Until the technology further matures, it's something better overlooked ... especially considering the exorbitant cost to run two decent (or high end) cards in a dual arrangement.

It all depends on what you're trying to achieve and how you set your priorities. If money really matters most, then quit gameing and go to work instead! Regarding SLI and XFire, remember that power requirements and heat management can't be ignored. These configurations are overkill and cost a wad of cash.
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
dual card solutions will most likely never be necessary, with each generation the leap in performance is usually better than two older top of the line generation cards.

even though people are buying sli and crossfire solutions today one that is on a budget should never consider sli or crossfire only until when you get a second gfx card you get atleast a 90%-100% increase in performance...once that happens dual card solutions wont be necessary but i would venture to say they would be the norm...

but if it never comes to that % increase over one card then i doubt we'll ever see 2 gfx cards as the norm
 

function9

Distinguished
Aug 17, 2002
657
0
18,980
My question is....knowing that graphic load switches from cpu to gpu at the resolution 1600x1200, how does the load work at the resolution of 1680x1050 (my widescreen's native res)?
I always thought it was 1280x1024? Or have things changed? I would be interested in this as well because I'm looking at either a 20" WS (1680x1050) or a 24" WS(1920x1200), so far for reviews I've been guesstimating from the results at 1600x1200 and 1920x1440 to get an idea of performance.
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
I always thought it was 1280x1024? Or have things changed? I would be interested in this as well because I'm looking at either a 20" WS (1680x1050) or a 24" WS(1920x1200), so far for reviews I've been guesstimating from the results at 1600x1200 and 1920x1440 to get an idea of performance.
If you are planning on playing all of your games at your native resolution of 1920x1200, I'd suggest getting a dual GPU setup. Two X1900XTs or 7900GTXs will provide a good gaming experience at this resolution.
 

enforcer22

Distinguished
Sep 10, 2006
1,692
0
19,790
Dont waste your time with dual GPU setups even for 1920x1200 i currently have a 24" wide screen with that native resalution. i have a x1900xtx i run HL2, BF2, and a number of other games at 8x fsaa to 16x (depending on the max the game allows) with no stutter or slow down. Dual GPUS are unessisary. I should mention i have 3 gigs of ram and a 4800+ x2 also. Dual GPU is just as far as i have ever seen in any benchmark something to brag about having as its utility isnt there.
 

KaozDragon

Distinguished
Aug 28, 2006
76
0
18,630
ya I know that single card can perform very well on current games. my new comp build includes 1 7950gx2. I'm mainly concerned for future proofing (ya ya i know thats impossible, just leaving options open), as in to be able to continue gaming at 1680x1050 without comprimising much.

point is though, I can't seem to find much info regarding shifting the load from cpu to gpu at that specific resolution (been out of the loop for 5 years) so if someone has a link, I'd appreciate it so I can read up on that :wink:
 

enforcer22

Distinguished
Sep 10, 2006
1,692
0
19,790
Its not impossible and in video its alot harder then other aspects. However dual GPU's only means you GPU gets out dated in 6 months twice. The minimul bonus if any as i have seen games praform worse under a sli setup isnt even worth spending 2x more for. If you want to "Future Proof" your system i suggest spending whatever it takes to get a good motherboard with alot of future able upgrading options and build off of that. Make sure it has a sli option just in case it actualy becomes worth getting in the future and actualy gives a praformance boost that would be comparable to running two cards instead of maybe what you would expect from a overclocked single card.

Btw i also upgraded from a x850xt pe which also ran games on my old monitor at 1920x1440. i wasnt able to use all the fsaa options and such but it ran the game in more then playable frames. the resalution you wish to run runs on my old 9800xt with more then playable frames even on most of todays game at 1600x1200 easy. How old is that card? seriously though you do not need sli for anything.

P.S. as far as the video card or cpu taking the load it really depends on the engine. Unreal uses alot of cpu power as others could really care less about the cpu.
 

KaozDragon

Distinguished
Aug 28, 2006
76
0
18,630
:D I'm doing exactly what you're recommending lol. my set up is c2d e6600, ocz700w, p180b, corsair xms2 ddr800. currently I'm just waiting for my 7950gx2 card and motherboard to be released (590 SLI I.E.). I'll be just using 1 card (SLI on a stick) for now, and maybe in a year or 2 upgrade to quad-core, SLI two 2nd gen dx10 cards (I don't trust first gens), and vista after service pack is released. so ya it's going to be a long time before I actually go "true" SLI (or quad, depending on future driver/game release).

neverless, I'm still interested in learning whether the cpu load shifts to gpu at 1680x1050 or only at 1600x1200 and above. my only question really 8)

EDIT: The game I'm looking forward to playing with lots of features on are Oblivion and F.E.A.R. I'm also trying to set my system up as overkill for now in hopes to be able to fully enjoy future games like Crysis and Project Offset with out having to change more than 1-2 components
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Dont waste your time with dual GPU setups even for 1920x1200 i currently have a 24" wide screen with that native resalution. i have a x1900xtx i run HL2, BF2, and a number of other games at 8x fsaa to 16x (depending on the max the game allows) with no stutter or slow down. Dual GPUS are unessisary. I should mention i have 3 gigs of ram and a 4800+ x2 also. Dual GPU is just as far as i have ever seen in any benchmark something to brag about having as its utility isnt there.
Interestingly enough, Battlefield2 doesn't support 1920x1200, and if you did your field of view is unchangeable, causing you to actually lose viewing space. Try playing F.E.A.R., or Oblivion at 1920x1200 with 4xAA and you'll see what I mean.
 

enforcer22

Distinguished
Sep 10, 2006
1,692
0
19,790
Interestingly enough BF2 max resalution is 1600x1200 my monitor is 1920x1200 the fov "hack" applys to resalutions where the vertical is less then 1200 which mine isnt. i simply removed the resalution lines in hte video.con's and added a resalution line in the shortcut. your right though it does NOT support that resalution but from tests as long as your vertical is minimum 1200 i have seen no loss in area( i could however be wrong but i have yet to see it) However any game that supports a resalution of 1920x1440 would from what i understand if you try to run 1920x1200 take the higher res and reduce teh vertical effectivly taking away 240 pixles from your view even though it is still rendering it. As far as fear or oblivion yes i know the two other cards i have mentioned would have a horrid time playing it. However the newest ATI cards and im sure the nvidia ones also can play that game in really high settings with playable framerates. but the extra what was it 5 or 10 frames? from the sli setup imo isnt worth the price of what $500? im not trying to bash you or anything im just trying to help him out. As far as quad core unless i missed something windows XP sp 2 is only able to run two cpus. i know in 2k you needed 2k server to run 4 cpus not sure if xp is the same i didnt really check into XP's cpu support vary much since when it came out there really wasnt much in the way of dual cpu's

P.S. I Dont know if this actualy matter but my monitor supports 16:10 aspect ratio while running in wide screen. which might also be why it isnt cutting off anything in BF2.
 

function9

Distinguished
Aug 17, 2002
657
0
18,980
As far as quad core unless i missed something windows XP sp 2 is only able to run two cpus. i know in 2k you needed 2k server to run 4 cpus not sure if xp is the same i didnt really check into XP's cpu support vary much since when it came out there really wasnt much in the way of dual cpu's
You are correct. But MS defines a cpu/processor as the physical chip you put in the socket. You can have "n" cores for any cpu/processor.
 

enforcer22

Distinguished
Sep 10, 2006
1,692
0
19,790
Guess i need to read up more about it i always just assumed it read it as two seperate cpu's and would treat it as such. I'm more of a hardware guy sorry im not the person to ask when it comes to software. So if windows will see it as a quad core and treat it as such I cant wait till those come out.
 

croc

Distinguished
BANNED
Sep 14, 2005
3,038
1
20,810
I'm looking into getting a new machine and I'm wondering if I should get a motherboard with dual video card capability or just single video card capability. Will I not be able to play the games that come out two years from now with a single video card?

Thanks...

My crystal ball gets a bit cloudy after about 90 days... I think that its fairly safe to say that the AGP interface will be very scarce in 24 months, I'm not sure at all about the pci-e interface. I'm also not sure what cpu's and MB / sockets will be available in 24 mos... nor what games, OS's, etc. that also might have an impact on anyone's decision.

I really don't think that anyone on here can accurately predict what'll be available in 24 mos., either game-wise or hardware-wise. Get what you think is the best bang for your currency at the moment, because in 90 days it will probably be outdated. When its no longer useable for what it is that you need it for, then get whatever's the best for you at that time.

'You places your bet, and you takes your chances...'

Just my 2p worth.
 

function9

Distinguished
Aug 17, 2002
657
0
18,980
No worries. Their documents are kind of ambiguous, I'm sure they don't mind getting people to shell extra unnecessary cash on an unnecessary OS upgrade. :D
 

ArbY

Distinguished
Aug 17, 2004
346
0
18,780
A dual-GPU solution is a supreme waste of money unless you are gaming at high resolutions and have cash to burn.

Now wait a minute... You just bought a 1950XTX and you're talking about dual GPU solutions being a waste of money? Value is not a simple quantity to define in many computing environments, but in the context of your recent purchase, this does sound a bit pot/kettle/black.

In the same post, I later explained my point. The performance gained by a dual GPU arrangement is far from matched by the price you pay. You mention the X1950XTX ... two of those cards in CrossFire would cost upwards of $950, depending on where you purchased them. The performance of the dual X1950's would not reflect the performance of one (1) X1950 x (times) two (2). It's only at high resolutions that the advantage becomes more obvious; and still, it's an advantage which doesn't justify spending nearly $1,000.

Until the technology further matures, it's something better overlooked ... especially considering the exorbitant cost to run two decent (or high end) cards in a dual arrangement.

It all depends on what you're trying to achieve and how you set your priorities. If money really matters most, then quit gameing and go to work instead! Regarding SLI and XFire, remember that power requirements and heat management can't be ignored. These configurations are overkill and cost a wad of cash.

What you say is fair, just as what I said was fair. I was simply giving onimusha2000 some advice and trying my best to explain myself. Your last sentence well reflects my original "bottom line." :)
 

sirheck

Splendid
Feb 24, 2006
4,659
0
22,810
i would say get a dual slot mobo as single slots are rare
and there isnt much diff in price. that way you can add another
card in the futuer if you see the need for it.

i have an sli set up but am currently using only one card
running both cards add more heat than i like.

and i play at 1026x768 mostly so i dont see much diff
with sli or not.

i do have a 32inch hdtv monitor with 1920x180
but cant get sli to work properly with it. :cry:
 

sirheck

Splendid
Feb 24, 2006
4,659
0
22,810
i can get it to work but cant see all icons or start menu
and cant see all icons in games

i have 2 arctic coolers for both cards one is on the top card
and it blows out the back of the case.
the bottom card is stock cooler which blows air inside
the ac cooler wont fit in the bottom slot
unless i raise my case and cut a hole in the bottom of it. :x