Sign in with
Sign up | Sign in
Your question

To hell with dual graphics cards !!!!!!!!!!!!!!!!!

Last response: in Graphics & Displays
Share
October 28, 2005 4:10:22 AM

Im annoyed by the direction in which we are heading guys.

Dual graphics cards. Crossfires, SLIs....

Its the corporate world scooping in on what used to be a fairly grounded community. All of a sudden its who has dual graphics cards... who hasnt. And with the release of FEAR it seems we wont be able to play with maximum settings with just one high end graphics card (EVEN though just one of these cards are worth the price of a standard computer).
So now we need to fork out extra bucks...

We pay extra bucks so that game developers can slack off, write loose codes... and then slap on a spec requirement that is out of this world. FEAR could have been less demanding if the programmers had done a better job. There is good reason why Half Life and Call of Duty were so nice to run. They were written well. TIGHT as a hot girls asss.

Well Im not getting involved with these fat asses. Im not gonna run around trying to keep up anymore. Its a waste of time and money. I aint gonna pay up unless they throw in alloy wheels and power steering.

I suggest anyone who thought about dual. refrains and boycotts the very idea of spending double. You may have the spare cash now but the very idea of having to buy two graphics cards everytime you want to upgrade is ridiculous.

Tomshardware should also refrain from contemplating the idea in their reviews.


NOPE I aint doing it!

Im gonna settle for playing previous generation games now. Its cheaper and easier. Ill play far cry, HL2 NOW and come end of next year Ill upgrade economically and play fear, BF2.
October 28, 2005 4:36:52 AM

its called progress or technological advancements if you like...different strokes for different folks as always...donate like i do the tech toys youve got to spare or have no use for, that way at least someone less equipped out there would hopefully be able to actually know of your plight while playing/surfing on/with it...btw F.E.A.R. rocks!
October 28, 2005 5:06:24 AM

Shut up.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
Related resources
a b U Graphics card
October 28, 2005 5:25:34 AM

Exactly!

Seriously the Luddites are Reproducing!

Fear technology, Fear change!

Buy a GD Etch-A-Sketch for Crips Sake! :tongue:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
October 28, 2005 11:35:12 AM

Technology advancements are always a good thing because you get the "older" parts cheaper.
a b U Graphics card
October 28, 2005 12:43:24 PM

First they start killing AGP, now it's dual PCI-e. It's the end of good times Grape. :wink:

I was tempted to do one of my big game orders. GoGamer has <A HREF="http://www.gogamer.com/cgi-bin/GoGamer.storefront/SESSI..." target="_new">Quake 4 DVD $34.90</A>, Fear 29.90, GTR $29.90.

Then I thought logically. I am yet to install Riddick, UT2004, and a couple other games I bought during a previous buying spree. All my gaming time goes into BF2. I do want GTR though, but probably come Nov 17th I'll be racing NFSMW. Q4 would be good for benching I guess, I was never into any of the others. I was a original UT fan.

Edit: sorry for dragging this thread OT :tongue:


<A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
<P ID="edit"><FONT SIZE=-1><EM>Edited by Pauldh on 10/28/05 08:45 AM.</EM></FONT></P>
a b U Graphics card
October 28, 2005 2:09:31 PM

Welcome to my world, I have about 7+ un-opened boxes of games (KillSwitch, COD [PlayedDemo], Matrix, XIII [played Demo], NFSU, and a bunch of lesser titles which have all been on sale for $9.99 or less at some point at FutureShop [Matrix and KillSwitch were $4.99]). I even have FartCry which will still stay as open, and now even installed, yet not played because I'm finally going to finish DOOM3 after watching the movie and saying "I have to put this title to rest!"

It's nice to have a large collection, but it'd also be nice to have the time to play as well.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
October 28, 2005 3:00:57 PM

While I can identify with your frustration, I don't think your opinion is well thought out.

You can either push the envelope further, or stagnate. And as the envelope get's pushed further, budget solutions for the common man (like the people you're championing) become MUCH more powerful; cards like the 6600GT become a reality... $150 cards that can challenge the $400 cards of only a year and a bit ago.

If tech developed a bit slower, the best available at any price might be something equivalent to the 6600GT, the only difference is that it would cost $500... we'd be spending our $150 on Geforce 5200s.

So how would the poorer folk be considered the winner if tech developed slower?

________________
<b>Geforce <font color=red>6800 Ultra</b></font color=red>
<b>AthlonXP <font color=red>~3300+</b></font color=red> <i>(Barton 2500+ o/c 412 FSB @ 2266 Mhz)</i>
<b>3dMark05: <font color=red>5,275</b>
a b U Graphics card
October 28, 2005 3:05:43 PM

or <A HREF="http://www.newegg.com/Product/Product.asp?Item=N82E1681..." target="_new">$127</A> after MIR.

*Steam rising*
|<font color=red>(\__/)</font color=red>|
|<font color=red>(='.'=)</font color=red>|
|<font color=red>(")_(")</font color=red>|
~~~~~
BUNNY STEW FOR DINNER!!
October 28, 2005 8:30:41 PM

Who said that eye-candy was for everyone. It an award for those that spend too much $$$ on their rigs and expect a return out of them.

But I do know what you are talking about. Most of the games released recenty would tax the hell outta of 7800 or x1800. Its done by intent and its a symbiosis between hardware and software industries.

<i><font color=red>Only an overclocker can make a computer into a convectional oven.</i></font color=red>
October 29, 2005 1:04:13 AM

First, FEAR is overrated. The levels are boring, and feel like everything before just rearranged. and the multiplayer ain't shiat compared to CS, HL2:D M, BF2, DoD, ecetera, ecetera. And it's f'in ridiculous how the graphics industry has tricked everyone into thinking dual graphics cards are novelty. Think about it, is it really that great of an idea?? Lets see, what makes more money-R&D on incredible technology, or getting our lab rat devotee gamers to grab two of our products, enabling us to cut down R&D department and make twice as much revenue... Jesus, the graphics card industry must have signed a pact with the devil in order to pull some shiat like this on consumers.
October 29, 2005 5:01:23 AM

compare the dual setups:

<A HREF="http://graphics.tomshardware.com/graphic/20050926/cross..." target="_new">http://graphics.tomshardware.com/graphic/20050926/cross...;/A>

to their single counterparts:

<A HREF="http://graphics.tomshardware.com/graphic/20051006/ati_e..." target="_new">http://graphics.tomshardware.com/graphic/20051006/ati_e...;/A>

unless i missed something i would guess the dual cards average about 30% more fps on high settings but are actually slower at lower settings. the test rigs seem fairly even in those two reviews. thanks tom's.

you will probably have to buy another $400 card, new motherboard, new processor, new LOUD cooling fan(s), new LOUD power supply...dont forget your new electric bill. ill pass right now.

<P ID="edit"><FONT SIZE=-1><EM>Edited by picture_perfect on 10/30/05 09:39 AM.</EM></FONT></P>
October 29, 2005 5:39:16 AM

Quote:
All my gaming time goes into BF2.


Yup. I have yet to check out FEAR, Q4, or any of the other new games. I stopped playing GTA:SA and haven't gone back.

<font color=red><b>Long live Dhanity and the minions scouring the depths of Wingdingium!</b>

XxxxX
(='.'=)
(")_(") Bow down before King Bunny
October 30, 2005 1:25:16 AM

Wusy Relax. Discussing whether the future of gaming technology goes dual is actually quite important.
Its a turning point.

And whilst many of you would be happy to put me on the firing line, most are not even considering the ideology behind it. In my opinion, the curtain is being pulled over our eyes, and no one is questioning it.

Dresden elaborated better than me. Its a lot more profitable to send us down the trail of dual cards, rather than improve the technology itself. At first, you might be fooled into thinking dual means an improvement in technology. But it aint. Its not like dual core cpu, where technology is employed to improve multitasking (IN THE SAME PRODUCT). Its simply wacking on an extra PCI-E slot, and letting the kids go out and buy two retail priced cards. If you think that is for the better. Then by all means. Proceed. But dont tell me, no one is putting a gun to my head. That is just a lazy reply guys.

So what are the implications. Well for one thing research & development can now rely on twin turbo. Quantity over quality. Game developers can now fall back on specs that are ridiculous, without thinking twice about tweaking performance. AND with the QUAD PCI-E slots in the horizen. Where does that leave us. Like sheep following the sun. Guess who is laughing.

Look at FEAR. Like Addiarmadar said, a single card X1800XT (not even released yet) will sweat and bleed to run on max. Done intentionally? To push consumers into the dual setup? I dont remember the last time a new high end card was challenged before it was even released. And dont confuse FEAR with 'improving technology'. Right now, game developers can make games that require 10 Graphics cards without a problem. ITS MAKING a great game with the constraints of hardware, that is their challenge. That is their technology.

Now lets discuss money. Technology improves, and with it better cards are introduced. Overall, the price of keeping up to date in the past has meant upgrading every year. In the past, the cost of upgrading every year has been reasonably the same each time. It has always cost me around $4000 AUD to get a new computer with the latest and greatest gadgets. Eg. A 80Gb hard drive cost me $300 in 2002, A 300GB hard drive costs $300 in 2005.

There lies the trend of technology and its consumers. Technology is related to time not PRICE. We were prepared to pay up every year to stay afloat. There lies the flaw in this DUAL Graphics card concept. We have to pay more each year. WHY!!!!!????! because you are not paying for technology! You are paying for two of the same products (over the counter). You are paying for quantity.

So everyone involved is laughing. R&D can decrease spending. You have to increase it.

You can put me on the cross, and rightly say you know more about graphic cards than me. But you are not looking at the big picture.

Spare me the insults and the one liners.

<P ID="edit"><FONT SIZE=-1><EM>Edited by ashkon52 on 10/29/05 11:40 PM.</EM></FONT></P>
October 30, 2005 1:54:12 AM

Look at how many people have dual graphic card setups. End of discussion.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
October 30, 2005 2:09:28 AM

Action Man
It wont stay like that if in a couple months Toms hardware starts showing graphs of FarCry 2, with only dual cards being able to meet the requirements of 4xAA etc.
October 30, 2005 2:36:50 AM

I understand that.

NEED NEED NEED.
Junkies NEED a hit too.
But at what cost.
The R&D bumping up their specs again, so that you need QUAD. Its a dead end. Soon enough, the wallet is going to run dry, the dealers will be rich (doubling... quadrupling marginal profit), and you're left with a hangover.

Im saying its the wrong direction. What we NEED are SINGLE cards with better output.

Make no mistake. Its gonna happen. When all of us will NEED dual cards. And no one is going to remember the good ol days of single cards and when technology advancements werent just about consumers doubling up on their retail purchases. When it was about improving the technology not doubling it.

Its like, a car. What would happen if car makers started improving their specs by making you buy two engines. Who wins. Not the consumer. The consumer has to fork out the costs. Whilst R&D can sit back and not worry about tweaking engines anymore.

At this stage, its not a major concern. Dual cards are for bragging rights. But slowly the market will sway towards that direction and soon enough if you want to play the latest games, dual cards will be necessary. It wont even be an option.
DOES ANYONE get what Im saying.
Can anyone see how this direction means consumers have to pay double for the same technology.
<P ID="edit"><FONT SIZE=-1><EM>Edited by ashkon52 on 10/30/05 00:55 AM.</EM></FONT></P>
October 30, 2005 4:03:41 AM

Let me explain it in another way

When a product is released, its cost is primarily based on labor costs, material costs and Research & Development costs. Now when you are producing something on a mass level, labor and material costs of 6600 and 6800 cards are more or less the same. So then why does 6800 cost more. Its not because 6800 is made of gold. Its because of the research and development costs.
You would be amazed at the R&D costs. A lot goes into it. hundreds of millions of dollars spent by the manufacturers. The better the technology the more R&D costs involved. That cost is fairly transferred onto the consumer price at the end of the day. So you end up paying more for the more technogically powerful 6800GT. Fair is fair. In fact most of your money is spent on the costs of R&D.

For example lets say.
A card costs the consumer $550 to buy

It costs the manufactuer $50 for material, $50 for labor
and $300 for the research and development to invent it.
Thus it costs the manufacturer $400 to make the card
That means they make a net profit of $150 for kids to get what they want.

Now what if everyone needs TWO cards
It costs the consumer $1100 (2X$550)

The manufacturer spends 2X$50 for material
2X$50 for labor
BUT still only spends 1X$300 for Research and development to invent it.
Therefore it costs manufacturers $500 to make two cards for each individual.
Their nett profit... a whopping $600 now


CAn you see how if the trend is towards 2 cards for everyone... how profitable it is for those bastards. How we end up paying more for the same R&D.

Thats why i believe their needs to be an immediate stop to
this direction.
October 30, 2005 4:28:10 AM

And when the corporate world knows and understands this. They will force everyone into this direction. Bump up the specs on games so that 2 cards is necessary to play 4AA mode. We are already seeing this happen.

You know, one day in the near future when you're planning on upgrading. You will be running through the costs, and in your checklist you will have TWO 9800 Nvidia cards.
And even though u never thought about it twice before, you will have a moment of realisation. You might even remember when I was whinging about it, all those many years ago. You think back, wondering about the days when it didnt cost an arm and leg to stay afloat.

Im getting a headache
October 30, 2005 6:35:57 AM

Sigh. That won't happen. Look at the Steam hardware survey, most people have sh!tty cards because they're cheap.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
October 30, 2005 6:42:54 AM

Quote:
Now when you are producing something on a mass level, labor and material costs of 6600 and 6800 cards are more or less the same.

Wrong. The 6800 is much more expensive to make.

Quote:
In fact most of your money is spent on the costs of R&D.

No most of the money goes into the materials, like silicon.

R&D isn't that expensive with architectures being used for more then one gen, the R300 is still going obviously its changed alot with the R520 but its still based on it.

Why am I even bothering with this.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
a b U Graphics card
October 30, 2005 7:21:06 AM

Quote:
the R300 is still going obviously its changed alot with the R520 but its still based on it.

That statement is just plain wrong, the R520 is not based on the R300. The R520 is a completely new design, and is based on the R300 only in a way that it would equally be based on Radeon 1.
Even if it weren't each additional feature and each switch of process requires additional R&D investments. The R520 had both of those factors inlcuding the most troubling being the move to 90nm.

And while parts are the most expensive part of any card, the top cards are all about the marketing because they are trying to sell at multiples of cost to cover R&D costs. The shorten the life of a card the less opportunity to recoup those costs. Supposedly with the R520 it was over $300million just for that one chip (not it's offspring). And while many progeny come from these initial cards, each of them carries a large R&D pricetag too. For every feature like extra pipes (R9800XT -> X800XT) or changing process (X800XT -> X800XL) or moving from PS2.0 to PS2.0B support.

It may seem trivial to some but even just the move to LowK-d in R9600XT probably cost them some amount of money that to us would seem foolish for such an insignificant move.

And just to say it again, the R520 is no more 'based' on the R300 than the G70 is 'based' on the NV30.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
October 30, 2005 8:13:49 AM

Quote:
That statement is just plain wrong

Hmmm perhaps based isn't the word I was looking for, borrows would be better, too tired.

Quote:
The R520 had both of those factors inlcuding the most troubling being the move to 90nm.

Well the move to 90nm itself wasn't so troublesome it was the soft ground problem that was troublesome. Xenos/R500/C1 didn't have many if any problems on 90nm so yeah.

Quote:
The R520 is a completely new design

Not really. It borrows alot of things from Xenos and the R420 and implements some new things. I guess it depends on how one defines completely new.

Quote:
Supposedly with the R520 it was over $300million just for that one chip (not it's offspring).

About that but once they had it going it wouldnt cost that much more to make the RV530 and RV515.

Quote:
And just to say it again, the R520 is no more 'based' on the R300 than the G70 is 'based' on the NV30.

Jez do you really need to throw in the 'based'?

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
a b U Graphics card
October 30, 2005 8:28:53 AM

Quote:
Xenos/R500/C1 didn't have many if any problems on 90nm so yeah.

Xenos has 3 benifts to it, it came out after they'd already spun R520s, it's almost half the transistors, and it doesn't need to reach any speed targets since it exists in a vaccum there is no competition to beat for the spec. Also Ati only had to spin the A01 silicon as a proof of concept, everything else is in M$' hands.
So the move to 90nm for the R520 was R&D for the Xenos, and far more of an issue.

Quote:
Not really. It borrows alot of things from Xenos



Very little. Arguably it borrows less from the Xenos than it does from the NV30 & NV40.

Quote:
and the R420 and implements some new things. I guess it depends on how one defines completely new.

Yes, if you want to be so general in terms, then it's really just an extension of the original 3D Rage, like I said before although the cutoff was a little earlier in my previous example.

Quote:
About that but once they had it going it wouldnt cost that much more to make the RV530 and RV515.

Actually the initial tape outs alone would cost a few million, and the designs themselves would likely cost that mcuh. While it's a fraction of the R520's cost, it's still far from peanuts.

Quote:
Jez do you really need to throw in the 'based'?

Well considering your statement about <i>"it depends on how one defines completely new."</i>, then I'd say 'based' was quite appropriate, and in retrospect a pre-emptive statement of that very train of thought.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
October 30, 2005 8:57:37 AM

Quote:
it came out after they'd already spun R520s


Really? I thought xenos was a couple of months before it, hmmm oh well.

Quote:
it's almost half the transistors


70% actually.

Quote:
and it doesn't need to reach any speed targets since it exists in a vaccum there is no competition to beat for the spec.


PS3. Plus since the ALU arrays change state per clock a good clock speed is necessary.

Quote:
Also Ati only had to spin the A01 silicon as a proof of concept, everything else is in M$' hands.


Don't ATI still handle it? I'm pretty sure they still people working on the xenos silicon and whatnot.

Quote:
Very little.


Complex scheduler and decoupled texture units, they're stuff from xenos.

Quote:
Yes, if you want to be so general in terms,


Too slack.

Quote:
Actually the initial tape outs alone would cost a few million, and the designs themselves would likely cost that mcuh. While it's a fraction of the R520's cost, it's still far from peanuts.


Indeed, I'm too slack to type that up.

Quote:
Well considering your statement about "it depends on how one defines completely new.", then I'd say 'based' was quite appropriate, and in retrospect a pre-emptive statement of that very train of thought.


Meh.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
October 30, 2005 9:40:02 AM

Forget nvidia`s sli whats wrong with ATIs crossfire, when you first top of the range card goes weak you buy the next gen walla, you now have a better system with you old card and new working together thats not a bad thing now is it, it saves you money in the long run plus you will be hitting the top spec in games.

As others have already siad software/game designers have to push the envolope to match industries top spec technology.

Tell me whats the point of sticking everything to max and still getting 180 FPS in a game when you have a x1800/7800GTX, and a 19 inch tft which refreshes at 60fps?

When you are playing a game you want the most out of your spanking new TOTR card, you dont want to max all the settings and then see that if you brought a 6600gt you could have achieved the same would`ent that piss you off more?

Its good that game developers are pushing the mark, 3-4 years ago they just could not match TOTR cards and that pissed me off when that i could have brought a lesser card and still got the same experience.

<font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
October 30, 2005 12:25:53 PM

sorry, i forgot to put in my [url's]. SLI is a fad but at least its now an option.

<font color=green>If you wanna be worshipped then go to india and moo.</font color=green><P ID="edit"><FONT SIZE=-1><EM>Edited by picture_perfect on 10/30/05 09:42 AM.</EM></FONT></P>
a b U Graphics card
October 30, 2005 3:34:32 PM

Quote:
70% actually.

LOL! OK, 72%, but 72% is close enough to half for the discussion.

Quote:
PS3.



The RS isn't even spec'd out yet to us and I doubt it was much of anything specific just over a year ago. The Xenos has been in development almost as long as the R520 series (3 vs 3+ years). At that time it was still unknown who Sony was going to choose for their part, even the announcement of nV+Sony came long after, and unlike the R500/C1/Xenos you can see that the RS appears to be nothing more than a 90nm shrink one of of their desktop parts.

Quote:
Plus since the ALU arrays change state per clock a good clock speed is necessary

.

Yeah but nowhere near that of the desktop part, which like I said has alot more transistors to warm-up. The Xenos needs to hit it's target to to function as expected, the R520 needs to hit it's target to sell more than the competition. And while you mention the PS3, unlike the desktop market the graphics core is wrapped around the rest of the rig that hides the performance of just the one product, so there is no way to compare the RS to the C1 straight-up, and thus not the same pressure.

Quote:
Don't ATI still handle it? I'm pretty sure they still people working on the xenos silicon and whatnot.

The level of commitment now is like OnStar with GM, ATi will give M$ some support and be sure to take feedback about design issues and such, but the production is specifically out of their hands M$ is contracting TSMC (or whomever they chose if TSMC gets problematic) to produce the chip, not ATi.

Quote:
Complex scheduler and decoupled texture units, they're stuff from xenos.

If you look at the design it looks more like it's the other way around with the R520 design flowing more along the line of development, and Xenos taking it as an extension at least for the scheduler/Ultra-threader.
In any case they both look like results of the dropped R400 project, and I'd say they both owe more to that than either owes to each other.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
October 30, 2005 7:47:18 PM

Quote:
The RS isn't even spec'd out yet to us and I doubt it was much of anything specific just over a year ago. The Xenos has been in development almost as long as the R520 series (3 vs 3+ years). At that time it was still unknown who Sony was going to choose for their part, even the announcement of nV+Sony came long after, and unlike the R500/C1/Xenos you can see that the RS appears to be nothing more than a 90nm shrink one of of their desktop parts.


Well the Sony 90nm process is alot better then TSMC's, they've got the RSX up to 550mhz which is a pretty impressive feat. Obviously they were expecting Sony to use thier own part. Also it has to compete with and complement the R520 so yeah.

Quote:
The level of commitment now is like OnStar with GM, ATi will give M$ some support and be sure to take feedback about design issues and such, but the production is specifically out of their hands M$ is contracting TSMC (or whomever they chose if TSMC gets problematic) to produce the chip, not ATi.


The 65nm process from TSMC isn't that far away so wouldn't ATI's people be handling that rather then TSMC? The 65nm process can also handle EDRAM so I would imagine they'd merge the parent and daughter dies to one.

Quote:
If you look at the design it looks more like it's the other way around with the R520 design flowing more along the line of development, and Xenos taking it as an extension at least for the scheduler/Ultra-threader.
In any case they both look like results of the dropped R400 project, and I'd say they both owe more to that than either owes to each other.


Yeah. The scheduler/ultra threader seems like a unfied shader sort of thing and the ultra threader seems like its meant to bridge the gap between it and R600 and to increase efficiency or something like that.

The threader would also be modified a fair bit with the R520 being non unifed and r400 being unified its scheduler would be more suited to xenos, sort of.

I hope some of that makes sense. Good points though.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.
October 31, 2005 12:19:43 AM

In reply to Wusy
"6800 were the first to introduce SLi, then came 7800(which also come with SLi). Comparing the performance of a single card 6800GT vs. 7800GT, do you think nV has slacked off???"

You're right. If I respond to that honestly, I would open the gates to hell. 7800 is a good step up. but I will say, the X1800XT was a dissapointment. I expected more pipelines, even if the new pipelines have got more grunt.

Im just thinking, if the new generations cards can only pull off so much, games that follow should be restricted to those specs... and not spill onto dual setup requirements. Im only extrapolating what I see with FEAR, to conlcude that games in 3-6months time will need more than just one card to play.

In response to Action man
"That won't happen. Look at the Steam hardware survey, most people have sh!tty cards because they're cheap."

But that is an oversimplification of the truth. HL2 was marketed to the general gamer. I wouldnt be surprised by that. In fact high end gamers include only a small portion of the gamers pool. The dual card is targetting these people. The ones who have been willing to buy expensive cards in the past. The marketing people, know this, and know they can push these people around. Because when you're really pashionate about something your less concerned with the expenses. You've payed for one, you'll pay for another.

"Wrong. The 6800 is much more expensive to make.....No most of the money goes into the materials, like silicon"

I dont know the exact ratio of manufacturers expenses regards to graphic cards. That was an example. Based on my mums experience in R&D department for washing machines, I can only assume things are similar across the board. Specially when your dealing with cutting edge technology (that changes on a yearly bases) the R&D costs would be massive. I would be really interested in the actual figures for a graphics card though.

In repsonse to RX8
"...when you first top of the range card goes weak you buy the next gen walla, you now have a better system with you old card and new working together thats not a bad thing now is it, it saves you money in the long run plus you will be hitting the top spec in games"

The advantages of buying a second card rather than buying a new gen card is a bit debatable.

I understand what you mean, not getting the maximum out fo your brand new card pisses me off too, but im thinking about this in the scope of things to come. I can live with the idea of people having the option of going to dual just to get some bragging rights. What I cant live with, is if in two years time, I am forced to buy TWO cards to play the latest games with maximum resolution.
October 31, 2005 12:50:59 PM

what a soapbox


oh, btw, thanks for clearing up your sig wusy. it didnt make much sense before.

"Who is General Failure, and why is he reading my drive?"
P4 3.0C HT, Intel D865GBF, 1GB Crucial PC3200 DDR, 2x WD 36GB Raptor 10kRPM, BBA Radeon X800XT PE, SB Audigy, Hauppage WinTV
October 31, 2005 1:08:47 PM

Game publishers will always target games to the average Joe-<3 Dell-Schmoe. That's where most of the money is. Any game-maker that wants people to play the game will spend tons of effort and money maximising for the people who don't even know what SLI is.

Not every game will run on average hardware, but most of the good ones will.

And I'm not talking about maximum settings. Maximum settings are there as an incentive for the hard-core gamer, and can be set so high that no game will run on any PC.

I'll put it this way, if the makers lowered the possible maximum but made it playable at those settings, would that make you happy?



I’m an emo kid, non-conforming as can be
You’d be non-conforming too if you looked just like me.
November 1, 2005 6:15:19 PM

Wusy hit the nail on the head-SLi is really just for peeps who have the money to have the latest and greatest. But everyone else in the fu*!ing world is drooling over SLi/Xfire. And there is no question about the performance gains. But this is a discussion about the tech of graphics cards, and it should be obvious that dual card setup is a pussy a$$ solution. Desperation for money comes to mind. And it seems that games are almost starting to surpass what cards can do, combined with high resolutions. The biggest ATI/nVidia cards right now are pretty much creamed corn when dealing with FEAR at highest resolutions, and probably will struggle with COD2 too. That's just gay. And what's up with only "30%" increase in framerates with dual card solutions????? Thats kinda weak, considering all the power under the hood. Oh, and one last thing, in response to an old comment by Action_Man, when you said that production cost far exceed R&D costs...that's BS man. And who told you silcon was expensive????? It's one of the most abundant resources on this planet. It's not the materials, it's the manufacturing that can get expensive. And that is only conditional on how many products are produced.
a b U Graphics card
November 1, 2005 8:35:37 PM

It makes you wonder if part of the problem IS sloppy coding. Reading the article that I linked in Cleeve's post on Havok FX, it seems that they want to put even more stress on the GPU and would love to get more GPU access via SLI or CF.

*Steam rising*
|<font color=red>(\__/)</font color=red>|
|<font color=red>(='.'=)</font color=red>|
|<font color=red>(")_(")</font color=red>|
~~~~~
BUNNY STEW FOR DINNER!!
November 1, 2005 9:01:06 PM

Quote:
And who told you silcon was expensive?????

<A HREF="http://www.theinquirer.net/?article=22451" target="_new">Link 1</A>
<A HREF="http://www.beyond3d.com/forum/showthread.php?t=22624&pa..." target="_new">Link 2</A>

Its making the chip thats expensive. Poorly worded from stuff from me. Don't post whilst tired, it could kill you or is that driving?

But it is $60 for a 5750 and $150 for a R520. And I never said far exceeds just exceeds.

Also space you're posts out so they're readable.

Some people are like slinkies....
Not really good for anything but you cant help smile when you see one tumble down the stairs.<P ID="edit"><FONT SIZE=-1><EM>Edited by Action_Man on 11/01/05 06:02 PM.</EM></FONT></P>
November 2, 2005 12:58:35 AM

I really miss the days when my previous R9600Pro could run anything at high settings + a little aa af.

But now...

AMD A64 3200+ Newcastle @ 2.4 Ghz
MSI K8N Neo Platinum
1GB Corsair DDR400 ValueSelect (CL2.5) @ DDR440
GeCube R9800XT 128mb @ C400Mhz/M700Mhz
November 2, 2005 3:52:08 AM

good link ActionMan!

But I think the article helps support my argument. My ratios could have been more accurate, but the hundreds of millions of dollars spent on R&D ($400 million) was right. From the figures, again, you can see that all other costs are fixed. They are inevitable costs, but R&D is the variable. By adjusting their R&D costs, profits can be increased.
No doubt, dual cards can help reduce R&D costs.
Will the manufacturers then be inclined to push the market in that direction. You bet your ass!

Finally some people are seeing my point of view. Irrespective of what the advantages\disadvantages are (at this stage) with dual cards, the future of card technology is bleak if we go down this road.

I reckon TomsHardware should do an article on this matter. Let the manufacturers know there is concern within the community. I DONT WANT TO GO THERE.
DO YOU?
Join the army now! (insert pic of man pointing at you)
November 2, 2005 2:14:01 PM

I wonder how these companies are making money these days.

say they spend 400,000,000 on r and d alone.
the average card price is 500 lets say.
they must sell 800,000 cards to break EVEN on the RD, that doesn't include any additional costs for parts and labour. not to mention the profit margin.

hypothetically, we'll say that 300 of the 500 is the Cost of Goods sold. so in reality, only 200 of that goes to R a D now.

2 million cards must be sold to break even. this now doesnt even include profit. to earn a profit, i'm estimating that NV or even ATI must sell at least 2.6 million of their top line cards
November 2, 2005 4:54:45 PM

Or, they sell a lot more low-to-mid range OEM cards with higher production/profit margins and wastly reduced R&D (if at all) cost.



I’m an emo kid, non-conforming as can be
You’d be non-conforming too if you looked just like me.
November 2, 2005 5:59:45 PM

Good, we are in agreement then, it's not the silcon, but it's the actual production that can be expensive.

Sorry about the conjested threads I put up, I'm new to the game.

You know what would be great, is if our friends at ATI and nVidia would cheapen the prices of the monsters they are putting out, so more people can get a SLi/Xfire setup. With the dual card option out there, they stand the chance to make much more sells, if they only open up they're stance on pricing.

I'm only dreaming though, I think the amount of people who demand a dual card setup is incredibly marginal, so it probably isn't worth their time to market this crap. Also, most people have more commonsense to waste so much money.
November 2, 2005 6:28:58 PM

I didn't read this whole thing but let me ask you this...

If dual cards ia a bad idea to produce, even if the companies are still making good single cards....does that apply toall things?

Twin turbo in a car should only be single turbo?
Light fixtures with mutiple bulbs should only have one?(a fixture with 5 60W bulbs should now have 1 300W bulb?)
People, like myself, with two monitors should have one GIANT monitor?

You do realize none of the things i listed are NEEDED as a part of the services the provide, I could have a sinlge turbo, i could have one light in my dining room and not 5, and i could only have one monitor.

but guess what...i made a choice to do it this way, and i am glad that companies have provided options. THEY PROVIDE OPTIONS! They arn't like M$ dictating what we have to do, if they didn't make the options people like you would be up in arms about companies not caring about the end user and not making what people want.

I am not a fan of SLI at all....but if i had an unlimited budget from the lotto, it would be hard not to buy one because it does help. Just not enough for me to buy without some insane amount of money.

_____________________________
Chaintech VNF3-250/A64 2800+/1GB(512x2) OCZ VX GOLD 2-2-2-5/BFG 6800GT/Thermaltake 420W/WD 200GB/Maxtor 300GB/Soon to include Z-5500...any ideas on which SC??
a b U Graphics card
November 2, 2005 11:20:30 PM

Quote:
If dual cards ia a bad idea to produce, even if the companies are still making good single cards....does that apply toall things?

Nope, because things aren't linear like that, just like the benifits of 2 cards are not linear either. Sometimes it's good sometimes it's useless or worse (regardless of which dual-card system you choose).

Quote:
Twin turbo in a car should only be single turbo?

Twin Turbos tries to counteract a problem with single turbo (turbo-lag). You might have an argument for larger 2 twins instead of the 3 turbos in some setups, however even then it's specific to what they are trying to achieve. But for that 1 versus 2 not a similar argument without other add-ons.

Quote:
Light fixtures with mutiple bulbs should only have one?(a fixture with 5 60W bulbs should now have 1 300W bulb?)

No, becuase the effect is different, most multiple bulb setups diffuse the light over the space trying to achieve an equal level of brightness around the area, a single bright spot doesn't achieve the same effect.

While these can be applied to those examples it's like someone from the silly wagon who is trying to string a dozen disposable flashlight together to achieve the same effect as the lowest powered of spotlights.

Quote:
You do realize none of the things i listed are NEEDED as a part of the services the provide,

For their purpose it is needed, for the arguemtns often used to push SLi, SLi itself is irrelevant to achieve that goal.

SLi when argued on the st00pid and base 3Dmark, e-penis, FPS/Res level doesn't have this differentiation. Argue for it based on additional features then it'd be similar, and I respect people's arguments that follow that logic, but not the ones far too many SLi fans promote. Especially the arguments that hey use that don't actually work at the time they make the statement.

Quote:
but guess what...i made a choice to do it this way, and i am glad that companies have provided options. THEY PROVIDE OPTIONS!

Your choice doesn't make it a wise choice, it makes it your choice. Someone else's rant about it may not be right or wrong, but they might get far more agreement from other on their position. However that really shouldn't have any influence on yours.

Quote:
They arn't like M$ dictating what we have to do,

BS! Both ATi and nVidia dictate what you can and can't do with SLi/Crossfire, what the hack are you talking about?!?

Quote:
if they didn't make the options people like you would be up in arms about companies not caring about the end user and not making what people want.

Sure they would, like in the FX series and the X700AGPs, getting SS-AA on Windows while it's been on Apple the whole time, the floptimizations and FUD from both, did either listen to the consumer to any great extent?

Quote:
I am not a fan of SLI at all....but if i had an unlimited budget from the lotto, it would be hard not to buy one because it does help. Just not enough for me to buy without some insane amount of money.

I agree with that philosophy t a great extent depending on the areas people expect it to be a benifit. But lotto dosn't usually involve 'wise' choices, and I think that's the issue most people are having. Personally my biggest issue is the people who said it would be better to focus on upgrading to 2 GF6800GTs instead of selling their GF6800GT and buying a new GF7800GTX for the same or less. Like I've always said, there's a place for SLi, it is as primarily a niche product solution, however should we encounter process shrink barriers sooner rather than later, then perhaps it will become a far more common and necessary thing in order to reach closer to the photo-realistic rendering we are all looking forward to.

Just my view from the sensible, I like those cool n' quiet solutions crowd. Of course like I've always said, if Oblivion comes out and requires an SLi'ed or Crossfired rig in order to play it, then that's what I'll do, just don't expect me to smile while signing the Visa bill. :wink:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
November 3, 2005 2:34:33 AM

Quote:
because things aren't linear like that, just like the benifits of 2 cards are not linear either. Sometimes it's good sometimes it's useless or worse (regardless of which dual-card system you choose).

Isn't that exactly the point i was trying to make by saying that? i was saying a lot of times 2 is better then 1 but not twice as good.

Quote:
Twin Turbos tries to counteract a problem with single turbo (turbo-lag). You might have an argument for larger 2 twins instead of the 3 turbos in some setups, however even then it's specific to what they are trying to achieve. But for that 1 versus 2 not a similar argument without other add-ons.

Exactly...twin turbo is better then single turbo but not twice as good.

Quote:
No, becuase the effect is different, most multiple bulb setups diffuse the light over the space trying to achieve an equal level of brightness around the area, a single bright spot doesn't achieve the same effect.

Point taken, that its not a great example but the point is that a 300W bulb with the proper housing could provide the same lighting solution. (granted not easily and probably not as well but if you wanted it to be, you could make it close.)

Quote:
BS! Both ATi and nVidia dictate what you can and can't do with SLi/Crossfire, what the hack are you talking about?!?

True they dictate what you can and can't do with SLI and CF, but they dont care what games you play. They dont care if you want to play 16x12 no AA or 1024x768 with max settings. They dont care...I wasn't trying to say they dont limit you but i am saying that they are providing an option. If you want a certain % increase for a high cost....they can do that, its an OPTION. they arn't forcing it on you. (this is a HUGE topic, and its hard to explain something without writing a book)

Quote:
Sure they would, like in the FX series and the X700AGPs, getting SS-AA on Windows while it's been on Apple the whole time, the floptimizations and FUD from both, did either listen to the consumer to any great extent?

you proved my point slightly, if they didn't make an SLI/CF situation people would have one more thing to add to that list. Granted they dont do everythign we want...and frankly we want too much. Yes being dishonest is bad, no its HORRIBLE (floptimizations) but We ask for the companies to do everything at a higher quality, faster production time, and all at equal or less cost....we ask for too much in general....although they shouldn't be lieing bastards like they are.



lol, yes lotto winners are usually never wise about purchases. But i wouldn't buy a Bently and a house and all that crap. I would buy resaonable things....Maybe an SLI system isn't the best sweetspot for price/perfomance. But it has a small market, and i think its nice to have that option for those who can afford it.

You are right, people say SLi is so great because of stupid ass reasons. But thats like all things. I actually know someone who, when asked why he was braggin about his gaming machine about why he owned Intel, he said "Well i dont like AMD. They have pretty good chips, and they might beat my chip for all the tests but to me the Intel feels faster. And a lot of those measuring systems are bogus."

We have all run into that person who says stupid stuff like that. But just because SLi isn't great doesn't make it terrible.



Side note, if you have a 6800GT and you are wondering if you should SLi....Shoot yourself in the face. Sell your card buy a 7800GTX. Thats just being plain retarded to SLi in that case unless you were givin a free 6800GT to couple with it.

_____________________________
Chaintech VNF3-250/A64 2800+/1GB(512x2) OCZ VX GOLD 2-2-2-5/BFG 6800GT/Thermaltake 420W/WD 200GB/Maxtor 300GB/Soon to include Z-5500...any ideas on which SC??
November 3, 2005 6:43:27 PM

Pickxx, you have a point, in some way. But that isn't what I'm arguing. Everyone here knows two is better than one. I don't even know why you bothered with so many damn examples. It's a fairly easy concept to grasp.

2>1.

Ok, but here's my problem with dual card solution...IT'S SPOILING THE MARKET PLACE, SCREWING WITH MY WALLET, AND DEMANDING THAT I HAVE IT IN ORDER TO PLAY A DAMN GAME. If companies want to showcase graphics in their games, code the game so more kids can see the how good the game looks. Look at HL2. Beautiful game that is availible to most gamers. I guarantee that this dual card solution scenario will push some gamers to jump on console bandwagon and abandon the PC realm.

Do ATI or nVidia care? lololol, they make products for consoles too. So in the end, WE=SCREWED.
November 3, 2005 8:05:33 PM

You dont have tohave SLi/CF for any game. Maybe if you want to turn some sparkles, bells, or whistles you will have to...

HL2 is a great example to say that its not the GPU makers that are at fault. HL@ is coded so it scales very nicely for people who can't afford an upgrade.

I dont think SLI is spoiling the marketplace, its an option, if you dont want it....dont buy it.

If you are mad at someone ask yourself this...
Why does HL2 scale down well, even to 9800pros and games liek FEAR and BF2 dont scale down like that.

_____________________________
Chaintech VNF3-250/A64 2800+/1GB(512x2) OCZ VX GOLD 2-2-2-5/BFG 6800GT/Thermaltake 420W/WD 200GB/Maxtor 300GB/Soon to include Z-5500...any ideas on which SC??
November 3, 2005 8:12:45 PM

I just don't see how this is a chip maker issue. I see it as a game maker issue.

Games that require crazy systems have always existed and always will, but they remain mostly marginal things that people look at, go 'wow' and forget. SLi is a good way to take money from people with lots of it, but doesn't effect the game industry as much as I think you think it does.

For instance, CIV came out. The recommended settings are laughably low compared to FEAR, but I predict, using my super powers to see the future, that CIV will be one of the most popular games in years. Furthermore, if CIV required a 10^10OMFGXTHAXXOR card for playing, it would be wastly less popular and a company would be out of millions of dollars.

Most people don't have the time to waste on badly written graphics demos. And software companies that offer nothing else disapear quickly.

Or are you one of those gamers that simply <i>have</i> to play every single one of the hundreds of badly coded games that come out every year. Or does it rankle that a game exists that wont run smooth on your system? Does an e-penis really matter that much?





I’m an emo kid, non-conforming as can be
You’d be non-conforming too if you looked just like me.
November 3, 2005 8:16:15 PM

HL2 played pretty well on my old 4200ti...



I’m an emo kid, non-conforming as can be
You’d be non-conforming too if you looked just like me.
a b U Graphics card
November 3, 2005 11:19:37 PM

Just a quick comment, not that this isn't for a large part correct, it just ignores some of our points though.

Quote:
Everyone here knows two is better than one. .... 2>1.

Not always, not if the performance difference doesn't come with. Because while you have the power consumption, heat generation, and additional noise of a second card, you aren't always guaranteed a benifit. Also for those people gaming on 1280x1024 monitors seriously it's a huge waste of money, but they don't notice it because while they weren't experiencing any game issues (other than a driver bug or something) before they got SLi/X-fire afterwards they can see that their Bungholiomarks went up so obviously it must mean their system is now ubber-elite and able to rock the Kazbah in WoW.

For many situations it will be better but sometimes it's just a waste. I like it when people can defend their purchase, because there are a hell of alot of reasons to get SLi and X-fire, the problem is rarely are those people people we encounter defending/praising/gloating SLi rigs. The ones who usually have a good reason to get SLi are secure enough to not try and sell other people on it to justify their purchase, and they are also secure enough to say "ahh whatever" to anyone who may make mention of it because they have it in their sig or are asking a related question or something.

People who can defend their purchases, usually don't feel the need.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internët account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil: 
November 4, 2005 2:39:54 AM

Pickxx

IN RESPONSE
-----
You dont have tohave SLi/CF for any game. Maybe if you want to turn some sparkles, bells, or whistles you will have to...
-----

I planned not to reply in order to prevent repeating myself. But here goes.

Your opinions about dual cards is limited by your lack of vision into the future. Sorry mate. Whilst its certainly true that dual cards provide a mere option (or nishe avenue) at this stage, it likely wont be true in months\years to come. You havent looked one day ahead mate! All you see is now.

With the profitability of dual cards, manufacturers will spear us into the direction of mandatory dual setups. They will bump up specs on loosely coded games to require dual cards in order to play them with decent framerate. And the only option you'll be left with is... do I buy two cards or just forget it.

YOu dont have as many options as you think. Your only real option is whether you want it in blue, green, or red.


IN RESPONSE
-----
I dont think SLI is spoiling the marketplace, its an option, if you dont want it....dont buy it.
-----

There's always someone who makes this statement... dont want it... dont buy it.
Ignorance is bliss.

And... last time I remember TWIN TURBO doesnt mean you have to buy two Single Turbos straight out of the box.


DUAL GRAPHICS CARDS =/= BETTER TECHNOLOGY
November 4, 2005 3:18:55 AM

If in the future, they solder two cards together and sell them as one technology (like dual core chips), at roughly the same price as one high end card now (+\- few hundred dollars), then my take on the matter would come crumbling down.
... remember dual core chip is roughly the same price as a predecessing single core chip sold last year.

The only thing we can hope for, is that next generation consoles (with single cards) will restrict PC dual setup. But that's never stopped them before.

DUAL GRAPHICS CARDS =/= BETTER TECHNOLOGY
!