Sign in with
Sign up | Sign in
Your question

NVidia Readies Dual Chip, Single Chip 9800GX2, 9800GTX and 9800GT

Last response: in Graphics & Displays
Share
January 4, 2008 12:26:11 PM

Wow ATI is really in for it bad!

Nvidia Readies Dual-Chip, Single-Chip High-Performance Graphics Cards.
Nvidia GeForce 9800 GX2, GeForce 9800 GTX, GeForce 9800 GT Approaching

http://www.xbitlabs.com/news/video/display/200801032235...

http://enthusiast.hardocp.com/article.html?art=MTQ0MCwx...

Category: Video

by Anton Shilov

[ 01/03/2008 | 10:34 PM ]


Nvidia Corp., the world’s largest designer of discrete graphics processing units (GPUs), reportedly plans to update its lineup of expensive graphics cards with at least two new offerings later in the quarter. The most powerful of the novelties will carry two graphics chips, whereas another will feature single-chip designs.


The new top-of-the-range graphics card by Nvidia is called GeForce 9800 GX2 which is based on two yet unknown 65nm graphics chips with 128 unified shader processors inside. The board, according to [H]ard|OCP web-site, will be 30% faster compared to Nvidia GeForce 8800 Ultra and will enable 4-way multi-GPU configurations. The novelty will have 256 stream processors in total, but will rely on driver support to demonstrate its potential, just like any multi-GPU solutions.

The least expensive solution – Nvidia GeForce 9800 GTX – is projected to be released in late February or early March and is claimed to be based on one GPU. The new model 9800 GTX will replace existing GeForce 8800 GTX, thus, should offer performance on par with GeForce 8800 Ultra and support 3-way SLI configuration. In addition, there will be a the least expensive version of GeForce 9-series called GeForce 9800 GT and due in March or April.

Based on information reported earlier, Nvidia GeForce 9800-series graphics processors will support DirectX 10.1 feature-set along with powerful video encoding engine and post-processor.

Even though the new GeForce 9800 GX2 is projected to offer performance only 30% higher compared to Nvidia GeForce 8800 Ultra, whereas the new GeForce 9800 GTX should outperform the 8800 GTX by a similar margin, the new lineup represents a great threat to ATI Radeon HD 3870 X2.

At present Nvidia sells GeForce 8800 Ultra for $849 in retail, whereas the GeForce 8800 GTX costs about $549 - $649. Provided that the new solution by graphics product group of Advanced Micro Devices offers performance of the GeForce 8800 Ultra, AMD’s new dual-chip graphics card will have to cost the same amount of money as the new GeForce 9800 GTX. Unfortunately, dual-chip configurations offer performance advantages over a single-chip ATI Radeon HD 3870 only in cases when its driver can take advantage of multi-GPU ATI CrossFireX technology. Therefore, in all other cases the GeForce 9800 GTX will be faster compared to ATI’s dual-chip solution, making it very hard for ATI Radeon HD 3870 X2 to finds its place on the market.

Nvidia did not comment on the news-story.
January 4, 2008 12:43:07 PM

Is it just me or does that performance level SUK? 30% for the GX2? single gpu card equal to the Ultra? I want 75% better for a GX2 and 50% for the single chip card, or Crysis is still going to suck.

Thats what happens when you have no competition folks.
January 4, 2008 12:43:29 PM

Well it seems NVidia is milking the G92 creating an 8800GTX G92 product called the 9800GTX. Now NVidia will put two G92 9800GTX GPU's on one card with only a 30% faster performance than an 8800GTX, while I believe that a single 9800GTX will have 30% more graphics power than an 8800GTX the duel chip card might exceed that by 60%. I guess time will tell.

Where the hell is the card said to be 2x times the power of the 8800GTX?
Related resources
January 4, 2008 12:53:33 PM

Not worth the $$$ imo...

I'll keep my 8800GT tyvm.
January 4, 2008 1:00:03 PM

systemlord said:
Well it seems NVidia is milking the G92 creating an 8800GTX G92 product called the 9800GTX. Now NVidia will put two G92 9800GTX GPU's on one card with only a 30% faster performance than an 8800GTX, while I believe that a single 9800GTX will have 30% more graphics power than an 8800GTX the duel chip card might exceed that by 60%. I guess time will tell.

Where the hell is the card said to be 2x times the power of the 8800GTX?


I still don't get it. For one, an 8800Ultra is not 30% faster than an 8800GTX as a similarly clocked GTX is equal to an Ultra both being essentally the same thing.

By that reasoning, a 9800GX2 with 30% advantage over a single 8800Ultra would have about a 40% advantage over a single 8800GTX.

A 9800GTX with a 30% advantage over a 8800GTX would only have about a 20% advantage over a 8800Ultra.

If you already have a pair of 8800GTX's in SLI or Ultra's in SLI, you already have more performance than a 9800GX2 but not more than two 9800GTX's. Which brings me to my point that the power increase sucks because we all know Ultra's SLI'ed and 8800GTX's Slied can't run Crysis with everything maxed at highest resolutions therefore, these new cards won't either.?????
January 4, 2008 1:02:03 PM

None of that makes sense to me.

#1 - The 8800GTS is Faster than the 8800GTX. How could NVIDIA come out with an 9800GTX that is not any faster than the 8800GTS.
I can't imagine the only thing they would add is support for 3-way SLI.

#2 - They will be coming out with an 9800GT that is cheaper/slower than the 9800GTX which would make it slower than the current 8800GT/GTS cards? Totally illogical.

#3 - The 9800X2 being only 30% faster than an 8800GTX could actually make some sense. Since they will be sticking two 92 GPUs together, this would be a heat/power monster unless they did something such as cutting back the Speed the GPUs ran at to seriously cut their power usage. I could easily see basically two 8800GTS cards running at 65% speed to give 130% peformance boost but still not requiring massive PSUs and massive cooling.
January 4, 2008 1:10:13 PM

TBO, I am going to wait this all out and see what ATi has to offer with their 3870 x2 chip. Maybe this will be able to scale better. That would be funny to put on my nvidia 780i mobo.... err... when and if it comes in.
January 4, 2008 1:11:22 PM

warezme said:
I still don't get it. For one, an 8800Ultra is not 30% faster than an 8800GTX as a similarly clocked GTX is equal to an Ultra both being essentally the same thing.

By that reasoning, a 9800GX2 with 30% advantage over a single 8800Ultra would have about a 40% advantage over a single 8800GTX.


I think they got their percentages wrong, I'm thinking that the single chip 9800GTX is 30% faster than the 8800GTX. The reason behind this is when the newer 8800GTS 512MB cards came out they were about 30% faster than the older 8800GTS 640MB card. Well thats my logic.
January 4, 2008 1:17:56 PM

systemlord said:
I think they got their percentages wrong, I'm thinking that the single chip 9800GTX is 30% faster than the 8800GTX. The reason behind this is when the newer 8800GTS 512MB cards came out they were about 30% faster than the older 8800GTS 640MB card. Well thats my logic.


But this is not the 50% faster as they promised for the GTX.
January 4, 2008 1:21:50 PM

I just bought an EVGA 8800GTS 512.

Looks like a may be in a good position for the step-up program!

I know it seems, based on this article, that the 9800 series of cards will not be much of a boost, but then, none of the information we are reading is official. I am still willing to believe that nVidia will not dissapoint us.

I hope.

Edit: Also, I imagine that the 9800 series cards will all support Tri-SLI. Probably with better scaling than we are seeing now. Not that I could afford 3 9800 GX2 video cards.
January 4, 2008 1:29:09 PM

systemlord said:
Wow ATI is really in for it bad!

Nvidia Readies Dual-Chip, Single-Chip High-Performance Graphics Cards.
Nvidia GeForce 9800 GX2, GeForce 9800 GTX, GeForce 9800 GT Approaching

http://www.xbitlabs.com/news/video/display/200801032235...

http://enthusiast.hardocp.com/article.html?art=MTQ0MCwx...

Category: Video

by Anton Shilov

[ 01/03/2008 | 10:34 PM ]ATI R680 is suppose to be faster than the ultra,then that means that its in line with the 9800gtx.The R680 is a first process to there newer techno preparing the R700 to the market.AMD in for it bad? hardly.Few months ahter Nvidias launch,we will see the R700,if not sooner.This will be a good year us for compition is back


Nvidia Corp., the world’s largest designer of discrete graphics processing units (GPUs), reportedly plans to update its lineup of expensive graphics cards with at least two new offerings later in the quarter. The most powerful of the novelties will carry two graphics chips, whereas another will feature single-chip designs.


The new top-of-the-range graphics card by Nvidia is called GeForce 9800 GX2 which is based on two yet unknown 65nm graphics chips with 128 unified shader processors inside. The board, according to [H]ard|OCP web-site, will be 30% faster compared to Nvidia GeForce 8800 Ultra and will enable 4-way multi-GPU configurations. The novelty will have 256 stream processors in total, but will rely on driver support to demonstrate its potential, just like any multi-GPU solutions.

The least expensive solution – Nvidia GeForce 9800 GTX – is projected to be released in late February or early March and is claimed to be based on one GPU. The new model 9800 GTX will replace existing GeForce 8800 GTX, thus, should offer performance on par with GeForce 8800 Ultra and support 3-way SLI configuration. In addition, there will be a the least expensive version of GeForce 9-series called GeForce 9800 GT and due in March or April.

Based on information reported earlier, Nvidia GeForce 9800-series graphics processors will support DirectX 10.1 feature-set along with powerful video encoding engine and post-processor.

Even though the new GeForce 9800 GX2 is projected to offer performance only 30% higher compared to Nvidia GeForce 8800 Ultra, whereas the new GeForce 9800 GTX should outperform the 8800 GTX by a similar margin, the new lineup represents a great threat to ATI Radeon HD 3870 X2.

At present Nvidia sells GeForce 8800 Ultra for $849 in retail, whereas the GeForce 8800 GTX costs about $549 - $649. Provided that the new solution by graphics product group of Advanced Micro Devices offers performance of the GeForce 8800 Ultra, AMD’s new dual-chip graphics card will have to cost the same amount of money as the new GeForce 9800 GTX. Unfortunately, dual-chip configurations offer performance advantages over a single-chip ATI Radeon HD 3870 only in cases when its driver can take advantage of multi-GPU ATI CrossFireX technology. Therefore, in all other cases the GeForce 9800 GTX will be faster compared to ATI’s dual-chip solution, making it very hard for ATI Radeon HD 3870 X2 to finds its place on the market.

Nvidia did not comment on the news-story.

January 4, 2008 1:30:06 PM

rallyimprezive said:
Edit: Also, I imagine that the 9800 series cards will all support Tri-SLI. Probably with better scaling than we are seeing now. Not that I could afford 3 9800 GX2 video cards.



OOooooo 3 9800 GX2s..... 6 GPUS!!!!

That should be able to run Crysis on full. I hope.
January 4, 2008 1:31:11 PM

spaztic7 said:
But this is not the 50% faster as they promised for the GTX.


Yea tell me about it.
January 4, 2008 1:32:40 PM



[ 01/03/2008 | 10:34 PM ]ATI R680 is suppose to be faster than the ultra,then that means that its in line with the 9800gtx.The R680 is a first process to there newer techno preparing the R700 to the market.AMD in for it bad? hardly.Few months ahter Nvidias launch,we will see the R700,if not sooner.This will be a good year us for compition is back
Sorry I ment to quote
January 4, 2008 1:33:19 PM

systemlord said:
Well it seems NVidia is milking the G92 creating an 8800GTX G92 product called the 9800GTX. Now NVidia will put two G92 9800GTX GPU's on one card with only a 30% faster performance than an 8800GTX, while I believe that a single 9800GTX will have 30% more graphics power than an 8800GTX the duel chip card might exceed that by 60%. I guess time will tell.

Where the hell is the card said to be 2x times the power of the 8800GTX?


thats what i think it is. just milking the G92. since ati got nothing to milk. these pple are just milking what they got.
January 4, 2008 1:33:36 PM

This is quite sad; they're saying the 9800GTX probably won't be much faster than the 8800Ultra. At the rate we're going, I'm never going to find a card that can play Crysis well. :( 
January 4, 2008 1:34:37 PM

spaztic7 said:
OOooooo 3 9800 GX2s..... 6 GPUS!!!!

That should be able to run Crysis on full. I hope.



No you can either run Tri-SLI with only the single chip 9800GTX or quad SLI with only 2 9800GX2's in quad mode.
January 4, 2008 1:36:18 PM

Another thought, pure conjecture of course, but Im willing to bet that when nVidia releases its next-gen video card, it will no longer use the current numbering scheme. I think a 10800GTX would be a bit silly.

It will be something like....nVidia Geforce 10X and 10X2 ...er something.
January 4, 2008 1:42:28 PM

remember that SLI does NOT scale 1:1 performancewise. you do gain anywhere from 10% to 70% more performance usually. so if they cut clocks to maybe 500 for heat constraints, they'd lose about... 16% performance versus the 8800GT and 23% versus the GTS(G92). assuming that SLI gives a sufficient performance boost, they're still only looking at a boost on the 9800GX2 over the 8800GT of 0% to ~35%. the supposed 9800GTX doesn't look like anything but PERHAPS a 700MHz G92 128SP. overall, i think this announcement sounds like BS and blind numbers. it doesn't make sense... why would they make their 9th generation a simple speed boost and process shrink of their 8th gen? sure, ATi did it. doesn't mean it's a great idea...
January 4, 2008 1:52:20 PM

cpburns said:
remember that SLI does NOT scale 1:1 performancewise. you do gain anywhere from 10% to 70% more performance usually. so if they cut clocks to maybe 500 for heat constraints, they'd lose about... 16% performance versus the 8800GT and 23% versus the GTS(G92). assuming that SLI gives a sufficient performance boost, they're still only looking at a boost on the 9800GX2 over the 8800GT of 0% to ~35%. the supposed 9800GTX doesn't look like anything but PERHAPS a 700MHz G92 128SP. overall, i think this announcement sounds like BS and blind numbers. it doesn't make sense... why would they make their 9th generation a simple speed boost and process shrink of their 8th gen? sure, ATi did it. doesn't mean it's a great idea...



I am aware of the current SLI limitations. Doesnt mean they cant ever change or be further optimized. :) 
January 4, 2008 1:55:30 PM

"Don’t be confused by the new “98XX” model numbers as they don’t signify much more than the die shrink to 65nm."

A production practice otherwise know as "perceived obsolence".
January 4, 2008 2:02:59 PM

If the 9800GTX is based off of the G92 core, I am guessing that it will AT LEAST have a wider memory bus (min 384-bit) and possibly more stream processors than the G92 8800GTS 512MB (256-bit bus and 112 SP). The addition of some GDDR4/5 to wider bus (512-bit?) and more SPs on a G92 could yield a card that is 30% faster than the current 8800GTX.

The 9800GX2 would probably have two of these modified G92 cores underclocked a bit for heat reasons.
January 4, 2008 2:09:25 PM

I believe the 8800GTS 512 has 128 SPs.
January 4, 2008 2:17:56 PM

rallyimprezive said:
I believe the 8800GTS 512 has 128 SPs.

You are correct, I was thinking top of the line G80 8800GTS. Based on that, I'm not sure if they can get more than 128 SPs out of that architecture.

Good table comparing released GPU specs:
http://www.legitreviews.com/article/610/2/


January 4, 2008 2:31:02 PM

Personally I don't have a problem waiting for Tom's to do their full review of a card before i jump to any conclusions...
January 4, 2008 2:46:51 PM

HoldDaMayo said:
Personally I don't have a problem waiting for Tom's to do their full review of a card before i jump to any conclusions...


But that is so boring!

Besides, Tom wont have a review until a few weeks after the offical release. :ange: 
January 4, 2008 3:07:02 PM

That's because you are thinking like an american.

In Australia their clocks move backwards just like their toilets flush.
Australian Counter Clockwise is American ClockWise.
January 4, 2008 3:08:57 PM

rallyimprezive said:
I can not for the life of me get that thing to look like its spinning counter-clockwise.

LOL - me too. Try using your peripheral vision and back away, works for me. That seems to have some double meaning.
January 4, 2008 3:17:43 PM

systemlord said:
I think they got their percentages wrong, I'm thinking that the single chip 9800GTX is 30% faster than the 8800GTX. The reason behind this is when the newer 8800GTS 512MB cards came out they were about 30% faster than the older 8800GTS 640MB card. Well thats my logic.


Yup I would think based on G92 die shrunk GT's and GT's that 9800GTX(G92) = 30% faster than 8800GTX

But all in all its just die shrunk same old crap and NO new next gen hardware.

I'm definitely not getting any new video cards until next generation silicon with new architecture and features slams down. If ATI hadn't released such weak early offerings this next round would have been a next gen product and not recycled crap. :pfff: 
January 4, 2008 3:19:24 PM

its clockwise for me also, cant get it to move counter T__T
January 4, 2008 3:20:59 PM

zenmaster said:
That's because you are thinking like an american.

In Australia their clocks move backwards just like their toilets flush.
Australian Counter Clockwise is American ClockWise.



I sure hope you dont believe that....


And I did figure out how to get it to spin the other way, in fact, I can get her upper body to spin the opposite direction of the lower body.

Fun!

Edit: In order for you to see it spinning one direction or the other, what changes is how you perceive the leg that is sticking out.

You see the leg come into the fore-ground, then spin into the back ground....if you reverse that thinking, it spins the other way! Then you can do the same with the arms, and if you tweak your brain enough, you can get the body to twist. :) 
January 4, 2008 3:40:34 PM

spaztic7 said:
OOooooo 3 9800 GX2s..... 6 GPUS!!!!

That should be able to run Crysis on full. I hope.


Is the incerease in your electric bill going to be worth it lol. Not to mention the that maybe even a 1KW PSU may not handle that. I heard a system with 3 GTX's uses about 700-800Watts total with everything else included, I think it was a intel quad cpu but still, the power needed would be insane.
a c 130 U Graphics card
a b Î Nvidia
January 4, 2008 3:42:05 PM

Thats quite good badgtx1969 much needed respite as my head was starting to spin trying to work out just where(which section, price/performance) Nvidia are trying to aim this at.
I know that 2 GPU'S = Twice the heat in theory but it seems to me that everybody is ignoring the die shrink. They seem to be holding there own with ATI on heat/power and thats before they shrink so i dont think they will have to throttle back as much as some people think might be the case,just my thinking though. So assuming im off the mark here does this mean that they will have a crazy amount of headroom for overclocking and if so are we heading back to the 1000 watt psu being needed, or and this is quite an interesting thought i just had are these being made from chips that are not up to spec for the next gen of cards so ther will be little headroom or has this card always been planned ?
To be honest im not that sure how a dual GPU card works as the arcitecture goes does anyone know of a good guide ?
Mactronix
January 4, 2008 3:42:11 PM

Now wait just a damn minute!! She is spinning counter-clockwise.

Notice that the shadow of her leg that is sticking out comes into the foreground only once in each rotation.
January 4, 2008 3:44:43 PM

I really wouldn't put my hopes up for the multi GPU scene for another 1.5 years... the drivers are crap and performance is different for every game. IF SLi is anything like CF, we have a long way to go.
January 4, 2008 4:00:17 PM

rallyimprezive - didn't mean to get you staring at your computer screen for 1+ hrs!
On topic - To get rid of of the driver issues and scaling problems on multi GPUs, there needs to be some better hardware development. I think ATi is moving in the right direction with the dual GPUs on one PCB (R680). However, there needs to be a better interconnect than the PLX PCIe switch.
January 4, 2008 4:00:21 PM

At first she's clockwise for me until I see the shadow. That's how I got it to change too.
January 4, 2008 4:03:04 PM

she's spinning both ways. I saw clockwise first, but if you focus on her lower foot, you can switch it really easy.

i think our lovely gpu race that's been going on for a while is really dragging down.
ATI's just not what it used to be. they have to get the next series out. nvidia has no reason to move foreword.

and i'm sick of recycled crap, too.

this is disappointing.
January 4, 2008 4:09:21 PM

rallyimprezive said:
Now wait just a damn minute!! She is spinning counter-clockwise.

Notice that the shadow of her leg that is sticking out comes into the foreground only once in each rotation.


...and that is why you are a left-brained user. ;) 

Right brained ppl see the "whole picture", they see the picture as one object. Left brainers actually break it up into smaller chunks (subconsciously) and see it as a sum of those parts, it is not a single object.

The "default" spin you see when you first pull the picture up is what your brain orientation is... and that shadow plays a role in it. By dissecting the shadow itself (left-brain) you are not taking the perceived angle in 3-D space of the girl to the "floor" in to account etc... (or you are even dissecting those points further into smaller parts and removing much perspective) There is a host of ways to "see" this if you dissect each part, that is why it is a decent metric on how you think. Right-brainers just take the whole thing in and say "cool"... and probably start all kinds of thoughts on the girl and why she is spinning in mid-air. ;) 

That is not saying one is better, just it points at different thinking processes.

January 4, 2008 4:11:56 PM

Ive got everyone in the office arguing about the TRUE direction of the spin, using the shadow as a reference.

I wonder, if the 9800GX2 is real, would it have the same limitations as two single GPUs running SLI?

If that makes sense....
January 4, 2008 4:15:02 PM

sojrner said:
...and that is why you are a left-brained user. ;) 

Right brained ppl see the "whole picture", they see the picture as one object. Left brainers actually break it up into smaller chunks (subconsciously) and see it as a sum of those parts, it is not a single object.

The "default" spin you see when you first pull the picture up is what your brain orientation is... and that shadow plays a role in it. By dissecting the shadow itself (left-brain) you are not taking the perceived angle in 3-D space of the girl to the "floor" in to account etc... (or you are even dissecting those points further into smaller parts and removing much perspective) There is a host of ways to "see" this if you dissect each part, that is why it is a decent metric on how you think. Right-brainers just take the whole thing in and say "cool"... and probably start all kinds of thoughts on the girl and why she is spinning in mid-air. ;) 

That is not saying one is better, just it points at different thinking processes.



Very interesting, that certainly makes sense.

To be completely honest though, the very first thing I noticed was that you can see her nipples.
January 4, 2008 4:22:50 PM

I hope this is someone’s idea of a bad joke because I’m not laughing. My X1900 XT is on point of oblivion and I have been holding out for the next generation of video cards to hit the market for several months now. The only straws I can hold onto is the fact nothing official has been mentioned about the hardware, but I’m not going to hold my breath.
January 4, 2008 4:22:59 PM

frozenlead said:
she's spinning both ways. I saw clockwise first, but if you focus on her lower foot, you can switch it really easy...
Me too. Initial look I saw clockwise. After a few second study it switched to counter-clockwise suddenly and I couldn't go back! Then looking down at the foot, I've learned to switch the direction by my will. :pt1cable: 
January 4, 2008 4:25:26 PM

Who the hell said 30% boost? Its boost will be the same boost that SLI gives to any 2 cards. Because it IS just 2 cards together as one.

I'm expecting it to be a big success, this way makes the power consumption and heat output much much lower then that of 2 individual cards, plus needs only one PCI-E slot.


The old quad sli rubbish performance will be gone since with DX10 they can raise the pre-render limit from 3 to 4 thus all 4 cards can be utilized. Still only good at higher res though and will depend on game just like SLI and tri-sli.

Its basically 2 cards together though, 2 PCB's. Just like 7950GX2, but with lots of things removed.

GX2 was a beast anyway it rocked, low PC low heat output over-clocked like mad etc. I'm expecting this to be using 8800GTS 512mb G92 chips but down-clocked probably to 500 - 600mhz core, just to make heat and power a little less problem.


Just assumptions, but I have a strong opinion that this will be a very good product. Itll beat ultra as bad as 2x 8800GT SLI beat an ultra.
January 4, 2008 4:32:38 PM

IMHO, I dont think its fair to compare SLI to a dual GPU single slot solution. Perhaps I should not have asked the question.

It just seems that there are to many differences. Two slots vs one slot. The bridge method between GPUs may be different. The way the driver utilizes the GPU may be different.
And lots of other technical shenanigans that I dont know anything about.
January 4, 2008 4:35:45 PM

drysocks said:
Me too. Initial look I saw clockwise. After a few second study it switched to counter-clockwise suddenly and I couldn't go back! Then looking down at the foot, I've learned to switch the direction by my will. :pt1cable: 


yup, but it's that initial look that tells you whether you are a rightie or leftie.


and @rallyimprezive: ya, lol... right or left, there are still nipples there. :D 
January 4, 2008 4:40:01 PM

rallyimprezive said:
IMHO, I dont think its fair to compare SLI to a dual GPU single slot solution. Perhaps I should not have asked the question.

It just seems that there are to many differences. Two slots vs one slot. The bridge method between GPUs may be different. The way the driver utilizes the GPU may be different.
And lots of other technical shenanigans that I dont know anything about.


nope, as for Nv it IS "just" sli whether 2 on 1 card or 2 cards... the gf7 GX2's were showing the same performance as "traditional" sli... Only benefit is single slot vs dual. You still had to disable dual monitors and do all the rest of the sli crap to get it working on unsupported games.

as for ati, unsure what their dual on one will do, but as mentioned on the tom's article the ati version seems to scale better than Nv's. Granted, I think even older versions of crossfire scaled better than sli clock for clock.
January 4, 2008 4:51:54 PM

sojrner said:
nope, as for Nv it IS "just" sli whether 2 on 1 card or 2 cards... the gf7 GX2's were showing the same performance as "traditional" sli... Only benefit is single slot vs dual. You still had to disable dual monitors and do all the rest of the sli crap to get it working on unsupported games.

as for ati, unsure what their dual on one will do, but as mentioned on the tom's article the ati version seems to scale better than Nv's. Granted, I think even older versions of crossfire scaled better than sli clock for clock.



Oh ok, good to know. Thanks for the information. I am now smarterer.

And I have an elevated level of consciousness because I can make the woman swing both ways...AND stare at her tits.

I am 26 by the way, and still enjoy the juvenile excitement of animated boobies. :ange: 
!