Sign in with
Sign up | Sign in
Your question

Nvidia SLI, SLI's back with a vengeance

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
June 28, 2004 4:17:18 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

http://media.hardwareanalysis.com/articles/large/11206....
http://media.hardwareanalysis.com/articles/large/11208....
http://media.hardwareanalysis.com/articles/large/11207....

http://www.hardwareanalysis.com/content/article/1728/

____________________________________________________________________________
___
Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM

By: Sander Sassen

I'm sure many of you can remember the days of 3dfx, the first Voodoo
Graphics back in 1996 and about a year later the introduction of the
Voodoo2. Voodoo2 actually made sure that 3dfx reigned supreme for
quite some time as two cards could be combined in something called an
SLI, Scan Line Interleave, configuration. Each card rendered half of
the image scan lines which resulted in double the performance of a
single board and the ability to play OpenGL games such as Quake 2 in a
1024x768 resolution. To date no manufacturer has come up with a
similar concept simply because modern graphics accelerators are all
AGP based, there's no dual AGP motherboards and PCI simply doesn't
have the bandwidth to handle modern graphics accelerators. With the
arrival of PCI-E things have changed though, a number of workstations
motherboards featuring the Tumwater chipset will have dual PCI-E-x16
slots making dual graphics accelerators a possibility again. Nvidia
steps up to the plate today with the re-introduction of the SLI
concept on the GeForce 6800 series, again using the SLI moniker but
now with a different approach to the same principles that made Voodoo2
SLI a huge success.




Two PCI-E GeForce 6800 Ultra graphics cards running in a SLI
configuration.

Whereas Voodoo2 SLI used a ribbon cable to be connected between two
Voodoo2 cards internally and a pass through VGA cable externally to
distribute the analog signal Nvidia's implementation is all done in
the digital domain. Both 6800 series PCI-E cards are connected by
means of a SLI, Scalable Link Interface, dubbed the MIO port, a
high-speed digital interconnect which connects to a connector on top
of both cards. Through this MIO port both cards communicate to each
other and distribute the workload which is accelerated by dynamic
load-balancing algorithms. In essence the screen is divided vertically
in two parts; one graphics card renders the upper section and the
second graphics card renders the lower section. The load balancing
algorithms however allow it to distribute the load across the graphics
processors. Initially they'll both start out at 50% but this ratio can
change depending on the load. Although Nvidia has remained
tight-lipped about what makes their SLI implementation tick exactly it
is clear that both hard- and software contribute to making SLI work.
Most of the dynamic load balancing between the two graphics processors
is handled in software and thus SLI needs driver support, drivers
which are as of yet unreleased, to work.




The MIO port connector that is used to connect two PCI GeForce 6800s
together in SLI.

Exact performance figures are not yet available, but Nvidia's SLI
concept has already been shown behind closed doors by one of the
companies working with Nvidia on the SLI implementation. On early
driver revisions which only offered non-optimized dynamic
load-balancing algorithms their SLI configuration performed 77% faster
than a single graphics card. However Nvidia has told us that
prospective performance numbers should show a performance increase
closer to 90% over that of a single graphics card. There are a few
things that need to be taken into account however when you're
considering buying an SLI configuration. First off you'll need a
workstation motherboard featuring two PCI-E-x16 slots which will also
use the more expensive Intel Xeon processors. Secondly you'll need two
identical, same brand and type, PCI-E GeForce 6800 graphics cards. For
workstation users it is also a nice extra that with a SLI
configuration a total of four monitors can be driven off of the
respective DVI outputs on the graphics cards, a feature we'll
undoubtedly see pitched as a major feature for the Quadro version of
the GeForce 6800 series SLI configuration.




The high-speed digital MIO port bridge connecting the two PCI-E cards
together.

The dual PCI-E-x16 motherboard however will mean a significant
investment, two PCI-E GeForce 6800GT cards could however make more
sense than a single PCI-E GeForce 6800 Ultra or Ultra Extreme, as the
performance increase will be much larger. Also, workstation
motherboards run at a hefty price premium over consumer products,
fortunately they do not require dual Xeons, a single Xeon will work
just as well. All in all Nvidia's SLI implementation brings back fond
memories of the 3dfx days and has all the right ingredients to once
again revolutionize 3D graphics provided you're willing and able to
pay the hefty price tag associated with it. Unlike Voodoo2 there's no
simple upgrade to double your 3D performance; apart from a second
PCI-E GeForce 6800 you'll need a new motherboard, memory and CPU(s).
That doesn't do much to dampen our spirits though, the best 3D
performance available comes at a price much like driving a Porsche or
Ferrari and it doesn't come cheap. Kudos to Nvidia for once again
raising the bar and making the harts of many gamers rejoice; SLI is
back, and with a vengeance.

Sander Sassen.
____________________________________________________________________________
___


I can't wait to see ATI's response to this. MAXX could be back!

don't forget that ATI has had the ability to scale upto 256 R300-Radeon 9700
VPUs
since 2002. Both E&S and SGI have taken advantage of this. now hopefully
consumers can get in on the fun.
Anonymous
a b U Graphics card
June 28, 2004 1:35:07 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

"SLIisBACK" <SLIrocks@aol.com> wrote:

>I can't wait to see ATI's response to this. MAXX could be back!
>
>don't forget that ATI has had the ability to scale upto 256 R300-Radeon 9700
>VPUs
>since 2002. Both E&S and SGI have taken advantage of this. now hopefully
>consumers can get in on the fun.

Learn how to post, dorky.
Anonymous
a b U Graphics card
June 28, 2004 1:35:58 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

"SLIisBACK" <SLIrocks@aol.com> wrote in message >Nvidia steps up to the
plate today with the re->introduction of the SLI
> concept on the GeForce 6800 series

What cpu(s) can keep up with it?
Should the electrician connect it straight to Edison's sub-
station or will any gas powered generator do?
What will this over the top money pit be worth a year
after purchase?

I guess I'll pass.
Related resources
Anonymous
a b U Graphics card
June 28, 2004 4:54:12 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

THIS IS A JOKE

3DFX?

REMEMBER VOODOO 6000

NO ONE WILL MAKE A 16X PCI DUAL SLOT MOTHER BOARD OTHER THEN SERVER SYSTEMS
AND YOU WILL NEED 800 WATTS POWER
O WOW
NVIDA HAS LOST i ALREADY HAVE MY ATI X800 XT PRO INSTALLED AND RUNNING FOR 2
WEEKS AND NOONE CAN EVEN GET AN ULTRA FROM NVIDIA
http://public.fotki.com/Tejas/


"SLIisBACK" <SLIrocks@aol.com> wrote in message
news:SrWdnT9rj8hANELdRVn-hA@comcast.com...
> http://media.hardwareanalysis.com/articles/large/11206....
> http://media.hardwareanalysis.com/articles/large/11208....
> http://media.hardwareanalysis.com/articles/large/11207....
>
> http://www.hardwareanalysis.com/content/article/1728/
>
>
____________________________________________________________________________
> ___
> Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>
> By: Sander Sassen
>
> I'm sure many of you can remember the days of 3dfx, the first Voodoo
> Graphics back in 1996 and about a year later the introduction of the
> Voodoo2. Voodoo2 actually made sure that 3dfx reigned supreme for
> quite some time as two cards could be combined in something called an
> SLI, Scan Line Interleave, configuration. Each card rendered half of
> the image scan lines which resulted in double the performance of a
> single board and the ability to play OpenGL games such as Quake 2 in a
> 1024x768 resolution. To date no manufacturer has come up with a
> similar concept simply because modern graphics accelerators are all
> AGP based, there's no dual AGP motherboards and PCI simply doesn't
> have the bandwidth to handle modern graphics accelerators. With the
> arrival of PCI-E things have changed though, a number of workstations
> motherboards featuring the Tumwater chipset will have dual PCI-E-x16
> slots making dual graphics accelerators a possibility again. Nvidia
> steps up to the plate today with the re-introduction of the SLI
> concept on the GeForce 6800 series, again using the SLI moniker but
> now with a different approach to the same principles that made Voodoo2
> SLI a huge success.
>
>
>
>
> Two PCI-E GeForce 6800 Ultra graphics cards running in a SLI
> configuration.
>
> Whereas Voodoo2 SLI used a ribbon cable to be connected between two
> Voodoo2 cards internally and a pass through VGA cable externally to
> distribute the analog signal Nvidia's implementation is all done in
> the digital domain. Both 6800 series PCI-E cards are connected by
> means of a SLI, Scalable Link Interface, dubbed the MIO port, a
> high-speed digital interconnect which connects to a connector on top
> of both cards. Through this MIO port both cards communicate to each
> other and distribute the workload which is accelerated by dynamic
> load-balancing algorithms. In essence the screen is divided vertically
> in two parts; one graphics card renders the upper section and the
> second graphics card renders the lower section. The load balancing
> algorithms however allow it to distribute the load across the graphics
> processors. Initially they'll both start out at 50% but this ratio can
> change depending on the load. Although Nvidia has remained
> tight-lipped about what makes their SLI implementation tick exactly it
> is clear that both hard- and software contribute to making SLI work.
> Most of the dynamic load balancing between the two graphics processors
> is handled in software and thus SLI needs driver support, drivers
> which are as of yet unreleased, to work.
>
>
>
>
> The MIO port connector that is used to connect two PCI GeForce 6800s
> together in SLI.
>
> Exact performance figures are not yet available, but Nvidia's SLI
> concept has already been shown behind closed doors by one of the
> companies working with Nvidia on the SLI implementation. On early
> driver revisions which only offered non-optimized dynamic
> load-balancing algorithms their SLI configuration performed 77% faster
> than a single graphics card. However Nvidia has told us that
> prospective performance numbers should show a performance increase
> closer to 90% over that of a single graphics card. There are a few
> things that need to be taken into account however when you're
> considering buying an SLI configuration. First off you'll need a
> workstation motherboard featuring two PCI-E-x16 slots which will also
> use the more expensive Intel Xeon processors. Secondly you'll need two
> identical, same brand and type, PCI-E GeForce 6800 graphics cards. For
> workstation users it is also a nice extra that with a SLI
> configuration a total of four monitors can be driven off of the
> respective DVI outputs on the graphics cards, a feature we'll
> undoubtedly see pitched as a major feature for the Quadro version of
> the GeForce 6800 series SLI configuration.
>
>
>
>
> The high-speed digital MIO port bridge connecting the two PCI-E cards
> together.
>
> The dual PCI-E-x16 motherboard however will mean a significant
> investment, two PCI-E GeForce 6800GT cards could however make more
> sense than a single PCI-E GeForce 6800 Ultra or Ultra Extreme, as the
> performance increase will be much larger. Also, workstation
> motherboards run at a hefty price premium over consumer products,
> fortunately they do not require dual Xeons, a single Xeon will work
> just as well. All in all Nvidia's SLI implementation brings back fond
> memories of the 3dfx days and has all the right ingredients to once
> again revolutionize 3D graphics provided you're willing and able to
> pay the hefty price tag associated with it. Unlike Voodoo2 there's no
> simple upgrade to double your 3D performance; apart from a second
> PCI-E GeForce 6800 you'll need a new motherboard, memory and CPU(s).
> That doesn't do much to dampen our spirits though, the best 3D
> performance available comes at a price much like driving a Porsche or
> Ferrari and it doesn't come cheap. Kudos to Nvidia for once again
> raising the bar and making the harts of many gamers rejoice; SLI is
> back, and with a vengeance.
>
> Sander Sassen.
>
____________________________________________________________________________
> ___
>
>
> I can't wait to see ATI's response to this. MAXX could be back!
>
> don't forget that ATI has had the ability to scale upto 256 R300-Radeon
9700
> VPUs
> since 2002. Both E&S and SGI have taken advantage of this. now hopefully
> consumers can get in on the fun.
>
>
June 28, 2004 5:26:34 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

Great, as long as you work for Pixar or ILM i suppose.
"SLIisBACK" <SLIrocks@aol.com> wrote in message
news:SrWdnT9rj8hANELdRVn-hA@comcast.com...
> http://media.hardwareanalysis.com/articles/large/11206....
> http://media.hardwareanalysis.com/articles/large/11208....
> http://media.hardwareanalysis.com/articles/large/11207....
>
> http://www.hardwareanalysis.com/content/article/1728/
>
>
____________________________________________________________________________
> ___
> Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>
> By: Sander Sassen
>
> I'm sure many of you can remember the days of 3dfx, the first Voodoo
> Graphics back in 1996 and about a year later the introduction of the
> Voodoo2. Voodoo2 actually made sure that 3dfx reigned supreme for
> quite some time as two cards could be combined in something called an
> SLI, Scan Line Interleave, configuration. Each card rendered half of
> the image scan lines which resulted in double the performance of a
> single board and the ability to play OpenGL games such as Quake 2 in a
> 1024x768 resolution. To date no manufacturer has come up with a
> similar concept simply because modern graphics accelerators are all
> AGP based, there's no dual AGP motherboards and PCI simply doesn't
> have the bandwidth to handle modern graphics accelerators. With the
> arrival of PCI-E things have changed though, a number of workstations
> motherboards featuring the Tumwater chipset will have dual PCI-E-x16
> slots making dual graphics accelerators a possibility again. Nvidia
> steps up to the plate today with the re-introduction of the SLI
> concept on the GeForce 6800 series, again using the SLI moniker but
> now with a different approach to the same principles that made Voodoo2
> SLI a huge success.
>
>
>
>
> Two PCI-E GeForce 6800 Ultra graphics cards running in a SLI
> configuration.
>
> Whereas Voodoo2 SLI used a ribbon cable to be connected between two
> Voodoo2 cards internally and a pass through VGA cable externally to
> distribute the analog signal Nvidia's implementation is all done in
> the digital domain. Both 6800 series PCI-E cards are connected by
> means of a SLI, Scalable Link Interface, dubbed the MIO port, a
> high-speed digital interconnect which connects to a connector on top
> of both cards. Through this MIO port both cards communicate to each
> other and distribute the workload which is accelerated by dynamic
> load-balancing algorithms. In essence the screen is divided vertically
> in two parts; one graphics card renders the upper section and the
> second graphics card renders the lower section. The load balancing
> algorithms however allow it to distribute the load across the graphics
> processors. Initially they'll both start out at 50% but this ratio can
> change depending on the load. Although Nvidia has remained
> tight-lipped about what makes their SLI implementation tick exactly it
> is clear that both hard- and software contribute to making SLI work.
> Most of the dynamic load balancing between the two graphics processors
> is handled in software and thus SLI needs driver support, drivers
> which are as of yet unreleased, to work.
>
>
>
>
> The MIO port connector that is used to connect two PCI GeForce 6800s
> together in SLI.
>
> Exact performance figures are not yet available, but Nvidia's SLI
> concept has already been shown behind closed doors by one of the
> companies working with Nvidia on the SLI implementation. On early
> driver revisions which only offered non-optimized dynamic
> load-balancing algorithms their SLI configuration performed 77% faster
> than a single graphics card. However Nvidia has told us that
> prospective performance numbers should show a performance increase
> closer to 90% over that of a single graphics card. There are a few
> things that need to be taken into account however when you're
> considering buying an SLI configuration. First off you'll need a
> workstation motherboard featuring two PCI-E-x16 slots which will also
> use the more expensive Intel Xeon processors. Secondly you'll need two
> identical, same brand and type, PCI-E GeForce 6800 graphics cards. For
> workstation users it is also a nice extra that with a SLI
> configuration a total of four monitors can be driven off of the
> respective DVI outputs on the graphics cards, a feature we'll
> undoubtedly see pitched as a major feature for the Quadro version of
> the GeForce 6800 series SLI configuration.
>
>
>
>
> The high-speed digital MIO port bridge connecting the two PCI-E cards
> together.
>
> The dual PCI-E-x16 motherboard however will mean a significant
> investment, two PCI-E GeForce 6800GT cards could however make more
> sense than a single PCI-E GeForce 6800 Ultra or Ultra Extreme, as the
> performance increase will be much larger. Also, workstation
> motherboards run at a hefty price premium over consumer products,
> fortunately they do not require dual Xeons, a single Xeon will work
> just as well. All in all Nvidia's SLI implementation brings back fond
> memories of the 3dfx days and has all the right ingredients to once
> again revolutionize 3D graphics provided you're willing and able to
> pay the hefty price tag associated with it. Unlike Voodoo2 there's no
> simple upgrade to double your 3D performance; apart from a second
> PCI-E GeForce 6800 you'll need a new motherboard, memory and CPU(s).
> That doesn't do much to dampen our spirits though, the best 3D
> performance available comes at a price much like driving a Porsche or
> Ferrari and it doesn't come cheap. Kudos to Nvidia for once again
> raising the bar and making the harts of many gamers rejoice; SLI is
> back, and with a vengeance.
>
> Sander Sassen.
>
____________________________________________________________________________
> ___
>
>
> I can't wait to see ATI's response to this. MAXX could be back!
>
> don't forget that ATI has had the ability to scale upto 256 R300-Radeon
9700
> VPUs
> since 2002. Both E&S and SGI have taken advantage of this. now hopefully
> consumers can get in on the fun.
>
>
Anonymous
a b U Graphics card
June 28, 2004 7:09:44 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

I think your ati card has corrupted your keyboard. It seems that when the
ati driver is installed, the shift key is automatically pressed.



--
Mickster

Visit my website and see my arcade!!

http://mickster.freeservers.com

"FatDaddy" <KillinU@worm.com> wrote in message
news:lNadnZeHaLGl0H3dRVn-uw@comcast.com...
> THIS IS A JOKE
>
> 3DFX?
>
> REMEMBER VOODOO 6000
>
> NO ONE WILL MAKE A 16X PCI DUAL SLOT MOTHER BOARD OTHER THEN SERVER
SYSTEMS
> AND YOU WILL NEED 800 WATTS POWER
> O WOW
> NVIDA HAS LOST i ALREADY HAVE MY ATI X800 XT PRO INSTALLED AND RUNNING FOR
2
> WEEKS AND NOONE CAN EVEN GET AN ULTRA FROM NVIDIA
> http://public.fotki.com/Tejas/
>
>
> "SLIisBACK" <SLIrocks@aol.com> wrote in message
> news:SrWdnT9rj8hANELdRVn-hA@comcast.com...
> > http://media.hardwareanalysis.com/articles/large/11206....
> > http://media.hardwareanalysis.com/articles/large/11208....
> > http://media.hardwareanalysis.com/articles/large/11207....
> >
> > http://www.hardwareanalysis.com/content/article/1728/
> >
> >
>
____________________________________________________________________________
> > ___
> > Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
> >
> > By: Sander Sassen
> >
> > I'm sure many of you can remember the days of 3dfx, the first Voodoo
> > Graphics back in 1996 and about a year later the introduction of the
> > Voodoo2. Voodoo2 actually made sure that 3dfx reigned supreme for
> > quite some time as two cards could be combined in something called an
> > SLI, Scan Line Interleave, configuration. Each card rendered half of
> > the image scan lines which resulted in double the performance of a
> > single board and the ability to play OpenGL games such as Quake 2 in a
> > 1024x768 resolution. To date no manufacturer has come up with a
> > similar concept simply because modern graphics accelerators are all
> > AGP based, there's no dual AGP motherboards and PCI simply doesn't
> > have the bandwidth to handle modern graphics accelerators. With the
> > arrival of PCI-E things have changed though, a number of workstations
> > motherboards featuring the Tumwater chipset will have dual PCI-E-x16
> > slots making dual graphics accelerators a possibility again. Nvidia
> > steps up to the plate today with the re-introduction of the SLI
> > concept on the GeForce 6800 series, again using the SLI moniker but
> > now with a different approach to the same principles that made Voodoo2
> > SLI a huge success.
> >
> >
> >
> >
> > Two PCI-E GeForce 6800 Ultra graphics cards running in a SLI
> > configuration.
> >
> > Whereas Voodoo2 SLI used a ribbon cable to be connected between two
> > Voodoo2 cards internally and a pass through VGA cable externally to
> > distribute the analog signal Nvidia's implementation is all done in
> > the digital domain. Both 6800 series PCI-E cards are connected by
> > means of a SLI, Scalable Link Interface, dubbed the MIO port, a
> > high-speed digital interconnect which connects to a connector on top
> > of both cards. Through this MIO port both cards communicate to each
> > other and distribute the workload which is accelerated by dynamic
> > load-balancing algorithms. In essence the screen is divided vertically
> > in two parts; one graphics card renders the upper section and the
> > second graphics card renders the lower section. The load balancing
> > algorithms however allow it to distribute the load across the graphics
> > processors. Initially they'll both start out at 50% but this ratio can
> > change depending on the load. Although Nvidia has remained
> > tight-lipped about what makes their SLI implementation tick exactly it
> > is clear that both hard- and software contribute to making SLI work.
> > Most of the dynamic load balancing between the two graphics processors
> > is handled in software and thus SLI needs driver support, drivers
> > which are as of yet unreleased, to work.
> >
> >
> >
> >
> > The MIO port connector that is used to connect two PCI GeForce 6800s
> > together in SLI.
> >
> > Exact performance figures are not yet available, but Nvidia's SLI
> > concept has already been shown behind closed doors by one of the
> > companies working with Nvidia on the SLI implementation. On early
> > driver revisions which only offered non-optimized dynamic
> > load-balancing algorithms their SLI configuration performed 77% faster
> > than a single graphics card. However Nvidia has told us that
> > prospective performance numbers should show a performance increase
> > closer to 90% over that of a single graphics card. There are a few
> > things that need to be taken into account however when you're
> > considering buying an SLI configuration. First off you'll need a
> > workstation motherboard featuring two PCI-E-x16 slots which will also
> > use the more expensive Intel Xeon processors. Secondly you'll need two
> > identical, same brand and type, PCI-E GeForce 6800 graphics cards. For
> > workstation users it is also a nice extra that with a SLI
> > configuration a total of four monitors can be driven off of the
> > respective DVI outputs on the graphics cards, a feature we'll
> > undoubtedly see pitched as a major feature for the Quadro version of
> > the GeForce 6800 series SLI configuration.
> >
> >
> >
> >
> > The high-speed digital MIO port bridge connecting the two PCI-E cards
> > together.
> >
> > The dual PCI-E-x16 motherboard however will mean a significant
> > investment, two PCI-E GeForce 6800GT cards could however make more
> > sense than a single PCI-E GeForce 6800 Ultra or Ultra Extreme, as the
> > performance increase will be much larger. Also, workstation
> > motherboards run at a hefty price premium over consumer products,
> > fortunately they do not require dual Xeons, a single Xeon will work
> > just as well. All in all Nvidia's SLI implementation brings back fond
> > memories of the 3dfx days and has all the right ingredients to once
> > again revolutionize 3D graphics provided you're willing and able to
> > pay the hefty price tag associated with it. Unlike Voodoo2 there's no
> > simple upgrade to double your 3D performance; apart from a second
> > PCI-E GeForce 6800 you'll need a new motherboard, memory and CPU(s).
> > That doesn't do much to dampen our spirits though, the best 3D
> > performance available comes at a price much like driving a Porsche or
> > Ferrari and it doesn't come cheap. Kudos to Nvidia for once again
> > raising the bar and making the harts of many gamers rejoice; SLI is
> > back, and with a vengeance.
> >
> > Sander Sassen.
> >
>
____________________________________________________________________________
> > ___
> >
> >
> > I can't wait to see ATI's response to this. MAXX could be back!
> >
> > don't forget that ATI has had the ability to scale upto 256 R300-Radeon
> 9700
> > VPUs
> > since 2002. Both E&S and SGI have taken advantage of this. now
hopefully
> > consumers can get in on the fun.
> >
> >
>
>
Anonymous
a b U Graphics card
June 28, 2004 7:46:17 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

"SLIisBACK" <SLIrocks@aol.com> wrote in
<SrWdnT9rj8hANELdRVn-hA@comcast.com>:

>http://media.hardwareanalysis.com/articles/large/11206....
>http://media.hardwareanalysis.com/articles/large/11208....
>http://media.hardwareanalysis.com/articles/large/11207....
>
>http://www.hardwareanalysis.com/content/article/1728/
>
>_________________________________________________________________________
>___ ___
>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>
>By: Sander Sassen
>
>I'm sure many of you can remember the days of 3dfx, the first Voodoo
>Graphics back in 1996 and about a year later the introduction of the
>Voodoo2. Voodoo2 actually made sure that 3dfx reigned supreme for
>quite some time as two cards could be combined in something called an
>SLI, Scan Line Interleave, configuration. Each card rendered half of
>the image scan lines which resulted in double the performance of a
>single board and the ability to play OpenGL games such as Quake 2 in a
>1024x768 resolution. To date no manufacturer has come up with a
>similar concept simply because modern graphics accelerators are all
>AGP based, there's no dual AGP motherboards and PCI simply doesn't
>have the bandwidth to handle modern graphics accelerators. With the
>arrival of PCI-E things have changed though, a number of workstations
>motherboards featuring the Tumwater chipset will have dual PCI-E-x16
>slots making dual graphics accelerators a possibility again. Nvidia
>steps up to the plate today with the re-introduction of the SLI
>concept on the GeForce 6800 series, again using the SLI moniker but
>now with a different approach to the same principles that made Voodoo2
>SLI a huge success.
>
>
>
>
>Two PCI-E GeForce 6800 Ultra graphics cards running in a SLI
>configuration.
>
>Whereas Voodoo2 SLI used a ribbon cable to be connected between two
>Voodoo2 cards internally and a pass through VGA cable externally to
>distribute the analog signal Nvidia's implementation is all done in
>the digital domain. Both 6800 series PCI-E cards are connected by
>means of a SLI, Scalable Link Interface, dubbed the MIO port, a
>high-speed digital interconnect which connects to a connector on top
>of both cards. Through this MIO port both cards communicate to each
>other and distribute the workload which is accelerated by dynamic
>load-balancing algorithms. In essence the screen is divided vertically
>in two parts; one graphics card renders the upper section and the
>second graphics card renders the lower section. The load balancing
>algorithms however allow it to distribute the load across the graphics
>processors. Initially they'll both start out at 50% but this ratio can
>change depending on the load. Although Nvidia has remained
>tight-lipped about what makes their SLI implementation tick exactly it
>is clear that both hard- and software contribute to making SLI work.
>Most of the dynamic load balancing between the two graphics processors
>is handled in software and thus SLI needs driver support, drivers
>which are as of yet unreleased, to work.
>
>
>
>
>The MIO port connector that is used to connect two PCI GeForce 6800s
>together in SLI.
>
>Exact performance figures are not yet available, but Nvidia's SLI
>concept has already been shown behind closed doors by one of the
>companies working with Nvidia on the SLI implementation. On early
>driver revisions which only offered non-optimized dynamic
>load-balancing algorithms their SLI configuration performed 77% faster
>than a single graphics card. However Nvidia has told us that
>prospective performance numbers should show a performance increase
>closer to 90% over that of a single graphics card. There are a few
>things that need to be taken into account however when you're
>considering buying an SLI configuration. First off you'll need a
>workstation motherboard featuring two PCI-E-x16 slots which will also
>use the more expensive Intel Xeon processors. Secondly you'll need two
>identical, same brand and type, PCI-E GeForce 6800 graphics cards. For
>workstation users it is also a nice extra that with a SLI
>configuration a total of four monitors can be driven off of the
>respective DVI outputs on the graphics cards, a feature we'll
>undoubtedly see pitched as a major feature for the Quadro version of
>the GeForce 6800 series SLI configuration.
>
>
>
>
>The high-speed digital MIO port bridge connecting the two PCI-E cards
>together.
>
>The dual PCI-E-x16 motherboard however will mean a significant
>investment, two PCI-E GeForce 6800GT cards could however make more
>sense than a single PCI-E GeForce 6800 Ultra or Ultra Extreme, as the
>performance increase will be much larger. Also, workstation
>motherboards run at a hefty price premium over consumer products,
>fortunately they do not require dual Xeons, a single Xeon will work
>just as well. All in all Nvidia's SLI implementation brings back fond
>memories of the 3dfx days and has all the right ingredients to once
>again revolutionize 3D graphics provided you're willing and able to
>pay the hefty price tag associated with it. Unlike Voodoo2 there's no
>simple upgrade to double your 3D performance; apart from a second
>PCI-E GeForce 6800 you'll need a new motherboard, memory and CPU(s).
>That doesn't do much to dampen our spirits though, the best 3D
>performance available comes at a price much like driving a Porsche or
>Ferrari and it doesn't come cheap. Kudos to Nvidia for once again
>raising the bar and making the harts of many gamers rejoice; SLI is
>back, and with a vengeance.
>
>Sander Sassen.
>_________________________________________________________________________
>___ ___
>
>
>I can't wait to see ATI's response to this. MAXX could be back!
>
>don't forget that ATI has had the ability to scale upto 256 R300-Radeon
>9700 VPUs
>since 2002. Both E&S and SGI have taken advantage of this. now
>hopefully consumers can get in on the fun.
>
>

The militairy already uses hooked up ATI cards for their simulators...
yup..multiple R300 cores together for their simulators!

Ati can do it... but it's damn expensive..thus only the army has bought it!
Anonymous
a b U Graphics card
June 28, 2004 8:14:51 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

I was being loud sorry

"Mickey Johnson" <mickster@derbyworks.net> wrote in message
news:2kbc1sFaq6cU1@uni-berlin.de...
> I think your ati card has corrupted your keyboard. It seems that when
the
> ati driver is installed, the shift key is automatically pressed.
>
>
>
> --
> Mickster
>
> Visit my website and see my arcade!!
>
> http://mickster.freeservers.com
>
> "FatDaddy" <KillinU@worm.com> wrote in message
> news:lNadnZeHaLGl0H3dRVn-uw@comcast.com...
> > THIS IS A JOKE
> >
> > 3DFX?
> >
> > REMEMBER VOODOO 6000
> >
> > NO ONE WILL MAKE A 16X PCI DUAL SLOT MOTHER BOARD OTHER THEN SERVER
> SYSTEMS
> > AND YOU WILL NEED 800 WATTS POWER
> > O WOW
> > NVIDA HAS LOST i ALREADY HAVE MY ATI X800 XT PRO INSTALLED AND RUNNING
FOR
> 2
> > WEEKS AND NOONE CAN EVEN GET AN ULTRA FROM NVIDIA
> > http://public.fotki.com/Tejas/
> >
> >
> > "SLIisBACK" <SLIrocks@aol.com> wrote in message
> > news:SrWdnT9rj8hANELdRVn-hA@comcast.com...
> > > http://media.hardwareanalysis.com/articles/large/11206....
> > > http://media.hardwareanalysis.com/articles/large/11208....
> > > http://media.hardwareanalysis.com/articles/large/11207....
> > >
> > > http://www.hardwareanalysis.com/content/article/1728/
> > >
> > >
> >
>
____________________________________________________________________________
> > > ___
> > > Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
> > >
> > > By: Sander Sassen
> > >
> > > I'm sure many of you can remember the days of 3dfx, the first Voodoo
> > > Graphics back in 1996 and about a year later the introduction of the
> > > Voodoo2. Voodoo2 actually made sure that 3dfx reigned supreme for
> > > quite some time as two cards could be combined in something called an
> > > SLI, Scan Line Interleave, configuration. Each card rendered half of
> > > the image scan lines which resulted in double the performance of a
> > > single board and the ability to play OpenGL games such as Quake 2 in a
> > > 1024x768 resolution. To date no manufacturer has come up with a
> > > similar concept simply because modern graphics accelerators are all
> > > AGP based, there's no dual AGP motherboards and PCI simply doesn't
> > > have the bandwidth to handle modern graphics accelerators. With the
> > > arrival of PCI-E things have changed though, a number of workstations
> > > motherboards featuring the Tumwater chipset will have dual PCI-E-x16
> > > slots making dual graphics accelerators a possibility again. Nvidia
> > > steps up to the plate today with the re-introduction of the SLI
> > > concept on the GeForce 6800 series, again using the SLI moniker but
> > > now with a different approach to the same principles that made Voodoo2
> > > SLI a huge success.
> > >
> > >
> > >
> > >
> > > Two PCI-E GeForce 6800 Ultra graphics cards running in a SLI
> > > configuration.
> > >
> > > Whereas Voodoo2 SLI used a ribbon cable to be connected between two
> > > Voodoo2 cards internally and a pass through VGA cable externally to
> > > distribute the analog signal Nvidia's implementation is all done in
> > > the digital domain. Both 6800 series PCI-E cards are connected by
> > > means of a SLI, Scalable Link Interface, dubbed the MIO port, a
> > > high-speed digital interconnect which connects to a connector on top
> > > of both cards. Through this MIO port both cards communicate to each
> > > other and distribute the workload which is accelerated by dynamic
> > > load-balancing algorithms. In essence the screen is divided vertically
> > > in two parts; one graphics card renders the upper section and the
> > > second graphics card renders the lower section. The load balancing
> > > algorithms however allow it to distribute the load across the graphics
> > > processors. Initially they'll both start out at 50% but this ratio can
> > > change depending on the load. Although Nvidia has remained
> > > tight-lipped about what makes their SLI implementation tick exactly it
> > > is clear that both hard- and software contribute to making SLI work.
> > > Most of the dynamic load balancing between the two graphics processors
> > > is handled in software and thus SLI needs driver support, drivers
> > > which are as of yet unreleased, to work.
> > >
> > >
> > >
> > >
> > > The MIO port connector that is used to connect two PCI GeForce 6800s
> > > together in SLI.
> > >
> > > Exact performance figures are not yet available, but Nvidia's SLI
> > > concept has already been shown behind closed doors by one of the
> > > companies working with Nvidia on the SLI implementation. On early
> > > driver revisions which only offered non-optimized dynamic
> > > load-balancing algorithms their SLI configuration performed 77% faster
> > > than a single graphics card. However Nvidia has told us that
> > > prospective performance numbers should show a performance increase
> > > closer to 90% over that of a single graphics card. There are a few
> > > things that need to be taken into account however when you're
> > > considering buying an SLI configuration. First off you'll need a
> > > workstation motherboard featuring two PCI-E-x16 slots which will also
> > > use the more expensive Intel Xeon processors. Secondly you'll need two
> > > identical, same brand and type, PCI-E GeForce 6800 graphics cards. For
> > > workstation users it is also a nice extra that with a SLI
> > > configuration a total of four monitors can be driven off of the
> > > respective DVI outputs on the graphics cards, a feature we'll
> > > undoubtedly see pitched as a major feature for the Quadro version of
> > > the GeForce 6800 series SLI configuration.
> > >
> > >
> > >
> > >
> > > The high-speed digital MIO port bridge connecting the two PCI-E cards
> > > together.
> > >
> > > The dual PCI-E-x16 motherboard however will mean a significant
> > > investment, two PCI-E GeForce 6800GT cards could however make more
> > > sense than a single PCI-E GeForce 6800 Ultra or Ultra Extreme, as the
> > > performance increase will be much larger. Also, workstation
> > > motherboards run at a hefty price premium over consumer products,
> > > fortunately they do not require dual Xeons, a single Xeon will work
> > > just as well. All in all Nvidia's SLI implementation brings back fond
> > > memories of the 3dfx days and has all the right ingredients to once
> > > again revolutionize 3D graphics provided you're willing and able to
> > > pay the hefty price tag associated with it. Unlike Voodoo2 there's no
> > > simple upgrade to double your 3D performance; apart from a second
> > > PCI-E GeForce 6800 you'll need a new motherboard, memory and CPU(s).
> > > That doesn't do much to dampen our spirits though, the best 3D
> > > performance available comes at a price much like driving a Porsche or
> > > Ferrari and it doesn't come cheap. Kudos to Nvidia for once again
> > > raising the bar and making the harts of many gamers rejoice; SLI is
> > > back, and with a vengeance.
> > >
> > > Sander Sassen.
> > >
> >
>
____________________________________________________________________________
> > > ___
> > >
> > >
> > > I can't wait to see ATI's response to this. MAXX could be back!
> > >
> > > don't forget that ATI has had the ability to scale upto 256
R300-Radeon
> > 9700
> > > VPUs
> > > since 2002. Both E&S and SGI have taken advantage of this. now
> hopefully
> > > consumers can get in on the fun.
> > >
> > >
> >
> >
>
>
Anonymous
a b U Graphics card
June 29, 2004 2:42:00 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

> I was being loud sorry

And you are wrong. There will be dual PCI-E 16X boards. Alienware are
preparing to launch machines with this very setup.
Anonymous
a b U Graphics card
June 29, 2004 2:42:01 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

Dirk Dreidoppel wrote:

>> I was being loud sorry
>
> And you are wrong. There will be dual PCI-E 16X boards. Alienware are
> preparing to launch machines with this very setup.

I'm still waiting for my PCI/Microchannel board from Zeos. Believe one of
these small vendors will deliver an "innovative" product when you see it.

Note by the way that the Alienware board is not going to support arbitrary
video boards--according to their press release it is going to be tied to
specific models and combines the video using a third board. In other words
they've done something nonstandard.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
June 29, 2004 6:00:32 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

On 28 Jun 2004 15:46:17 GMT, no_spam@spamnet.nl (Dark Avenger) wrote:

>"SLIisBACK" <SLIrocks@aol.com> wrote in
><SrWdnT9rj8hANELdRVn-hA@comcast.com>:
>
>>http://media.hardwareanalysis.com/articles/large/11206....
>>http://media.hardwareanalysis.com/articles/large/11208....
>>http://media.hardwareanalysis.com/articles/large/11207....
>>
>>http://www.hardwareanalysis.com/content/article/1728/
>>
>>_________________________________________________________________________
>>___ ___
>>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>>

By the time this setup's price would come into most of the
game player's budget, the next great videocard chipset would hit the
market, boosting performance to rival, or surpass the dual card setup.
Unless the mainboard manufactuers start producing desktop mainnboards
with dual PCI-X slots, I don't see this as something that will take
off in the home user market.
Hell... By the time dual 12MB Voodoo2 SLI cards came into my
price range, the GF2 cards where dominating, and the GF3 cards where
about to hit the market.
June 29, 2004 9:46:26 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

<snip>

>____________________________________________________________________________
>___
>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM

<snip>

I never did like the SLI configuration...took up two PCI slots
and kept you from installing anything else....all for what???

....1024x768?? You've got to be kidding???

Why not make a long PCI card with multiple GPU daughter card snap-on like a
ram extender.

That would save us the few and precious PCI-E slots.

So I guess with so many extra GPU cards...they'll all need two dedicated
molex connectors?

I think I'll pass on that...


Redbrick..who Loves his CLK
Anonymous
a b U Graphics card
June 29, 2004 10:16:41 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

On Mon, 28 Jun 2004 12:54:12 -0400, "FatDaddy" <KillinU@worm.com>
wrote:

>THIS IS A JOKE
>
>3DFX?
>
>REMEMBER VOODOO 6000
>
>NO ONE WILL MAKE A 16X PCI DUAL SLOT MOTHER BOARD OTHER THEN SERVER SYSTEMS
>AND YOU WILL NEED 800 WATTS POWER
>O WOW
>NVIDA HAS LOST i ALREADY HAVE MY ATI X800 XT PRO INSTALLED

Where ?

Does it fit ?

Your mouth seems too big for it, since you have kept repeating the
glorious news that you have some sort of X800 for the past 2 weeks.

Ati claim to make the following:-
X800 Pro
X800 XT
X800 XT Platinum.

Don't see a X800 XT Pro anywhere.............

Anyway, if the X800 ?? does not fit your mouth, there are other
quite suitable apertures

John Lewis
Anonymous
a b U Graphics card
June 29, 2004 1:28:07 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

Larry Roberts <skin-e@juno.com> wrote in message news:<h042e0p3afea5j7t3qf53v2e1ijrag2upv@4ax.com>...
> On 28 Jun 2004 15:46:17 GMT, no_spam@spamnet.nl (Dark Avenger) wrote:
>
> >"SLIisBACK" <SLIrocks@aol.com> wrote in
> ><SrWdnT9rj8hANELdRVn-hA@comcast.com>:
> >
> >>http://media.hardwareanalysis.com/articles/large/11206....
> >>http://media.hardwareanalysis.com/articles/large/11208....
> >>http://media.hardwareanalysis.com/articles/large/11207....
> >>
> >>http://www.hardwareanalysis.com/content/article/1728/
> >>
> >>_________________________________________________________________________
> >>___ ___
> >>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
> >>
>
> By the time this setup's price would come into most of the
> game player's budget, the next great videocard chipset would hit the
> market, boosting performance to rival, or surpass the dual card setup.
> Unless the mainboard manufactuers start producing desktop mainnboards
> with dual PCI-X slots, I don't see this as something that will take
> off in the home user market.
> Hell... By the time dual 12MB Voodoo2 SLI cards came into my
> price range, the GF2 cards where dominating, and the GF3 cards where
> about to hit the market.

I don't see why I should spend $1000 or more for 2 GeForce 6800s when
I will be able to get an NV50 next year with similar or better
performance and newer features for about half the price
Anonymous
a b U Graphics card
June 29, 2004 3:47:23 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Mon, 28 Jun 2004 21:57:36 -0400, "J. Clarke"
<jclarke@nospam.invalid> wrote:


>Note by the way that the Alienware board is not going to support arbitrary
>video boards--according to their press release it is going to be tied to
>specific models and combines the video using a third board.

In essence, couldn't you say the same for Nvidia's solution? They are
tying together two cards with a third PCB.
Anonymous
a b U Graphics card
June 29, 2004 3:49:57 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

On Tue, 29 Jun 2004 05:46:26 GMT, redbrick@fastermail.com (Redbrick)
wrote:

><snip>
>
>>____________________________________________________________________________
>>___
>>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>
><snip>
>
>I never did like the SLI configuration...took up two PCI slots
>and kept you from installing anything else....all for what???
>
>...1024x768?? You've got to be kidding???
>
>Why not make a long PCI card with multiple GPU daughter card snap-on like a
>ram extender.
>
>That would save us the few and precious PCI-E slots.
>
>So I guess with so many extra GPU cards...they'll all need two dedicated
>molex connectors?
>
>I think I'll pass on that...
>
>
>Redbrick..who Loves his CLK

What are you saving those PCI slots for anyway? The only thing
currently occupying my five PCI slots is a sound card. In these days
of onboard NIC's, onboard RAID, etc., etc., there is less of a need
for PCI slots.
Anonymous
a b U Graphics card
June 29, 2004 5:31:35 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

Folk wrote:

> On Tue, 29 Jun 2004 05:46:26 GMT, redbrick@fastermail.com (Redbrick)
> wrote:
>
>><snip>
>>
>>>____________________________________________________________________________
>>>___
>>>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>>
>><snip>
>>
>>I never did like the SLI configuration...took up two PCI slots
>>and kept you from installing anything else....all for what???
>>
>>...1024x768?? You've got to be kidding???
>>
>>Why not make a long PCI card with multiple GPU daughter card snap-on like
>>a ram extender.
>>
>>That would save us the few and precious PCI-E slots.
>>
>>So I guess with so many extra GPU cards...they'll all need two dedicated
>>molex connectors?
>>
>>I think I'll pass on that...
>>
>>
>>Redbrick..who Loves his CLK
>
> What are you saving those PCI slots for anyway? The only thing
> currently occupying my five PCI slots is a sound card. In these days
> of onboard NIC's, onboard RAID, etc., etc., there is less of a need
> for PCI slots.

It's not a "PCI slot", it's a "PCI _Express_" slot. There's a difference.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
June 29, 2004 5:41:45 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Folk wrote:

> On Mon, 28 Jun 2004 21:57:36 -0400, "J. Clarke"
> <jclarke@nospam.invalid> wrote:
>
>
>>Note by the way that the Alienware board is not going to support arbitrary
>>video boards--according to their press release it is going to be tied to
>>specific models and combines the video using a third board.
>
> In essence, couldn't you say the same for Nvidia's solution? They are
> tying together two cards with a third PCB.

The Alienware design doesn't use a bridgeboard between the two video boards,
it feeds the video outputs of the two into a third PCI analog video merger
board that has the video output to the monitor on it. Further, the output
is analog-only. And, incidentally, it's been demonstrated with 5900
boards.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
June 29, 2004 6:28:01 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

>> And you are wrong. There will be dual PCI-E 16X boards. Alienware are
>> preparing to launch machines with this very setup.

Well no, actually it's not quite the same setup, although you are
correct that Alienware will use two PCI-X slots. Alienware's
technology does not use a molex connector between the cards. They use
a proprietary chipset to do the connection. They are in it with Wicked
3D, remember them? NVidea has the patents from 3DFX and is a chipset
manufacturer themselves, doubt VERY much that they won't be the ones
to develop the motherboard SLI enabling chipsets. Now ATI, not ones to
sit back and watch Nvidia walk all over them, will surely either come
up with an SLI compatible card or team up with Alienware (doubt they
would put all their eggs in one basket tho..) to counter it.

JMHO of course..

G Patricks
Anonymous
a b U Graphics card
June 29, 2004 6:33:32 PM

Archived from groups: alt.comp.periphs.videocards.ati (More info?)

>On Tue, 29 Jun 2004 05:46:26 GMT, redbrick@fastermail.com (Redbrick) wrote:
>That would save us the few and precious PCI-E slots.

AFAIK the PCI-X slots will only be fully utilized by video cards
anyway, which is why there would only be one slot normally. With
todays motherboards I only use the AGP slot and *maybe* one PCI slot
for a high end sound card, everything else is built in..
>
>So I guess with so many extra GPU cards...they'll all need two dedicated
>molex connectors?

Two cards, one connector between them.. thats it.

G Patricks
Anonymous
a b U Graphics card
June 29, 2004 8:27:57 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Unzar Jones wrote:

> "SLIisBACK" <SLIrocks@aol.com> wrote in message >Nvidia steps up to the
> plate today with the re->introduction of the SLI
>
>>concept on the GeForce 6800 series
>
>
> What cpu(s) can keep up with it?
> Should the electrician connect it straight to Edison's sub-
> station or will any gas powered generator do?
> What will this over the top money pit be worth a year
> after purchase?
>
> I guess I'll pass.
>
>

Maybe you're too young, but back when the V2's came out nearly the same
things were said about it. Overpriced, slot hogging, hot bastards. But
the legacy of the V2 SLI still lives to this day. I can remember when
even the 2nd generation of AGP cards came out (TNT2, various Rage cards,
S3, etc), people were very hesitant to give up the raw power of 2 V2 cards.
Anonymous
a b U Graphics card
June 29, 2004 9:45:23 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

On Mon, 28 Jun 2004 00:17:18 -0500, "SLIisBACK" <SLIrocks@aol.com>
wrote:

>http://media.hardwareanalysis.com/articles/large/11206....
>http://media.hardwareanalysis.com/articles/large/11208....
>http://media.hardwareanalysis.com/articles/large/11207....
>
>http://www.hardwareanalysis.com/content/article/1728/
>
>____________________________________________________________________________
>___
>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>
>By: Sander Sassen
>


See the new Anandtech article:-
http://www.anandtech.com/video/showdoc.html?i=2097

Not the SLI that we knoew and loved but a totally different
patented technique for sharing GPU loads in an intelligent manner.
They have the rights inherited from 3dfx to the name "SLI" for
GPU load-sharing.

It is pretty obvious that nVidia's current dual board "SLI" exercise
is a Trojan horse to test the functionality of the GPU shared-port and
build optimizing drivers/compilers in preparation for multiple GPUs on
one board.a la 3dfx V5 5500. This situation will become a reality
with the current round of design exercises at both nVidia ( and Ati )
to shrink the 6800 ( and X800 ) on to a smaller geometry process,
probably 110nm, and later 65nm. For nVidia, the resulting reduction
of the power per chip will certainly make a dual-GPU/board a
reality, or maybe even a dual GPU on a hybrid substrate.
Parallel graphics processing is becoming a mandatory requirement.
Process shrinks are not occuring fast enough to keep up with
the signal processing demands. Parallelism is the way to keep
the chip-yield up and the on-chip dissipation/unit-area within
reasonable bounds.

Intel is actively working on CPU dual-core parallel-architectures
too, beyond HT. They have realised that they are getting near
the limits of pushing clock-rate, and have to come up with other
solutions.

John Lewis
Anonymous
a b U Graphics card
June 29, 2004 10:23:13 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

Larry Roberts <skin-e@juno.com> wrote in
<h042e0p3afea5j7t3qf53v2e1ijrag2upv@4ax.com>:

>On 28 Jun 2004 15:46:17 GMT, no_spam@spamnet.nl (Dark Avenger) wrote:
>
>>"SLIisBACK" <SLIrocks@aol.com> wrote in
>><SrWdnT9rj8hANELdRVn-hA@comcast.com>:
>>
>>>http://media.hardwareanalysis.com/articles/large/11206....
>>>http://media.hardwareanalysis.com/articles/large/11208....
>>>http://media.hardwareanalysis.com/articles/large/11207....
>>>
>>>http://www.hardwareanalysis.com/content/article/1728/
>>>
>>>_______________________________________________________________________
>>>__ ___ ___
>>>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>>>
>
> By the time this setup's price would come into most of the
>game player's budget, the next great videocard chipset would hit the
>market, boosting performance to rival, or surpass the dual card setup.
>Unless the mainboard manufactuers start producing desktop mainnboards
>with dual PCI-X slots, I don't see this as something that will take
>off in the home user market.
> Hell... By the time dual 12MB Voodoo2 SLI cards came into my
>price range, the GF2 cards where dominating, and the GF3 cards where
>about to hit the market.

Well, that indeed is a point.. why spend so much money... I guess it's for
bragging rights... to say that you are the fastest bitch on the market FOR
THAT MOMENT.

Yup, I know people like htat...gamers... hardcore gamers...go for nothing
less then the fastest... and see their moneys worth plummit in mere
months...

I self, well I buy products..when they are on good price... a price I am
willing to pay... I guess the most expensive thing about my rig is the
memory... 1 Gb of PC3200 DDR... Kingstong also ( not hyperX though ) ... so
even there I didn't invest to much..
June 30, 2004 2:17:08 AM

Archived from groups: alt.comp.periphs.videocards.ati (More info?)

On Tue, 29 Jun 2004 14:33:32 +0000, Icer wrote:

>>On Tue, 29 Jun 2004 05:46:26 GMT, redbrick@fastermail.com (Redbrick) wrote:
>>That would save us the few and precious PCI-E slots.
>
> AFAIK the PCI-X slots will only be fully utilized by video cards
> anyway, which is why there would only be one slot normally. With
> todays motherboards I only use the AGP slot and *maybe* one PCI slot
> for a high end sound card, everything else is built in..
>>

Be careful, PCIe and PCI-X are entirely different, though I can easily see
why people would want to call it PCI-X.

K
Anonymous
a b U Graphics card
June 30, 2004 4:19:10 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Tue, 29 Jun 2004 13:31:35 -0400, "J. Clarke"
<jclarke@nospam.invalid> wrote:

>Folk wrote:
>
>> On Tue, 29 Jun 2004 05:46:26 GMT, redbrick@fastermail.com (Redbrick)
>> wrote:
>>
>>><snip>
>>>
>>>>____________________________________________________________________________
>>>>___
>>>>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>>>
>>><snip>
>>>
>>>I never did like the SLI configuration...took up two PCI slots
>>>and kept you from installing anything else....all for what???
>>>
>>>...1024x768?? You've got to be kidding???
>>>
>>>Why not make a long PCI card with multiple GPU daughter card snap-on like
>>>a ram extender.
>>>
>>>That would save us the few and precious PCI-E slots.
>>>
>>>So I guess with so many extra GPU cards...they'll all need two dedicated
>>>molex connectors?
>>>
>>>I think I'll pass on that...
>>>
>>>
>>>Redbrick..who Loves his CLK
>>
>> What are you saving those PCI slots for anyway? The only thing
>> currently occupying my five PCI slots is a sound card. In these days
>> of onboard NIC's, onboard RAID, etc., etc., there is less of a need
>> for PCI slots.
>
>It's not a "PCI slot", it's a "PCI _Express_" slot. There's a difference.

Since a PCI Express video card uses a completely different design
(16X) and form factor than a 'regular' PCI Express slot, what makes
you think that having two video slots will block a 'regular' slot?
Anonymous
a b U Graphics card
June 30, 2004 11:16:52 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Folk wrote:

> On Tue, 29 Jun 2004 13:31:35 -0400, "J. Clarke"
> <jclarke@nospam.invalid> wrote:
>
>>Folk wrote:
>>
>>> On Tue, 29 Jun 2004 05:46:26 GMT, redbrick@fastermail.com (Redbrick)
>>> wrote:
>>>
>>>><snip>
>>>>
>>>>>____________________________________________________________________________
>>>>>___
>>>>>Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>>>>
>>>><snip>
>>>>
>>>>I never did like the SLI configuration...took up two PCI slots
>>>>and kept you from installing anything else....all for what???
>>>>
>>>>...1024x768?? You've got to be kidding???
>>>>
>>>>Why not make a long PCI card with multiple GPU daughter card snap-on
>>>>like a ram extender.
>>>>
>>>>That would save us the few and precious PCI-E slots.
>>>>
>>>>So I guess with so many extra GPU cards...they'll all need two dedicated
>>>>molex connectors?
>>>>
>>>>I think I'll pass on that...
>>>>
>>>>
>>>>Redbrick..who Loves his CLK
>>>
>>> What are you saving those PCI slots for anyway? The only thing
>>> currently occupying my five PCI slots is a sound card. In these days
>>> of onboard NIC's, onboard RAID, etc., etc., there is less of a need
>>> for PCI slots.
>>
>>It's not a "PCI slot", it's a "PCI _Express_" slot. There's a difference.
>
> Since a PCI Express video card uses a completely different design
> (16X) and form factor than a 'regular' PCI Express slot, what makes
> you think that having two video slots will block a 'regular' slot?

No, a PCI Express video card does _not_ use "a completely different design
and form factor than a 'regular' PCI Express slot". Any PCI Express board
is _supposed_ to work in any PCI Express slot with the same number of lines
or higher. So a PCI Express 1X board is supposed to plug into that 16x
slot and work fine.

Further, Alienware says that they use all the available PCI Express lanes on
the Intel 7525 chipset in their dual processor design. The 7525 has 24 PCI
Express lines arranged as 1 16x and 1 8x that can be split into 2 4x. So
they don't even have a full 16x port for the second video board, let alone
any lanes left over for other devices.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
July 1, 2004 4:10:40 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

"SLIisBACK" <SLIrocks@aol.com> wrote in message
news:SrWdnT9rj8hANELdRVn-hA@comcast.com...
> http://media.hardwareanalysis.com/articles/large/11206....
> http://media.hardwareanalysis.com/articles/large/11208....
> http://media.hardwareanalysis.com/articles/large/11207....
>
> http://www.hardwareanalysis.com/content/article/1728/
>
>
____________________________________________________________________________
> ___
> Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>

Smells like desperation to me. Seems like they can't keep up ATI's
technology so they're going for the brute force approach. Ironically,
NVidia criticized 3dfx for the same thing back in the late nineties.
Anonymous
a b U Graphics card
July 1, 2004 4:16:00 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

>
> Smells like desperation to me. Seems like they can't keep up ATI's
> technology so they're going for the brute force approach. Ironically,
> NVidia criticized 3dfx for the same thing back in the late nineties.
>
>

Desperation or not, I do not see anything wrong with this. If they have this
leverage, why not use it? ATi will find some technology or else try to trail
as close it can until they come up with something. This the cruel and brutal
of business. Technology will advance as rival find new way to be better than
the other.

CapFusion,...
Anonymous
a b U Graphics card
July 1, 2004 4:38:03 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Wed, 30 Jun 2004 19:16:52 -0400, "J. Clarke"
<jclarke@nospam.invalid> wrote:


>> Since a PCI Express video card uses a completely different design
>> (16X) and form factor than a 'regular' PCI Express slot, what makes
>> you think that having two video slots will block a 'regular' slot?
>
>No, a PCI Express video card does _not_ use "a completely different design
>and form factor than a 'regular' PCI Express slot". Any PCI Express board
>is _supposed_ to work in any PCI Express slot with the same number of lines
>or higher. So a PCI Express 1X board is supposed to plug into that 16x
>slot and work fine.
>
>Further, Alienware says that they use all the available PCI Express lanes on
>the Intel 7525 chipset in their dual processor design. The 7525 has 24 PCI
>Express lines arranged as 1 16x and 1 8x that can be split into 2 4x. So
>they don't even have a full 16x port for the second video board, let alone
>any lanes left over for other devices.

OK, I've been wrong before, but look here:

http://www6.tomshardware.com/motherboard/20040301/alder...

That shows the difference between a 1X and a 16X PCI slot. Now most
mobos (discounting exotic designs like Alienware) are going to come
with a single 16X and one or more 1X slots. You're saying that a 1X
card will fit in a 16X slot, and that may be true, but I doubt anyone
will actually do that.

I haven't seen any board layouts yet, but it's certainly possible that
a board with two 16X slots to accommodate an SLI setup would have the
1X slots positioned far enough away from the dual 16X slots to make
the concept of "wasting a slot" disappear. Wouldn't that make sense
to you too?
Anonymous
a b U Graphics card
July 1, 2004 4:38:04 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Folk" <Folk@folk.com> wrote in message
news:l3f8e0d03u2vo02rqnncn8f3umbldf13md@4ax.com...
>
> OK, I've been wrong before, but look here:
>
> http://www6.tomshardware.com/motherboard/20040301/alder...
>
> That shows the difference between a 1X and a 16X PCI slot. Now most
> mobos (discounting exotic designs like Alienware) are going to come
> with a single 16X and one or more 1X slots. You're saying that a 1X
> card will fit in a 16X slot, and that may be true, but I doubt anyone
> will actually do that.
>
> I haven't seen any board layouts yet, but it's certainly possible that
> a board with two 16X slots to accommodate an SLI setup would have the
> 1X slots positioned far enough away from the dual 16X slots to make
> the concept of "wasting a slot" disappear. Wouldn't that make sense
> to you too?
>

Both card will need both 16X PCI-E slot and with bridge -
http://graphics.tomshardware.com/graphic/20040628/index...

CapFusion,...
Anonymous
a b U Graphics card
July 1, 2004 6:50:21 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

This is actually a very good technique. Dividing the screen into two
load-balanced halves means there's no redundant texture memory usage like
3dfx's scanline interleaving, and no mouse lag like ATi's alternate-frame
rendering. Wicked3D had something similar a couple of years ago, but the
immature drivers back then produced a black line between the two image
halves.

The real card to watch for is the 6800GT, which may actually be affordable
in SLI config.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"SLIisBACK" <SLIrocks@aol.com> wrote in message
news:SrWdnT9rj8hANELdRVn-hA@comcast.com...
> http://media.hardwareanalysis.com/articles/large/11206....
> http://media.hardwareanalysis.com/articles/large/11208....
> http://media.hardwareanalysis.com/articles/large/11207....
>
> http://www.hardwareanalysis.com/content/article/1728/
>
Anonymous
a b U Graphics card
July 1, 2004 7:24:01 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

On Thu, 01 Jul 2004 14:50:21 GMT, "First of One" <daxinfx@yahoo.com>
wrote:

>This is actually a very good technique. Dividing the screen into two
>load-balanced halves means there's no redundant texture memory usage like
>3dfx's scanline interleaving, and no mouse lag like ATi's alternate-frame
>rendering. Wicked3D had something similar a couple of years ago, but the
>immature drivers back then produced a black line between the two image
>halves.
>
>The real card to watch for is the 6800GT, which may actually be affordable
>in SLI config.
>

Agreed !!

Once you have a PCI-Express chip-set that will support 2 or more
PCI-express sockets. The nforce4 chip-set, still in design at nVidia,
is very likely to do just that. And may for the first time make me
thinkvery seriously about leaving the Intel camp. Athlon 64 FX53
939-pin-- unlocked overclock, plus nForce4, plus dual 6800GT
PCI-express in SLI-configuration; the thought makes me really drool,
( and my pocket-book wilt ).

As far as the enthusiast community goes, Intel has really lost their
way in the past year. Besides power-hungry Prescott, the latest
Intel miss-step is to DELIBERATELY build-in a 10% overclock limit
into the 915/925 chip-sets. Intel has again become arrogant - they
periodically do that until the threat of real competition beats them
over the head.

John Lewis


>--
>"War is the continuation of politics by other means.
>It can therefore be said that politics is war without
>bloodshed while war is politics with bloodshed."
>
>
>"SLIisBACK" <SLIrocks@aol.com> wrote in message
>news:SrWdnT9rj8hANELdRVn-hA@comcast.com...
>> http://media.hardwareanalysis.com/articles/large/11206....
>> http://media.hardwareanalysis.com/articles/large/11208....
>> http://media.hardwareanalysis.com/articles/large/11207....
>>
>> http://www.hardwareanalysis.com/content/article/1728/
>>
>
>
Anonymous
a b U Graphics card
July 1, 2004 10:57:58 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

CapFusion wrote:

>>
>> Smells like desperation to me. Seems like they can't keep up ATI's
>> technology so they're going for the brute force approach. Ironically,
>> NVidia criticized 3dfx for the same thing back in the late nineties.
>>
>>
>
> Desperation or not, I do not see anything wrong with this. If they have
> this leverage, why not use it? ATi will find some technology or else try
> to trail as close it can until they come up with something. This the cruel
> and brutal of business. Technology will advance as rival find new way to
> be better than the other.

Or the market will look at it and snore.

> CapFusion,...

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
July 1, 2004 10:57:59 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"J. Clarke" <jclarke@nospam.invalid> wrote in message
news:cc25240t9f@news2.newsguy.com...
>
> Or the market will look at it and snore.
>

It may or may not. It really depend on the current user and how it deal with
it. If that technology produce the same image as Intel famous RDRAM [PC800
etc]. Then it may become a flop until user either totally reject it or very
slowly embrace it technology. During this time, maybe ATi will have
something new to combat this SLI from nVIDIA. Only time will tell.

CapFusion,...
July 1, 2004 11:07:30 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

OMG. *drooooooll* What a beast!!!
Anonymous
a b U Graphics card
July 2, 2004 12:44:57 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Folk wrote:

> On Wed, 30 Jun 2004 19:16:52 -0400, "J. Clarke"
> <jclarke@nospam.invalid> wrote:
>
>
>>> Since a PCI Express video card uses a completely different design
>>> (16X) and form factor than a 'regular' PCI Express slot, what makes
>>> you think that having two video slots will block a 'regular' slot?
>>
>>No, a PCI Express video card does _not_ use "a completely different design
>>and form factor than a 'regular' PCI Express slot". Any PCI Express board
>>is _supposed_ to work in any PCI Express slot with the same number of
>>lines
>>or higher. So a PCI Express 1X board is supposed to plug into that 16x
>>slot and work fine.
>>
>>Further, Alienware says that they use all the available PCI Express lanes
>>on
>>the Intel 7525 chipset in their dual processor design. The 7525 has 24
>>PCI
>>Express lines arranged as 1 16x and 1 8x that can be split into 2 4x. So
>>they don't even have a full 16x port for the second video board, let alone
>>any lanes left over for other devices.
>
> OK, I've been wrong before, but look here:
>
> http://www6.tomshardware.com/motherboard/20040301/alder...
>
> That shows the difference between a 1X and a 16X PCI slot.

What of it? Look closely and you'll see that there is nothing that prevents
a 1x board from being plugged into a 16x slot. The spacing of the contacts
is the same. There is one crosspiece in the slot and it is the same
distance from the left end on both. If you plug a 1x (or 4x or 8x) board
into a 16x slot, it will fit fine, it just won't use all the contacts,
anymore than a typical AGP board uses all the contacts in an AGP Pro slot.

> Now most
> mobos (discounting exotic designs like Alienware) are going to come
> with a single 16X and one or more 1X slots. You're saying that a 1X
> card will fit in a 16X slot, and that may be true, but I doubt anyone
> will actually do that.

Of course they will. Consider some future machine that has all 16x slots.
Or somebody who gets the Alienware machine and then decides later that he
wants to use it as a server and pulls one of the video boards to use in
another machine, leaving its slot available for his gigabit board.

> I haven't seen any board layouts yet, but it's certainly possible that
> a board with two 16X slots to accommodate an SLI setup would have the
> 1X slots positioned far enough away from the dual 16X slots to make
> the concept of "wasting a slot" disappear. Wouldn't that make sense
> to you too?

I'm sorry, but if you have three slots and two of them are taken up by video
then there is only one additional slot available. The fact that it is
available does not mean that the slot with the second video board is also
available, thus the video board is tying up a PCI Express slot that could
be used for other purposes.

Regardless of any of this, the Alienware machine is _not_ going to have any
1X slots. They state that they are using the Intel 7525 chipset. The docs
for the 7525 are available on the Intel site (go to "workstation and
server" then "chipsets"). If you read them you will find that the 7525
does not support 1x slots, it supports a single 16x and either a single 8x
or two 4x slots. Alienware states that they are using all the available
lanes for video, which means that they are using the 8x slot for the second
video board and there will be no 4x slots, let alone 1x.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
July 2, 2004 7:03:31 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

On Thu, 1 Jul 2004 12:10:40 -0400, "Tim" <argybargy@hotmail.com>
wrote:

>
>"SLIisBACK" <SLIrocks@aol.com> wrote in message
>news:SrWdnT9rj8hANELdRVn-hA@comcast.com...
>> http://media.hardwareanalysis.com/articles/large/11206....
>> http://media.hardwareanalysis.com/articles/large/11208....
>> http://media.hardwareanalysis.com/articles/large/11207....
>>
>> http://www.hardwareanalysis.com/content/article/1728/
>>
>>
>____________________________________________________________________________
>> ___
>> Nvidia SLI, SLI's back with a vengeance Jun 28, 2004, 07:30 AM
>>
>
>Smells like desperation to me. Seems like they can't keep up ATI's
>technology

In what way ? Please explain ?

I though that it was nVidia that had overcome the significant
intricacies of a Dx9.0c implementation, but maybe I am
reading the wrong technical literature.

> so they're going for the brute force approach.

Not quite. What nvIdia is doing is a simple microcosm for
desktop computers and graphic applications of the shared
processing approach used world-wide by number-
crunching super-computers. nVidia has had the foresight
to implemented the sharing mechanism in their current
silicon. Not exactly a new concept. In a similar domain a
few years ago, I was involved in the design of chips for
time-simultaneous processing of the 3 channels of
component-video (Y, Cr, Cb ) each with a link-port for
accurate synchronization and to coordinate task-
sharing with the other two chips.

John Lewis

> Ironically,
>NVidia criticized 3dfx for the same thing back in the late nineties.
>

Yes, solely for marketing reasons, never technical.

John Lewis


>
>
Anonymous
a b U Graphics card
July 2, 2004 9:40:38 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

Unlike the 3dfx VSA-100, nowadays a single 6800 Ultra is competitive with
the X800XT, so this SLI thing is really just a matter of image and bragging
rights. Seriously, 0.5 GB video RAM, 32 textures in a single cycle, four
expansion slots...

The most thumpingly expensive setup is Quadro SLI. Total cost is at least
$5000.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


> Smells like desperation to me. Seems like they can't keep up ATI's
> technology so they're going for the brute force approach. Ironically,
> NVidia criticized 3dfx for the same thing back in the late nineties.
Anonymous
a b U Graphics card
July 2, 2004 2:11:20 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Thu, 01 Jul 2004 20:44:57 -0400, "J. Clarke"
<jclarke@nospam.invalid> wrote:


>I'm sorry, but if you have three slots and two of them are taken up by video
>then there is only one additional slot available. The fact that it is
>available does not mean that the slot with the second video board is also
>available, thus the video board is tying up a PCI Express slot that could
>be used for other purposes.

I think you just like to argue.

Your initial complaint was about 'wasting' a PCI slot with an SLI
configuration. You have yet to see a mobo that will accommodate a
dual vid card setup, yet you seem convinced that when one does
surface, the board designers will stupidly put the other PCI slots so
close to those two that they will be unusable.

>Regardless of any of this, the Alienware machine is _not_ going to have any
>1X slots. They state that they are using the Intel 7525 chipset. The docs
>for the 7525 are available on the Intel site (go to "workstation and
>server" then "chipsets"). If you read them you will find that the 7525
>does not support 1x slots, it supports a single 16x and either a single 8x
>or two 4x slots. Alienware states that they are using all the available
>lanes for video, which means that they are using the 8x slot for the second
>video board and there will be no 4x slots, let alone 1x.

And why do you keep mentioning server chipsets and Alienware rigs?
That's not what the lion's share of this group's readers will be
interested in. Most enthusiasts will be using either the 925X or 915
chipsets, not some exotic Alienware or server class setup.

But whatever.... you seem to be one of those persons that is "always
right" and loves to argue about it. It just seemed odd to me that you
would immediately dismiss SLI as a blocker of PCI slots, when you
haven't even seen a real board design that accommodates SLI. Are you
psychic, or just a troll?
Anonymous
a b U Graphics card
July 2, 2004 9:00:18 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Folk wrote:

> On Thu, 01 Jul 2004 20:44:57 -0400, "J. Clarke"
> <jclarke@nospam.invalid> wrote:
>
>
>>I'm sorry, but if you have three slots and two of them are taken up by
>>video
>>then there is only one additional slot available. The fact that it is
>>available does not mean that the slot with the second video board is also
>>available, thus the video board is tying up a PCI Express slot that could
>>be used for other purposes.
>
> I think you just like to argue.

Pot, kettle.

> Your initial complaint was about 'wasting' a PCI slot with an SLI
> configuration. You have yet to see a mobo that will accommodate a
> dual vid card setup, yet you seem convinced that when one does
> surface, the board designers will stupidly put the other PCI slots so
> close to those two that they will be unusable.

I'm sorry, but I have at no point expressed concern that any slot will be
any particular distance from any other slot. The Alienware machine has two
and only two PCI Express slots. With the dual video there is nowhere to
plug in a third PCI device. It is not a matter of the third slot being "so
close to those two", it is that THERE IS NO THIRD SLOT.
>
>>Regardless of any of this, the Alienware machine is _not_ going to have
>>any
>>1X slots. They state that they are using the Intel 7525 chipset. The
>>docs for the 7525 are available on the Intel site (go to "workstation and
>>server" then "chipsets"). If you read them you will find that the 7525
>>does not support 1x slots, it supports a single 16x and either a single 8x
>>or two 4x slots. Alienware states that they are using all the available
>>lanes for video, which means that they are using the 8x slot for the
>>second video board and there will be no 4x slots, let alone 1x.
>
> And why do you keep mentioning server chipsets and Alienware rigs?

Because that looks to be the _only_ machine coming in the near future that
will have a provision for PCI Express dual video?

> That's not what the lion's share of this group's readers will be
> interested in. Most enthusiasts will be using either the 925X or 915
> chipsets, not some exotic Alienware or server class setup.

So let's see, you're going to plug your super hotrod PCI Express 6800 board
into a 4x slot to use SLI? And how well do you think that that is going to
work? Assuming of course that someone actually _makes_ a 915 or 925X
machine that has the 4 PCI Express lanes on the ICH6 configured as a single
4X slot instead of 4 1x slots. And if the machine _is_ configured with the
4x slot then it will have two and only two PCI Express slots both of which
have video boards plugged into them and there will be no way to attach a
third PCI Express device.

Now why, precisely, do you think that Alienware chose to use a workstation
chipset instead of a desktop chipset on their machine? Hmmm? Maybe a 4x
second slot that is the largest possible on the 925X and 915 doesn't have
enough bandwidth to make the dual video worth the effort?

> But whatever.... you seem to be one of those persons that is "always
> right" and loves to argue about it.

I'm quite happy to admit that I am wrong when I am. But I haven't seen you
provide any information that would lead me to that belief.

> It just seemed odd to me that you
> would immediately dismiss SLI as a blocker of PCI slots, when you
> haven't even seen a real board design that accommodates SLI. Are you
> psychic, or just a troll?

Who said anything about "a blocker of PCI slots"? First, nobody said
anything about PCI slots except YOU. It was PCI EXPRESS (got that,
EXPRESS, not the same as PCI) that was the concern, and the concern was not
that some slot would be "blocked" because it was "too close" to some other
slot, it was that the machines have damned few PCI Express slots to begin
with and the second video board is physically inserted into one of them.

As to being "psychic or just a troll", I'm neither. Just someone who has
read the spec sheets for the chipsets and understands their implications.

You seem determined to misunderstand the issue.

To simplify, the chipsets you favor are not capable of providing two 16x
slots. The _best_ they can do is a 16x and a 4x, and in that configuration
those are the _only_ PCI Express slots. If you accept that the utility of
an SLI machine with one video board in a 16x slot and one in a 1x slot is
virtually nonexistent, then you have to agree that if SLI is going to be
used with a 925X or a 915X then there will be exactly two PCI Express
slots, a 16x and a 4x, with a video board plugged into each slot, and with
NO other PCI Express slots anywhere in the machine. Do you begin to
understand the issue?


--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
Anonymous
a b U Graphics card
July 3, 2004 4:53:22 AM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

On Fri, 02 Jul 2004 05:40:38 GMT, "First of One" <daxinfx@yahoo.com>
wrote:

>Unlike the 3dfx VSA-100, nowadays a single 6800 Ultra is competitive with
>the X800XT, so this SLI thing is really just a matter of image and bragging
>rights. Seriously, 0.5 GB video RAM, 32 textures in a single cycle, four
>expansion slots...
>
>The most thumpingly expensive setup is Quadro SLI. Total cost is at least
>$5000.
>

Yep.

Pros pay $1000 where consumers pay $100 for almost the same
thing nowadays in the technology markets.

I do freelance video work and ensure maximum quality for
my capital-equipment-buck by very judiciously mixing pro
and 'high-end-domestic" tools and hardware.

John Lewis

>--
>"War is the continuation of politics by other means.
>It can therefore be said that politics is war without
>bloodshed while war is politics with bloodshed."
>
>
>> Smells like desperation to me. Seems like they can't keep up ATI's
>> technology so they're going for the brute force approach. Ironically,
>> NVidia criticized 3dfx for the same thing back in the late nineties.
>
>
Anonymous
a b U Graphics card
July 3, 2004 1:06:43 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Fri, 02 Jul 2004 17:00:18 -0400, "J. Clarke"
<jclarke@nospam.invalid> wrote:

>Do you begin to understand the issue?

Heh. Finally.

I'm so used to hearing people complain about vid cards with overly
large fan solutions *blocking* PCI slots, that's what my brain heard
when you complained about a *lack* of PCI (e) slots. It finally took
you posting several thousand words of explanation for me to see the
distinction.

I'm a dumbass sometimes... just move along. <g>
July 5, 2004 9:45:30 PM

Archived from groups: alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.ati,comp.sys.ibm.pc.hardware.video (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
> >>
> >
> >Smells like desperation to me. Seems like they can't keep up ATI's
> >technology
>
> In what way ? Please explain ?
>

It just seems to me that they can't compete with ATI in their price range so
they're creating a "halo" product to garner prestige for the company. I'm
sure they don't expect to sell too many, it's more for promoting the brand
name anyway.

Video graphics technology is advancing so rapidly that I suspect that this
dual-card solution will be integrated into a
single-card, realistically priced, product long before this version reaches
its own life span.

> >NVIDIA criticized 3dfx for the same thing back in the late nineties.
> >
>
> Yes, solely for marketing reasons, never technical.
>

Pretty much the same thing do you think? NVidia's marketing department
seemed to try to convince us that 3dfx was inferior for technical reasons.
Also, I have found NVidia's own product descriptions so filled with
marketing jargon and double-talk that I need a third party source just to
understand what they're talking about.
!