Sign in with
Sign up | Sign in
Your question

SLI - The innovation of 3DFX

Last response: in Graphics & Displays
Share
September 14, 2012 2:28:31 PM

3DFX was an amazing company and they were ahead of their time. SLI in 1998 was a beautiful thing, but it did not mean "Scalable Link Interface." Today, SLI means, double the quality but not double the power. Thus, effectively, double the AA, AF etc. Ram is not combined, processing power is not combined to double performance. It's more of a novelty, in fact I tend to get better FPS then most SLI setups, with only one card.

In 1998, back in the days of my dual Voodoo 1's, SLI stood for "Scan-line Interleave," which did exactly what the name would suggest. One car rendered part one half of the screen and the other card, the other part of the screen. Combined with today's Vsync (or perhaps, an enhanced SLI Vsync), this would rock! SLI, back then, allowed for higher resolutions for instance, 1152x768 instead of the average 800x600, but there was be a problem. The average computer gamer and computers were not ready for this. It required a stronger power supply, and there weren't very many expansion options at the time, and VERY few people were custom building their PC's. We needed room for things like soundcards, modems etc. too, so having dual cards or more, would often times mean giving up a device that was important e.g. internet access/ sound etc. Or moving backwards to an ISA soundcard, instead of PCI. Because, nothing was integrated into the mainboards back then. Managing IRQ signals was tricky enough with one card, let alone two, in Windows 95/98.. Software support was crappy at start, same as it is with any new tech/ method at first, but back then snags like that weren't as gracefully handeld and fixed over time, they were instead, abandoned. Thus, development didn't have a chance to mature/ iron the kinks out before AGP would be introduced over the next two years but when it worked, it worked good! Nvidia then acquired 3DFX around 2001 and never used the tech 3DFX began...after all, why would they future proof us by running double specs when they can force us to upgrade every year?

More about : sli innovation 3dfx

September 14, 2012 3:11:11 PM




Just kidding. SLI has come a long way.

mmmm...


m
0
l
September 14, 2012 3:19:17 PM

It does look good though, doesn't it? ;)  Na, but man, imagine if we were running literal double-power. It can be done but it's not good for their marketing model. They don't want us gaming with two GTX680's for the next 5-10 years.
m
0
l
a c 216 U Graphics card
September 14, 2012 3:24:02 PM

Clearly you have had troubles with SLI on your system. A single video card is not going to outperform two if those two cards comparable to your 1 card, unless their CPU is holding things back, or they are doing something wrong. There are very few games these days who don't support SLI.

As to your last comment:
"Thus, development didn't have a chance to mature/ iron the kinks out before AGP would be introduced over the next two years but when it worked, it worked good! Nvidia then acquired 3DFX around 2001 and never used the tech 3DFX began...after all, why would they future proof us by running double specs when they can force us to upgrade every year? "

The tech they use now works even better than the original tech, and they do still use some of the old tech too, depending on the game. With the advent of triple buffering, splitting the screen in half is rarely the best way, because having each card render every other frame is better. But old openGL programs will still do as they used to.

As for the future proofing comment, it has NOTHING to do with the graphics card company that no card is future proofed. They could increase performance by 1000 times this year, and in 2-3 years, you'll still have to upgrade. It's because game developers will always push the limits of the systems available today. When new tech comes out, developers take advantage of it, leaving old systems behind.
m
0
l
September 14, 2012 4:10:25 PM

^Or because Crytek decided that a tesselated ocean underneath Crysis 2 looked really badass...
m
0
l
September 14, 2012 4:12:29 PM

bystander said:
Clearly you have had troubles with SLI on your system. A single video card is not going to outperform two if those two cards comparable to your 1 card, unless their CPU is holding things back, or they are doing something wrong. There are very few games these days who don't support SLI.

As to your last comment:
"Thus, development didn't have a chance to mature/ iron the kinks out before AGP would be introduced over the next two years but when it worked, it worked good! Nvidia then acquired 3DFX around 2001 and never used the tech 3DFX began...after all, why would they future proof us by running double specs when they can force us to upgrade every year? "

The tech they use now works even better than the original tech, and they do still use some of the old tech too, depending on the game. With the advent of triple buffering, splitting the screen in half is rarely the best way, because having each card render every other frame is better. But old openGL programs will still do as they used to.

As for the future proofing comment, it has NOTHING to do with the graphics card company that no card is future proofed. They could increase performance by 1000 times this year, and in 2-3 years, you'll still have to upgrade. It's because game developers will always push the limits of the systems available today. When new tech comes out, developers take advantage of it, leaving old systems behind.



To be fair, I'm not talking about my system with 1 card vs my system with 2. There is a small-notable gain in performance with 2x cards vs 1, depending on the title. I'm talking about how there are frequently SLI gamers that hand out their max-settings + FPS and half the time I achieve faster in a single-card system. The gain isn't as justifiable as it was back in the day. SLI doesn't necessarily equate to running double res, you can't run double VRam etc. It's like opting to run a pure mirrored Raid setup, as opposed to Raid Striping. Striping is double the HDD performance but Mirror was the best for backup purposes. Not sure how Raid is these days, that how it was back when I ran it. I know some changes have been made.

True about the developers though, but it's not secret that they Want you to upgrade every year. The answers lay within their drivers and how they may lockout certain features to older cards and unlock them for those with the latest/ high-end cards, even though there's no reason why you wouldn't be able to achieve it on last generation's card. It's things like that. Maybe they don't do so much of that now, but I recall, not but 4-5 years ago that if you had to have the latest card just to run, AF 16x or MSAA 8x etc. and people would have to run hacks in order to Unlock higher options on previous gen cards, even though there's nothing to keep you from running those settings on older cards by trading off another effect/ shadow settings etc.
m
0
l
a b U Graphics card
September 14, 2012 4:17:59 PM

Long live VOODOO, VOODOO II, and VOODOO III. IT was different back then, you needed a dedicated video card, and the 3DFX card was an add on card. You could not run 2 dedicated video cards or two Voodoo cards.

m
0
l
September 14, 2012 4:35:53 PM

geekapproved said:
Long live VOODOO, VOODOO II, and VOODOO III. IT was different back then, you needed a dedicated video card, and the 3DFX card was an add on card. You could not run 2 dedicated video cards or two Voodoo cards.



I agree with the Long Live Voodoo! Too bad they sold out, ditched their fans for money. ;) 

Sure you could, there were bugs with it but just like today in a crossfire setup, you had a ribbon cable that went between the two Voodoo's, one card designated as slave.
m
0
l
a b U Graphics card
September 14, 2012 4:40:48 PM

You had to have a video card to use a voodoo card, you couldn't just use 2 voodoo cards. I had my Voodoo connected to a Nvidia TNT 16MB 3d card. Then I had the TNT2 32mb 3d car with a Voodoo II attached. Unreal Tournament BEAST!
m
0
l
a c 291 U Graphics card
September 14, 2012 4:53:44 PM

bernardblack said:
To be fair, I'm not talking about my system with 1 card vs my system with 2. There is a small-notable gain in performance with 2x cards vs 1, depending on the title. I'm talking about how there are frequently SLI gamers that hand out their max-settings + FPS and half the time I achieve faster in a single-card system. The gain isn't as justifiable as it was back in the day. SLI doesn't necessarily equate to running double res, you can't run double VRam etc. It's like opting to run a pure mirrored Raid setup, as opposed to Raid Striping. Striping is double the HDD performance but Mirror was the best for backup purposes. Not sure how Raid is these days, that how it was back when I ran it. I know some changes have been made.

True about the developers though, but it's not secret that they Want you to upgrade every year. The answers lay within their drivers and how they may lockout certain features to older cards and unlock them for those with the latest/ high-end cards, even though there's no reason why you wouldn't be able to achieve it on last generation's card. It's things like that. Maybe they don't do so much of that now, but I recall, not but 4-5 years ago that if you had to have the latest card just to run, AF 16x or MSAA 8x etc. and people would have to run hacks in order to Unlock higher options on previous gen cards, even though there's nothing to keep you from running those settings on older cards by trading off another effect/ shadow settings etc.


To me it looks like almost double performance...

m
0
l
a c 216 U Graphics card
September 14, 2012 6:03:26 PM

bernardblack said:
To be fair, I'm not talking about my system with 1 card vs my system with 2. There is a small-notable gain in performance with 2x cards vs 1, depending on the title. I'm talking about how there are frequently SLI gamers that hand out their max-settings + FPS and half the time I achieve faster in a single-card system. The gain isn't as justifiable as it was back in the day. SLI doesn't necessarily equate to running double res, you can't run double VRam etc. It's like opting to run a pure mirrored Raid setup, as opposed to Raid Striping. Striping is double the HDD performance but Mirror was the best for backup purposes. Not sure how Raid is these days, that how it was back when I ran it. I know some changes have been made.

True about the developers though, but it's not secret that they Want you to upgrade every year. The answers lay within their drivers and how they may lockout certain features to older cards and unlock them for those with the latest/ high-end cards, even though there's no reason why you wouldn't be able to achieve it on last generation's card. It's things like that. Maybe they don't do so much of that now, but I recall, not but 4-5 years ago that if you had to have the latest card just to run, AF 16x or MSAA 8x etc. and people would have to run hacks in order to Unlock higher options on previous gen cards, even though there's nothing to keep you from running those settings on older cards by trading off another effect/ shadow settings etc.


For that to be true, you have to have a problem with your system, or you have a CPU bottleneck.
m
0
l
a b U Graphics card
September 14, 2012 7:28:56 PM

I remember having 2 Diamond Monster 2's 12MB OC'd to 95 or 100Mhz. Those cards were great and gaming at "high resolution - 1024 x 768 @ 16 bit color" was just awesome. I don't think there ever was a time when a company was so far ahead of the competition that there wasn't actually any real competition.

Then everyone started hating on Glide and started going to OpenGL and DX. Then Nvidia came out with the TNT and then the downfall started.
m
0
l
a b U Graphics card
September 14, 2012 7:45:42 PM

I had my Voodoo card for 3d and a Lightspeed 128 to do the 2d rendering. That was the setup for computer gaming, top of the line. The 2 d card had all of 1.25 megs of ram, and I think the Voodoo card had 4 megs, or something equally uber. Games were so smoooooth :o 
m
0
l
September 21, 2012 1:46:13 AM

ish416 said:
I remember having 2 Diamond Monster 2's 12MB OC'd to 95 or 100Mhz. Those cards were great and gaming at "high resolution - 1024 x 768 @ 16 bit color" was just awesome. I don't think there ever was a time when a company was so far ahead of the competition that there wasn't actually any real competition.

Then everyone started hating on Glide and started going to OpenGL and DX. Then Nvidia came out with the TNT and then the downfall started.


Yes I believe I had the Diamond Monster 8mb version back in the AVP 1 days (Alien Versus Predator 1) I remember being excited running that game even at 800x600. THen the Voodoo 1 days allowed for me to run that same game at something like 1024x768 and 1152x768 even.
m
0
l
September 21, 2012 1:49:41 AM

...but I remember Unreal I (best Unreal game ever, best storyline/ atmosphere etc.). I don't recall which card I was running, I think it was my Voodoo 2, and walking through that busted up prison ship, with the dangling wire sparking near me and reflecting off the hallway walls and a puddle below it, it was so real like back then.
m
0
l
!