Sign in with
Sign up | Sign in
Your question

New Voodoo3 3000 driver!

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
November 12, 2000 11:28:48 PM

Ok several weeks ago I downloaded 3dfx's at the time new V3 driver 1.05.00. Anyways, this new driver had FXt1, at least the option to enable it. I thought that was kinda odd, so I along with several other THF members emailed 3dfx with the obvious question why?

Well it seems that it wasent supposed to be in that driver. Now I just finised instaling 3dfx's new V3 driver 1.06.00. They took the liberty to take that option out. But now they have included a new feature in the driver. This is the new added feature that now comes with the Voodoo3 driver:

Depth Precision - This feature optimizes the precision used for depth calculations in 16-bit modes to improve performance, especially at higher resolutions. This optimization could also produce some display artifacts or visual defects in some programs.

Disable - Select this option to use normal depth precision.
Fast - This setting provides optimized depth precision, and may produce some visual defects in some programs.
Faster - This setting provides maximum depth precision optimization for faster performance, but it may produce more visual defects.

Anyone care to comment on this new option?

More about : voodoo3 3000 driver

January 3, 2007 7:27:12 PM

Who's 3DFX? ;-)
January 3, 2007 7:40:21 PM

That post was made 6 days before you were a member of the forum in November of 2000. :) 
Related resources
January 3, 2007 7:45:55 PM

Bah I still have a Voodoo 3 3000. I was hoping it was a driver that allow me to try Oblivion at max settings! :lol: 
January 4, 2007 3:36:13 AM

Quote:
Bah I still have a Voodoo 3 3000. I was hoping it was a driver that allow me to try Oblivion at max settings! :lol: 

Actually, it will. It pwns G80 and R600 with no problems. ;) 
January 4, 2007 3:58:11 AM

Hmm... it should pwn the X1950XTX... but the G80 and R600 are slightly out of its league. That's why the monstrosity known as the Voodoo5 6000 was born.
January 4, 2007 4:06:53 AM

Man I loved the Voodoo 3. I still have two Voodoo 3500 TV cards. 16 MBs of video RAM (or frame buffer); I'm still drooling. It kicked ass in NFS (4 and 5) and Half-Life; heck it even got decent performance and detail in Return to Castle Wolfenstien - especially compared to a nVidia TNT with 8 MB video RAM that I had on another computer.
January 4, 2007 4:14:42 AM

i've decided i'm going to buy 1 card from each 3dfx generation off ebay. i'm already bidding on the voodoo1 and voodoo2. then i'll move on to the 3000 and 5000 series
January 4, 2007 5:20:24 AM

Quote:
3DFX!

http://www.youtube.com/watch?v=fyVOCV8QeLo

http://www.youtube.com/watch?v=NrERPOBMO04

http://www.youtube.com/watch?v=JdvhNnM850w

YEAH run oblivion at full on voodoo, don't think so :-D maybe at 0.01 fps.


LMAO. those are classic. too bad 3dfx died... it really is. ATI and nVidia are almost completely devoid of any humor

Woooooooooaaaaaaaaahh this thread is old. Yeah I really liked those commercials but I wasn't technologically savvy when 3dfx was around so I guess I missed out =/
January 4, 2007 3:33:52 PM

Well too bad, because 3dfx made some really killer products in their day. I liked their idea of including multiple GPUs on one card (um, yeah, SLI or 4 GPUs on a single card? What do you think?).
January 4, 2007 4:12:23 PM



Yeah, kind of scary though.
January 4, 2007 4:36:45 PM

Actually, nvidia bought 3dFX, that's why they can use SLI as the name of their multi-gpu configuration (well, it's based on the 3dfx design anyway - that might explain why it works better than Xfire).

Btw, Nvidia did a SLI on a single card...the 7950
January 4, 2007 5:00:24 PM

This thread is old lol, the only voodoo i owned was a voodoo 2.

It really helped my pentium 100 running nfs 2 se :D 
January 4, 2007 5:18:10 PM

Good riddance to 3dfx, bought a card a week before they went under, were good for a brief time in the 20th century LOL
a c 108 U Graphics card
January 4, 2007 5:31:35 PM

had to do it didn't you....

:) 
January 4, 2007 5:38:23 PM

Quote:
Btw, Nvidia did a SLI on a single card...the 7950


yeah, and it sucks...i bought one and i want to shoot myself in the head. i honestly regret not going with 7900gt's or 7950gt's in sli.
January 4, 2007 6:06:32 PM

I still have a VooDoo 3 3500 somewhere, with the huge adapter for it.. blah
January 4, 2007 6:53:55 PM

I wonder how a voodoo 600 would compare to my pci gforce fx 5500. id be pissed if it was like 3 years ahead of its time.
January 4, 2007 7:09:37 PM

Um, the 7950 GX2 doesn't really count because it pretty much IS 2 cards.

BTW, I have a 3500 and it's still working (gave it to a friend for his budget rig). That thing was a tank. :D 
January 4, 2007 7:24:00 PM

Quote:

How would that fit in any computer case :S and what is that waste of board t the end that ooks like it has no pcd on it :S wierd design


That extra length of board was so the card was "full length". Old school. It allowed the card to reach the front of the case and slip in the expansion slot guides. Helped support the card both front and back instead of just at the back in a tower system. Ship a system with that card not supported at both ends and it would snap the AGP slot off the board. :lol: 

Expansion slot guides is that gray thing at the front of the case that line up with the cards in this pic:
http://www.madshrimps.be/articles/TitanRobelaH20case-Li...
(cut 'n paste that link if you're curious)
January 4, 2007 7:52:08 PM

Quote:
Who's 3DFX? ;-)


You know, i saw a Vodoo 5 5500 being sold for 29Euro on ebay a few days ago. Thats about 35$. It broke my heart to see such a magnificent piece of computer history go for such a low price.

BRING BACK 3DFX! :twisted:
January 4, 2007 7:58:15 PM

I remember when a friend of mine bought a Voodoo 3.. I was Jealous. My first GPU was a Riva TNT2 Ultra. Bad a$$ back in the day.
January 4, 2007 8:06:58 PM

Quote:
Good riddance to 3dfx, bought a card a week before they went under, were good for a brief time in the 20th century LOL


Hey man, without 3DFX we may not have the level of advanced graphics that we do today.

Still got a Voodoo 3 2000 thats alive and kickn' arse. Played Half-Life and Quake3 like a jem.
January 4, 2007 8:30:16 PM

Quote:
Good riddance to 3dfx, bought a card a week before they went under, were good for a brief time in the 20th century LOL


Hey man, without 3DFX we may not have the level of advanced graphics that we do today.

Still got a Voodoo 3 2000 thats alive and kickn' arse. Played Half-Life and Quake3 like a jem.
Yeah, my 3500 rocked it up in Unreal Tournament like nobody's business man! :D 
January 4, 2007 8:33:28 PM

Quote:
I remember when a friend of mine bought a Voodoo 3.. I was Jealous. My first GPU was a Riva TNT2 Ultra. Bad a$$ back in the day.


Technically, GPU's didn't exist until the GeForce 256.
January 4, 2007 8:38:45 PM

Quote:
I remember when a friend of mine bought a Voodoo 3.. I was Jealous. My first GPU was a Riva TNT2 Ultra. Bad a$$ back in the day.


Technically, GPU's didn't exist until the GeForce 256.
Don't you mean the TERM GPU didn't exist? The TNT2 was doing what a GPU does, but I think the terminology just wasn't coined yet.

It's like how Texas Instruments' processors in calculators COULD be considered CPUs because they are the Central Processing Unit of the Calculator, but of course when we use the term CPU we are usually referring to CPUs in Personal Computers.

Correct me if I'm wrong.
January 4, 2007 9:06:49 PM

Damn, I was like 10 years old back then, but I seem to recall that the fact that the GeForce 256 had hardware T&L is what made it a GPU...
January 4, 2007 9:11:46 PM

Quote:
I remember when a friend of mine bought a Voodoo 3.. I was Jealous. My first GPU was a Riva TNT2 Ultra. Bad a$$ back in the day.


Technically, GPU's didn't exist until the GeForce 256.
Don't you mean the TERM GPU didn't exist? The TNT2 was doing what a GPU does, but I think the terminology just wasn't coined yet.

It's like how Texas Instruments' processors in calculators COULD be considered CPUs because they are the Central Processing Unit of the Calculator, but of course when we use the term CPU we are usually referring to CPUs in Personal Computers.

Correct me if I'm wrong.

Actually the introduction of the Geforce series had something to do with a programmable graphics chip. Something that hasn´t been around before and thus the term GPU was coined.
January 4, 2007 9:43:11 PM

*sigh*

http://en.wikipedia.org/wiki/Graphics_processing_unit

Please read that.

Quote:
A Graphics Processing Unit or GPU (also occasionally called Visual Processing Unit or VPU) is a dedicated graphics rendering device for a personal computer, workstation, or game console. Modern GPUs are very efficient at manipulating and displaying computer graphics, and their highly parallel structure makes them more effective than typical CPUs for a range of complex algorithms.

A GPU implements a number of graphics primitive operations in a way that makes running them much faster than drawing directly to the screen with the host CPU. The most common operations for early 2D computer graphics include the BitBLT operation (combines several bitmap patterns using a RasterOp), usually in special hardware called a "blitter", and operations for drawing rectangles, triangles, circles, and arcs. Modern GPUs also have support for 3D computer graphics, and typically include digital video-related functions as well.


Sealboy might have something with the Hardware T&L being a requirement for the GPU though...

Quote:
The first company to develop the GPU is NVIDIA Inc. Its GeForce 256 GPU is capable of billions of calculations per second, can process a minimum of 10 million polygons per second, and has over 22 million transistors, compared to the 9 million found on the Pentium III. Its workstation version called the Quadro, designed for CAD applications, can process over 200 billion operations a second and deliver up to 17 million triangles per second.


But that doesn't seem to be entirely accurate to me, because Amiga had a dedicated chip for graphics processing, so why does that not qualify?

Anyway, bah. Whatever. It's a stupid terminology argument anyway so meh. :D 
January 4, 2007 9:52:01 PM

Actually, I think the GeForce 3 was the first programmable GPU.

The term GPU was more of a marketing gimmick back then. I remember a lot of people refused to use it. But the GeForce was one of the first cards to integrate a lotta crap into one chip... including a memory controller.

Regardless, the kick ass performance didn't really come until the GeForce 2, which was released a few short months later.

And of course, the GeForce 3 was the ultimate card. That's when graphics really took a leap forward.
January 4, 2007 10:03:38 PM

Quote:

How would that fit in any computer case :S and what is that waste of board t the end that ooks like it has no pcd on it :S wierd design


Back when I was in high school, I use to 'acquire' old computer from people or businesses - take them apart, resuse the parts etc.
Some of the old ISA 14.4 or slower Data/fax modems rivaled the size of a 8800GTS.
January 4, 2007 10:15:39 PM

Apparently the marketing gimmick worked because now I have people telling me that I don't know what a GPU is and nVidia created the first GPU, when it just stands for Graphic Processing Unit (which doesn't even imply 3D, let alone hardware T&L). ;) 
January 4, 2007 10:19:19 PM

Quote:
*sigh*

http://en.wikipedia.org/wiki/Graphics_processing_unit

Please read that.

A Graphics Processing Unit or GPU (also occasionally called Visual Processing Unit or VPU) is a dedicated graphics rendering device for a personal computer, workstation, or game console. Modern GPUs are very efficient at manipulating and displaying computer graphics, and their highly parallel structure makes them more effective than typical CPUs for a range of complex algorithms.

A GPU implements a number of graphics primitive operations in a way that makes running them much faster than drawing directly to the screen with the host CPU. The most common operations for early 2D computer graphics include the BitBLT operation (combines several bitmap patterns using a RasterOp), usually in special hardware called a "blitter", and operations for drawing rectangles, triangles, circles, and arcs. Modern GPUs also have support for 3D computer graphics, and typically include digital video-related functions as well.


Sealboy might have something with the Hardware T&L being a requirement for the GPU though...

Quote:
The first company to develop the GPU is NVIDIA Inc. Its GeForce 256 GPU is capable of billions of calculations per second, can process a minimum of 10 million polygons per second, and has over 22 million transistors, compared to the 9 million found on the Pentium III. Its workstation version called the Quadro, designed for CAD applications, can process over 200 billion operations a second and deliver up to 17 million triangles per second.


But that doesn't seem to be entirely accurate to me, because Amiga had a dedicated chip for graphics processing, so why does that not qualify?

Anyway, bah. Whatever. It's a stupid terminology argument anyway so meh. :D 

Thanks! Typical case of saying something and thinking something else.
Well, as i see, i was wrong with both. To me a GPU needs to be programmable. The term GPU includes the term processor, which actually suggests processing which doesn´t require it to be programmable. Which brings me to where i was wrong - to assume that a processor has to be programmable to be a processor. It doesn´t have to be. I guess i was thinking about PC CPUs.
Basically i confused programmable chips and non-programmable chips. Are there terms to differenciate between the two?
I mean, as mpjesse pointed out, the Geforce 3 was programmable. By that it differenciates itself from the rest of the GPU crowd, still its just a "Geforce 3" or a NV20. Now since we´re at it, what is a programmable chip called?
January 4, 2007 10:21:45 PM

Well, apparently it's not an easy argument to lay to rest being as how wikipedia and webopedia say 2 different things pretty much. :D 
January 4, 2007 10:33:02 PM

I might be wrong, but I think that the pixel and vertex shaders are what made the GeForce 3 the first truly programmable GPU.
January 4, 2007 10:43:32 PM

Quote:
Actually, nvidia bought 3dFX, that's why they can use SLI as the name of their multi-gpu configuration (well, it's based on the 3dfx design anyway - that might explain why it works better than Xfire).

Btw, Nvidia did a SLI on a single card...the 7950


actually the SLi of 3dfx isn't the exact same thing as nVidia's SLi. SLi for nVidia stands for scalable link interface while SLi from 3dfx stands for Scan-Line InterLeave. I'm not sure if they worked the same way or not.

P.S 7950 GX2 = failure
January 4, 2007 11:28:57 PM

Technically speaking, SEALBoy is correct.

Prior to the introduction of the Geforce256 and hardware T&L, video cards were referred to as graphic accelerators, nothing more. This was due in large part to the limited number of things they actually accelerated or offloaded from the CPU.

Hardware T&L allowed the graphics core to accelerate other things, such as OpenGL's lighting and matrix operations.

Definition of: T&L

(Transformation & Lighting) Moving 3D objects on screen and changing the corresponding lighting effects. Transforming these 3D matrices many times per second as the objects move and recomputing their shadows each time takes an enormous amount of processing. Hardware T&L offloads these functions from the system CPU into the display adapter or some other board, which enables a greater number of polygons to be processed to create a more realistic effect.


Future generations, such as the Geforce3/Geforce4 allowed fully programmable GPU's with hardware vertex shaders and fragment shaders. With the introduction of the shaders came Cg and corresponding shader languages, such as HLSL and GLSL. The rest is history.

As an afterthought, I currently have both, a Diamond TNT2 Ultra and an Asus Geforce256 DDR in storage. Oh the memories.
a b U Graphics card
January 4, 2007 11:51:05 PM

Quote:
SLi for nVidia stands for scalable link interface while SLi from 3dfx stands for Scan-Line InterLeave.


Nvidia SLi processes with two methods, either
*the upper half is processed by one card, and the bottom half of the screen is processed by the second

OR

*odd frames on one, even frames on another.


3dfx SLi:
*one card for the odd-numbered scanlines, one for the even-numbered scan lines.

Amazing that i still remember this.

Quote:

As an afterthought, I currently have both, a Diamond TNT2 Ultra and an Asus Geforce256 DDR in storage.


I still have a GeForce MX 200 32MB, and a Savage 4 16MB. I remember playing Couter-Strike beta 6 on the Savage 4, and it was sweet.
January 5, 2007 1:24:29 AM

Quote:
P.S 7950 GX2 = failure


trust me...i know.
January 5, 2007 2:02:23 AM

Oh so that IS you in the avatar :wink:
January 5, 2007 2:29:43 AM

Quote:
Oh so that IS you in the avatar :wink:


finally, someone understands why the H E L L i picked that.
a b U Graphics card
January 5, 2007 2:58:59 AM

Quote:
i've decided i'm going to buy 1 card from each 3dfx generation off ebay. i'm already bidding on the voodoo1 and voodoo2. then i'll move on to the 3000 and 5000 series


i still own two voodoo1's, two voodoo2 12mb's in sli, and one or two voodoo3's :p 

3Dfx seems like the only company where fanboys unite :lol:  :lol:  :lol: 
January 5, 2007 3:40:41 AM

I want to say they stopped making those when i was learning to walk.
January 5, 2007 3:52:41 AM

My videocards to date:
1. Some prehistoric Trident ISA video card.
2. Creative Riva TNT 16MB PCI (yes, they did make graphics cards).
3. ELSA Riva TNT2 32MB AGP (anyone remember ELSA and their 3D-glasses?)
4. Visiontek GeForce2 MX400 64MB AGP
5. Sapphire Radeon 9000 64MB DDR AGP
6. MSI Radeon 9250 128MB DDR AGP
7. nVidia GeForce 6100 IGP 128MB shared (current :cry:  :cry:  :cry:  )
8. Upgrade soon... I hope.
January 5, 2007 11:04:17 AM

Haha! See what I started by saying GPU?

I really only typed it because I was too lazy to type Graphics Accelerator.

Hmm..
I think I started on a Trident 4x AGP card with 4mb memory.
then, the Diamond Riva TNT2 Ultra 32MB, then my 9800 SE 128MB.. the 9800 is what I bought for my current rig, other than that the previous 2 were in my Pentium II 300. I used a laptop for a long time with a GeForce2 Go.
!