GeForce2 Upgrade scores less than Voodoo2?

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Before I upgraded, one of my systems had the following configuration:

Gigabyte Motherboard w/ ALi Aladdin V Chipset
K6-2 450MHz
256 MB PC133 SDRAM
Trident Blade 3D 8MB (Don't even consider gaming with this!)
Creative 3D Blaster Voodoo2 12 MB (My gaming card)

This system is currently used to play Home LAN games like UT, CS, & HL; these games aren't exactly new, but they can't be considered ancient either. (Hey we still LOVE ROTT!) After installing the Day of Defeat mod for HL, I finally decided to erradicate the video bottlenecks. From studying several benchmark tests, I concluded that installing a video card any faster than a GeForce DDR on a K6-2 500MHz results in very marginal or no performance gain. As a result, the bargain eVGA GeForce2 MX 400 64MB was my pick.

Knowing that the Epic Games developed the Unreal engine with mainly 3DFX chipsets in mind, I expected only 5 or 10FPS tacked onto my previous 25FPS with the Voodoo2. (Of course, the quoted average is entirely dependent on the map played, # of bots, and Graphics settings.)

To my surprise, there was NO performance gain. In fact, the framerate stooped down to approximately 20FPS or less! This is what a Pentium 233MMX should score! Considering this system is souped up compared to other Super Socket 7 platforms, the framerate was UNACCEPTABLE for having a decent 64MB SDRAM Graphics card.

HL Deathmatch performed like a Pentium 233MMX as well. Surprisingly, when I launched the Cherbourg map on HL Day of Defeat, I actually saw a tremendous performance gain.

In both HL & UT, Direct3D freezes the system after a few seconds or sometimes a couple of minutes after the games is launched. So, basically I am limited to OpenGL, even though I got some glimpses of slow Direct3D performance as well.

I originally compared an eVGA GeForce2 MX 400 (1280x1024x32) Voodoo2(800x600x16) as only an experiment on my Dad's Office system. When using maximum detai,16 bots, & Fractal, the eVGA GF2 SMASHED the V2 into the dust although, I must say the V2 performed actually had very high framerates like 50FPS on 4 bots. I know eVGA isn't one of the most well-known NVidia board producers, but I decided to buy an eVGA GeForce2 MX 400 for myself anyways because I liked the card's performance on dad's office system. His configuration is the following:

Gigabyte GA-7VTX-H w/ KT266A Chipset
Athlon Thunderbird 1333 MHz
256 MB PC2100 266MHz DDR SDRAM
eVGA GeForce2 MX 400 (Guess you got part one down by now)
Windows XP Professional (I like LINUX better though!)

Finally, what must I do to get my GF2 to perform like it should on this sytem & why would my system be performing so poorly? My resources are ovr 95% free, and I even reformmated and freshly install Windows 98 SE. Any suggestions would GREATLY be appreciated.
 

svol

Champion
First: the GeForce2 MX400 is known to be a bad performing card.
Don't think your mobo supports AGP 4x so that also brings the performance down. The Voodoo2 is a PCI card (right?) so isn't dependent of the AGP bus.
Also be sure that you have the latest drivers from NVidia's website.

My case has so many fans that it hovers above the ground :eek: .
 

Matisaro

Splendid
Mar 23, 2001
6,737
0
25,780
were you sure to remove all traces of the voodoo drivers from your system.


ALso make sure your bios is set up properly.

"The Cash Left In My Pocket,The BEST Benchmark"
No Overclock+stock hsf=GOOD!
 

TRENDING THREADS