Sign in with
Sign up | Sign in
Your question

Gainward GF3 Ti200 64MB or 128MB?

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
March 20, 2002 9:51:11 PM

How much of an advantage does the 128MB Gainward Geforce 3 Ti 200 offer over the 64MB version of the same card? I have read alot about Gainward being really good for O/Cing, this is true for the 128MB version as well? I have an ECS K7S5A mobo, athlon xp 1600, 192MB SDRAM (one 128MB stick @ PC133 and one 64MB PC100 stick, however 3dmark2001 says I have a 66MHz FSB, I'm not sure what this is all about, if someone knows please tellme) and 40GB western digital HD, would it be worth spending an extra 30 or so dollars for the extra memory, or is it not used very efficiently in games?
Anonymous
a b U Graphics card
March 20, 2002 10:00:05 PM

Not much really, as moder games are made for a max of 64MB, meaning by the time 128MB is in use, Ti200 will long be unuseable.
March 20, 2002 10:17:08 PM

You're better off going for Radeon 8500 than the 128 MB Ti200. Price is similar, and the 8500 tops the ti500.

Bad trolls Bad trolls... Whacha gonna do... Whacha gonna do when they post here too...
Related resources
Can't find your answer ? Ask !
Anonymous
a b U Graphics card
March 20, 2002 11:22:28 PM

no it doesn't.
March 21, 2002 4:08:21 AM

With the 6025 drivers, it was close or beat the ti500 in every benchmark in the new <A HREF="http://www.tomshardware.com/graphic/02q1/020304/geforce..." target="_new">GF4 review</A> (Check all the benches). The 6043 drivers are better giving at least a 3% increase to everything, and over 10% in OpenGL.

On top of that, the GF3 is a dead product, they are stoping making them. Nvidia tends to stop tweeking drivers on products it thinks are dead (TNT, Even the Geforce 256), and even if it wasn't, it's got pretty much all of the performance squeezed out of it. The 8500 still has a lot of room to grow, and is cheeper than the ti-500.

Bad trolls Bad trolls... Whacha gonna do... Whacha gonna do when they post here too...
March 21, 2002 4:53:26 AM

i have 2 disagree. the Ti500 is much more reliable compare to the 8500, i also believe that it is faster as well. And No, the 8500 cant be pushed further then the GF3.
March 21, 2002 10:30:08 AM

I have to disagree. I've never had driver problems. I've personally seen a huge improvement in performance between the 3286 and the latest 6043 and 6052. Also, ATI has faster driver support than nVidia which means if you have a problem it'll probably be solved faster with an ATI card than an nVidia card. The R8500 is technicially superior to the GF3 in nearly everyway, and even superior to the GF4 in some aspects.

AMD technology + Intel technology = Intel/AMD Pentathlon IV; the <b>ULTIMATE</b> PC processor
March 21, 2002 1:47:52 PM

Also, as Crashman has pointed out in a different post, Nvidia's drivers have been getting more and more unstable.

Those who are saying the ti500 is better. What do you see that makes you think that the 8500 can't be, if it already hasn't been, pushed past the ti500? Do you think there is a lot more room for improvement in the Geforce 3 line? Nvidia doesn't think so or else they probably wouldn't be abandoning it.

What I see, and have experienced personaly, is a very stable card that continues to make leaps and strides in performance with each driver release, nearly 6 months after it was put out on the market. I see a card that, with faster memory, is easily pushed to faster speeds without extra cooling. I see a card that has extra features standard that are not available on all GF3s, and some aren't even an option on the GF3 line (Dual monitor support). I also see a card that's price point is fantastic.

So, please let me know what you think. I've never been happier about a hardware purchase than I have about my 8500, but I am willing to listen to what you all have to say. Don't get me wrong, the GF3 is a good card line, and is not a horable purchase, I just feel that the 8500 is money better spent.

Bad trolls Bad trolls... Whacha gonna do... Whacha gonna do when they post here too...
!