FX5200, should it be this bad?

G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

I recently made a P4 3.0 computer, the place was out of 5700 ultras so I got
a Geforce FX5200 128mb, I figured it would still be on a par or better than
the 2 year old TI 4200 I had in the previous system.

Imagine my surprise to see GTA3(old game with a 700mhz + 16mb D3D card
recommended) running like a slide show on the new computer, I proceeded to
run the game X2-the threat in benchmark mode to compare the new computer to
the old one(athlon 2800 with the TI 4200):

Old Athlon with 4200TI - 53 frames per second
New P4 3.0 with FX 5200 - 10 FPS

I went through the normal procedure of reintalling direct x, getting new
drivers etc, totally convinced something was seriously wrong but it seems
this really is how bad the 5200 is!!, I swapped the cards and the Athlon
performed just as badly with the 5200, infact either computer with the 5200
was half as fast as my daughters Athlon 1000 with a GF2 ultra.

OK I know the 5200 is not exactly top-of-the-range and it didnt cost me a
lot, but with figures like 10 FPS it is frankly unusable, I really cant see
how Nvidia can still sell a card that is vastly slower than one they were
selling 4 years ago, hell I have a 3dFX 5500 in the cupboard upstairs that
beats it hands-down.

Anyone else have the misfortune to have owned one of these "video cards"?,
are they supposed to be this bad?

Note the use of the past-tense as I cant believe anyone who plays games more
demanding than minesweeper still uses one
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Red Activist" <Red@ctivistREMOVE.fsnet.co.uk> wrote in message
news:cdc2r3$fg$1@news7.svr.pol.co.uk...
> I recently made a P4 3.0 computer, the place was out of 5700 ultras so I
got
> a Geforce FX5200 128mb, I figured it would still be on a par or better
than
> the 2 year old TI 4200 I had in the previous system.
>
> Imagine my surprise to see GTA3(old game with a 700mhz + 16mb D3D card
> recommended) running like a slide show on the new computer, I proceeded to
> run the game X2-the threat in benchmark mode to compare the new computer
to
> the old one(athlon 2800 with the TI 4200):
>
> Old Athlon with 4200TI - 53 frames per second
> New P4 3.0 with FX 5200 - 10 FPS
>
> I went through the normal procedure of reintalling direct x, getting new
> drivers etc, totally convinced something was seriously wrong but it seems
> this really is how bad the 5200 is!!, I swapped the cards and the Athlon
> performed just as badly with the 5200, infact either computer with the
5200
> was half as fast as my daughters Athlon 1000 with a GF2 ultra.
>
> OK I know the 5200 is not exactly top-of-the-range and it didnt cost me a
> lot, but with figures like 10 FPS it is frankly unusable, I really cant
see
> how Nvidia can still sell a card that is vastly slower than one they were
> selling 4 years ago, hell I have a 3dFX 5500 in the cupboard upstairs that
> beats it hands-down.
>
> Anyone else have the misfortune to have owned one of these "video cards"?,
> are they supposed to be this bad?
>
> Note the use of the past-tense as I cant believe anyone who plays games
more
> demanding than minesweeper still uses one
>
The FX 5200 is the lowest end card NVIDIA currently makes. That was not the
case with the Ti4200. Many people need video cards to do mostly 2D and
occasionally 3D without playing a lot of games, and that is who the 5200 is
for.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Red Activist" <Red@ctivistREMOVE.fsnet.co.uk> wrote in message
news:cdc2r3$fg$1@news7.svr.pol.co.uk...
> OK I know the 5200 is not exactly top-of-the-range and it didnt cost me a
> lot, but with figures like 10 FPS it is frankly unusable, I really cant
see
> how Nvidia can still sell a card that is vastly slower than one they were
> selling 4 years ago, hell I have a 3dFX 5500 in the cupboard upstairs that
> beats it hands-down.

You have learnt a very universally important lesson: Always do some
backgroud research before spending money!!! (especially if you care much
about money that is...) With the internet at your hands nowadays it should
be a piece of cake for you to get quick accurate info :D
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

The Ti4200 has a MUCH faster clock rate than the slowww FX5200. The 5200 is
the current entry level (slowww) card from Nvidia and is pretty much agreed
to be a waste of money for gamers.
(However, the 4200 is NOT DX9 capable, while the 5200 is.)

--
DaveW



"Red Activist" <Red@ctivistREMOVE.fsnet.co.uk> wrote in message
news:cdc2r3$fg$1@news7.svr.pol.co.uk...
> I recently made a P4 3.0 computer, the place was out of 5700 ultras so I
got
> a Geforce FX5200 128mb, I figured it would still be on a par or better
than
> the 2 year old TI 4200 I had in the previous system.
>
> Imagine my surprise to see GTA3(old game with a 700mhz + 16mb D3D card
> recommended) running like a slide show on the new computer, I proceeded to
> run the game X2-the threat in benchmark mode to compare the new computer
to
> the old one(athlon 2800 with the TI 4200):
>
> Old Athlon with 4200TI - 53 frames per second
> New P4 3.0 with FX 5200 - 10 FPS
>
> I went through the normal procedure of reintalling direct x, getting new
> drivers etc, totally convinced something was seriously wrong but it seems
> this really is how bad the 5200 is!!, I swapped the cards and the Athlon
> performed just as badly with the 5200, infact either computer with the
5200
> was half as fast as my daughters Athlon 1000 with a GF2 ultra.
>
> OK I know the 5200 is not exactly top-of-the-range and it didnt cost me a
> lot, but with figures like 10 FPS it is frankly unusable, I really cant
see
> how Nvidia can still sell a card that is vastly slower than one they were
> selling 4 years ago, hell I have a 3dFX 5500 in the cupboard upstairs that
> beats it hands-down.
>
> Anyone else have the misfortune to have owned one of these "video cards"?,
> are they supposed to be this bad?
>
> Note the use of the past-tense as I cant believe anyone who plays games
more
> demanding than minesweeper still uses one
>
>
 

ceebee

Distinguished
Apr 5, 2004
197
0
18,680
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Red Activist" <Red@ctivistREMOVE.fsnet.co.uk> wrote in
alt.comp.periphs.videocards.nvidia:

> I recently made a P4 3.0 computer, the place was out of 5700 ultras so
> I got a Geforce FX5200 128mb, I figured it would still be on a par or
> better than the 2 year old TI 4200 I had in the previous system.
>
> Imagine my surprise to see GTA3(old game with a 700mhz + 16mb D3D card
> recommended) running like a slide show on the new computer, I
> proceeded to run the game X2-the threat in benchmark mode to compare
> the new computer to the old one(athlon 2800 with the TI 4200):
>
> Old Athlon with 4200TI - 53 frames per second
> New P4 3.0 with FX 5200 - 10 FPS
>
> I went through the normal procedure of reintalling direct x, getting
> new drivers etc, totally convinced something was seriously wrong but
> it seems this really is how bad the 5200 is!!, I swapped the cards and
> the Athlon performed just as badly with the 5200, infact either
> computer with the 5200 was half as fast as my daughters Athlon 1000
> with a GF2 ultra.
>
> OK I know the 5200 is not exactly top-of-the-range and it didnt cost
> me a lot, but with figures like 10 FPS it is frankly unusable, I
> really cant see how Nvidia can still sell a card that is vastly slower
> than one they were selling 4 years ago, hell I have a 3dFX 5500 in the
> cupboard upstairs that beats it hands-down.
>
> Anyone else have the misfortune to have owned one of these "video
> cards"?, are they supposed to be this bad?
>
> Note the use of the past-tense as I cant believe anyone who plays
> games more demanding than minesweeper still uses one



A 1969 Lamborghini Miura eats any 2004 Kia for breakfast. You can't
compare apples and oranges - an old high(er) end card with a newer low
end card.

A short info spree around Internet would have given you that info.

The FX5200 is a low end card and not suited for regular gamers who need
high frame rates, fast refresh rates and lots of tiny details at high
rez recalculated every nanosecond.
It is an excellent budget choice for the user who plays a casual game
with resolution not set too high. Remember that "PC user" doesn't equal
"gamer".

Instead of gloating over bad performance of the card, one could gloat
over your bad performance on etting info and choosing a suitable card
for your specs before buying one.


--
CeeBee


"I don't know half of you
half as well as I should like;
and I like less than half of you
half as well as you deserve."
 

augustus

Distinguished
Feb 27, 2003
740
0
18,980
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

> Anyone else have the misfortune to have owned one of these "video cards"?,
> are they supposed to be this bad?
>
> Note the use of the past-tense as I cant believe anyone who plays games
more
> demanding than minesweeper still uses one

They're really that bad. Friend had a 64bit MSI FX5200 installed on a dual
channel Barton XP3200+ with a gig of RAM. 3DMark01 score.....6400. The only
one worth having is the FX5200 Ultra.....which still gets it's ass kicked by
a Ti4200 of any kind. The plain 128bit FX5200's performance is barely above
the standard Radeon 9200 series, itself a bottom feeder. The lowest FX card
I'd touch is the 5700 Ultra.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Augustus" <tiberius@weeik.com> wrote in message
news:OBkKc.43577$iw3.20604@clgrps13
>> Anyone else have the misfortune to have owned one of these "video
>> cards"?, are they supposed to be this bad?
>>
>> Note the use of the past-tense as I cant believe anyone who plays games
>> more demanding than minesweeper still uses one
>
> They're really that bad. Friend had a 64bit MSI FX5200 installed on a dual
> channel Barton XP3200+ with a gig of RAM. 3DMark01 score.....6400.
.....

This is a tad more than my G4 MX4400 - I get around 6000-6100 on my TB
XP2200+

St.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Red Activist" <Red@ctivistREMOVE.fsnet.co.uk> wrote in message
news:cdc2r3$fg$1@news7.svr.pol.co.uk...
> I recently made a P4 3.0 computer, the place was out of 5700 ultras so I
got
> a Geforce FX5200 128mb, I figured it would still be on a par or better
than
> the 2 year old TI 4200 I had in the previous system.
>
> Imagine my surprise to see GTA3(old game with a 700mhz + 16mb D3D card
> recommended) running like a slide show on the new computer, I proceeded to
> run the game X2-the threat in benchmark mode to compare the new computer
to
> the old one(athlon 2800 with the TI 4200):
>
> Old Athlon with 4200TI - 53 frames per second
> New P4 3.0 with FX 5200 - 10 FPS


Considering that teh FX5200 is the newer version of the GF4 MX420, what more
did you expect?

About the only difference between the GF4 MX 420 and the FX5200 is DX9
support on the FX. The FX5200 is their lowest card in their current
offerings.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

>he FX5200 is their lowest card in their current
>offerings.

They should rename it to Geforce 4-5200 MX+
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"PRIVATE1964" <private1964@aol.com> wrote in message
news:20040717232955.15985.00001204@mb-m21.aol.com...
> >he FX5200 is their lowest card in their current
> >offerings.
>
> They should rename it to Geforce 4-5200 MX+

Why should they? Its not a GF4 chip based card, its a GF FX chip based
card. Since about the GF2 days, nVidia has marketed many cards in the same
series, ranging in price and speed/memory bandwidth. That has not changed.
The FX5200 is just the lowest card. their cards in teh FX line range from
the 5200 cards all the way up to the 5950 Ultra versions. The only
differences being memory speed/clock speed/memory bandwith(64 or 128-bit)/
and video ram..
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Red Activist wrote:
> I recently made a P4 3.0 computer, the place was out of 5700 ultras
> so I got a Geforce FX5200 128mb, I figured it would still be on a par
> or better than the 2 year old TI 4200 I had in the previous system.
>
> Imagine my surprise to see GTA3(old game with a 700mhz + 16mb D3D card
> recommended) running like a slide show on the new computer, I
> proceeded to run the game X2-the threat in benchmark mode to compare
> the new computer to the old one(athlon 2800 with the TI 4200):
>
> Old Athlon with 4200TI - 53 frames per second
> New P4 3.0 with FX 5200 - 10 FPS
>
> I went through the normal procedure of reintalling direct x, getting
> new drivers etc, totally convinced something was seriously wrong but
> it seems this really is how bad the 5200 is!!, I swapped the cards
> and the Athlon performed just as badly with the 5200, infact either
> computer with the 5200 was half as fast as my daughters Athlon 1000
> with a GF2 ultra.
>
> OK I know the 5200 is not exactly top-of-the-range and it didnt cost
> me a lot, but with figures like 10 FPS it is frankly unusable, I
> really cant see how Nvidia can still sell a card that is vastly
> slower than one they were selling 4 years ago, hell I have a 3dFX
> 5500 in the cupboard upstairs that beats it hands-down.
>
> Anyone else have the misfortune to have owned one of these "video
> cards"?, are they supposed to be this bad?
>
> Note the use of the past-tense as I cant believe anyone who plays
> games more demanding than minesweeper still uses one

I just upgraded from a GeForce 4 MX420 to a FX5200 last week. My system has
the same CPU as urs but dunno bout motherboard and memory setup. Overall i
got about a 30% performance increase above the MX420 using the FX5200. Of
course the MX420 couldnt do pixel shaders of any kind, so a fully DX9 card
was a good buy for £10 i thought. However it is certainly not a high flier,
in some tests in 3DMark2001SE my old MX420 out performed it, only just
though.

Im getting a mates old GeForce 4 4200Ti 128MB with AGP8x on tuesday, and i
expect it to a lot better than the 5200, and completely piss over the MX420.
Based on reasearch ive done and the replies ive had from people on here, if
you want a card in the FX range, dont go lower than a FX5700 with your
system. Other than that the GeForce 4 Ti range will perform better than the
low end FX range.

Matt
--
Collection: http://users.ign.com/collection/GLYTCH_2K4
MSN: GLYTCH_2K4(at)msn.com
Xbox Live: (Coming September)
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Red Activist wrote:
> I was running GTA 3 at 640x480 with lowest possible settings and it
> was barely playable lol

Thats kinda weird though. I play GTA:VC (practically the same game) on my
5200 with no troubles at all, in 1024x768x32 with the frame limiter off and
its very smooooth. Even my old GF4 MX420 could handle it well, a small
amount of slowdown when the s##t hits the fan but thats about it. What are
the rest of your system specs? Seams very odd that your getting THAT bad
performance. My PC is a P4 (Northwood) 3GHz, 1GB DDR400 Dual Channel, Intel
D865PERL motherboard, 2x Western Digital 800JB HDD's.

Matt
--
Collection: http://users.ign.com/collection/GLYTCH_2K4
MSN: GLYTCH_2K4(at)msn.com
Xbox Live: (Coming September)
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sat, 17 Jul 2004 21:40:05 +0100, "Red Activist"
<Red@ctivistREMOVE.fsnet.co.uk> wrote:

>I recently made a P4 3.0 computer, the place was out of 5700 ultras so I got
>a Geforce FX5200 128mb, I figured it would still be on a par or better than
>the 2 year old TI 4200 I had in the previous system.
>
>Imagine my surprise to see GTA3(old game with a 700mhz + 16mb D3D card
>recommended) running like a slide show on the new computer, I proceeded to
>run the game X2-the threat in benchmark mode to compare the new computer to
>the old one(athlon 2800 with the TI 4200):
>
>Old Athlon with 4200TI - 53 frames per second
>New P4 3.0 with FX 5200 - 10 FPS
>
>I went through the normal procedure of reintalling direct x, getting new
>drivers etc, totally convinced something was seriously wrong but it seems
>this really is how bad the 5200 is!!, I swapped the cards and the Athlon
>performed just as badly with the 5200, infact either computer with the 5200
>was half as fast as my daughters Athlon 1000 with a GF2 ultra.
>
>OK I know the 5200 is not exactly top-of-the-range and it didnt cost me a
>lot, but with figures like 10 FPS it is frankly unusable, I really cant see
>how Nvidia can still sell a card that is vastly slower than one they were
>selling 4 years ago, hell I have a 3dFX 5500 in the cupboard upstairs that
>beats it hands-down.
>
>Anyone else have the misfortune to have owned one of these "video cards"?,
>are they supposed to be this bad?
>
>Note the use of the past-tense as I cant believe anyone who plays games more
>demanding than minesweeper still uses one
>

Sounds like you got one of those FX 5200's that only have a 64-bit
memory interface. It cripples an already low-end card to the
"performance" levels you are currently seeing. FX 5200's with the
128-bit memory interface fare much better (even if they still aren't
speed demons).
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

>Why should they? Its not a GF4 chip based card, its a GF FX chip based
>card.

It was a joke get a sense of humor.

They should at least put an easy to understand speed rating system on the box
for the general population to understand.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

PRIVATE1964 wrote:
>>Why should they? Its not a GF4 chip based card, its a GF FX chip based
>>card.
>
>
> It was a joke get a sense of humor.
>
> They should at least put an easy to understand speed rating system on the box
> for the general population to understand.
>
>
You forgot to put a <g> or ;-) after the "joke". Some of us struggle a
little harder with not so obvious jokes. :)
 

pip

Distinguished
Jun 13, 2004
20
0
18,510
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

GLYTCH (A.K.A. PYRO-Maniak) wrote:
> Red Activist wrote:
>
>>I was running GTA 3 at 640x480 with lowest possible settings and it
>>was barely playable lol
>
>
> Thats kinda weird though. I play GTA:VC (practically the same game) on my
> 5200 with no troubles at all, in 1024x768x32 with the frame limiter off and
> its very smooooth. Even my old GF4 MX420 could handle it well, a small
> amount of slowdown when the s##t hits the fan but thats about it. What are
> the rest of your system specs? Seams very odd that your getting THAT bad
> performance. My PC is a P4 (Northwood) 3GHz, 1GB DDR400 Dual Channel, Intel
> D865PERL motherboard, 2x Western Digital 800JB HDD's.

Unfortunately, Rockstar did a bad job of optimising GTA3 for the PC
after porting it.
GTA:VC is another matter, it plays way better than GTA3 did. A friend
also has an FX5200 (he got it free and was better than the onboard he
was using before..) and it does mostly play fine as long as the settings
are pretty low.

I've not played GTA3 since I got my 9800 pro a month or so ago, but if
settings were maxed like I have with VC I seriously wouldn't be
surprised to see it slow down noticeably sometimes.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

>Some of us struggle a
>little harder with not so obvious jokes. :)

; - ) better?
That was another joke by the way.
 

jafar

Distinguished
Jun 25, 2004
115
0
18,680
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sun, 18 Jul 2004 11:35:53 +0100, GLYTCH (A.K.A. PYRO-Maniak) wrote:

> Red Activist wrote:
>> I was running GTA 3 at 640x480 with lowest possible settings and it
>> was barely playable lol
>
> Thats kinda weird though. I play GTA:VC (practically the same game) on my
> 5200 with no troubles at all, in 1024x768x32 with the frame limiter off and
> its very smooooth.

Maybe the difference is, yours is probably the the version with 128bit
memory bandwidth?

--
Jafar Calley
-----BEGIN GEEK CODE BLOCK-----
d+ s-:+ a C++++ L++ E--- W++ N++ w-- PE- t* 5++ R+ !tv D+ G e* h---- x?
------END GEEK CODE BLOCK------
Registered Linux User #359623
http://fatcatftp.homelinux.org
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

jafar wrote:
> Maybe the difference is, yours is probably the the version with 128bit
> memory bandwidth?

Probably, ive no idea if mine is 64 or 128, how can i tell? theres no
stickers on the card at all and no markings to suggest.

Matt
--
Collection: http://users.ign.com/collection/GLYTCH_2K4
MSN: GLYTCH_2K4(at)msn.com
Xbox Live: (Coming September)
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Humga" <Humga@no-spam.com> wrote in message
news:OOmdnac8_o3hBGTd4p2dnA@eclipse.net.uk...
>
> "Red Activist" <Red@ctivistREMOVE.fsnet.co.uk> wrote in message
> news:cdc2r3$fg$1@news7.svr.pol.co.uk...
> > OK I know the 5200 is not exactly top-of-the-range and it didnt cost me
a
> > lot, but with figures like 10 FPS it is frankly unusable, I really cant
> see
> > how Nvidia can still sell a card that is vastly slower than one they
were
> > selling 4 years ago, hell I have a 3dFX 5500 in the cupboard upstairs
that
> > beats it hands-down.
>
> You have learnt a very universally important lesson: Always do some
> backgroud research before spending money!!! (especially if you care much
> about money that is...) With the internet at your hands nowadays it should
> be a piece of cake for you to get quick accurate info :D
>
>

This is easily said...but remember, time is a valuable resource many of us
don't have for video cards. My experience is that the internet provides
mountains of 'trash information' one must swim through before reaching
something that is 'usable'. The frustration level often is so high that I
wonder if a study won't be done someday on how computerization affects blood
pressure in the masses. Often, we are 'forced' due to the technicality of a
thing, to involve ourselves far more than we have a real interest in doing.
Graphics cards are one thing I do not have a lot of time for...but I'm
forced to take the time to wade through 'useless' information [for me
anyway] just to make a simple purchase at the store. I'm quite happy that a
lot of you have such time to burn...or perhaps it involves your 'profession'
anyway [techies and hardware pros etc]...or perhaps you're just young and
computers are still toys you and your classmates play with and therefore,
are more intimate with, but it's not the real world to expect the typical
buyer/user to be so intimate with what amounts to labels and complex 'model
numbers'. Exactly what is the difference from a GForce 1 and a GForce 4 or
a UF from an XT model? See...to the common buyer, who can't actually 'sit
in the Laborghinni' or kick the tires, but only have the box to go upon [and
whatever the box reports, most of it foreign garble], there's no way of
actually knowing if it's really that KIA or not [until of course they get it
home and it does not do what you 'thought' it would...namely run the damn
game better than the old junker sitting in the driveway]. Ooops, getting
analogies cross referenced...hehe..

What gripes me is getting DX9.0b compatability for tomorrows games. I hate
buying something 'obsolete' before I even install it. But now...we have
DX9.0c...so where does it end??????
 

ceebee

Distinguished
Apr 5, 2004
197
0
18,680
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"GLYTCH \(A.K.A. PYRO-Maniak\)" <GLYTCH_2k4@msn.com> wrote in
alt.comp.periphs.videocards.nvidia:


> Probably, ive no idea if mine is 64 or 128, how can i tell? theres no
> stickers on the card at all and no markings to suggest.
>
> Matt

The 64 Bit version is sparsely available. IIRC the 128bit version has a
fan while the 64 bit has passive cooling.
There are some general utilitiies around informing you about the
bandwith. The now sadly defunct AIDA32 had that capability - again,
IIRC.

--
CeeBee


"I don't know half of you
half as well as I should like;
and I like less than half of you
half as well as you deserve."
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

>You have learnt a very universally important lesson: Always do some
>backgroud research before spending money!!! (especially if you care much
>about money that is...) With the internet at your hands nowadays it should
>be a piece of cake for you to get quick accurate info :D

How people can purchase items costing hundreds and sometimes thousands of
dollars without doing *any* research whatsoever is BEYOND ME.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

>How people can purchase items costing hundreds and sometimes thousands of
>dollars without doing *any* research whatsoever is BEYOND ME.

I agree to a certain point that someone should do some research before an
expensive purchase.
In the case of buying a graphic card someone who is just a casual gamer is not
gonna know what information to look for on a box.
They know that for sure.
How many 5200's do you think they would sell if on the box they had a graph
comparing its performance to a 4200? Especially when there are still 4200 cards
on the shelf and cheaper.
 

jafar

Distinguished
Jun 25, 2004
115
0
18,680
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sun, 18 Jul 2004 17:47:26 +0100, GLYTCH (A.K.A. PYRO-Maniak) wrote:

> jafar wrote:
>> Maybe the difference is, yours is probably the the version with 128bit
>> memory bandwidth?
>
> Probably, ive no idea if mine is 64 or 128, how can i tell? theres no
> stickers on the card at all and no markings to suggest.

You could do a Google for your card's model number. I'm sure the specs
will be there somewhere.

--
Jafar Calley
-----BEGIN GEEK CODE BLOCK-----
d+ s-:+ a C++++ L++ E--- W++ N++ w-- PE- t* 5++ R+ !tv D+ G e* h---- x?
------END GEEK CODE BLOCK------
Registered Linux User #359623
http://fatcatftp.homelinux.org
 

augustus

Distinguished
Feb 27, 2003
740
0
18,980
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"CeeBee" <ceebeechester@start.com.au> wrote in message
news:Xns952AE7CECDECBceebeechesterstartco@195.121.6.74...
> "GLYTCH \(A.K.A. PYRO-Maniak\)" <GLYTCH_2k4@msn.com> wrote in
> alt.comp.periphs.videocards.nvidia:
>
>
> > Probably, ive no idea if mine is 64 or 128, how can i tell? theres no
> > stickers on the card at all and no markings to suggest.
> >
> > Matt
>
> The 64 Bit version is sparsely available. IIRC the 128bit version has a
> fan while the 64 bit has passive cooling.
> There are some general utilitiies around informing you about the
> bandwith. The now sadly defunct AIDA32 had that capability - again,
> IIRC.

Aida32 is now called Everest, and it's still free to d/l. It will tell you
what the memory bus width is, amongst many other things.