Upgrade problems

G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

I recently upgraded from an AMD Athalon XP 2400 CPU to an AMD 2800+ w/333
MHz fsb.

The old system had an NVIDIA TI 4200-128 Mb card and Far Cry ran pretty well
at the "Medium" setting that the Auto Detect selected for my system.

I picked up an NVIDIA GeForce FX 5200-256 Mb card thinking it would improve
visuals and speed in the game, but when I tried it the Auto Detect set the
graphics at "Low" and performance was terrible. I put the Ti4200 back in and
things went back to the way they had been. BTW, I used NVIDIA 6.1.7.7
drivers in each case.

I took the card back and the store gave me an NVIDIA GeForce FX 5700LE with
256 Mb of DDR at no extra cost. I took that home and put it in and the Auto
Detect again set things at "Low" and the performance was awful.

So, I went online and found a patch for the game, installed it, and tried
again. Auto Detect still wants to set things at "Low", but "Medium" seems to
work so-so, but not quite as good as with the Ti4200 I had originally.

Does anyone have any comments on this seemingly contradictory situation? Is
it possible the old Ti4200 w/128 MB is still a better card than the 5700LE
w/256 MB? I'm quite confused.

dvus
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

> Does anyone have any comments on this seemingly contradictory situation? Is
> it possible the old Ti4200 w/128 MB is still a better card than the 5700LE
> w/256 MB? I'm quite confused.

Can't blame you for thinking a newer card would be faster--a lot of
people get bit by that bug. I remember reading on this group a couple
of years ago about the people who "upgraded" from a GF3 Ti to a GF4 MX
and wondered why things actually got worse. There's no substitute for
finding benchmarks for your particular card. Tom's VGA charts have
historically been a good reference, but this latest one runs all the
games at maxed out quality settings, making it look like you need a $400
card just to get playable frame rates--so pay attention to the settings,
and don't base your buying decision on just one or two reference points.

Anyhow, the Ti4200 is about the same speed-wise as a GeForce FX 5700 (no
suffix).
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

dvus wrote:
> I recently upgraded from an AMD Athalon XP 2400 CPU to an AMD 2800+ w/333
> MHz fsb.
>
> The old system had an NVIDIA TI 4200-128 Mb card and Far Cry ran pretty well
> at the "Medium" setting that the Auto Detect selected for my system.
>
> I picked up an NVIDIA GeForce FX 5200-256 Mb card thinking it would improve
> visuals and speed in the game, but when I tried it the Auto Detect set the
> graphics at "Low" and performance was terrible. I put the Ti4200 back in and
> things went back to the way they had been. BTW, I used NVIDIA 6.1.7.7
> drivers in each case.
>
> I took the card back and the store gave me an NVIDIA GeForce FX 5700LE with
> 256 Mb of DDR at no extra cost. I took that home and put it in and the Auto
> Detect again set things at "Low" and the performance was awful.
>
> So, I went online and found a patch for the game, installed it, and tried
> again. Auto Detect still wants to set things at "Low", but "Medium" seems to
> work so-so, but not quite as good as with the Ti4200 I had originally.
>
> Does anyone have any comments on this seemingly contradictory situation? Is
> it possible the old Ti4200 w/128 MB is still a better card than the 5700LE
> w/256 MB? I'm quite confused.
>
> dvus
>
>
Yes the ti card is a much better card than the 5700LE or the 5200. The
5200 is probably one of the worst vid cards nvidia ever made, and the
5700LE is a dumbed down version of the 5700. If I were you, I'd stick
with the ti card for now, unless you want to spend the extra bucks to go
to the 5950Ultra or 6800GT.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

> It seems to indicate that for Far Cry, at least, the Ti4200 outperforms even
> the FX 5950U by a small margin.

I saw that too, and I think it smells funny. The Ti4200 is a great
card, no doubt... but it can't compare to a FX5900-series card. It's
outclassed in both GPU speed and memory bandwidth. My guess is that
Tom's wasn't running them on a level playing field--the FX5900-series
cards were probably using SM2.0, while the Ti4200 was of course only
using SM1.1.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

The 4200 is better; it's clocked faster than those other two cards you
tried. But the 4200 is not DX9 capable.

--
DaveW



"dvus" <dven1@adelphia.net> wrote in message
news:2soo90F1o17rgU1@uni-berlin.de...
>I recently upgraded from an AMD Athalon XP 2400 CPU to an AMD 2800+ w/333
>MHz fsb.
>
> The old system had an NVIDIA TI 4200-128 Mb card and Far Cry ran pretty
> well at the "Medium" setting that the Auto Detect selected for my system.
>
> I picked up an NVIDIA GeForce FX 5200-256 Mb card thinking it would
> improve visuals and speed in the game, but when I tried it the Auto Detect
> set the graphics at "Low" and performance was terrible. I put the Ti4200
> back in and things went back to the way they had been. BTW, I used NVIDIA
> 6.1.7.7 drivers in each case.
>
> I took the card back and the store gave me an NVIDIA GeForce FX 5700LE
> with 256 Mb of DDR at no extra cost. I took that home and put it in and
> the Auto Detect again set things at "Low" and the performance was awful.
>
> So, I went online and found a patch for the game, installed it, and tried
> again. Auto Detect still wants to set things at "Low", but "Medium" seems
> to work so-so, but not quite as good as with the Ti4200 I had originally.
>
> Does anyone have any comments on this seemingly contradictory situation?
> Is it possible the old Ti4200 w/128 MB is still a better card than the
> 5700LE w/256 MB? I'm quite confused.
>
> dvus
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

DaveW wrote:

> The 4200 is better; it's clocked faster than those other two cards you
> tried. But the 4200 is not DX9 capable.

It does not accelerate DirectX 9, which may or may not be an issue depending
on what one is doing with it and how much one values eye-candy.


--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Lachoneus wrote:

>> It seems to indicate that for Far Cry, at least, the Ti4200
>> outperforms even the FX 5950U by a small margin.
>
> I saw that too, and I think it smells funny. The Ti4200 is a great
> card, no doubt... but it can't compare to a FX5900-series card. It's
> outclassed in both GPU speed and memory bandwidth. My guess is that
> Tom's wasn't running them on a level playing field--the FX5900-series
> cards were probably using SM2.0, while the Ti4200 was of course only
> using SM1.1.

I did a little belated research and see that the Ti4200-8x and the FX5700LE
have identical low chip clock speeds of 250 MHz. In the memory clock
department the 5700LE runs out of the box at 400 MHz while the old 4200
starts off at 514 MHz! Of course, the FX5700 has features that the older
Ti4200 doesn't, so it can take advantage of some of the newer graphic
programming tecniques.

I will say, though, that the FX5700LE will run quite stable with the memory
overclocked to 510 MHz and the chip core to 310 MHz. This provides about a
15% frame-rate boost in Doom 3, so it's worth the few little "glitches" one
gets when running that game overclocked.

dvus
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"dvus" <dven1@adelphia.net> wrote in message
news:2su6afF1odptbU2@uni-berlin.de...
> Lachoneus wrote:
>
>>> It seems to indicate that for Far Cry, at least, the Ti4200
>>> outperforms even the FX 5950U by a small margin.
>>
>> I saw that too, and I think it smells funny. The Ti4200 is a great
>> card, no doubt... but it can't compare to a FX5900-series card. It's
>> outclassed in both GPU speed and memory bandwidth. My guess is that
>> Tom's wasn't running them on a level playing field--the FX5900-series
>> cards were probably using SM2.0, while the Ti4200 was of course only
>> using SM1.1.
>
> I did a little belated research and see that the Ti4200-8x and the
> FX5700LE have identical low chip clock speeds of 250 MHz. In the memory
> clock department the 5700LE runs out of the box at 400 MHz while the old
> 4200 starts off at 514 MHz! Of course, the FX5700 has features that the
> older Ti4200 doesn't, so it can take advantage of some of the newer
> graphic programming tecniques.
>
> I will say, though, that the FX5700LE will run quite stable with the
> memory overclocked to 510 MHz and the chip core to 310 MHz. This provides
> about a 15% frame-rate boost in Doom 3, so it's worth the few little
> "glitches" one gets when running that game overclocked.
>
> dvus
>

From your Ti4200, you trying to compare to a different class or level.
FX5200 is class of MX series so FX5700 is about the class of Ti4200. The
only obvious difference between Ti and FX is added DirectX 9 support and
other new feature as technology grow.

When the whole line of 6xxxx series come out, you can not assume FX5700
compare to GeForce 6100LE.
6100LE would be consider MX class like GeForce4 series.

CapFusion,...
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

CapFusion wrote:
> "dvus" <dven1@adelphia.net> wrote in message
> news:2su6afF1odptbU2@uni-berlin.de...
>> Lachoneus wrote:
>>
>>>> It seems to indicate that for Far Cry, at least, the Ti4200
>>>> outperforms even the FX 5950U by a small margin.
>>>
>>> I saw that too, and I think it smells funny. The Ti4200 is a great
>>> card, no doubt... but it can't compare to a FX5900-series card. It's
>>> outclassed in both GPU speed and memory bandwidth. My guess
>>> is that Tom's wasn't running them on a level playing field--the
>>> FX5900-series cards were probably using SM2.0, while the Ti4200 was
>>> of course only using SM1.1.
>>
>> I did a little belated research and see that the Ti4200-8x and the
>> FX5700LE have identical low chip clock speeds of 250 MHz. In the
>> memory clock department the 5700LE runs out of the box at 400 MHz
>> while the old 4200 starts off at 514 MHz! Of course, the FX5700 has
>> features that the older Ti4200 doesn't, so it can take advantage of
>> some of the newer graphic programming tecniques.
>>
>> I will say, though, that the FX5700LE will run quite stable with the
>> memory overclocked to 510 MHz and the chip core to 310 MHz. This
>> provides about a 15% frame-rate boost in Doom 3, so it's worth the
>> few little "glitches" one gets when running that game overclocked.
>
> From your Ti4200, you trying to compare to a different class or level.
> FX5200 is class of MX series so FX5700 is about the class of Ti4200.
> The only obvious difference between Ti and FX is added DirectX 9
> support and other new feature as technology grow.
>
> When the whole line of 6xxxx series come out, you can not assume
> FX5700 compare to GeForce 6100LE.
> 6100LE would be consider MX class like GeForce4 series.

This would all be very educational if I knew what basic differences there
were between an "MX" and "FX" class card, but I'm not sure. I think that
this may be why a regular dummy like me has such a hard time shopping for a
new graphics card, I don't mind spending the money if I'm getting something
for it, but all too often the differences are hidden in technical specs that
even the salespeople are unaware of.

One of you graphics geniuses ought to write some sort of "Graphic Adapters
for Dummies" that we can take to the store in order to get our money's
worth.

dvus