Comparison between TI4280 and FX5600XT

G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Hi guys,

The seller of my PC substituted in warranty my old TI4280 VIVO (128
MB) with a FX 5600 XT (256 MB) and, as expected, the performance has
heavily degraded.

The shop claims that 5600 XT has equal or best performances when
compared to 4280 (hah!) and they are obliged to substitute HW in
warranty with the available HW that have comparable characteristics.

Since they do not have anymore the TI 4280 in stock, they used the FX
5600 XT to replace the 4280, since they claim that the two cards have
same characteristics.

It is quite evident to me that the FX 5600 XT is worse than the 4280,
but of course it is easy for the shop to say that it is a subjective
feeling.

I want to counter this position with objective data.

For this reason, I would like to have some benchmarking between TI4280
and FX5600XT. I have been surfing and searching but I cannot find any
real reply.

Is anyone aware of any benchmark available in the net?

Can anyone provide some info on the issue, that I can use as objective
data, to make the shop admit that they are not complying with the
warranty conditions?

Thanks in advance,

Francesco
 

DaveL

Distinguished
Jun 2, 2001
634
0
18,980
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

They ripped you off. The 5600XT is like the 5200 in that it comes in both
128 bit and 64 bit varieties. All of the 5600XT 256MB cards that I have
seen are 128 bit. That's the good news. The bad news is they are about the
same speed as GF3 ti200. They are not even close to the performance of a
ti4200. To get close to equal performance they would have to give you a
5600 ULTRA at least.

DaveL


"Francesco" <ilpacchi@libero.it> wrote in message
news:58690731.0404090938.106f508d@posting.google.com...
> Hi guys,
>
> The seller of my PC substituted in warranty my old TI4280 VIVO (128
> MB) with a FX 5600 XT (256 MB) and, as expected, the performance has
> heavily degraded.
>
> The shop claims that 5600 XT has equal or best performances when
> compared to 4280 (hah!) and they are obliged to substitute HW in
> warranty with the available HW that have comparable characteristics.
>
> Since they do not have anymore the TI 4280 in stock, they used the FX
> 5600 XT to replace the 4280, since they claim that the two cards have
> same characteristics.
>
> It is quite evident to me that the FX 5600 XT is worse than the 4280,
> but of course it is easy for the shop to say that it is a subjective
> feeling.
>
> I want to counter this position with objective data.
>
> For this reason, I would like to have some benchmarking between TI4280
> and FX5600XT. I have been surfing and searching but I cannot find any
> real reply.
>
> Is anyone aware of any benchmark available in the net?
>
> Can anyone provide some info on the issue, that I can use as objective
> data, to make the shop admit that they are not complying with the
> warranty conditions?
>
> Thanks in advance,
>
> Francesco
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Francesco" <ilpacchi@libero.it> wrote in message
news:58690731.0404090938.106f508d@posting.google.com...

" Is anyone aware of any benchmark available in the net? "


I take it that you want a comparison between a Ti4200 and a 5600XT.
http://www.digit-life.com/articles2/over2003/
 

teqguy

Distinguished
Apr 1, 2004
100
0
18,680
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Cuzman wrote:

> "Francesco" <ilpacchi@libero.it> wrote in message
> news:58690731.0404090938.106f508d@posting.google.com...
>
> " Is anyone aware of any benchmark available in the net? "
>
>
> I take it that you want a comparison between a Ti4200 and a 5600XT.
> http://www.digit-life.com/articles2/over2003/





I'd take quality over performance... even though the XT could probably
hold it's own in both areas.



If you can, do whichever costs less... since you'll probably want to
upgrade when the NV40 comes out anyways.
 

teqguy

Distinguished
Apr 1, 2004
100
0
18,680
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

DaveL wrote:

> They ripped you off. The 5600XT is like the 5200 in that it comes in
> both 128 bit and 64 bit varieties. All of the 5600XT 256MB cards
> that I have seen are 128 bit. That's the good news. The bad news is
> they are about the same speed as GF3 ti200. They are not even close
> to the performance of a ti4200. To get close to equal performance
> they would have to give you a 5600 ULTRA at least.
>
> DaveL




Uh, I don't think you know what you're talking about...


The memory pipelines on all of the FX series is either 128bit or
256bit... not 64bit.

http://www.nvidia.com/page/fx_5200.html



That would make no sense, since manufacturers would be having to use
multiple ram configurations per line of video cards, which isn't cost
effective.



Not to mention, the Geforce 4 440MX has a 128bit memory pipeline, which
would put a $60 card above a $120 card? Not possible.





The Geforce 4 4200Ti is a good card once you overclock it, which
requires decent cooling.

The same is applicable to any piece of hardware, though.


Because of the FX series being such a heat monger and compensates for
this by lowering the clock speed when heat rises, I can't say that
you're going to get the same percentage of performance increases
through overclock.


If you have the proper cooling, however, the 5600XT is definitely the
way to go... almost reaching the potential performance of the 5700
Ultra.



But we're simply comparing apples and oranges here.


The graphical quality of the 5200 beats the 4200 significantly, which
is really where enjoyment comes down to.


Sure, you could get 230fps in a benchmark without AA, aniso, or high
resolution... but your eyes can't even function properly at that
framerate... in a real world study, you'd only be able to see about 150
frames, and only have the cognitive span to remember 30 of them.



It's better to have the eye candy and savor it than have nice
benchmarks. If you're going to go and build a system for benchmarking,
you're basically building it for everyone but yourself.







Francesco, your performance has probably degraded because of your
drivers, try updating them. Clean out your older ones first, just so
there aren't any remains.
 

DaveL

Distinguished
Jun 2, 2001
634
0
18,980
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Dude, you better go do your homework. Were talking about cards, not chips.
Just because Nvidia made the GPU with a 128 bit memory interface does not
mean the card manufacturers have to use all that bandwidth. In fact, they
very often don't.

I'll let this slide this time, but in the future you should be damn sure of
your facts before you accuse someone of not knowing what they are talking
about.

DaveL


"teqguy" <teqguy@techie.com> wrote in message
news:_bGdc.20573$TS3.5@nwrddc02.gnilink.net...
> DaveL wrote:
>
> > They ripped you off. The 5600XT is like the 5200 in that it comes in
> > both 128 bit and 64 bit varieties. All of the 5600XT 256MB cards
> > that I have seen are 128 bit. That's the good news. The bad news is
> > they are about the same speed as GF3 ti200. They are not even close
> > to the performance of a ti4200. To get close to equal performance
> > they would have to give you a 5600 ULTRA at least.
> >
> > DaveL
>
>
>
>
> Uh, I don't think you know what you're talking about...
>
>
> The memory pipelines on all of the FX series is either 128bit or
> 256bit... not 64bit.
>
> http://www.nvidia.com/page/fx_5200.html
>
>
>
> That would make no sense, since manufacturers would be having to use
> multiple ram configurations per line of video cards, which isn't cost
> effective.
>
>
>
> Not to mention, the Geforce 4 440MX has a 128bit memory pipeline, which
> would put a $60 card above a $120 card? Not possible.
>
>
>
>
>
> The Geforce 4 4200Ti is a good card once you overclock it, which
> requires decent cooling.
>
> The same is applicable to any piece of hardware, though.
>
>
> Because of the FX series being such a heat monger and compensates for
> this by lowering the clock speed when heat rises, I can't say that
> you're going to get the same percentage of performance increases
> through overclock.
>
>
> If you have the proper cooling, however, the 5600XT is definitely the
> way to go... almost reaching the potential performance of the 5700
> Ultra.
>
>
>
> But we're simply comparing apples and oranges here.
>
>
> The graphical quality of the 5200 beats the 4200 significantly, which
> is really where enjoyment comes down to.
>
>
> Sure, you could get 230fps in a benchmark without AA, aniso, or high
> resolution... but your eyes can't even function properly at that
> framerate... in a real world study, you'd only be able to see about 150
> frames, and only have the cognitive span to remember 30 of them.
>
>
>
> It's better to have the eye candy and savor it than have nice
> benchmarks. If you're going to go and build a system for benchmarking,
> you're basically building it for everyone but yourself.
>
>
>
>
>
>
>
> Francesco, your performance has probably degraded because of your
> drivers, try updating them. Clean out your older ones first, just so
> there aren't any remains.
 

teqguy

Distinguished
Apr 1, 2004
100
0
18,680
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

DaveL wrote:

> Dude, you better go do your homework. Were talking about cards, not
> chips. Just because Nvidia made the GPU with a 128 bit memory
> interface does not mean the card manufacturers have to use all that
> bandwidth. In fact, they very often don't.
>
> I'll let this slide this time, but in the future you should be damn
> sure of your facts before you accuse someone of not knowing what they
> are talking about.
>
> DaveL



You will "let this slide"?
Who the hell do you think you are?




The memory interface does not control bandwidth consumption... the
bandwidth is equal across all channels. The only change is in how
many pipelines are addressed.



The FX GPU's memory interfaces are either 128bit or 256bit, not 64bit.

There are no FX cards with memory capacities of 64mb or memory
pipelines with 64bit addressing.

I don't understand where you pulled 64bit from... because even the
Geforce 4 MX uses 128bit memory addressing.


Show me an FX card that uses a 64bit memory pipeline and I'll admit I'm
wrong.


But the fact is, the bitrate is derrived from the memory controller on
the GPU, not the GPU itself.

The FX GPU's are ALL 256bit.

Show me anything that mentions a 64bit memory interface on this page:


http://www.nvidia.com/page/pg_20040109440047.html




If you can provide substantial evidence, then so be it.


But until then, you're simply either confused, wrong, or
miscommunicating information and I'm not understanding you.
 

teqguy

Distinguished
Apr 1, 2004
100
0
18,680
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

dino wrote:

> basically they did rip you you off...the FX5600-256 is not worth the
> price of a Ti4200..I had one..kept it for less than a week and
> returned it..it is another of Nvidia's flops.Check out Tom's Hardware
> for benchmarking..they gave you an inferior product.
> http://www.tomshardware.com/graphic/20031229/vga-charts-03.html#unreal
> _tournament_2003



Like I said, graphical quality sometimes supercedes performance.


If theres playability issues, then they need to be fixed.


But if you're able to run at 40fps in 8X AA and Aniso and it's
perfectly fine for you... then I'd take the higher quality.


The DVD quality is noticably better on the FX series, not to mention
the faster ramdacs improve multi-monitor display.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

> Like I said, graphical quality sometimes supercedes performance.

And this person obviously disagrees. He feels cheated, and I would be
inclined to agree. The first FX card that beats a TI4200 in frame rate is
an FX 5700 Ultra. Anything below that is a downgrade.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Do a google search, FX cards with a 64 bit memory interface are on the
market, 5200-5600.

There were also 64 Meg. versions of the 5200 floating around.

Use google before opening your mouth and putting your own foot in it.

RF

"teqguy" <teqguy@techie.com> wrote in message
news:csQdc.21103$TS3.541@nwrddc02.gnilink.net...
> DaveL wrote:
>
> > Dude, you better go do your homework. Were talking about cards, not
> > chips. Just because Nvidia made the GPU with a 128 bit memory
> > interface does not mean the card manufacturers have to use all that
> > bandwidth. In fact, they very often don't.
> >
> > I'll let this slide this time, but in the future you should be damn
> > sure of your facts before you accuse someone of not knowing what they
> > are talking about.
> >
> > DaveL
>
>
>
> You will "let this slide"?
> Who the hell do you think you are?
>
>
>
>
> The memory interface does not control bandwidth consumption... the
> bandwidth is equal across all channels. The only change is in how
> many pipelines are addressed.
>
>
>
> The FX GPU's memory interfaces are either 128bit or 256bit, not 64bit.
>
> There are no FX cards with memory capacities of 64mb or memory
> pipelines with 64bit addressing.
>
> I don't understand where you pulled 64bit from... because even the
> Geforce 4 MX uses 128bit memory addressing.
>
>
> Show me an FX card that uses a 64bit memory pipeline and I'll admit I'm
> wrong.
>
>
> But the fact is, the bitrate is derrived from the memory controller on
> the GPU, not the GPU itself.
>
> The FX GPU's are ALL 256bit.
>
> Show me anything that mentions a 64bit memory interface on this page:
>
>
> http://www.nvidia.com/page/pg_20040109440047.html
>
>
>
>
> If you can provide substantial evidence, then so be it.
>
>
> But until then, you're simply either confused, wrong, or
> miscommunicating information and I'm not understanding you.