Do ATI cards really offer better 2D than NVidia ca

Woodman

Distinguished
May 8, 2002
867
0
18,980
Or is this just (& i dont mean to offend anyone with this) some rumor started long ago to comfort existing ATI users due to NVidia's (then) grip on power?

Now im not trying to bash ATI/NVidia or anything, im just curious why many seem to believe this is so? Is there some kind of 2D benchmark around stating this?


Thanks.
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
All ATI cards (beginning with 7500 I think) indeed offer better image quality (2d as also 3d) than Geforce cards because all those Radeons and Matrox cards include a 400Mhz RAMDAC for enhanced sharpness and clearer image quality.

But the upcoming GeForce FX (available in 2 or 3 weeks) will be the first nvidia card to offer the same 400Mhz RAMDAC as the competition (instead of the standard 350Mhz RAMDAC) and so this advantage of ATI and Matrox will vanish.

One shouldn't forget that if you don't own a Triniton CRT monitor, you won't notice much difference between a 350 and a 400MHz RAMDAC...


Update: Since the RAMDAC is reponsible for converting the digital image rendered by the graphiccard into an analog signal of a monitor, the Mhz of the RAMDAC has nothing to do with how fast graphics can be rendered. The more MHz a RAMDAC has, the better the quality of the converted image gets. This is comparable to the sampling rate of audio files, a 48KHz mp3 does NOT play faster than a 44.1KHz one, it only has a better "high fidelity"...
<P ID="edit"><FONT SIZE=-1><EM>Edited by vacs on 01/05/03 04:27 PM.</EM></FONT></P>
 

heffeque

Distinguished
Dec 9, 2002
181
0
18,680
ATI has always had better image quality. The thing is...

<b> <font color=red> ¿¿¿ ¡¡¡ Can anybody give me some money for free to by myself a new computer !!! ??? </font color=red> </b>

:lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol:

----
PII 266
256 RAM
GeForce2MX 32MB + Sony Trinitron
4+3GB
Benq 56x + TDK 12x10x32x
<font color=green> ISA Sound card </font color=green> (¡Now that's old! :wink: )
Windows 98SE (System too slow for XP)
 

Woodman

Distinguished
May 8, 2002
867
0
18,980
Thanks Vacs :). So apart from RAMDACs, is there anything else ATI/Matrox has that provides better 2D than NVidia?

Thanks once again.
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
It should also be mentioned that as far as image quality is concerned RAMDAC speed (or the lack of) generally only degrades at high resolutions, above 1600x1200. To display a resolution of 1600x1200 with 75 Hz refresh only requires a RAMDAC frequency 290 Mhz. This does not even challenge modern RAMDACs.

A bigger source of video quality problems are the RGB RF filters used on all VGA video cards. (Use cheap quality components to make your filters and you get poor results). Some Geforce2 and Geforce3 cards were absolutely horrible. You could get blurry text at even 800x600 resolution.

Supposedly nVidia began, with the Geforce4 line, requiring a certain level of quality control for filters from nVidia's partners. I think your will find that Geforce4 cards mostly have very good visual quality. (Don't have one myself so I don't know for sure).

Here is a good article about RAMDACs.

<A HREF="http://grafi.ii.pw.edu.pl/gbm/matrox/ramdac.html#What" target="_new">http://grafi.ii.pw.edu.pl/gbm/matrox/ramdac.html#What</A>

Here is a good description about the RF filter problem with nVidia cards. (It's a very old problem).

<A HREF="http://www.geocities.com/porotuner/imagequality.html#26sep2000" target="_new">http://www.geocities.com/porotuner/imagequality.html#26sep2000</A>
Basically, the article is a description of how to remove or bypass the filters and it works. I've done the mod on a Geforce256, a Geforce 2 (which was one of the cards that was blurry at 800x600), and I adapted the mod to an old ATI All-in-Wonder (original). From what I understand the problem is rare in ATI cards but this one was blurry at 1280x1024 (60hz) but not any more.

<b>99% is great, unless you are talking about system stability</b>
 

starwrs3

Distinguished
Jul 11, 2001
167
0
18,680
phsstpok got it right the ati and the matrox cards use a higher quality filters this is what gives then a better image quality than nvidia card the ramdac just allows higher refresh rates and higher reslolutions
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
The RF filter on retail GeForces is independent of the nvidia reference board. Some GeForce vendors used cheap filters some don't... The RAMDAC however is shared by all nvidia graphiccards...
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
Unfortunately, there is no way to tell how good a particular video card's filters are without using and testing the video card.

ATI and Matrox do have the reputation of good quality. Time will tell for nVidia partner cards. For any other manufactured card you should see for yourself, or at least try to get a report from others on that particular model. Quality varies from manufacturer to manufacturer, video card to video card. If anyone says one card is blurry at any resolutions of 1600x1200 and below then it probably has inferior filters and should be avoided, unless you only plan to use lower resolutions.

<b>99% is great, unless you are talking about system stability</b>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I didn't imply that all nVidia cards had problems only some. Plus, I did say that nVidia has addressed this problem.

For those that doubt about the filters, ask a Matrox user, particularly one using dual display. The second RAMDAC on G400 and G450 cards is only the 230 Mhz variety. This limits the maximum resolution of the second display but the visual quality is still good for the usuable resolutions.

<b>99% is great, unless you are talking about system stability</b>