need quality

G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

how does nvidia stack up against ati as far as texture quality?
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Mega Man" <nobody@nobody.com> wrote in message
news:1A8Wc.3015$VY.2459@trndny09...
> how does nvidia stack up against ati as far as texture quality?


Both good. I think it depends on which drivers you're using with each one.

Gary
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Mega Man wrote:

> how does nvidia stack up against ati as far as texture quality?
>
>

Excellent if not greatly better. ATI's recent hardware has skimped on
internal texture filtering precision as a method of reducing total chip
workload. It helps to increase overall chip efficiency, but the side
effect is increased aliasing.

Most image quality comparisons incorrectly interpret aliasing and
lightly filtered texels as "sharpness" in screenshots. This however is
not the correct way the image should be rendered.

Internally, the R3xx and above use 5-bit precision mipmaps at greater
than L1. This allows for many fewer gradients and overall lower quality
filtering as the angle of incidence to a plane decreases. NVIDIA uses a
de-facto standard of 8-bit precision (started by SGI's OpenGL reference).

Also ATI uses an adaptive trilinear technique that is cannot be disabled
by the user or developer. This can hold increases up to 30% when
anisotropic filtering is enabled. NVidia has some similar techniques
(that use a different method though), BUT with a big difference in that
you have a setting that disables either trilinear or aniostropic
optimizations.

Differencing algorithems do not show a great difference between older
R250 ingame screenshots (that have no adaptive trilinear) and the R3xx,
but overall it contributes to much greater texture aliasing, especially
during movement.

Currently I'm using a FX5600 (the series that invented "brilinear"
filtering as a aniso optimization), but with the newest drivers I can
disable both tri and af optimizations and it looks absolutely beatiful.

Brilinear has evolved in the GF6 series however and you should not think
twice about using the optimization on those cards as it looks MUCH MUCH
better and doesn't detract from image quality when moving.
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

> ......This allows for many fewer gradients and overall lower quality
> filtering as the angle of incidence to a plane decreases.
But doesn't aliasing become less apparent as the angle decreases anyway?
That's what I've read and observed. So why not give up some filtering
quality that's not needed to gain extra performance?
Right now I have 2 ATI cards and 2 nVidia cards (Radeon 8500, Ti4400, Radeon
9800Pro and eVga 6800GT.) I figure I'll become brand loyal when one of those
companies becomes Me loyal.
One of the things I liked about ATI cards was that they seemed to have more
gradients (smoother transitions between colors) than the nVidia cards. Not
just in games, but in 2D also, in pictures and the desktop. Low quality
pictures, especially, looked better on the ATI.
Regardless, that's changed with the 62.xx series drivers, especially with
the 6800 series cards. The Ti4400 with 65.62 drivers is very nice, and the
6800gt is even better. I can honestly say that I think the texture quality
between ATI and nVidia is even now. Until ATI doesn't something with the
poor performance of their OpenGL drivers, nVidia is definitely the way to go
for any newer games with that format.
Gary

--
Tweaks & Reviews
www.slottweak.com
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

GTX_SlotCar wrote:
>>......This allows for many fewer gradients and overall lower quality
>>filtering as the angle of incidence to a plane decreases.
>
> But doesn't aliasing become less apparent as the angle decreases anyway?
> That's what I've read and observed. So why not give up some filtering
> quality that's not needed to gain extra performance?
> Right now I have 2 ATI cards and 2 nVidia cards (Radeon 8500, Ti4400, Radeon
> 9800Pro and eVga 6800GT.) I figure I'll become brand loyal when one of those
> companies becomes Me loyal.
> One of the things I liked about ATI cards was that they seemed to have more
> gradients (smoother transitions between colors) than the nVidia cards. Not
> just in games, but in 2D also, in pictures and the desktop. Low quality
> pictures, especially, looked better on the ATI.
> Regardless, that's changed with the 62.xx series drivers, especially with
> the 6800 series cards. The Ti4400 with 65.62 drivers is very nice, and the
> 6800gt is even better. I can honestly say that I think the texture quality
> between ATI and nVidia is even now. Until ATI doesn't something with the
> poor performance of their OpenGL drivers, nVidia is definitely the way to go
> for any newer games with that format.
> Gary
>

Perhaps I worded it wrongly, but by decreasing angle, I'm referring to a
"sun on the horizon" situation. At a certain point, all the mipmaps
become too small anyhow. But generally with 90 degree fov games, we're
observing at a 30 degree agle to the floor plain and that's where I'm
coming from.

The absolute ideal situation is that of a raytraced scene taking point
many point samples for each given pixel and oversampling, but trilinear
filtering is used as an abstraction of that concept (the contribution of
neighboring texels as distance inreases and apparent resolution
decreases, such as the ability to decern details at X number of arc
seconds).

3dcenter.org has an excellent article detailing ATI's filtering methods.

You won't have to worry about 2d and overall image quality anymore with
NVidia based cards. NVidia as a company has never actually designed
cards, they've always sold chip designs. The chips are then made by a
foundry (TSMC, IBM) and sold to card makers. Then manufacturers take
the reference design and create their cards based on it. As such,
historically NVidia has been unable to achieve great quality control of
the final retail product.

That has changed however. There are now set standards for what
components (2d rf filters, ramdac's, etc) and specifications a given 3rd
party manufacturer can choose. This has led to a huge increase of
overall consitent quality. This system was implemented around the time
of the Geforce FX and ever since.