Misrach

Distinguished
Mar 25, 2006
61
0
18,630
I recently upgraded from an nVidia 7800GTX 256mb card to an ATi X1900XT. Of course, the ATi card is significantly faster (20-50% increase in framerates), but it seems like the textures are a bit blurrier or muddier. I've noticed this in both F.E.A.R. and Oblivion. At one point in Oblvion, the textures of the ground in an Oblvion gate were almost Doom 1 quality.

Okay, maybe not that bad; maybe more like Doom 3 on the Xbox (textures close-up). Still, it was noticeably low-rez. I've seen this to a lesser extent on other objects as well, such as rocks, doors, and grass. Also, when anti-aliasing is turned on, its effect doesn't seem as pronounced as it was on my nVidia card (which would chug at extremely slow framerates with AA on). It's almost as if the textures were too small for the comparable resolution, resulting in blockier textures that don't smooth out as easily.

I've tried both the ATi and OmegaDrivers with the same results (6.4 and 6.3, repsectively). Is this an issue with ATi drivers in general? I'm running all games at 1200x1920 -- which is also the native rez of my monitor -- with almost everything turned up. The texture and mip-map settings in ATi control panel are set to Highest Quality. Specifically, when playing Oblivion, I had HDR on and AA off. With HDR off, the chameleon effect on my sword resulted in a screen-door effect.

Please, only relevant answers. I'm not trying to start a flame war, just trying to get the most out of my new videocard.
 

NumenorLord

Distinguished
Mar 13, 2006
248
0
18,680
Turn AA and shadows up. The shadows were horribly blocky in Oblivion until I turned self shadows up. Beyond that, I think you want to set texture size to large. I have a 256mb 6800 vanilla at 400/750, and Oblivion looks beautiful at medium-high settings (resolution of 1680x1050). Needless to say, your X1900XT should kill my gpu.
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
I have AA off because the PC version won't, at this time, run HDR and AA together. Also, self-shadows has a known conflict with several cards, though it was worse on my 7800GTX. My texture settings are set to large.

The game looks very good for the most part, though jaggies do seem a bit more pronounced than they did on my 7800GTX, particularly with objects and buildings in the distance. Also, the muddy textures only result in a close-up view of the objects.

In F.E.A.R., I can still see jaggies even with 2x and 4x AA. I know that they're working, however, because some of the most severe offenders disappear with each AA sampling.

Framerates are very high in all games with eye candy turned up.
 
I have AA off because the PC version won't, at this time, run HDR and AA together.

Yes it will, use the Chuck patch.

As for the Chameleon issue, there are known problems with the chameleon implementation since supposedly it changed from the pre-release version ATi had to test with and the final version. Seems neither nV nor ATi were ever given updated version to test before launch.

Issue are supposedly going to be addressed as part of the unified driver in the 6.5 release.
 

cleeve

Illustrious
I'd be really interested in seeing these differences. Could you take some screenshots of the blurry textures?

At the risk of sounding overly skeptical, if you haven't seen the two side-by-side - I assume you upgraded and don't still use the 7800... that it might simply be a matter of you're noticing a low detail texture that you simply didn't notice before...
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
I didn't want to update to the Chuck patch until I had the card up and running for a few weeks, especially now that I noticed this "low-rez" issue. Since I just finished Oblivion over the weekend, I'm in no rush. Testing Far Cry in HDR and AA is my next step.

I'll try to get some screenshots ASAP, but it might not be until the weekend. I'm hoping that I just didn't notice the textures before, or that I have some setting at the wrong level (I've fiddled with it a lot, so I doubt that). Don't get me wrong: the nVidia card would reveal lower-resolution textures at close range as well. They just looked a little sharper on the nVidia card, and AA seemed to have a more pronounced effect.

To be honest, it won't bother me enough to RMA or upgrade the card. I just want to make sure that I've done everything to correct the matter, if it is indeed an issue.
 

choknuti

Distinguished
Mar 17, 2006
1,046
0
19,280
Did you try editing the ini file? Totally changed the experience for me.

@ NumenorLord
I know that this is off topic but where did you find that pic of Aribeth ?
 

Misrach

Distinguished
Mar 25, 2006
61
0
18,630
No, I didn't try editing the .ini file, but that shouldn't matter: I didn't edit the file under the nVidia card either.

I think that part of the problem could lie in the D3D compressed textures. With my nVidia card, I didn't have the option (that I could find) of using D3D textures. When I disabled it, the textures at close range increased slightly.

That said, when I disable D3D textures with the X1900XT card, the load times for Oblvion increase significantly; they were never that high with the nVidia card. Also, the game can chug sometimes at unplayable framerates. And F.E.A.R. wouldn't play at all without D3D compressed textures enabled.
 

Vokofpolisiekar

Distinguished
Feb 5, 2006
3,367
0
20,790
Yes, I'd also like to see some screen shots for comparison (if I can remeber where I post my pictures to). My texture quality is supreme on Oblivion (we are talking HDR light effect off bump mapped textures). To date, my 1900 has been shurning stunning quality textures.