ATi Hits Back (Again) with Mid-Range X700 line, Driver Tweak

Catalyst A.I., Continued

Furthermore, A.I. also analyzes the textures a game loads and decides which "optimizations" should be applied to get the best performance from the game. Again, the user is forced to rely on ATi's promise that the image quality isn't noticeably degraded in any way. ATi even goes so far as to say that image quality may even be improved in some cases:

Swipe to scroll horizontally
"CATALYST A.I. makes use of ATI's new texture analyzer technology to optimize performance in a 3D application while maintaining or even improving image quality."

Sounds great, but again we need to read between the lines. The improved image quality in this case does not refer to the reference images from Refrast, which are in effect "unoptimized" images, but to the optimization techniques ATi has been applying so far. In UT2003/2004, for example, A.I. ensures that texture stage 0 is no longer the only one that receives tri- or "brilinear" filtering. Even higher texture stages may now be filtered as well, depending on A.I.'s analysis. Surely this is an improvement over the current optimizations, but just as surely this can't be better than full or real trilinear filtering in all texture stages.

Aside from such features as disabling FSAA in applications that are incompatible with it, it would also be nice to see processor makers use application detection in a way that benefits only the user. One possibility could be employing a different FSAA mode so that FSAA is available in problematic games after all. Or it could be used to really improve image quality in games, by using better filtering methods or texture-anti aliasing in games that would benefit from it.

It kind of makes you wonder what will come next. Will we see drivers that replace the entire polygon model of an object, because the lower-poly model still looks "equivalent" while the performance improves? The opinion of most game developers in respect to this matter is obviously that they want the cards to render exactly what they provided for. Of course, reality shows us a different picture (quite literally). Both ATi and NVIDIA have been employing performance optimizations for quite a while now (see also ATI's Optimized Texture Filtering Called Into Question ). No product from either company can be said to render truly "correctly" - if anything, the results they produce are "equivalent". At least NVIDIA has learned from last year's protests by users and the press alike and now allows the deactivation of these optimizations. So far, ATi has proved to be quite stubborn in this respect, but at least the company is taking a step in the right direction by allowing the deactivation of A.I.

Don't think that this optimization craze is only confusing for the users. Classic apples-to-apples comparisons haven't been possible for some time. Only the processor makers or their driver teams actually know what is being "optimized" where and to what extent. The new "adaptive" performance optimizations that ATi will be introducing with their new driver make a precise analysis next to impossible.