I qualify it as a better choice per my reasoning above, restated simply as:
When two video cards get 100fps + in standard benchmarks at high detail settings they are, for all intents and purposes, equal... as the differences in framerates cannot be percieved by the naked eye.
i.e. if a Ti4600 gets 159 FPS in quake3 at 1600*1200, and the 9600 PRO gets 114 FPS... I will not percieve a difference, so who gives a crap?
BUT... if when image enhancing features like FSAA and Anistropic filtering is applied, and the framerates drop well below 100 FPS, if there is a sizable difference between how the cards perform... this makes a definite and percievable difference.
i.e. if a Ti4600 gets 48 FPS in UT2003 at 1024*768 with 4xFSAA and 8x anistropic filtering, and the 9600 PRO gets 64 FPS... I will see a difference and therefore, definitely, give a crap.
Note the following benchmarks to support my opine:
http://firingsquad.gamers.com/hardware/ati_radeon_9600_pro_review/page5.asp
On a side note, I do have to agree with Eden that addiarmadar's suggestion that 8x AGP makes a performance difference is unfounded.
The only time I have seen 8X AGP make a difference in a benchmark is when bandwidth was stratched to the limit with the top of the line Radeon 9800 PRO doing 4xFSAA at super high resolutions.
Every benchmark I've ever seen shows that most cards, even farily strong ones like the Ti4600 show no benefit to running 8x AGP.