I just managed to buy an ATi Radeon X800XL off eBay (and at the retail price of $299 US no less!). I've been checking it's benchmarks over the past week and they are very good; the X800XL is more or less equivalent to the 6800 GT (which I couldn't afford to begin with), so it's a good buy for me.
However, I noticed that the X800XL (in fact, all current ATi cards) only have DirectX 9.0b support whereas the current nVidia cards all have 9.0c. Does anyone know what the difference is between the two?
who knows what the future holds for 9.0c sm 3.0 these cards might handle future games a lot better than 9.0b cards.
yes there is a difference in texture quality, considering i own a X800XT plus a 6800 128mb, some games like farcry look better with the 6800 than my X800XT, using the command console the 6800 has more graphical options and lighting enhancements which my X800XT just cannot touch.
even chronicles fo riddick looks slightly more better on a 6800.
i still use the X800XT over the plain 6800 due to the 6800 having 128mb compared to the X800XT 256mb the 6800 judders time to time because of the memory.
But the 6800GT in my opinion is good because of the sm 3.0 and 256mb memory.
Don't worry about SM3.0, you'll get better performance/dollar out of X800XL with the same image quality in (((most games))) compared to IQ with SM3.0 enabled cards.
True wusy is correct bucks for power i would go for the X800XL Above anycard on the MARKET TODAY.
With games like FarCry, only a supermans eyes can spot the IQ difference the two.
thats not true at all, only a man who can compare the 2 can say this, in this case you cannot Mr 9800.
Plus if you have seen a comparison on some site months ago, you should try the game today with latest patches and drivers, i play far cry a lot plus i tweak the graphics a lot to get the most out of it, the 6800 has better picture quality.
Sorry to necromance a 3 yr old old thread - some of us are still very content with reliable socket 478 main boards that use the AGP8x bus. :-)
SM3.0 = Shader Model 3.0 correct? I have seen that appear as a recommended system requirement (e.g. on COD4) but usually minimum system requirements for game titles are just DirectX9.0 and sometimes DirectX9.0c
Are applications (really games!) specifying DirectX9.0c as minimum requirement typically going to not run (or run poorly) if I have installed a video card that only boasts DirectX9.0b support?
My current video card is the nVidia GeforceFX 5600 256MB DDR 128bit and boasts support for Directx9.0.
The various video cards I am considering an upgrade to are:
(i) nVidia Geforce 7600GT 512MB DDR2 128bit
(ii) nVidia GeForce 7300GT 512MB DDR2 128bit
(iii) ATI RADEON X800XT 256MB DDR3 256bit
(iv) ATI RADEON X800PRO 256MB DDR3 256bit
(i)&(ii) claim DX9.0c support and are priced around AUD$145 while (iii)&(iv) claim only DX9.0 support (however a Wikipedia comparison of ATI graphics processors indicates that these GPU's support DX9.0b) and are priced around AUD$115
A not so subtle feature of the two ATI cards is a much larger diameter fan than the two nVidia based cards. I presume this will yield a quieter cooling solution.
So everything about the X800XT/X800PRO looks better to me except maybe the annoying absence of explicit DirectX9.0c support.
Does anyone think I will regret choosing (iii) or (iv) over (i) or (ii) ?
I recently got 2 7800GTX on ebay for 50.00 USD, When I SLi'ed them I can't really say DX9 looks bad for Sins of Solar Empire. I would go for the 7900GTX or X800PRO if you can find them. Majority of the games are still DX9.0c, Red Alert 3 and Fallout 3 are also still going to be DX9.0c.