2900XT review on vr-zone

G

Guest

Guest
i guess nvidia prices are not coming down then. what do you guys think.
 
Ive been reading them, but I havnt pulled up the test platform yet. What cpu did they use? The 3D marks looks ok, but the comment on the possible bottleneck from the cpu gives concern. Also, they speak of huge driver improvements needed, and as they went from one to the newer, that the performance jumped
 
G

Guest

Guest
they used X6800 and it requires more power then GTX and performance is just as good as 8800GTS in some case less then 8800GTS 320. yeah performance did increase with drivers. but i don't think it would increase quiet much with drivers. what do you think. i am kind of disapponited now after reading this review. all the wait and the R600 hype was not worth it
 
G

Guest

Guest
i mean come on ATI is even loosing in image quality test ATI was always better in that then nvidia.
 

SEALBoy

Distinguished
Aug 17, 2006
1,303
0
19,290
Wow the pages are taking forever to load...

On the first page it says the X2600 draws only 45W of power. That's insane. If ATI can get it to perform faster than the abysmal nVidia 8600 cards, they can really take over the mid-range market.
 
I hate to say this but, according to FUaD, the new drivers really help. They said (VR) that they saw improvements from 5 to 30 percent , and thats still not the newest drivers. And Oblivion saw great improvements. I think in the end what we will all see is that itll be better than the gts, nip the gtx in a few games, and be priced very competitively
 

xela

Distinguished
Apr 27, 2007
153
0
18,680
Well.. Nvidia still might drop prices and honostly.. I think they already did :)

I am planning to buy a new comp and was waiting for decent 2900 benches before ordering, 3 weeks ago I couldn't get GTS 320 for any less then 325 Euro, it's +/- 250 Euro now and you can get a 640 Mb version for around 350 so I would call it a considerable price drop.

They didn't have to drop prices.. they will however (in my opinion) stab ATI/AMD like Intel would when they are at it's weakest. AMD/ATI better get their act together or we will be stuck with Intel/Nvidia monopoly before long and god knows that Intel has enought cash to buy 3 Nvidia's....

I'll wait untill the end of the month and if ATI doesn't release a good driver by then.. they've lost another client :(
 
G

Guest

Guest
i hoping you right man. but if i am right nvidia has some improvements pending too on there drivers. i think drivers will mature in month or 2 then will really know.
 
i mean come on ATI is even loosing in image quality test ATI was always better in that then nvidia.
These are VERY imature drivers.
Search


Home

VR-News

Editorials

Processors

Motherboards

Memories

Graphic Cards

Overclocking

Barebones & SFF

Cooling & Chassis

Digital Photography

Mobile Computing

Networking

Storage Devices

Digital Audio Devices

Displays

Technology

Software

Others

Downloads

Sim Lim Square
















VR-Zone > ATi Radeon 2000 Series Launch: X2900XT Review Add VR-Zone Headlines in RSS

Sunday, May 13, 2007





[ATi Radeon 2000 Series Launch: X2900XT Review ]
Page Title: Test Platform, Drivers
Category: GPUs & Graphic Cards
Type: Reviews
Posted By: Shamino
Date: May 14, 2007, 3:55 am
Source: ATi
Actions: Print Article Email Del.icio.us Digg




When I first started testing the card, I was using the 8.36 Catalyst Drivers.






Then after I have finished running the test runs for the X2900XT, the 8.37 came and I had to rerun everything.






Thus, I took the chance to also compare the difference between these two slight Driver update. On the 8.37, there is no option of the 'High Quality' under AF options with the X2900XT while there is on the 8.36. There is the option when I put in the X1950XTX as well.





Seeing that the High Quality option for Anisotropic filtering was missing on the X2900XT on the 8.37 Drivers when it was present on the 8.36 drivers, I guessed that AF was automatically set at best quality when enabled for the X2900XT on this new set of drivers. So I ran a check between the 2 drivers with Oblivion to check out the Anisotropic Filtering. 1600x1200, 16x AF (High Quality when option was there), Temporal Anti-Aliasing at 8x Level and Wide-Tent Filter set at 16x Sampling.






The Filtering on the 8.37 is definitely at least on par or even better than the High Quality setting on the 8.36. You get the faint impression that textures seem to be slightly more detailed on the 8.37. So I didn't really care that the High Quality Option was missing on the 8.37 drivers with the X2900XT.


--------------------------------------------------------------------------------





Platform Test Setup


CPU
Intel Core 2 Duo Extreme Edition X6800 Overclocked @ 9 x 366MHz = 3.3GHz
Motherboard
ASUS P5K Deluxe (Intel P35 Chipset)

Memory
2 x 1GB GSkill F2-8000PHU2-2GBHZ DDR II set to run @ CL5-5-5-15, DDR2 915MHz, 5:4 Divider

Graphics Card
ATi HD X2900XT 743/828MHz
ASUS EN8800GTS 640MB 513/792MHz Inno3D 8800GTX 575/900MHz EVGA 8800GTS 320MB Superclocked 576/860MHz ASUS 1950XTX 648/1000MHz
USD$399 USD$399 USD$529 USD$299 USD$433
Hard Disk Drive
Seagate 80GB, 250GB Barracuda SATA Hard Disk Drives

PSU
SilverStone Zeus ST85ZF

Operating System
Windows XP Pro


One look and you can tell which segment this video card is gunning for: the USD$399 price point where the GeForce 8800 GTS resides, it's direct competitor.

Drivers Used on X2900XT and X1950XTX is Catalyst 8-37-4-070419a. Drivers used for 8800GTS 320/640MB and 8800GTX is Forceware 158.22.

MipMap Detail setting on all drivers set to maximum level of High Quality. 16x Anisotropic Filtering was turned on.





As of time of testing, we did not have the latest build just issued out 3 days before NDA was lifted. We were running 8-37-4-070419a.
The latest 8.37.4.2_47323 drivers is supposed to implement a new intelligent algorithm that increases FPS while applying similar image quality when running Adaptive Anti-Aliasing. In Oblivion, performance several times faster than previous drivers using the new adaptive AA algorithm was claimed to have been acheived. New optimizations for HDR applications in general resulted in a 5-30% increase in performance.

The 8.37.4.2_47323 is actually a pre-alpha driver, but it includes a preview of new 12xAA and 24xAA modes. These modes use an advanced edge detection filter that delivers edge quality while eliminating blurring.



Clock, Heat, Power < > Quake 4, Lost Coast


Page: 10 - Test Platform, Drivers Page: 1 - Radeon X2000 Series! ... Page: 2 - New Anti-Aliasing ... Page: 3 - DirectX 10 Demos ... Page: 4 - Radeon Mobility 2000 Series ... Page: 5 - Radeon HD X2600, X2400 Pictures ... Page: 6 - Radeon X2900XT Pictures ... Page: 7 - X2900XT Cooling ... Page: 8 - More Pictures ... Page: 9 - Clock, Heat, Power ... Page: 10 - Test Platform, Drivers ... Page: 11 - Quake 4, Lost Coast ... Page: 12 - Company Of Heroes, BattleField 2142 ... Page: 13 - NeverWinter Nights 2, Oblivion ... Page: 14 - 3D Mark 06... Page: 15 - Image Quality: 2900 16xAF vs 8800 16xAF... Page: 16 - Image Quality: Anisotropic Filtering Continued ... Page: 17 - Image Quality: Wide-Tent Anti-Aliasing... Page: 18 - Overclocking ... Page: 19 - Mod Her up!... Page: 20 - UnReal Overclocking! ... Page: 21 - Cold Bug? No!... Page: 22 - Conclusion ...







[Optimized] Page was created in 0.612623929977 seconds.
As you read thru this, youll see that the drivers are the quality problem. Give it time, the release drivers will be ok, but therell still be huge improvements as we go along. If I remember, the 1900 struggled against the 7900 when first released, then pulled away as the drivers matured
 

Periander

Distinguished
Jan 23, 2007
170
0
18,680
The conclusion:

In many non Anti-Aliasing, High Definition game settings, you have seen the X2900XT push ahead of the performance of it's closest competitor, the GeForce 8800GTS 640MB, sometimes by quite a large margin, sometimes falling behind or ahead by a small percentage. In a select few games, the GTS is slightly faster, and vice versa. When Anti-Aliasing is turned on, the X2900XT showed that it carries it off with great efficiency in games that the drivers are optimized for, performing significantly better than the GTS; while the AA efficiency is piss-poor in some games due to the raw driver which has not fully blossomed to take advantage of ATi's new GPU technology. Just take a look at how performance has boosted from Drivers 8.36 to 8.37, that shows the potential in performance growth... a whole lot of it to reap.


It is slightly off tradition that the GPU company's flagship product sails off not to meet the flagship of it's competitor, but one target lower. Then again, the lower we go down the price pyramid, the bigger the audience, more people with the budget to spend. I'd say that there is no clear winner between the 8800 GTS and X2900XT, the GTS displayed more consistent performance behavior while the X2900XT fluctuates around due to the in-matured driver. I would say that despite the heat thrown out by the GPU, the X2900XT overclocks better than the 8800GTS by 8-10%, but that's putting out a lot more heat and drawing more power than it already consumes. So this is something potential XT buyers should take note of, the heat produced by the card is no small amount, nor is the power consumed by it - more than 60w over the GTS. What you would be investing in is a higher potential of upcoming performance boosts (including the latest pre-Alpha 8.37.4.2_47323 Catalyst just released 3 days before this review) and full HDCP support with integrated audio controller. And of course the new programmable Tessellation technology which we will probably not see support in games until much later.

Not the fastest video card in the market for sure, but definitely holds it's own at it's current price-point. We only hope that supply will be adequate and not lead to an indirect increase in prices due to short supply. We hope to see some interesting implementations from various card partners as well, be it overclocked specifications, or improved coolers.

Should be no surprise to anyone.
 

SEALBoy

Distinguished
Aug 17, 2006
1,303
0
19,290
Considering the cards architecture (512-bit bus, 64 Unified Shaders), I suppose it was just a matter of drivers that dragged it back so much.
 

Greaper08

Distinguished
May 13, 2007
17
0
18,510
These are VERY imature drivers.

And you are very much in denial :roll: You have an arguement against every benchmark that comes out showing that the R600 is not going to be worth the wait after all.

Get over yourself already FANBOY :roll:
 
I ran 3D Mark 05 with a 'modestly' overclocked Quad Core at 4.2GHz and at 1GHz Core, !030MHz Memory, well over 24,000 on 3D Mark 05! This card is strong at this benchmark, I noticed the benchmark became really CPU-bottled neck even at 4.2GHz, as scores went up little even as I overclocked the GPU much. A 5.2GHz CPU perhaps can take it really close to 30,000 mark.
Im thinking that theres more to the story, as they also said the 2900 will oc 8-10% better than the gts
 
G

Guest

Guest
i don't know man. first it was the boards are not mature. and now the drivers are not mature. we can say drivers are not mature. but drivers won't bump up the preformence by huge number or would they. Nvidia drivers are not mature either. so i guess we have a driver war now
 
G

Guest

Guest
and please we are here discusing not here to blame each other so keep the simple
 

blade85

Distinguished
Sep 19, 2006
1,426
0
19,280
i don't know man. first it was the boards are not mature. and now the drivers are not mature. we can say drivers are not mature. but drivers won't bump up the preformence by huge number or would they. Nvidia drivers are not mature either. so i guess we have a driver war now

well, according to that review, going from Catalyst 8.36 to Catalyst 8.37, resulted in the performance going up by 11% on COH and 42% on Quake 4. sooo, you never know :)
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280
i don't know man. first it was the boards are not mature. and now the drivers are not mature. we can say drivers are not mature. but drivers won't bump up the preformence by huge number or would they. Nvidia drivers are not mature either. so i guess we have a driver war now
You should try running a 8800 GTX with the windows standard vga driver and then come back and repeat what you should said. 8)