2900XT review on vr-zone

everyone check out the review on 2900 xt on vr-zone. and and 2900xt is against 8800GTS not GTX. in some test 2900xt is beaten by 8800GTS 320 MB.

http://vr-zone.com/?i=4946&s=1
84 answers Last reply
More about 2900xt review zone
  1. i guess nvidia prices are not coming down then. what do you guys think.
  2. Ive been reading them, but I havnt pulled up the test platform yet. What cpu did they use? The 3D marks looks ok, but the comment on the possible bottleneck from the cpu gives concern. Also, they speak of huge driver improvements needed, and as they went from one to the newer, that the performance jumped
  3. vr-zone seems to be a bit slow today as I cant pull up anything but the first page.

    Best,

    3Ball
  4. they used X6800 and it requires more power then GTX and performance is just as good as 8800GTS in some case less then 8800GTS 320. yeah performance did increase with drivers. but i don't think it would increase quiet much with drivers. what do you think. i am kind of disapponited now after reading this review. all the wait and the R600 hype was not worth it
  5. Quote:
    i guess nvidia prices are not coming down then. what do you guys think.

    They probably wouldn't come down more even if the 2900XT was better. The X2900XT better come down in price. Otherwise, no one would buy it.
  6. yeah i guess everybody is jumping to that review
  7. i mean come on ATI is even loosing in image quality test ATI was always better in that then nvidia.
  8. Wow the pages are taking forever to load...

    On the first page it says the X2600 draws only 45W of power. That's insane. If ATI can get it to perform faster than the abysmal nVidia 8600 cards, they can really take over the mid-range market.
  9. I hate to say this but, according to FUaD, the new drivers really help. They said (VR) that they saw improvements from 5 to 30 percent , and thats still not the newest drivers. And Oblivion saw great improvements. I think in the end what we will all see is that itll be better than the gts, nip the gtx in a few games, and be priced very competitively
  10. Well.. Nvidia still might drop prices and honostly.. I think they already did :)

    I am planning to buy a new comp and was waiting for decent 2900 benches before ordering, 3 weeks ago I couldn't get GTS 320 for any less then 325 Euro, it's +/- 250 Euro now and you can get a 640 Mb version for around 350 so I would call it a considerable price drop.

    They didn't have to drop prices.. they will however (in my opinion) stab ATI/AMD like Intel would when they are at it's weakest. AMD/ATI better get their act together or we will be stuck with Intel/Nvidia monopoly before long and god knows that Intel has enought cash to buy 3 Nvidia's....

    I'll wait untill the end of the month and if ATI doesn't release a good driver by then.. they've lost another client :(
  11. i hoping you right man. but if i am right nvidia has some improvements pending too on there drivers. i think drivers will mature in month or 2 then will really know.
  12. Quote:
    i mean come on ATI is even loosing in image quality test ATI was always better in that then nvidia.
    These are VERY imature drivers.
    Quote:









    Search


    Home

    VR-News

    Editorials

    Processors

    Motherboards

    Memories

    Graphic Cards

    Overclocking

    Barebones & SFF

    Cooling & Chassis

    Digital Photography

    Mobile Computing

    Networking

    Storage Devices

    Digital Audio Devices

    Displays

    Technology

    Software

    Others

    Downloads

    Sim Lim Square
















    VR-Zone > ATi Radeon 2000 Series Launch: X2900XT Review Add VR-Zone Headlines in RSS

    Sunday, May 13, 2007





    [ATi Radeon 2000 Series Launch: X2900XT Review ]
    Page Title: Test Platform, Drivers
    Category: GPUs & Graphic Cards
    Type: Reviews
    Posted By: Shamino
    Date: May 14, 2007, 3:55 am
    Source: ATi
    Actions: Print Article Email Del.icio.us Digg



    When I first started testing the card, I was using the 8.36 Catalyst Drivers.


    Then after I have finished running the test runs for the X2900XT, the 8.37 came and I had to rerun everything.


    Thus, I took the chance to also compare the difference between these two slight Driver update. On the 8.37, there is no option of the 'High Quality' under AF options with the X2900XT while there is on the 8.36. There is the option when I put in the X1950XTX as well.


    Seeing that the High Quality option for Anisotropic filtering was missing on the X2900XT on the 8.37 Drivers when it was present on the 8.36 drivers, I guessed that AF was automatically set at best quality when enabled for the X2900XT on this new set of drivers. So I ran a check between the 2 drivers with Oblivion to check out the Anisotropic Filtering. 1600x1200, 16x AF (High Quality when option was there), Temporal Anti-Aliasing at 8x Level and Wide-Tent Filter set at 16x Sampling.


    The Filtering on the 8.37 is definitely at least on par or even better than the High Quality setting on the 8.36. You get the faint impression that textures seem to be slightly more detailed on the 8.37. So I didn't really care that the High Quality Option was missing on the 8.37 drivers with the X2900XT.


    --------------------------------------------------------------------------------


    Platform Test Setup


    CPU
    Intel Core 2 Duo Extreme Edition X6800 Overclocked @ 9 x 366MHz = 3.3GHz
    Motherboard
    ASUS P5K Deluxe (Intel P35 Chipset)

    Memory
    2 x 1GB GSkill F2-8000PHU2-2GBHZ DDR II set to run @ CL5-5-5-15, DDR2 915MHz, 5:4 Divider

    Graphics Card
    ATi HD X2900XT 743/828MHz
    ASUS EN8800GTS 640MB 513/792MHz Inno3D 8800GTX 575/900MHz EVGA 8800GTS 320MB Superclocked 576/860MHz ASUS 1950XTX 648/1000MHz
    USD$399 USD$399 USD$529 USD$299 USD$433
    Hard Disk Drive
    Seagate 80GB, 250GB Barracuda SATA Hard Disk Drives

    PSU
    SilverStone Zeus ST85ZF

    Operating System
    Windows XP Pro


    One look and you can tell which segment this video card is gunning for: the USD$399 price point where the GeForce 8800 GTS resides, it's direct competitor.

    Drivers Used on X2900XT and X1950XTX is Catalyst 8-37-4-070419a. Drivers used for 8800GTS 320/640MB and 8800GTX is Forceware 158.22.

    MipMap Detail setting on all drivers set to maximum level of High Quality. 16x Anisotropic Filtering was turned on.


    As of time of testing, we did not have the latest build just issued out 3 days before NDA was lifted. We were running 8-37-4-070419a.
    The latest 8.37.4.2_47323 drivers is supposed to implement a new intelligent algorithm that increases FPS while applying similar image quality when running Adaptive Anti-Aliasing. In Oblivion, performance several times faster than previous drivers using the new adaptive AA algorithm was claimed to have been acheived. New optimizations for HDR applications in general resulted in a 5-30% increase in performance.

    The 8.37.4.2_47323 is actually a pre-alpha driver, but it includes a preview of new 12xAA and 24xAA modes. These modes use an advanced edge detection filter that delivers edge quality while eliminating blurring.



    Clock, Heat, Power < > Quake 4, Lost Coast


    Page: 10 - Test Platform, Drivers Page: 1 - Radeon X2000 Series! ... Page: 2 - New Anti-Aliasing ... Page: 3 - DirectX 10 Demos ... Page: 4 - Radeon Mobility 2000 Series ... Page: 5 - Radeon HD X2600, X2400 Pictures ... Page: 6 - Radeon X2900XT Pictures ... Page: 7 - X2900XT Cooling ... Page: 8 - More Pictures ... Page: 9 - Clock, Heat, Power ... Page: 10 - Test Platform, Drivers ... Page: 11 - Quake 4, Lost Coast ... Page: 12 - Company Of Heroes, BattleField 2142 ... Page: 13 - NeverWinter Nights 2, Oblivion ... Page: 14 - 3D Mark 06... Page: 15 - Image Quality: 2900 16xAF vs 8800 16xAF... Page: 16 - Image Quality: Anisotropic Filtering Continued ... Page: 17 - Image Quality: Wide-Tent Anti-Aliasing... Page: 18 - Overclocking ... Page: 19 - Mod Her up!... Page: 20 - UnReal Overclocking! ... Page: 21 - Cold Bug? No!... Page: 22 - Conclusion ...







    [Optimized] Page was created in 0.612623929977 seconds.
    As you read thru this, youll see that the drivers are the quality problem. Give it time, the release drivers will be ok, but therell still be huge improvements as we go along. If I remember, the 1900 struggled against the 7900 when first released, then pulled away as the drivers matured
  13. The conclusion:

    Quote:
    In many non Anti-Aliasing, High Definition game settings, you have seen the X2900XT push ahead of the performance of it's closest competitor, the GeForce 8800GTS 640MB, sometimes by quite a large margin, sometimes falling behind or ahead by a small percentage. In a select few games, the GTS is slightly faster, and vice versa. When Anti-Aliasing is turned on, the X2900XT showed that it carries it off with great efficiency in games that the drivers are optimized for, performing significantly better than the GTS; while the AA efficiency is piss-poor in some games due to the raw driver which has not fully blossomed to take advantage of ATi's new GPU technology. Just take a look at how performance has boosted from Drivers 8.36 to 8.37, that shows the potential in performance growth... a whole lot of it to reap.


    It is slightly off tradition that the GPU company's flagship product sails off not to meet the flagship of it's competitor, but one target lower. Then again, the lower we go down the price pyramid, the bigger the audience, more people with the budget to spend. I'd say that there is no clear winner between the 8800 GTS and X2900XT, the GTS displayed more consistent performance behavior while the X2900XT fluctuates around due to the in-matured driver. I would say that despite the heat thrown out by the GPU, the X2900XT overclocks better than the 8800GTS by 8-10%, but that's putting out a lot more heat and drawing more power than it already consumes. So this is something potential XT buyers should take note of, the heat produced by the card is no small amount, nor is the power consumed by it - more than 60w over the GTS. What you would be investing in is a higher potential of upcoming performance boosts (including the latest pre-Alpha 8.37.4.2_47323 Catalyst just released 3 days before this review) and full HDCP support with integrated audio controller. And of course the new programmable Tessellation technology which we will probably not see support in games until much later.

    Not the fastest video card in the market for sure, but definitely holds it's own at it's current price-point. We only hope that supply will be adequate and not lead to an indirect increase in prices due to short supply. We hope to see some interesting implementations from various card partners as well, be it overclocked specifications, or improved coolers.




    Should be no surprise to anyone.
  14. Considering the cards architecture (512-bit bus, 64 Unified Shaders), I suppose it was just a matter of drivers that dragged it back so much.
  15. i cant even see any pictures :(
  16. Quote:
    These are VERY imature drivers.


    And you are very much in denial :roll: You have an arguement against every benchmark that comes out showing that the R600 is not going to be worth the wait after all.

    Get over yourself already FANBOY :roll:
  17. Quote:
    I ran 3D Mark 05 with a 'modestly' overclocked Quad Core at 4.2GHz and at 1GHz Core, !030MHz Memory, well over 24,000 on 3D Mark 05! This card is strong at this benchmark, I noticed the benchmark became really CPU-bottled neck even at 4.2GHz, as scores went up little even as I overclocked the GPU much. A 5.2GHz CPU perhaps can take it really close to 30,000 mark.
    Im thinking that theres more to the story, as they also said the 2900 will oc 8-10% better than the gts
  18. Did someone say something?
  19. i don't know man. first it was the boards are not mature. and now the drivers are not mature. we can say drivers are not mature. but drivers won't bump up the preformence by huge number or would they. Nvidia drivers are not mature either. so i guess we have a driver war now
  20. Quote:
    i cant even see any pictures :(

    same here :cry:
  21. and please we are here discusing not here to blame each other so keep the simple
  22. Quote:
    i don't know man. first it was the boards are not mature. and now the drivers are not mature. we can say drivers are not mature. but drivers won't bump up the preformence by huge number or would they. Nvidia drivers are not mature either. so i guess we have a driver war now


    well, according to that review, going from Catalyst 8.36 to Catalyst 8.37, resulted in the performance going up by 11% on COH and 42% on Quake 4. sooo, you never know :)
  23. There will be performance boosts, not just drivers. The 88's didnt do that well when they first came out either. In a few games they did, but overall no.
  24. Quote:
    i don't know man. first it was the boards are not mature. and now the drivers are not mature. we can say drivers are not mature. but drivers won't bump up the preformence by huge number or would they. Nvidia drivers are not mature either. so i guess we have a driver war now

    You should try running a 8800 GTX with the windows standard vga driver and then come back and repeat what you should said. 8)
  25. noooo, i cant even read the damn thing now...lol

    now i get :

    Warning: mysql_connect() [function.mysql-connect]: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2) in /home/vrz/public_html/start.php on line 2
    Database error: Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
  26. your just way to mature for this thread.
  27. Quote:
    There will be performance boosts, not just drivers. The 88's didnt do that well when they first came out either. In a few games they did, but overall no.


    Absloute rubbish. Here was anand's take on the G80 when it came out.

    http://www.anandtech.com/video/showdoc.aspx?i=2870&p=29

    Quote:
    A single GeForce 8800 GTX is more powerful overall than a 7900 GTX SLI configuration and even NVIDIA's mammoth Quad SLI. Although it's no longer a surprise to see a new generation of GPU outperform the previous generation in SLI, the sheer performance we're able to attain because of G80 is still breathtaking. Being able to run modern day games at 2560x1600 at the highest in-game detail settings completely changes the PC gaming experience. It's an expensive proposition, sure, but it's like no other; games just look so much better on a 30" display at 2560x1600 that it makes playing titles at 1600x1200 seem just "ok". We were less impressed by the hardware itself than by gaming at 2560x1600 with all the quality settings cranked all the way up in every game we tried, and that is saying quite a lot. And in reality, that's what it's all about anyway: delivering quality and performance at levels never before thought possible.

    Architecturally, G80 is a gigantic leap from the previous generation of GPUs. It's the type of leap in performance that's akin to what we saw with the Radeon 9700 Pro, and given the number of 9700 Pro-like launches we've seen, they are rare. Like 9700 Pro, we are able to enable features that improve image quality well beyond the previous generation, and we are able to run games smoothly at resolutions higher than we could hope for. And, like 9700 Pro, the best is yet to come.



    I can dig up tons more reviews that echoed those sentiments.
  28. i am checking vr-zone again it again it's way fast now.
  29. You have to remember a few things, this isnt the drivers we will get. The overdrive wasnt even an option, and this isnt their real full review. Did anyone see anything about the HD sound at all? Therellbe more, trust me
  30. so they finally activated the review?
  31. yeah the sound in 2900XT is real plus point.
  32. So youre saying that aside from a brand new arch, that the IMPROVEMENTS from thedrivers from launch til now are next to nil? Dont get me wrong here, I think the GTX was an incredible leap in performance and IQ for nVidia, but that isnt what I was saying, as I point out that just look at the BENCHES from startup to now NOT the ARCH
  33. one thing more man so i guess 2900 xt will have same or little bit more performence then 8800gts. will 128 MB of more memory on 8800gts help in higher resolution. as you need more memory. what do you think.
  34. Quote:
    So youre saying that aside from a brand new arch, that the IMPROVEMENTS from thedrivers from launch til now are next to nil?


    I'm saying the 8800s kicked ass from day 1, delivering major performance and quality improvements the day they were launched. You said they "didnt do that well when they first came out either" and "In a few games they did, but overall no". Those statements were absloute rubbish.

    I understand that maturing drivers can certainly boost performance. That does not change the fact that when the G80s launched they delivered large boosts right away. People who skipped the G80s to wait for the R600s ended up waiting over 6 months and are now being asked to wait still longer in the hope that someday they won't suck as much as do now compared to the stuff the competition put out over 6 months ago.

    It's a bad joke really.
  35. Most reviews/tests Ive seen regaurding vram usage, that they max out using 460 Megs in the most demanding games, tho there may be a few newer ones since that use a lil more.
  36. Quote:
    So youre saying that aside from a brand new arch, that the IMPROVEMENTS from thedrivers from launch til now are next to nil?


    I'm saying the 8800s kicked ass from day 1, delivering major performance and quality improvements the day they were launched. You said they "didnt do that well when they first came out either" and "In a few games they did, but overall no". Those statements were absloute rubbish.

    I understand that maturing drivers can certainly boost performance. That does not change the fact that when the G80s launched they delivered large boosts right away. People who skipped the G80s to wait for the R600s ended up waiting over 6 months and are now being asked to wait still longer in the hope that someday they won't suck as much as do now compared to the stuff the competition put out over 6 months ago.

    It's a bad joke really. Whoa... I wasnt slamming the GTX. I never have. I was keeping in context about the drivers, and what one usually sees in performance from the drivers as they mature. Ill state it here, the GTX is one whoopazz card, no problems with that, just keeping on the driver issue is all. And like I said earlier, the drivers we see in these tests are not the ones the buyers of the cards will get
  37. Where and when did the press briefing day in this article take place? Was it the one in Africa? The date on the article says...huh, May the 14th?!
  38. I believe some is taken from then/there in April, the rest was done 2 days ago.
  39. Quote:
    everyone check out the review on 2900 xt on vr-zone. and and 2900xt is against 8800GTS not GTX. in some test 2900xt is beaten by 8800GTS 320 MB.

    http://vr-zone.com/?i=4946&s=1
    GG ATI. :(
  40. Quote:
    Nvidia drivers are not mature either. so i guess we have a driver war now


    Indeed, the poor drivers are the real problem for us Vista users. I hope one of them gets mature drivers soon.
  41. The way I see it is that there really isnt a DX9 game that cant be run on a single GTX that wont either look or perform badly. The 2900 will be nipping at the heals of the GTX, so the same story. The next hurdle will be the DX10 games and how well each card performs.
  42. yeah cause thats what are these cards for. but one thing i read and this review proved it that 2900xt will be faster in pixel fill rate the GTX cause of it's shader units. but then again there aren't many games that will use it. i will find that review and post it here. it was just a hardware review of R600 and G80
  43. I remember reading that, but these 2 dont share the same arch, theyre each taking their own path. The super scaling on the 2900 may be a nice surprise down the road, but thats speculation
  44. lol, I was just browsing around for more R600reviews, and man.. like 80% of them have Daily tech's scores but with diferent fancier graphics, and most of them post screenshot of the 1 GB ddr OEM version thats clockedf SLOWER than the retail...

    Quote:
    everyone check out the review on 2900 xt on vr-zone. and and 2900xt is against 8800GTS not GTX. in some test 2900xt is beaten by 8800GTS 320 MB.

    http://vr-zone.com/?i=4946&s=1
    GG ATI. :(

    werent these tests using the first .36 driver? ( .38 is suposed to be out by tomorrow monday)
  45. i saw a picture of sound card connecting to 2900xt. i thought it had sound onboard.
  46. Quote:
    The 88's didnt do that well when they first came out either. In a few games they did, but overall no.


    Your speaking UTTER and Complete BS now :roll: Besides a few bugs with the first XP driver the performance was outstanding from the get go. I should know as I actually owned one from release, did you? :roll:
  47. Quote:
    lol, I was just browsing around for more R600reviews, and man.. like 80% of them have Daily tech's scores but with diferent fancier graphics, and most of them post screenshot of the 1 GB ddr OEM version thats clockedf SLOWER than the retail...

    everyone check out the review on 2900 xt on vr-zone. and and 2900xt is against 8800GTS not GTX. in some test 2900xt is beaten by 8800GTS 320 MB.

    http://vr-zone.com/?i=4946&s=1
    GG ATI. :(

    werent these tests using the first .36 driver? ( .38 is suposed to be out by tomorrow monday)Hopefully things will get better with newer drivers; power consumption is also higher than the 8800GTX, which is not something that most people are going to like given the current Direct X9 performance.
  48. Quote:
    There will be performance boosts, not just drivers. The 88's didnt do that well when they first came out either. In a few games they did, but overall no.


    Absloute rubbish. Here was anand's take on the G80 when it came out.

    http://www.anandtech.com/video/showdoc.aspx?i=2870&p=29

    Quote:
    A single GeForce 8800 GTX is more powerful overall than a 7900 GTX SLI configuration and even NVIDIA's mammoth Quad SLI. Although it's no longer a surprise to see a new generation of GPU outperform the previous generation in SLI, the sheer performance we're able to attain because of G80 is still breathtaking. Being able to run modern day games at 2560x1600 at the highest in-game detail settings completely changes the PC gaming experience. It's an expensive proposition, sure, but it's like no other; games just look so much better on a 30" display at 2560x1600 that it makes playing titles at 1600x1200 seem just "ok". We were less impressed by the hardware itself than by gaming at 2560x1600 with all the quality settings cranked all the way up in every game we tried, and that is saying quite a lot. And in reality, that's what it's all about anyway: delivering quality and performance at levels never before thought possible.

    Architecturally, G80 is a gigantic leap from the previous generation of GPUs. It's the type of leap in performance that's akin to what we saw with the Radeon 9700 Pro, and given the number of 9700 Pro-like launches we've seen, they are rare. Like 9700 Pro, we are able to enable features that improve image quality well beyond the previous generation, and we are able to run games smoothly at resolutions higher than we could hope for. And, like 9700 Pro, the best is yet to come.



    I can dig up tons more reviews that echoed those sentiments.


    Thank you and well said. We need more people like you and I putting these dumb fanboys in their place.
  49. yeah thats why i am confused too. it has less memory then GTX and chip is .65MM then why it uses way more power
Ask a new question

Read More

Graphics Cards Gtx Graphics Product