Sign in with
Sign up | Sign in
Your question

My experience with HD2900XT @ 1680 x 1050 w/ 4xaa

Last response: in Graphics & Displays
Share
May 25, 2007 10:29:29 PM

This morning I installed what I believe is Catalyst driver version 8.38 RC7 and took a walk through the woods in Oblivion @ 1680 x 1050, 4xaa and 16af forced at the driver level, with HDR enabled in the game. I have been testing informally and subjectively from the same save point in the game with different drivers over the past few days.

Is there an improvement from one driver to the next? The answer is no. This card simply chokes on scenes with a lot of trees with the above settings and the frame rate drops into "this is laggy" or “what a bummer” territory pretty consistently. This is the third driver I have tried and there is really no subjective improvement. If there is an improvement there I don't see it, or at least it isn’t meaningful. The issue is clearly the trees, specifically, the tops of the trees with all the leaves. That is without a doubt what the card is struggling with at 4xaa. It is very disappointing.

Note that the card performs better in the Shivering Isles expansion pack because large oversized mushrooms replace a lot of the trees—at least in the areas I have played through. Mushrooms do not have lots of leaves to render, obviously, so that makes sense.

Also note that in Battlefield 2, you would expect the card to handle such an old game with ease, but 8x antialiasing is also laggy when there are lots of trees with lots of leaves. On most maps, 8xaa is not really an option. As this is the only in game option higher than 4x available to me (and forcing at the driver level doesn’t work for me) this card does not yield any gameplay advantage over my old x1800XT except for better framerates. There is no image quality improvement.

In both games if you pan down and just look at the grass the frame rate jumps up, but when you pan back up so that lots of trees are in the frame the frame rate slows to a crawl. If you turn the grass distance all the way down in Oblivion, you still get laggy frame rates because of those trees with 4xaa.

At first I couldn't figure out why some reviews of this card were so positive while [H]'s review was so harsh. I think I get it now. [H]'s review, by stressing maximum playable settings as opposed to frame rates and resolutions, highlights what this card's weakness is—antialiasing. Antialiasing, to me, is a critical feature, because my monitor’s native resolution is 1680 x 1050, which is high, but not high enough to make going without AA an option. It is a must have feature for me, and I suspect it is a must have feature for a lot of other gamers who are similarly situated.

Other reviews tend to do no aa or 4xaa and then compare frame rates. Some reviews do some super high resolutions as well. After playing around with the card myself, I think [H]’s methodology is the best and their conclusions are also pretty accurate.

At the end of the day, the only thing that matters is the maximum playable settings that a card can provide in the games that you play. I have an image quality standard that I require of all games without exception—1680 x 1050 with 4xaa and 16xaf, 50 fps average. This card doesn’t cut it—not for Oblivion at least.

The HD2900XT has a lot going for it but for some reason I cannot explain, ATi gimped this card with respect to antialiasing. If I understand the hardware correctly, the 2900 has 16 ROPs, the GTS has 20 ROPs, and the GTX has 24. My understanding of the architecture of these cards is limited, but my understanding is that the ROPs are the part of the card that has the biggest impact on AA performance. There is just no way around the fact that the card is lacking in this department and it shows in games.

It also has less texture units. As [H] explained:
Quote:
The ATI Radeon HD 2900 XT has 16 texture units and can perform 16 bilinear filtered FP16 pixels per clock. In comparison the GeForce 8800 GTX has twice as many texture units, 32 and does 32 FP16 pixels per clock, and the GTS has 50% more with 24 FP16 pixels per clock. It seems that ATI is focusing more on shader processing like they did with the Radeon X1K architecture. The GeForce 8800 GTS and GTX seem to have much higher texture filtering performance available.

Further:
Quote:
There are also 16 ROPs in the ATI Radeon HD 2000 series. The GeForce 8800 GTS has 20 ROPs and the GTX has 24. The Radeon HD 2900 XT can perform 32 pixels per clock for Z, the GeForce 8800 GTS can do 40 and the GTX does 48.

I have come to the conclusion, and I could be wrong, but I believe that this is the reason why the GTS and GTX perform so much better in Oblivion. I have not used either of the Nvidia cards personally so I can’t comment on them really. I find it hard to believe that any driver update is going to rectify this kind of poor performance. I think future drivers will improve the card’s performance, but unless someone can explain to me how a driver update can close the gap with respect to ROPs and AA performance, this card is going back to Newegg.
The point I’m trying to make is that as an ordinary enthusiast/gamer, I think [H]’s analysis of this card is correct. Some of the other reviews and especially the 3DMark benchmarks are very misleading, because the fact of the matter is that once you enable AA this card’s performance takes a giant nose dive.
When I think about how much foliage will be in Crysis (not to mention the inevitable next installment of the Battlefield series) this card just seems like too much of a gamble. I am also very concerned that with Crysis still a good 6 months away (I think) a new ATi card will appear with more ROPs and texture units.
I don’t doubt that this card does some things better than the GTS (geometry). It is also possible this card will perform better with DX10 games. But I simply do not see how this card will ever close the gap with the GTS with respect to antialiasing. I have never made an upgrade that yielded this little in terms of actual image quality.
For these reasons I will be returning the HD2900XT. What I replace it with is still up in the air—probably a GTX.
May 25, 2007 10:45:08 PM

Sorry to hear about your bad experience; I own an 800GTX @ 633MHz/1050MHz and the lowest my framerate goes to in the hardest areas is in the low 30s at 1680x1050 with 4xAA 16xAF and texture filtering at the max quality.
Anonymous
May 25, 2007 10:49:39 PM

i can't believe that it can't run battlefield 2 at 8x antialiasing. my 8800gts can do that with easily. thats strange
Related resources
May 25, 2007 11:06:06 PM

That does suck. You should RMA the card. I'm running the HIS 2900xt with a c2d e6600, 4gb GSkill DDR800, Gigabyte DS3, Kinda cheap 750w psu, and it deffinately runs better than my 320mb GTS system at work with the same setup. So... it should at least be acceptable. Do you have the grass distance all the way up? Doesn't matter what kind of card you have there it will lag. Perhaps there is a driver installation problem. Did you do a fresh Windows install when you upgraded your major piece of hardware? What kind of PSU? You didn't make it clear in your long post.
May 25, 2007 11:16:40 PM

That is disappointing. I am still hoping that a driver will fix the performance issues of the r600. Competition is always good for the consumer and at this point it doesn't seem like Nvidia has much.
May 26, 2007 12:15:10 AM

Sorry, I forgot my specs aren't listed on this forum. Here are my system specs.

Silverstone TJ07 (added front intake for videocard)
Intel E6600 @ 3.4 GHz (Tuniq Tower 120)
2GB OCZ Gold DDR2 1:1 @ Cas3
Asus P5W DH Deluxe Mobo
PC Power and Cooling Silencer 750 Quad
X-Fi
NEC 20WMGX2 w/ Opticlear
May 26, 2007 1:19:55 AM

Quote:
i came to know somthing now. another lie from AMD check this out

http://www.techreport.com/onearticle.x/12552

This isn't even related to the discussion at hand, why post this?Because he's a foolish Nvidia fanboy?
a b U Graphics card
May 26, 2007 1:56:36 AM

I'd love to reply to this thread right now, but I'm heading home.

I think you've got some good insight, especially first hand at the ffect of ATi's (remember they designed not AMD) continued focus on front end power over back-end.

To get a greater feel for the complexity of the issue, especially for AA, check into B3D and TheTechReport's reviews of the HD2900, both of them do a very good job of specialized testing of the shaders strength/limits, TU limiting factors, and the issue with ROP design (not just number).

I just wish we had performance results in these tests for the HD2600/2400 since it would offer more insight due to their different balancing of resources.
May 26, 2007 2:15:59 AM

Every time I read one of Ape's posts I have to go read a bunch of white papers. Damn You!
May 26, 2007 2:20:21 AM

Ape-

Right, I'm just trying to give some end user subjective type feedback. Obviously this is a complex piece of hardware and I don't understand it fully. It is possible that they made design choices that might pay off with future software, but what I'm trying to figure out is why the AA performance is so bad. It seems to be that it lacks ROPs and texture modules when compared to the 8800 series cards. So setting aside the tesselation and all that audio passthrough functionality, where is the advantage over an 8800 series card?
May 26, 2007 2:34:11 AM

No advantage that I can see whether from a power consumption standpoint or performance.By the time ATI/AMD get the drivers worked out they hopefully will have a better card that adresses those issues and some real DX10 games as well.IMO it's not worth it.
a b U Graphics card
May 26, 2007 2:35:01 AM

I guess their plan was for shader AA to catch on, too bad it hasn't been invented yet. Hopefully the drivers will help some, because if AA performance stays as it does, this will be a disaster for ATI. For me, though, the card runs awesomely in Oblivion. That may be because I am running at 1280x1024 though. :lol:  :lol: 
May 26, 2007 3:11:33 AM

Quote:
I guess their plan was for shader AA to catch on, too bad it hasn't been invented yet. Hopefully the drivers will help some, because if AA performance stays as it does, this will be a disaster for ATI. For me, though, the card runs awesomely in Oblivion. That may be because I am running at 1280x1024 though. :lol:  :lol: 
It's still a fast card and should be able to keep up with the 8800GTS in a good number of games. :) 
Anonymous
May 26, 2007 3:16:53 AM

now don't go on personal attacks. i just pointed out what AMD said 2900 xt had but it doesn't have it. if you read that article.don't go around disrepecting other people. if you don't have respect for yourself because you were just kissing great ape ass in other form even if he was disrespecting you.. then you guys say your mature. whats wrong with you.
May 26, 2007 3:21:18 AM

Thanks for posting. It´s an interesting take on the 2900xt.
May 26, 2007 3:44:29 AM

Quote:
now don't go on personal attacks. i just pointed out what AMD said 2900 xt had but it doesn't have it. if you read that article.don't go around disrepecting other people. if you don't have respect for yourself because you were just kissing great ape ass in other form even if he was disrespecting you.. then you guys say your mature. whats wrong with you.
You constantly post negative comments towards AMD/ATI products, and are always posting how great their Nvidia counterparts are. Also, you may call it ass kissing, but Ape knows more about graphics cards than just about every other member on the forum.
May 26, 2007 4:11:34 AM

Thanks for taking the time to write this post. It's nice to see how people actually feel about this product with hands on experience. I hope the drivers improve alot for ATI here soon because i would sure hate for Nvidia to try and take advantage of the situation. Graphics cards are already expensive enough, we need this competition to heat up again. Thanks guys
May 26, 2007 5:16:32 AM

i heard catalyst 7.5 will have a new type of AA. even the non r600. hold on to that card as this might be a saving grace or a last stand.
May 26, 2007 5:30:55 AM

Quote:
now don't go on personal attacks. i just pointed out what AMD said 2900 xt had but it doesn't have it. if you read that article.don't go around disrepecting other people. if you don't have respect for yourself because you were just kissing great ape ass in other form even if he was disrespecting you.. then you guys say your mature. whats wrong with you.
You constantly post negative comments towards AMD/ATI products, and are always posting how great their Nvidia counterparts are. Also, you may call it ass kissing, but Ape knows more about graphics cards than just about every other member on the forum.

I strongly disagree :!:
May 26, 2007 6:45:56 AM

Quote:
now don't go on personal attacks. i just pointed out what AMD said 2900 xt had but it doesn't have it. if you read that article.don't go around disrepecting other people. if you don't have respect for yourself because you were just kissing great ape ass in other form even if he was disrespecting you.. then you guys say your mature. whats wrong with you.
You constantly post negative comments towards AMD/ATI products, and are always posting how great their Nvidia counterparts are. Also, you may call it ass kissing, but Ape knows more about graphics cards than just about every other member on the forum.

you know, I was imagining hangman was another clone of ROB....
but then he lacks the classic spam of insults rob does all the time to show his superiority and e-penis of his dual 8800GTX :p 
May 26, 2007 7:21:23 AM

Seriously I should stop wasting my time at this forum. I don't see why the guy that started this thread with his long winded post is getting such bad numbers, it just doesn't add up to me. I am starting to think he doesn't even own a 2900xt. Ok anyways this forum is just full of product backing and fanboyism. If I need some serious non-objective reviews and advice I deffinately go elsewhere, and if I want to get into some lame ass flame war or take any position for any company and argue to a deaf retard about it... this is the place. Don't bother with replies, I am gone and for good. I'm sure some of you will love it. I don't care either way. This site sucks.
May 26, 2007 7:26:21 AM

Quote:
Seriously I should stop wasting my time at this forum. I don't see why the guy that started this thread with his long winded post is getting such bad numbers, it just doesn't add up to me. I am starting to think he doesn't even own a 2900xt. Ok anyways this forum is just full of product backing and fanboyism. If I need some serious non-objective reviews and advice I deffinately go elsewhere, and if I want to get into some lame ass flame war or take any position for any company and argue to a deaf retard about it... this is the place. Don't bother with replies, I am gone and for good. I'm sure some of you will love it. I don't care either way. This site sucks.


nana na na

nana na na

hey hey hey gooood bye
May 26, 2007 7:52:08 AM

I just found this explanation as to why the R600 was designed the way it was and why the aa performance is so bad, at least for the time being. It seems this was a deliberate design choice. What is confusing me now is that ATi seems to think this card is more future proofed than the 8800 series. This explanation at least revives the notion that the 2900XT will perform better in future games, but is this just P.R. spin?

Quote:

As most gamers will want AA and AF enabled in games, the HD 2900XT's poor performance with these processing options enabled is a serious problem for the card and ATi. We asked ATi to comment on this surprising result and the company revealed that the HD 2000-series architecture has been optimised for what it calls 'shader-based AA'. Some games, including S.T.A.L.K.E.R., already use shader-based AA, although in our tests the 640MB 8800 GTS proved to be faster than the HD 2900XT.

We asked Richard Huddy, Worldwide Developer Relations Manager of AMD's Graphics Products Group, to go into more detail about why the Radeon HD 2000-series architecture has been optimised for shader-based AA rather than traditional multi-sample AA. He told us that 'with the most recent generations of games we've seen an emphasis on shader complexity (mostly more maths) with less of the overall processing time spent on the final part of the rendering process which is "the AA resolve". The resolve still needs to happen, but it's becoming a smaller and smaller part of the overall load. Add to that the fact that HDR rendering requires a non-linear AA resolve and you can see that the old fashioned linear AA resolve hardware is becoming less and less significant.' Huddy also explained that traditional AA 'doesn't work correctly [in games with] HDR because pixel brightness is non-linear in HDR rendering.'

While many reviews of the HD 2900XT have made unflattering comparisons between it and Nvidia's GeForce 8800-series, Huddy was upbeat about AMD's new chip. 'Even at high resolutions, geometry aliasing is a growing problem that can only really be addressed by shader-based anti-aliasing. You'll see that there is a trend of reducing importance for the standard linear AA resolve operation, and growing importance for custom resolves and shader-based AA. For all these reasons we've focused our hardware efforts on shader horsepower rather than the older fixed-function operations. That's why we have so much more pure floating point horsepower in the HD 2900XT GPU than NVIDIA has in its 8800 cards... There's more value in a future-proof design such as ours because it focuses on problems of increasing importance, rather than on problems of diminishing importance."


I think you could read this as an argument that Oblivion performance is not going to be a good predicter of Crysis/UT3/Bioshock performance. Are they basically arguing that with future games 2900 series cards will be able to get rid of jaggies but in a different way? How are jaggies a problem of decreasing importance. I'm not saying I don't buy it I just don't get it.

I would like to hang onto this card I'm just concerned that the 8800 series is handling the aliasing problem so much better in today's games. Very confusing.

http://www.custompc.co.uk/custompc/news/112773/amd-explains-radeon-hd-2900xts-poor-aa-performance.html
a b U Graphics card
May 26, 2007 8:12:44 AM

Quote:
Ape-

Right, I'm just trying to give some end user subjective type feedback.


Yep, and I'm saying it actually seems about what many other are experiencing and what is expected for the situations you're talking about.

Quote:
but what I'm trying to figure out is why the AA performance is so bad.


There's two things involved in the AA issue, and one of them you already touched on and that's the ROP count, but the question becomes why does it take a greater AA hit than the X1950 which has just as many ROPs and they're clocked slower. Well it looks like either ATi changed their hardware AA in the ROPs or it's borked, but the result is the same in that the ROPs aren't handling the AA resolve and that the card relies on the shader to do that, which is inefficient for situations that don't need a shader based AA (like geometry and displacement). There's still not enough iformation on what's really going on (another reason to look forward to another design [should tell us if the AA is borked or if it's by design alone]) Needless to say both sides are being pushed by reviewers and the AMD people like WaveyDave aren't really giving enough insight to truely discount these rumours.

Quote:
It seems to be that it lacks ROPs and texture modules when compared to the 8800 series cards.


The TUs have nothing to do with AA performance, but they do affect AF performance, and really the R600 is mostly deficient in this area compared to the G80 for current games. That's the area most people expect to see focus on for the extra space on a 65nm refresh.
a b U Graphics card
May 26, 2007 8:23:18 AM

Quote:
I just found this explanation as to why the R600 was designed the way it was and why the aa performance is so bad.


Great, all that typing I did for nothing. :wink:

That's what I was talking about, yet that shouldn't have kept ATi/AMD from putting in hardware resolve as well, since you can do both and the transistor cost is minimal. Some reviewers aren't buying that they didn't do both and they just have a broken back end as well. I'm not convinced because to me it just looks like an extnesion of ATi's mis-calculation of the speed to these requirements being exploited.

If you think about the idea of what they're looking towards and consider the R600 as a design that's supposed to be in the beginning of DX10 games not at the pre-DX10 era, this shader based AA feature that's a requirement for D3D10.1 is a good approach, but right now it's just not cutting it compared to the dedicated hardware in the X1900 and GF80.
May 26, 2007 8:29:09 AM

Quote:
...ROPs aren't handling the AA resolve and that the card relies on the shader to do that, which is inefficient for situations that don't need a shader based AA (like geometry and displacement).


Ok that makes a lot of sense, thanks! Did you see the quote from ATi I posted above? They claim it is a deliberate design choice. So if I understand this correctly, basically the card is geared toward using the shaders and not the ROPs to reduce the jaggies, but that approach is only superior in certain situations? When you say situations, is it a question of whether developers tap into this capability? In other words, is ATi responding to what game developers are doing or do the game developers have to take affirmative steps to implement this when they design the game?

So the programmable shaders let the developers deal with the jaggies and nip it at the bud as opposed to resolving it at the end? lol this is so complicated.
a b U Graphics card
May 26, 2007 8:57:08 AM

Quote:
Did you see the quote from ATi I posted above?


Yeah it's similar to the ones I've seen before on the subject as we were discussing it elsewhere. There's still scepticism out there, but I believe it.... as in I believe that ATi was so myopic as to just do the shader based resolve and not have a backup plan for the legacy support for performance. :?

Quote:
but that approach is only superior in certain situations? When you say situations, is it a question of whether developers tap into this capability?


No it's not aboutn developers enabling it, it's that you can't apply AA corectly to certain situations (it breaks or reduces the effects), tey point out HDR as an example, and it makes sense, although I doubt people notice the difference when they've been doing it over the past year. A few other issues are geometry shaders, soft shadows, and displacement maps, all of which pose AA problems because of the way the effect is handled in the shader and how appying AA outside the shader would essentially distort the effect.

Quote:
In other words, is ATi responding to what game developers are doing or do the game developers have to take affirmative steps to implement this when they design the game?


Well supposedly it will be available if the game has the option for it, and also offer it as a forced feature for some titles (similar to how the chuck patch was implemented) but it's still early on to tell. Really it's shouldn't be difficult for the devs to implement since it doesn't require much work on their part since the R600 should handle the work, it just needs the queue to know it can do it.

Quote:
So the programmable shaders let the developers deal with the jaggies and nip it at the bud as opposed to resolving it at the end? lol this is so complicated.


Yeah it seems inefficient now, but it is obviously the way to go in the future, but it's still not practical yet, which sucks especially if we think it will be useful in the near future. D3D10.1 requires this shader based AA option, but I still think ATi shoud;ve designed for both for now, then drop hardware resolve later. They may have and it's borked, but it just sounds like they took another leap of faith like they did with the X1900 shader imbalance, and this time they missed for that aspect.

Anywhoo, hope that helps, gotta go sleep now.
May 26, 2007 9:21:56 AM

Can the 8800 series cards do all this programmable AA stuff as well?
a b U Graphics card
May 26, 2007 9:25:01 AM

Stalker uses shader AA? This is what ATI calls the future? Oh no, the future of image quality is doomed... :( 
May 26, 2007 4:02:01 PM

Quote:
I think you could read this as an argument that Oblivion performance is not going to be a good predicter of Crysis/UT3/Bioshock performance.

I'm glad they can read the future, it will make things a lot easier on me.
May 27, 2007 2:26:58 PM

Damn, I get tired of war in this forumz!
1st of all, as the Heyyou mentioned:
Quote:
Ape knows more about graphics cards than just about every other member on the forum.

It's so simple!
But if you are new to this forumz! you better start to understand this rule! If you just can't understand this simple rule, better F**K off and die somewhere! OK?
2nd, better show some respect to the older users of TG Forumz, if you can't do this, go to the last line of the 1st rule!(Don't mind my posts count as i have been on this forumz for some years and lots of usernames such as PX7800GT/BOSS/Lord_Farhang/... & other would probably know me! :wink: )
Better start to use these two very simple rules if you do not want to be like ROB! :wink:
Now back to the thread!
Seems nothing more to answer... :tongue:
Quote:
Stalker uses shader AA? This is what ATI calls the future? Oh no, the future of image quality is doomed... :( 

Yeah, pathetic!
S.T.A.L.K.E.R. graphic sucks! And as it doesn't support AA, i rather Half-Life II engine! :( 
a b U Graphics card
May 28, 2007 4:22:07 AM

Well it supports some weird dithering instead of AA, but at full it looks like normal 2x AA. Plus it doesnt support AF whats up with that?
May 28, 2007 5:12:14 AM

Quote:
Seriously I should stop wasting my time at this forum. I don't see why the guy that started this thread with his long winded post is getting such bad numbers, it just doesn't add up to me. I am starting to think he doesn't even own a 2900xt. Ok anyways this forum is just full of product backing and fanboyism. If I need some serious non-objective reviews and advice I deffinately go elsewhere, and if I want to get into some lame ass flame war or take any position for any company and argue to a deaf retard about it... this is the place. Don't bother with replies, I am gone and for good. I'm sure some of you will love it. I don't care either way. This site sucks.

Leave. Good riddance. Don't let the door hit you on the way out. You're far worse than the nV fanboys, most of them at least acknowledge that ATI has had good/superior products in the past, regardless of the G80 vs R600 outcome.

This moron wants "non-objective reviews"... what? He claims to "hate nVidia" but complains about fanboys? This is the same genius who posted this crapola in another thread:
Quote:
How could this possibly be true. Well, I feel like I must start another useless post on this horrible forum that I decide to check on once in a while. Seriously though does anyone think that Nvidia will have an answer to ATi's 65nm chips. I seriously think not. There is no way Nvidia could possibly have the resourses to do it on thier own. Well, all of you Nvidia fanboy retards... well I shouldn't clump you together like that. All of you sheep who jump on the latest and greatest bandwagon look out. Nvidia just may go the way of 3dfx and the rest. Whatever though, with all of the useless ATi bashing posts on this site I thought I'd just waist your time. Get bent.


And just who the f*ck are you?

Hi, I'm erocker. I hate Nvidia, and I have reasons. By the use of your language, I just may hate you too. :D 
May 28, 2007 5:21:26 AM

Quote:
Damn, I get tired of war in this forumz!
1st of all, as the Heyyou mentioned:
Ape knows more about graphics cards than just about every other member on the forum.

It's so simple!
But if you are new to this forumz! you better start to understand this rule! If you just can't understand this simple rule, better F**K off and die somewhere! OK?
2nd, better show some respect to the older users of TG Forumz, if you can't do this, go to the last line of the 1st rule!(Don't mind my posts count as i have been on this forumz for some years and lots of usernames such as PX7800GT/BOSS/Lord_Farhang/... & other would probably know me! :wink: )
Better start to use these two very simple rules if you do not want to be like ROB! :wink:
Now back to the thread!

I wouldn't judge too much by post count, some with lots of posts like Ape here in graphics and JumpingJack over in CPUs really know their stuff, where some other 10000+ posters are complete idiots and/or fanboys. Look how fast you can rack posts up in a flamewar for example. I bet ROB's up to 10000 posts if you combine all his accounts. I suggest respecting all the users until they prove they don't deserve it by posting fanboy crap or FUD.
!