Sign in with
Sign up | Sign in
Your question

Cat 7.9 and 2900XT 1GB crossfire -- huge improvement

Last response: in Graphics & Displays
Share
September 10, 2007 11:55:07 PM

ATI still have some compatibilities issues to finish up on these drivers, but dang they just gave me a 50% boost in Quake IV and about 30% in Lost Planet.

Looking for image quality sacrafice now -- need to go back to Cat 7.7 and compare, can't notice anything obvious.

Need to see how FSX SP1 does...

Rob.

Under Vista x64 with KB's applied.
September 11, 2007 12:18:42 AM

Thats pretty impressive.

Best,

3Ball
September 11, 2007 6:04:43 AM

Pretty good so far, but Titan Quest still has problems -- no shading/shadows at all. Rise of Legends and AOE III is fine. FSX SP1 about the same -- no real crossfire support there.

Need to run 3DMark06 and 3DMark05 again...

Rob.
Related resources
Can't find your answer ? Ask !
September 11, 2007 11:38:27 AM

50% boost is quite bit of frame rates. Do you mind posting benchmarks?
September 11, 2007 1:04:02 PM

Niice. Good for AMD/ATI
Hopefully they can give NV a hurry up on the 9xxx
:bounce: 

FSX is a cpu bound game.
It runs about the same on a 7600 as it does on a 8800 ultra
If u have a quad the SP1 WILL use 100% of each core.
:hello: 
Pretty cool.
September 11, 2007 2:43:53 PM

If AMD keeps these gains up and gets all the little problems completely worked out by x-mas time I will be considering the 2900XT for my next card...which I really hope they do.

Best,

3Ball
September 11, 2007 3:29:05 PM

But what about the aa performance boost? Have you noticed any imporvements there?
September 11, 2007 3:50:55 PM

marvelous211 said:
50% boost is quite bit of frame rates. Do you mind posting benchmarks?


Not to be cynical but I agree. We need a comparison. Were your frame rates in those games poor to begin with? How did they compare to 8800? Are you saying ATI has found a magic setting or something? We need more info.
September 11, 2007 4:16:06 PM

notherdude said:
Not to be cynical but I agree. We need a comparison. Were your frame rates in those games poor to begin with? How did they compare to 8800? Are you saying ATI has found a magic setting or something? We need more info.
I was wondering the same thing; when Lost Planet first came out, the HD 2900XT was barely breaking 20FPS, while at the same quality the 8800GTX was getting around 70. Now if the HD 2900XT is say, performing this much better in a title that already worked fine than I'll be impressed.
September 11, 2007 4:45:28 PM

I'll get some more detailed numbers up this weekend, don't have time to do a good job of that right now.

Please, please do NOT make this an nVidia vs. ATI debate. I was just posting my quick initial findings using the Cat 7.9 drivers on two HD 2900XT 1GB cards in crossfire on a P35 based (Asus P5K) board vs. my older Cat 7.7 (I skipped Cat 7.8).

Plus this is on my Vista x64 box where other Microsoft KB's were installed manually (since they aren't on Windows Update yet) so those may have also improved the performance which may have squewed the results somewhat.

ATI still has some work to do on the compatibility front with various titles -- just happy to see the drivers moving in the right direction without any image quality loss, but still some AA issues remain in certain titles.

Also running crossfire and Intel have a new chipset drivers for the P35 out which I need to install.

But don't worry, I'll get some numbers up.

P.S. - Heyyou27, can you tell me what setting you used to get 70 fps in Lost Planet DX10 version? I've test my older 8800GTX 768 card at 1920 x 1080 and was getting low of 10 fps to a high of 32 fps (after unplugging all my other USB devices) -- again on Vista x64 with 163.xx beta.
September 11, 2007 8:02:41 PM

notherdude said:
Not to be cynical but I agree. We need a comparison. Were your frame rates in those games poor to begin with? ... Are you saying ATI has found a magic setting or something? We need more info.


Well they did just finish work on their FireGL cards which focus on OGL drivers, so really, I wouldn't be surpised at all if QIV got a big boost, especially since these were supposed to be the drivers prepped for the QuakeWars demo launch. LostPlanet sounds like they finally got their optimizations in after finally being able to play with the game. These changes don't happen quickly and unlike Q4/QW they didn't have much time with LP, so depending on the work required it usually take 2-3 releases to see the new additions, just look at how long D3 and Oblivion took to go from performance patches to be rolled into the drivers, and the chuck patch took about 5 or more releases to be put into the drivers (due to WHQL).

I don't think it's magic (the G80s also improved greatly over time), but I would like to see IQ comparisons since both companies still believe in optimizations especially in the AF area, and without the ability to change it in the 2K series, I'd want to check things like catalyst AI style optimizations and be sure they stay on the side of quality and not performance.

But like V8 says the high water mark shouldn't be compared to the G80 it should be compared to the previous quality/performance of that setup, and see what they've improved/done.
September 13, 2007 6:09:53 PM

Quick update on 7.9 vs 7.7

Cat 7.9 & 2900XT 1GB in Crossfire still has problems with Lost Planet DX10 (flashing, this is a known issue), DX9 version is fine. From Cat 7.7 to Cat 7.9 the performance has gone from 12 fps average to 22 fps average in Lost Planet. My single 8800GTX was doing 13 fps average. But the 8800GTX did NOT have any graphical issues, the 2900XT has a ton of issues with Lost Planet (especially in Crossfire mode) -- has me a little concerned about DX10 and ATI cards -- hoping that more driver releases can resolve the image problems. On a side note, I really do NOT like the blurr effect used in Lost Planet on either my nVidia or ATI -- don't know if this is just poor implementation of blurr or if it is just a limitation of DX10 or if it is driver problem in both camps -- either way, blurr seems was over used and is not realistic at all.

So far big performance improvements on Quake IV, Doom3 - no surprises there (frame rates never below 60 so don't really care).

All testing is being done in 1920 x 1080 (except 3DMark06 which is 1280 x 1024) on Vista x64 with Asus P5K deluxe (P35) on 3.2Ghz (8 x 400) X6800 CPU with 4GB RAM DDR2.

3DMark06 score went from 11325 to 13502 (Cat 7.7 to Cat 7.9). 8800GTX (single) was 8409. Tested in 1280 x 1024 with control panel settings set to default for both cards and "use application".

FSX SP1 maintains 24 fps solid (locked at 24) --this is WAY up over Cat 7.7 -- I'd average 15 fps. On the single 8800GTX I'd get wild fluctuations in fps from 6 to 24 fps -- nVidia have some work to do there with FSX SP1 under Vista x64.

Race07 Demo -- still a dog, has been a dog on both ATI/nVidia -- should be measure if seconds per frame -- not sure what is going on with this title.

Race -- 60+ fps (don't care)

GTR2 -- 60+ fps (don't care)

rFactor -- 60+ fps (don't care - resolution selection issue)

Supreme Commander, AOE III, C&C 3, Oblivian -- still need to fraps them.

There are just too many individual graphics setting for each and every game (along with control panel setting) -- I'll try to get those specific settings up this weekend.

Overclocking results to come...(back on a Air cooled only setup, TEC would just heat up the entire HOUSE way too much -- case of where I would need to install some pipes and put the radiators and fans outside and add some serious water pumps and a quick connect outlet in my gaming room).

Vista x64 seems to be on average seems to be about 10% performance hit.


September 13, 2007 8:36:49 PM

Ok, think I found a more serious problem with Cat 7.9 for Vista x64 -- I installed the Drivers using the Suite install approach. After intial boot into Vista from S3 power state, there is executable called CCCPrev.EXE that runs and consumes 50% of my CPU processing cycles -- this is a 32bit app not 64bit. As far as I can tell CCCPrev is the code used to display the differences in graphics settings (3D window) in the Cat Control Panel.

On my system the default location of this file is:
C:\Program Files (x86)\ATI Technologies\ATI.ACE\Graphics-Previews-Vista

It appears to be safe to End Process CCCPrev.exe via task manager -- which brings CPU back to idle 2-3% range.

Better start my testing all over again....ugh!

I've submitted a Ticket with AMD/ATI.


September 13, 2007 9:14:10 PM

2x 2900XT's with 1G each, who does that. We can't really even see anything past ~25FPS anyway. Let's see...more power than 2 8800GTX's and lower FPS too. In fact a single 640 GTS is good enough, less power, less heat, less price than 2 2900XT's, few issues.

I'm drunk have a nice day...! :) 
September 14, 2007 4:52:25 AM

T8RR8R said:
2x 2900XT's with 1G each, who does that. We can't really even see anything past ~25FPS anyway. Let's see...more power than 2 8800GTX's and lower FPS too. In fact a single 640 GTS is good enough, less power, less heat, less price than 2 2900XT's, few issues.

I'm drunk have a nice day...! :) 



If you cant see a difference over ~25 FPS then you must be drunk. I would agree with you if you were to say something more like 60 or so.

Best,

3Ball
September 14, 2007 5:18:26 AM

I know I've played games where the framerate can drop from 60 to 55 and I notice it. To say we can't see above 25 is almost absurd. I got 20 on Oblivion on my X200 and it still played choppy, so I know you can perceive above 25 frames/second.
September 14, 2007 6:15:21 AM

V8VENOM said:
Quick update on 7.9 vs 7.7

Cat 7.9 & 2900XT 1GB in Crossfire still has problems with Lost Planet DX10 (flashing, this is a known issue), DX9 version is fine. From Cat 7.7 to Cat 7.9 the performance has gone from 12 fps average to 22 fps average in Lost Planet. My single 8800GTX was doing 13 fps average. But the 8800GTX did NOT have any graphical issues, the 2900XT has a ton of issues with Lost Planet (especially in Crossfire mode) -- has me a little concerned about DX10 and ATI cards -- hoping that more driver releases can resolve the image problems. On a side note, I really do NOT like the blurr effect used in Lost Planet on either my nVidia or ATI -- don't know if this is just poor implementation of blurr or if it is just a limitation of DX10 or if it is driver problem in both camps -- either way, blurr seems was over used and is not realistic at all.

So far big performance improvements on Quake IV, Doom3 - no surprises there (frame rates never below 60 so don't really care).

All testing is being done in 1920 x 1080 (except 3DMark06 which is 1280 x 1024) on Vista x64 with Asus P5K deluxe (P35) on 3.2Ghz (8 x 400) X6800 CPU with 4GB RAM DDR2.

3DMark06 score went from 11325 to 13502 (Cat 7.7 to Cat 7.9). 8800GTX (single) was 8409. Tested in 1280 x 1024 with control panel settings set to default for both cards and "use application".

FSX SP1 maintains 24 fps solid (locked at 24) --this is WAY up over Cat 7.7 -- I'd average 15 fps. On the single 8800GTX I'd get wild fluctuations in fps from 6 to 24 fps -- nVidia have some work to do there with FSX SP1 under Vista x64.

Race07 Demo -- still a dog, has been a dog on both ATI/nVidia -- should be measure if seconds per frame -- not sure what is going on with this title.

Race -- 60+ fps (don't care)

GTR2 -- 60+ fps (don't care)

rFactor -- 60+ fps (don't care - resolution selection issue)

Supreme Commander, AOE III, C&C 3, Oblivian -- still need to fraps them.

There are just too many individual graphics setting for each and every game (along with control panel setting) -- I'll try to get those specific settings up this weekend.

Overclocking results to come...(back on a Air cooled only setup, TEC would just heat up the entire HOUSE way too much -- case of where I would need to install some pipes and put the radiators and fans outside and add some serious water pumps and a quick connect outlet in my gaming room).

Vista x64 seems to be on average seems to be about 10% performance hit.

Good results, i manged over 9000 on my 8800GTS 640MB just wondering how come you got under 9000 with the GTX, i'm running C2D E6600 and Corsair 2x 1GB 6400 800Mhz RAM.
September 14, 2007 3:50:34 PM

I don't know why I got below 9000 on my 8800GTX 768MB (seemed low to me also -- think the average for that card was 10000-11000) -- are you running Vista x64? I had the latest nVidia Beta drivers but also tried the WHQL with pretty much the same results in 3DBench06. But as with my ATI's, I'm about 10-20% below what various web sites are reporting.

Microsoft had released a few Vista x64 hot fixes that could affect 3D performance: http://www.nvidia.com/object/windows_vista_hotfixes.htm...

This fix did smooth out FPS in FSX SP1 -- still about the same fps, just no wild fluctuations.

But not trying to compare to other folks systems/configurations -- hard enough getting a good comparison on the same system let alone someone elses -- which is what I'm trying to do with a focus on going from Cat 7.7 and Cat 7.9.

Also discovering that I'm having to setup specific ATI profiles for just about every game I'm testing with (this is getting tedious). In many cases the best game performance (as reported by Fraps 2.9.2) and visual quality is when AA in the game is set to NONE, and use the ATI control panel to force the AA setting -- so far this seems to be the key to many games. Also enabling Adaptive AA in the ATI control panel seems to be a bad idea -- causes many issues with 3D objects disappearing then re-appearing along with horrible decrease in frame rates.

I don't care about anything that averages above 60 fps because my 1080p monitor refreshes at 60hz so anything above that is pointless even if I pretended my eyes could tell. But if anyone can get FSX SP1 to run at 60 fps average @ 1920 x 1080 with everything turned up, please contact me -- I'd pay for that info.

My goal is 60 fps @ 1920 x 1080 with good dose of AA & AF, my acceptable minimum is 24 fps.

Why am I doing this -- pretty obvious, I want AMD/ATI to survive -- without them the world of graphics performance and CPU performance will come to a halt (that is ALWAYS the case when competition is removed or seriously down sized). I got these ATI graphics cards knowing full well nVidia's older 8800 SLI offerings had the upper hand in performance. With that said, I'm actually more impressed that ATI's cards are doing as well as they are on my system -- of course, based on the AMD/ATI down play and all the negative posts, I had low expectations to begin with.

I got both of these cards for $880 shipped to my door with tax (to be honest you can probably do as well going with the two 512MB DDR3 cards rather than two DDR4 1GB cards). Two 8800GTX 768 would be shipped to door for $1090. Two 8800GTS 640MB shipped to door is $790. Is it worth the extra $90 for ATI's cards, don't know -- I haven't tested two 8800GTS 640MB cards.
September 14, 2007 4:14:25 PM

Im pretty sure the average for a gtx is 13000-14000. I get almost 11k with my 8800gts 320mb and I havent tested since the last two bioshock patches.
September 14, 2007 4:46:44 PM

I was using 3DMark06 test results database, based on my setup it had the 8400 score just below the average which was around 10000. At least that's what the online site presented to me based on the test I ran.

A quick search and the scores I see for 8800GTS 320 range from a low of 6597 to high of 9756 (overclocked) -- so your 11,000 score is very impressive but again without knowing your driver and exact setting for both graphics and 3DMark06 it would be hard to compare.

September 14, 2007 5:00:18 PM

V8VENOM said:
P.S. - Heyyou27, can you tell me what setting you used to get 70 fps in Lost Planet DX10 version? I've test my older 8800GTX 768 card at 1920 x 1080 and was getting low of 10 fps to a high of 32 fps (after unplugging all my other USB devices) -- again on Vista x64 with 163.xx beta.
I was using Lost Planet as an example of why simply posting % gains doesn't always present an accurate image on current performance. Back at the PC demo's release, the HD 2900XT performed very poorly due to driver issues; now say, if these gains with Catalyst 7.9 occurred in Oblivion or titles that already had relatively good performance than it's great news.
September 14, 2007 5:33:53 PM

Not sure I follow you -- I posted % gains on specific games and it does indeed vary from game to game. Not sure why Oblivion requires anything different? Don't think it is good to use any "single" game as the definitive benchmark.

Lost Planet is one of the early DX10 games, neither ATI nor nVidia are doing well with the game on my system. But DX10 is pretty new to developers so as they come up to speed on how to use the technology both software/hardware I think you'll see more and more games taking advantage of the strengths or each hardware solution.

Not sure why AMD/ATI aren't pushing the "best played on" startup ads like nVidia do in many games? Even when ATI had the upperhand, they never pushed that, which seems odd. One would think it is in AMD/ATI's best interest to work very closely with game developers.
September 14, 2007 5:52:20 PM

just under 13000 on my GTX.

under 9000 with your machine(quad 3200 and all) does not seem possible.

Keep the info coming. interesting to see the driver improvements.
September 14, 2007 6:23:49 PM

X6800 is not a quad, it's a dual.

have much testing to do -- haven't even done any testing without Crossfire enabled yet. The Asus P5K (P35) only permits 16x/4x for PCI-E -- seems strange but may not be relevant with crossfire (I just don't know what usage the bus gets vs. Crossfire connectors) -- anyone feel free to fill me in here.
September 14, 2007 7:01:17 PM

my bad i though i read "Q"X6800....my bad.....
September 14, 2007 8:42:11 PM

I hate all you guys!!!! I get 7000 on on 3DMark06 with my 7600GT SLI :fou: 
September 14, 2007 8:54:10 PM

Well ATI have escalated the CCCPrev.exe issue -- good news.
September 14, 2007 10:41:50 PM

SpeedyVV you are lying i really don't think that you could get more then 4500 with a geforce 7600gt sli.
September 15, 2007 12:20:51 AM

cristip60 said:
SpeedyVV you are lying i really don't think that you could get more then 4500 with a geforce 7600gt sli.


http://service.futuremark.com/compare?3dm06=2282113

I did scale back my OC now though to 2.5 unless I am playing DiRT.

EDIT- Oh, and I am not the fastest with 7600GT in SLI.
September 15, 2007 12:49:10 AM

Actually, all kidding aside cristip60, I always wondered if that score was as good as it looks.

It is our 1st build, and on top of that, my 11 year old daughter and 9 old son did the build.

Were we just lucky? For a n00bs 1st build\OC experience anyways.

Oh yeah and if you think I am lying about the kids doing the build, take a look:

http://www.youtube.com/watch?v=f2XVeEDqGUc
September 15, 2007 3:32:48 AM

Just to add to the whole "25 fps thingy"
The not being able to see more than 25fps comes from the movie industry.
There is a process called 'motion blur' which makes images look better.

This doesnt exist in gaming hence the numbers need to be higher.
More importantly however, the idea is to have a system that runs at about 30fps in the WORST CASE SCENARIO.
This means u are running ~ double this most of the time JUST to keep up when its crunch time.

Thats the way i do it anyways.
September 15, 2007 4:28:31 AM

why 16x4x does dont hinder performance compared to 8x8x is because in crossfire most of the data comunicated to the second card is via the crossfire connection, not the pci-e slot.

oh btw, about the 25fps thing, our eyes don't see in frames per second, our eyes percieve images in an "analogue" format, not a "digital" one, aka, we are constantly seeing things that blend into each other, not seeing heaps of different pictures wiht transition pictures inbetween, on a TV, they have transition pictures which give the perecpetion of smooth vision, but in real life, the human eye can perfcieve up to about 70 different changes in light per second, so you can percieve about 70fps. btw, everyoen is unique, so this figure may vary.
September 15, 2007 8:34:17 AM

I recall reading something about 70 fps as being the human limit and hence why IMAX used that speed for some of their filming.

Ran some quick tests with BioShock DX10 -- 1920 x 1080 with all details set to High (there doesn't appear to be any AA or AF setting in game) ATI Control Panel settings of AA 8X AF 16X or Application Controlled seem to have no affect on the image quality nor frame rates. Running the game with Fraps 2.9.2 and it never drops lower than 30 and maxes at 60 fps. Didn't notice any graphical problems, in fact it really looks nice (better than Lost Planet IMHO) -- water, smoke, shadows, reflections are very impressive -- this brings back some hope in DX10 titles.

September 15, 2007 9:09:11 AM

V8VENOM said:
I recall reading something about 70 fps as being the human limit and hence why IMAX used that speed for some of their filming.

Ran some quick tests with BioShock DX10 -- 1920 x 1080 with all details set to High (there doesn't appear to be any AA or AF setting in game) ATI Control Panel settings of AA 8X AF 16X or Application Controlled seem to have no affect on the image quality nor frame rates. Running the game with Fraps 2.9.2 and it never drops lower than 30 and maxes at 60 fps. Didn't notice any graphical problems, in fact it really looks nice (better than Lost Planet IMHO) -- water, smoke, shadows, reflections are very impressive -- this brings back some hope in DX10 titles.


Forcing AA/AF using CCC does not enable the effects at all in Bioshock. In Fact, from what i've read, AA/AF can not be forced/enabled at all in Bioshocks DX10 mode. NVidia users are apparently able to force AA using DX9 and XP (note, not Vista) using their native control panel. This, apparently, is a limitation of the Unreal Engine 3 in Vista and with DX10. After reading all the cries on 2Kgames forums about not having in game AA/AF control, it may be taken into consideration by the design team for the first patch (..... when they get around to releasing it :p ).

P.S..... Your FPS may max at 60 because Bioshock tends to enable Vsync automatically every time the game is started. Disable it in the settings tab (and also in CCC if you have it set) and you should, given you have the horsepower, exceed the 60 FPS cap.

Cheers
September 15, 2007 4:46:52 PM

Anyone know the reason why 2K prevents AA/AF in DX10? Seems like a pretty unusual move unless the developers "discovered" something during DX10 coding so they basically turned it OFF for now.

60 fps is fine with me, honestly I don't need to go any higher -- maybe you folks can tell the difference beyond 60 fps, but I can't.

Still working on getting Lost Planet working well, just when I thought I had a good combo, averaged 34 fps max to 60 fps -- it triggered a reboot! And I thought my PC Power & Cooling 1000 Watt PSU would be enough for two 2900XT 1GB, looks like I need to rethink -- maybe go with Koolance 1200 Watt water cooled PSU.

September 15, 2007 6:09:24 PM

Sorry for that but i have in one of my computers a 7600gs oc'ed to 7600gt level and i didn't score more then 2400 in 3dmark so i didn't think it was possible for a 7600gt sli to get more then 5000.
September 15, 2007 7:25:49 PM

cristip60 said:
Sorry for that but i have in one of my computers a 7600gs oc'ed to 7600gt level and i didn't score more then 2400 in 3dmark so i didn't think it was possible for a 7600gt sli to get more then 5000.
He also has a Core 2 Duo at 3GHz.
September 15, 2007 8:30:50 PM

My bad didn't see that.
September 15, 2007 8:47:48 PM

Don't worry V8Venom, even with the 7.9 drivers (which freaking rock) you are still going to find a blue screen every now and then. so getting about 38fps is perfectly normal they will eventually get everything working just right. I remember with my old 9600xt even 2 years afterward the drivers were still improving the card.
September 16, 2007 12:00:02 AM

cristip60 said:
SpeedyVV you are lying i really don't think that you could get more then 4500 with a geforce 7600gt sli.


Hey cristip60, I want a public apology for calling me a liar. Come on be a man (or woman) and admit your mistake!!!! :kaola: 

Oops I just missed your apology above. Oh well, you are a man!! :hello: 
September 16, 2007 12:20:02 AM

SpeedyVV said:
Hey cristip60, I want a public apology for calling me a liar. Come on be a man (or woman) and admit your mistake!!!! :kaola: 

Oops I just missed your apology above. Oh well, you are a man!! :hello: 


For future reference, before any OC (CPU or GPU) I got 3607 with 1 GPU and 6191 in SLI. Mind you, the XFX 7600GT XXX is factory OCed.
September 16, 2007 2:59:31 PM

I deserved that.
September 18, 2007 4:11:31 PM

Well I didn't get to doing much testing over the weekend (out racing at Infineon).

On the negative side: Lost Planet is still a lost causes unfortunately -- is anyone actually playing this game?? I went to the capcom website seeking information and the place is virtually dead? Thread count is very low. The Lost Planet keyboard issue returned (continuous rapid scrolling at the Graphics settings - which made it impossible to do further testing) and I just didn't feel like unplugging all my USB devices to make it work right (I hate bad ports).

On the plus side: BioShock is still working well (getting over my motion sickness).

Another plus: in crossfire mode I was able to overclock GPUs to 830 Mhz and the 2GB of DDR4 to 1160 Mhz all with stock air cooling -- fans do kick in and temps go up to 73 degrees C (ouch!). Got me into the 14007 with 3DMark06 (4% increase) and added about 3 fps average to FSX SP1 (12% increase).

Tried to get the 2nd Beta of Crysis, but must have missed to 30 second window of opportunity.


September 22, 2007 6:52:26 AM

Loaded up World in Conflict -- even with just 4X AA and 4AF at 1920 x 1080 with Max High I averaged 8 fps -- tried playing around with control panel settings but best ones that work were "Application Controlled".

It's almost as if this games doesn't work with crossfire at all. But seeing as it is another "nVidia title" I'm not surprised -- but WTF is going on with ATI -- do they not "coordinate" with over game developers also??

Maybe ATI does have AA/AF problems with DX10 titles. AA/AF works great with older DX9 stuff.

The only positive news is that using "uncompressed textures" has no affect of fps.

But I must admit, I'm beginning to wonder if ATI has contact any of the current DX10 dev teams.

Rob
September 22, 2007 7:16:27 AM

WIC has a few settings that affect performance most. Water reflection size is one. Water quality is another. Seeing a pattern here? The game has nice looking water with huge reflection sizes, but runs bad. Also running Pixel shaders on Low make a huge performance increase too, but you lose all of the Post processing effects as well as a general loss in quality, particularly when the nuke goes off.
September 22, 2007 3:55:53 PM

Ok, this really has nothing to do with a 2900XT, but here it goes...

Okay, on my Dell E1505 laptop I have a ATI Mobility Radeon X1300 and I would love to install Catalyst 7.9, but Dell doesn't have it on the download page for my laptop. I tried to install just the regular desktop graphics and I still have the old version installed... so I'm not really sure what is going on. If anybody has any advice or a place where I can download the Notebook version.
October 4, 2007 1:12:07 PM

I just got 2900 PRO. What a piece of junk. I am sending it back. My 600W PSU not powerful enough even though it is fine for my 8800.

2900 PRO gives me blank screen and beeps. I would send it back even if it worked - it is so extremally loud. I mean extremally loud, it is like having a vacum cleaner in the PC. How can you guys tolerate it?
October 4, 2007 1:38:06 PM

64 bit Vista? Crossfire? Man, this guy is a glutton for punishment... there's a reason they call it bleeding edge technology. Even if I had that kind of money to throw around on a rig (actually, I do now that I think about it) I'd want a more mature platform... that machine is depreciating like mad and I'd want to play it 24/7 to maximize my value... not spend my time troubleshootinging anomolies.
October 5, 2007 12:29:36 AM

leckig said:
I just got 2900 PRO. What a piece of junk. I am sending it back. My 600W PSU not powerful enough even though it is fine for my 8800.

2900 PRO gives me blank screen and beeps. I would send it back even if it worked - it is so extremally loud. I mean extremally loud, it is like having a vacum cleaner in the PC. How can you guys tolerate it?

What a load of BS. Your PSU is crap or the card is simply DOA. I can run a 2900XT (which uses more power) on my 520 watt PSU if I had the money to pay for it, so your system is just more screwed than mine. Assuming you are talking about the 8800GTX/ultra (you act like you have the cash for them) the 2900 pro uses less power than those. And of course it's loud, all ATI cards are loud, what did you expect? Do some research and stop complaining.
!