Sign in with
Sign up | Sign in
Your question

I need some help, confused about two things with ATi

Last response: in Overclocking
Share
September 9, 2006 3:47:01 PM

Hi.

1) What's the difference between the x8xx series and the 1xxx from Ati? I thought an x800 would be an old model and the x1600 + would be the new ones. Is the difference simply that the 1000 series have HDR? x850 has more than double the framerates of x1600 etc.

2) I'm choosing between an nVidia 7900GT and an x1900GT. The 7900 is slightly more expensive, but seems to have better performance. What are the other differences between them? Which one is noisiest and runs hotter?

Thanks for any help.

More about : confused things ati

September 10, 2006 12:05:31 PM

Quote:
Hi.

1) What's the difference between the x8xx series and the 1xxx from Ati? I thought an x800 would be an old model and the x1600 + would be the new ones. Is the difference simply that the 1000 series have HDR? x850 has more than double the framerates of x1600 etc.

2) I'm choosing between an nVidia 7900GT and an x1900GT. The 7900 is slightly more expensive, but seems to have better performance. What are the other differences between them? Which one is noisiest and runs hotter?

Thanks for any help.


I believe the x1xxx series from Ati has newer technology obviously and has shader model 3 support, oh and of course it has better performance. But, the thing is that the x850 is the top dog of the older series so of course it would still be better than the middle contenders of the newer series.

Im not sure about 7900gt and x1900gt though.
September 10, 2006 1:19:29 PM

Remember to check the THG VGA charts, but it depends on what you want to do with it. The 7900GT pretty much outperforms the X1900GT in most all games save Oblivion, and given that the difference there is only about 3 frames average at 1600X1200 outdoors, you'd be better off going with the slightly cheaper 79. Out of curiousity, what motherboard do you have?

As for your question about heat and noise, I know that all the X1900 series cards run incredibly hot, and apparently the fan is pretty loud. That being said, however, I could only suppose that the 7900 is cooler and quieter; I haven't any experience with NVIDIA.
Related resources
September 10, 2006 7:47:36 PM

Yup, the 7900GT is a much better choice than the x1900GT, which is a crippled card.

The top x1800 card is about the same price and far better than the x1900GT.

The x1900XT on the otherhand, is a better card than any of the above, including the 7900GT, but its slightly more expensive.

On the other question, the 'x' series (x300, x600, x800, x850) is the 'equivalent' to the Geforce 6xxx series, the 'x1' series, (x1300, x1600, x1800, x1900) is equivalent to the Geforce 7xxx series.

Just as a Geforce 6800U will walk all over a 7300GT, and probably a 7600GT in alot of benchmarks, the x850 is a faster card than the low end x1xxx GPUs.
September 10, 2006 11:49:59 PM

I haven't got the motherboard yet, as it may depend on the card (ie SLI or crossfire).

So basically, an x is old tech, but still pretty fast,

The x2 is new tech, with better capabilities, better for the future and playing newer games.

The 850 is still selling for X1900 and 7900GT prices, so that's why I was a little confused. Who would be buying x850's when they are out of date?

The 7900GT would be cooler and quieter, and slightly faster.

Thanks for all the help, I'm gonna go 7900GT. I only heard crossfire was easier and more compatilbe than SLI, which was why I was wavering. The THG VGA charts go show the 7900GT's with a 10-15 frame increase, and they only cost slightly more.
September 11, 2006 12:14:35 AM

The 'x' is ment as the roman numeral 'x', or 10, because otherwise after the 9xxxs (the same generation as the Geforce FX 5xxxs) we'd have been on Radeon 10xxxs, and ATi decided that was too long.

Hence we now have 'x' for 10 and 'x1' for 11 :) 

The x850 is still sold as an AGP card, as its the fastest ATi AGP card, the only other decent one is the x1600, and thats a low end card (while the x850 was the fastest of its generation). As far as I know all x850 PCI-E cards you can find are old stock.

If you want to go Intel, you are probably slightly better off with Crossfire because the 975x supports it, and thats an amazing chipset while the nVidia ones are not as good on the Intel side.

Motherboard considerations aside, Crossfire is no easier than SLi these days, if anything its more complex.

Most Crossfire solutions need a Crossfire Master card and any normal card, there is generally only one master card per category and the pair run at the speed of the slowest.

For example, the x1900 Crossfire is the same speed at an x1900XT. if you pair those two up, thats fine, but if you pair it with an x1900GT, then the extra pipelines in the crossfire master are shut down and wasted, and if you pair it with an x1900XTX, the XTX is slowed down to the x1900 Crossfire speed. You also need a nasty external dongle thingy.

SLi *can* run cards from different vendors etc nowadays, even cards with different stock clocks (the faster card slows down) or different amounts of RAM (the extra on the larger card is shut down as with crossfire)

All that sound the same as Crossfire, but thats 'just in case', with SLi you can buy an identical 2nd card, rather than a limited range of master cards.

Also the equivalent of the dongle, the SLi bridge, is internal, and optional.

That advantage is moot on Intel imho untill nVidia release drivers that allow SLi on the 975.
September 12, 2006 3:58:28 AM

Now there's a 7600GT AGP version which can be comparable to the X850 XT. It's about about $50-$60 more expensive than the AGP X1600, though.
September 12, 2006 4:22:21 AM

Crossfire setups usually dont put much stress on the CPU as much of the information is handled by the cards themselves, SLI is, however, useing the processor.
September 12, 2006 6:04:02 PM

Quote:
Crossfire setups usually dont put much stress on the CPU as much of the information is handled by the cards themselves, SLI is, however, useing the processor.


Eh?

Please show some information to back that up. I'm positive that that is completely baseless.

There is no more of a CPU load from SLi than from Single card or from Crossfire.
September 12, 2006 9:56:18 PM

From: http://www.atomicmpc.com.au/article.asp?SCID=15&CIID=24...

SLI vs. CrossFire

Now that we’ve dealt with the theory, it’s much easier to understand what ATI and NVIDIA have brought to the table. Despite all the fanfare on each side, NVIDIA’s SLI and ATI’s CrossFire have a lot in common.

Both Crossfire and SLI support alternate frame rendering. This is no surprise as it’s the simplest method to implement. Since it scales both vertex and pixel shading, you can expect most games to be using AFR.

CrossFire and SLI also both support split frame rendering. NVIDIA currently only supports horizontal split frame rendering. ATI supports both horizontal and vertical splits. Since load balancing is a critical issue for SFR, NVIDIA has implemented an on the fl y algorithm to determine where to split the frame. As this is done after each frame, it will adapt itself to the game environment.

ATI’s split frame rendering uses a predetermined value to split the screen. The optimal split location is determined by ATI after they’ve profiled the game.

Right here!!!
:::::They believe this is more effective since there’s no CPU overhead associated with calculating the load after each frame. This method, however, is not the preferred way. ATI has long perfected the tile based splitting strategy for use in massive visualisation systems. ::::::


Now perhaps ive insinuated that it is a 'MAJOR CRUSHING SYSTEM PERFORMANCE ROBOT OF DOOM'. But it isnt, its just how they work. Both are fast, but the ATI card tends to do more on its own.
September 12, 2006 10:40:54 PM

since its obvious your looking for a cheaper card. get an X1800xt or get owned.
September 12, 2006 11:12:20 PM

Remember guys, the OP's needs are what important here, not your preferences on video cards.

X1800XT is not cool, nor quiet, though it is a great card.

The 7900GT doesn't deal with eye candy like ati gpus, but has a good frame rate to price ratio, and if cool/quiet is an important issue, its a no brainer.

Though if you do go 7900GT, check out evga or xfx so replacing the stock fan doesn't void your warranty (ie arctic cooling accelero x1 or zalman vf900 unless you're an h2o freak)

little out of date, but i couldn't find anything reliable and more recent:

http://www.xbitlabs.com/articles/video/display/gpu-cons...

And yes, x850 for the price, not worth it. Though the x800xl pci-e for $100 is almost as good a deal as the 7600gt, but no sm3.0...
September 13, 2006 5:52:38 PM

Quote:


X1800XT is not cool, nor quiet, though it is a great card.

The 7900GT doesn't deal with eye candy like ati gpus, but has a good frame rate to price ratio, and if cool/quiet is an important issue, its a no brainer.



Depends on the fan speed you run with. I know @ 50% fan speed it wasnt anywhere near loud, and still kept the GPU within a comfortable temp range.
* A great card it is indeed*

Your right though, it is a no-brainer --> The X1800XT would get my vote. :lol:  :lol:  :lol:  :lol: 
September 13, 2006 7:07:09 PM

The 7900GT and the X1900GT trade blows but the 7900GT edges it out. If the 7900GT is only "slightly" more expensive, then I'd choose it. If the 7900GT was a lot more expensive, then I'd get the X1900GT. If you can find an X1800XT cheaper than the 7900GT or at the same price as either the X1900GT or 7900GT, then get it.

Quote:
Just as a Geforce 6800U will walk all over a 7300GT, and probably a 7600GT in alot of benchmarks, the x850 is a faster card than the low end x1xxx GPUs.

If an X850XT=7600GT, then a 6800U is at best, barely equal to the 7600GT.
September 15, 2006 12:10:23 AM

Quote:
Just as a Geforce 6800U will walk all over a 7300GT, and probably a 7600GT in alot of benchmarks, the x850 is a faster card than the low end x1xxx GPUs.

If an X850XT=7600GT, then a 6800U is at best, barely equal to the 7600GT.

I hadnt really thought about it any further than 16 pipes vs. 12 pipes, but I suppose the 7600GT is clocked *much* higher.

Either way, my point was that the latest generation card isnt always faster than the best of the last generation :) 
September 28, 2006 12:29:41 PM

Thanks for the help everyone!
September 28, 2006 1:14:13 PM

Quote:
Thanks for the help everyone!


Just for reference, I just got my x850XT PE for $100 from newegg. It fits my budget and performance needs
September 28, 2006 7:30:08 PM

Quote:

Just as a Geforce 6800U will walk all over a 7300GT, and probably a 7600GT in alot of benchmarks, the x850 is a faster card than the low end x1xxx GPUs.

If an X850XT=7600GT, then a 6800U is at best, barely equal to the 7600GT.
No, a x850xt is not a 7600GT. It has double the memory interface (256bit against the puny 128bit of the 7600GT), and 4 more pipelines (compared to the 12 of the 7600GT).

The higher clocks do not do anything. My x850xt runs at 575MHz core (compared to the 580MHz core of the 7600GT) stable, and at 1250MHz mem (compared to the 1500MHz ram). At just 5MHz core and 250MHz ram away, it blows any 7600GT away.
September 28, 2006 8:11:53 PM

At the heat remarks:

Yes and no. I've owned both the 7900GT from eVGA, with the special KO cooler, and my current X1900XT. Both great cards.

Now, the X1900XT can throttle down its clock speeds when not being used in a 3D application, like a game. The 7900GT cannot, so it runs at the same speed all the time.

Currently in Windows, with no game running, the X1900XT is running at a cool 55C (well, cool for a video card :roll:) . It's actually been as low as the mid 40s, I think. The 7900GT stayed at around 60C.

In a game, I've seen the Radeon get up to a little over 80C. The GeForce only got up to about low-mid 70s.

So, the Radeon can run hotter, but it's actually cooler when it's not being used.

Oh, and the X1900XT is faster, but it's also a little louder.

Oh, and if it's between the X1800XT, the X1900GT, and the 7900GT, I'd take the nVidia. Now, if it came down to the X1900XT and the 7900GT, I'd go with the Radeon.

Either way, if you do go for nVidia, get an eVGA. Their warranty lets you overclock or change the stock cooler without voiding it, which is why I got one of their cards originally. Then I broke it, RMA'd it to newegg, not eVGA, and got the Radeon because I wanted a two-slot cooler, no hassle, and more video memory for a 1680x1050 resolution.
September 29, 2006 5:16:03 PM

Hahahahah!!! Thanks for the laugh.

You're full of BS.

This issue was debated and solved long before you got here.
September 29, 2006 5:54:37 PM

Quote:
All that sound the same as Crossfire, but thats 'just in case', with SLi you can buy an identical 2nd card, rather than a limited range of master cards.

Also the equivalent of the dongle, the SLi bridge, is internal, and optional.


Sorry, why is the dongle optional for SLI? I thought you'd need it to link the cards together?
September 29, 2006 5:55:00 PM

LOL.

I'd take the 7600GT over the X850 any day. No question.

X850 series were good cards...in their day. That day has passed. Still powerful enough to keep up with today's midrange cards, but they show their age when you play a game with all the latest tech, like HDR.

Granted, not even an X1950XTX Crossfire setup can play Oblivion maxed out in outdoor environments, but then again, neither can the 7600GT.

Wouldn't it be embarassing for ATI if it could?
September 29, 2006 7:04:58 PM

I didn't think the 7600GT could do HDR? I thought the main difference was SM2 vs SM3??? I went with the x850xt pe because I got it for $100 and figured it'd do. Of course, I plan to upgrade when DX10 arrives.
September 29, 2006 8:23:48 PM

I'm the same, I can't see the logic in getting a high end card now that won't be DX 10 compatible in half a year, or whenever it comes out. Sure, a "low end card" will be defunct in a year, but so will all the expensive, glossy bell and whistle graphic cards that won't be able to keep up with modern games. Will I cry because I can't get SM3? No, but I admit it looks good...but not $100 good.
September 29, 2006 8:51:47 PM

The 7600GT can and does handle HDR very well.

You didn't make a bad choice with the X850XT PE as it is a stellar card for $100.00. For quite some time 7600GTs and X850XTs could be had for about the same price and people were debating about it all over the place. It boiled down to what games you played as the 2 cards duke it out with each other. agi_shi was simply wrong and needed to be corrected just in case someone else came along and believed his BS on the issue. Big deal, he can OC his X850XT and "blow away" a 7600GT. You can OC the 7600GT (volt mods too) and bring the performance right back up. Both cards are solid. Get which one fits your games and budget.
September 29, 2006 9:13:39 PM

The 7600GT can do HDR.

What you might be talking about is nVidia's cards and their current inability to do HDR + AA at the same time.
September 29, 2006 10:01:11 PM

For those of us who don't know jack (like me), can someone quickly run through the difference between

a) SM2 (better detail?)
b) SM3 (even better detail?)
c) HDR (cool lighting?)

HDR and SM3 (7600 and up, x1xx? and up)
HDR and SM3 are on these same cards
SM2 anything less than these
nVidia can't do HDR and AA, but can do HDR and SM3
atI can do HDR, AA and SM3 together on x1xxx? and up?

Thanks.
September 29, 2006 11:12:07 PM

Cleeve, Genetic Weapon, Pauldh or TheGreatGreapApe could help you out better but I'll take a stab at it.

The difference between SM2.0 and SM3.0 is HDR, or more to the point OpenExr HDR (true HDR as the more informed believe). You can read a little about the differences between Shader Model 2.0 and 3.0 here and about OpenExr here, but they might be a little technical.

In other words:

SM2.0 is not equal to OpenExr (true) HDR
SM3.0 is equal to OpenExr (true) HDR

The exception to the rule is the Valve's Source Engine used in Half Life 2 as you can enable HDR in games using this engine. However, it is NOT OpenExr. Put simply, it is "fake" HDR. ATI cards older than the X1XXX series can display this "fake" HDR.

nVidia cards supported SM3.0 since the 6xxx series of cards. ATI supported it with the X1XXX series of cards.

nVidia cards cannot render HDR and AA at the same time, but ATI cards can when using special patches. Commonly referred to as "Chuck" patches. The patches only work with ATI cards because the ability to display both OpenExr HDR and AA are based at the hardware level of a graphic card, not the software level. Meaning ATI cards are physically capable of it while nVidia cards aren't.
September 29, 2006 11:57:52 PM

Quote:
I'm the same, I can't see the logic in getting a high end card now that won't be DX 10 compatible in half a year, or whenever it comes out. Sure, a "low end card" will be defunct in a year, but so will all the expensive, glossy bell and whistle graphic cards that won't be able to keep up with modern games. Will I cry because I can't get SM3? No, but I admit it looks good...but not $100 good.


>> that won't be DX 10 compatible in half a year, or whenever it comes out.

nVidia 8800GTX and 8600 both planned for release in mid-november (6 weeks).
September 30, 2006 12:50:09 AM

Quote:
I haven't got the motherboard yet, as it may depend on the card (ie SLI or crossfire).

So basically, an x is old tech, but still pretty fast,

The x2 is new tech, with better capabilities, better for the future and playing newer games.

The 850 is still selling for X1900 and 7900GT prices, so that's why I was a little confused. Who would be buying x850's when they are out of date?

The 7900GT would be cooler and quieter, and slightly faster.

Thanks for all the help, I'm gonna go 7900GT. I only heard crossfire was easier and more compatilbe than SLI, which was why I was wavering. The THG VGA charts go show the 7900GT's with a 10-15 frame increase, and they only cost slightly more.
Actually, SLI is generally easier to use due to it's maturity, but Crossfire is still improving.
September 30, 2006 5:35:55 AM

I say it's due to its simplicity.

Got a 7900GT? Great.

Want SLI? Buy another 7900GT.

Doesn't even have to be the same brand, as long as the actual processors are the same.

I think I heard somewhere that once nVidia implements physics processing into the drivers, you won't even need to buy two of the same cards. In that case...

Got a 7600GT? Cool.

Want SLI? Get a 7900GTX.

Move the 7600GT down to the 2nd slot, put in the GTX, and game on.
September 30, 2006 6:55:08 AM

Quote:
complete and utter rubbish and you should know better.

seriously was expecting better than that garbage.

Really.....and I expected better of you.


Quote:
openexr is one form of HDR but not true HDR. it is not got anything really to do with SM 3.0 apoart from using the same hardware AFAIK.

Talk to TGGA about this one as I've read his replys on the subject. Perhaps I did not remember them correctly. I'm not debating whether or not it's "true" HDR or not with you as that is out of my scope. Perhaps I should've stated it as SM2.0 doesn't support OpenExr but SM3.0 does. I apologize for that. I meant to keep the explanation basic so if Cleeve, GW & Co. dropped in they could correct any errors or fill in any blanks. Basically, as far as the average person is concerned, that is the difference between the two. If you'd like, you can briefly review the HDR explanations from HardOCP when they touch upon the subject while discussing the FarCry 1.3 Patch, or perhaps Elite Bastard's brief mention of it while discussing the graphics in Oblivion. You'll also notice I posted this link which covers the differences between SM2.0 and SM3.0 in full when regarding DX9.0c which aswered the posters question fully, albeit maybe bit more technical than the poster was looking for.

Quote:
the only game that need the chuck patch is oblivion cause the mnakers diabled it so as not to mkae its partner nvidia look stupid.

I apologize, I made a generalization. I suggest you read up on FarCry V1.4 patch. They're essentially the same thing and I thought I saw one of the other forum members (maybe Cleeve or Pauldh, I cannot remember) coin it as a "Chuck Patch". If I'm wrong, then I apologize for the error.

Quote:
i cannot remember the excat reason for nvidia not doin HDR and AA but it is to do with the hardware not being able to do both at the same time due to the range of vlaues it takes to do HDR or something. probably complete crap myself but better than that effort.

Quote:
The patches only work with ATI cards because the ability to display both OpenExr HDR and AA are based at the hardware level of a graphic card, not the software level. Meaning ATI cards are physically capable of it while nVidia cards aren't.

No explanation needed on that one as you practically repeated what I stated earlier. Perhaps I should have worded it more eloquently like yours.
September 30, 2006 11:47:12 AM

Quote:
The difference between SM2.0 and SM3.0 is HDR, or more to the point OpenExr HDR (true HDR as the more informed believe).

In other words:

SM2.0 is not equal to OpenExr (true) HDR
SM3.0 is equal to OpenExr (true) HDR

The exception to the rule is the Valve's Source Engine used in Half Life 2 as you can enable HDR in games using this engine. However, it is NOT OpenExr. Put simply, it is "fake" HDR. ATI cards older than the X1XXX series can display this "fake" HDR.

nVidia cards supported SM3.0 since the 6xxx series of cards. ATI supported it with the X1XXX series of cards.

nVidia cards cannot render HDR and AA at the same time, but ATI cards can when using special patches. Commonly referred to as "Chuck" patches. The patches only work with ATI cards because the ability to display both OpenExr HDR and AA are based at the hardware level of a graphic card, not the software level. Meaning ATI cards are physically capable of it while nVidia cards aren't.


Quote:
openexr is one form of HDR but not true HDR. it is not got anything really to do with SM 3.0 apoart from using the same hardware AFAIK.

i cannot remember the excat reason for nvidia not doin HDR and AA but it is to do with the hardware not being able to do both at the same time due to the range of vlaues it takes to do HDR or something. probably complete crap myself but better than that effort.


To summarize, then,

SM2 (fake HDR) before x1000, before 6xxx
SM3 (real HDR) x1000 and up, 6xxx and up.
SM3 (real HDR) and AA can be done with special patches for x1000 and up, and 6xxx can't do both.

OpenExr is a "method" or way of doing HDR?)

SM2 and SM3 replace AF?

So what, in layman's terms does

SM2 (funky lighting version 2?)
SM3 (funky lighting version 3?)
HDR (funky lighting, part of SM3,
...or not?)
OpenExr (funky lighting part of HDR?)
AA (texture edge inprovement?)
AF (texture filtering?)

do to the picture on screen?

Thanks for that, amd please don't fall out over my question! Another question, if ati had the tech to do AA and SM3, why wouldn't it be on by default to give them a market edge?
September 30, 2006 12:34:15 PM

LMAO are u kidding me, the x1900 owns even the 7900gtx in shader heavy games, and thats wat counts this time around

heres the kicker, when ever u see hdr and aa benchmarks, nvidia cards arent running both, but ati cards are so take that into account
September 30, 2006 2:27:25 PM

I've updated my "summary" above, I've carefully read everything, but there are arguments and conflicting statements, so I tried to simply sum up what I've been told so far.

I've also included what you just said.

I only want to end up with something that makes sense, although I realize that this is going off topic to what I originally posted.
a b U Graphics card
October 1, 2006 2:02:01 PM

Quote:


To summarize, then,

SM2 (fake HDR) before x1000, before 6xxx

SM3 (real HDR) x1000 and up, 6xxx and up.
SM3 (real HDR) and AA can be done with special patches for x1000 and up, and 6xxx can't do both.



No such thing as 'fake HDR', that's an nV marketing ploy. Show me in the Siggraph presentations detailing HDR on a PC where integer+fp based HDR is 'fake', let alone show me early statements about it from photography which has detailed HDR long before it hit the computers. And thus no such thing as 'Real HDR' although that term has spread very widely. However ask the early 'fake/real' people about it now and they back peddal because their darling can't do their version of 'real HDR' with AA in hardware.

Bloom and overbright radiance are components of HDR, not the only functions, but they are a component, and if people want to be anal about what is and isn't HDR then why not go whole hog and demand FP64 per channel HDR? Oh yeah, that's because no VPU can render that, it's only the host CPU that can do that very VERY slowly, meaning it's not to anyone's marketing benefit to say that, but it was to someone's benifit last year to try and define 'true HDR' in their own terms.

Really the true discussion of 'Real and Fake' HDR should involve display technologies, which are currently so limited as to provide only a portion of what HDR is capable of. LCDs are very limited, CRTs are slightly limited. There are a bunch of monitors still in the development stage that promise the contrast range needed, but stil come short on the colour side. Current displays are limited in both ways. LED backlight LCDs will alleviate the contrast portion, but still not have the full colour range.

Quote:
(except StrangerStranger says OpenExr isn't part of HDR, so HDR must be something else...?)


That's not what he said, re-read and absorb, don't re-interpret, it makes it harder for others, and you cause further confusion, tring to 'summarize' when all you're doing is perverting his statements.

SS said OpenEXR HDR isn't the only version not that it wasn't a version, he said specifically that it IS one method.

What he was stressing is that the OpenEXR style of HDR using 16 bit FloatingPoint values per channel is not part of the SM3.0 spec, and unfortunately people get confused because the hardware that currently can handle FP16 blending is primarily SM3.0 hardware. But FP16 blending is not part of the SM3.0 spec, so saying SM3.0 is required or involved is really to confuse matters, as is the case with most discussions of HDR.

Perfect Examples of this exception are the GF6100-6500 and GF7100, all of which comply with SM3.0 spec, but all of which lack FP16 blending in their ROPs, unlike the higher model numbers. They cannot do OpenEXR style FP16HDR in hardware.

Quote:
SM2 and SM3 replace AF?


Huh? That doesn't make sense, and neither does the rest of what you wrote after that.

Want to see what HDR looks like use google, especially google video for the 'dynamic' part of HDR.

Quote:
if ati had the tech to do AA and SM3, why wouldn't it be on by default to give them a market edge?


Because a game must either tell the hardware it supports it (like HL2, AOE3, SS2, 3Dmk06, and FartCry's patch) or the hardware drivers must tell the card to force it on in a method that exploits the hardware and software (like in Oblivion w/ the ChuckPatch).

The reason that nV can't do both FP16-HDR+AA in hardware is because in order to apply AA and calculate HDR they must go through the ROPs fully and then return to go through for a second pass, and the card doesn't output 16bit info out the other end of the ROP but 8bit target info used to display the image, that means that while it can do FP16HDR or AA, the application of the next layer is based on the integer target.
ATi on the other hand can renders the FP16 AA into buffer and then swap it back to use as texture information for the HDR, it doesn't need to exit the render path in order to apply one or the other.

ATi also has support for FX10 rendering on the X1K and Xenos/X360, which allows for quicker HDR with less overhead than full FP16, but providing nearly the same results visually (especially since most current monitors [especially LCDs] aren't good enough to expose the differences). This gives them one additional option, although it really hasn't been exploited much in games, only demos.

Now this isn't to say that nV can't do HDR+AA, just no FP16 throughout in hardware. What they can do is either AA or HDR on the host system, and then apply the other in the render pipeline (which is how AOE3 works doing AA in software and HDR on the GPU) this is the only way to get full FP16 HDR+AA on the GF6/7 series cards. This can cause a bit of a performance penalty, and depending on whether or not this host based AA penalty was present on both then you might see a large difference in performance between the two methods. Te other drawback is that AA levels are restricted to what is preprogrammed by the game developers, so SLi AA is out, as is special AA like adaptive/transparent AA.

Now you could also get FP16+int8 HDR+AA by taking two passes through the G7 series runing the HDR through first, and then applying either MSAA or SSAA on the int8 resultant. This also has a performance penalty to it because you have to go through the pipeline twice (no early out) in order to add both layers of attributes.

This is similar to the possible FP16 HDR workaround for the X800 series, which would've required 3 full passes to emulate FP16 hardware blending, which would've been terribly slow (although likely only about 70-80% as slow as the GF6 series which was traditionally slower than the X8 series, but with HDR enabled was about half as fast as without). However this avenue was never persued as it obviously wasn't felt to be enough of a killer app on just FartCry at the time to dedicate effort to make it work, which IMO is the reason why nV isn't dedicating time to making HDR+AA work on the GF6-7 series. Isn't not enough of a concern to try and make it work, especially for something that will still be second banana. So they are up against either not having it or having it as a checkbox feature but being much slower than the competition. I'm sure they think like I do in that it's not worth the effort to have a mediocre solution, when there's other areas they could be focusing their efforts.

For more information check google, wiki, bit-tech and a few other sources. Try and ignore the PR and look for the hard information. Beyond that you'll need to understand how things work in order to understand the difference between architectures and applications.

But the main thing you need to remember is that HDR can be output by all of the current cards, what's limited is what can be done on the cards themselves and what needs to be done on host/CPU resources.
a b U Graphics card
October 1, 2006 3:11:18 PM

Well, Cleeve, Heyyou27, Nottheking and myself discussed this in alot of detail many times when trying to figure out nV's HDR+AA issues and Oblivion and Chuck patch.

Probably should put them together into something, I was thinking that while writing the above and wishing the search engine worked better so I could just point to our previous discussions instead of all that writing.

Might be time to consolidate it all into a sticky complete with pics and links.

EDIT: By changing my search and refining, I found one of the previous discussions (which mentions the even earlier ones), this is the stuff about AOE3's interesting implementation of HDR+AA in AOE3 using software+hardware;

http://forumz.tomshardware.com/hardware/modules.php?nam...
October 1, 2006 5:53:24 PM

This is replying to everyone who's tried to explain and help me out with understanding this.

I apologize if I've got confused and misinterpreted what people have said. I was just trying to get a simple description of these terms and what they actually mean on screen. I'm no expert on this, which is why I just wanted a layman's description, but perhaps there isn't one, as people seem to have their ownb interpretations of what these terms mean.

If somebody is able to help write out a quick dumbo's description, if there's such a thing (!) then I'd really appreciate it. Thanks again.
October 1, 2006 5:55:47 PM

Quote:


if ati had the tech to do AA and SM3, why wouldn't it be on by default to give them a market edge?


Because a game must either tell the hardware it supports it (like HL2, AOE3, SS2, 3Dmk06, and FartCry's patch) or the hardware drivers must tell the card to force it on in a method that exploits the hardware and software (like in Oblivion w/ the ChuckPatch).

But the hardware exists, so wouldn't leading edge games be using the hardware available, unless of course they are all sponsored by nVidia, who wouldn't want a potential disadvantage to their cards to be marketed?
a b U Graphics card
October 1, 2006 6:28:42 PM

Quote:

But the hardware exists, so wouldn't leading edge games be using the hardware available, unless of course they are all sponsored by nVidia, who wouldn't want a potential disadvantage to their cards to be marketed?


Well there are only a few games that support HDR anyways most are listed above.

SS2 supports FP16HDR+AA it's had intimate support from ATi and nV.

HL2 / Day of Defeat supports Integer + FP based HDR, with lots of tone mapping to help display HDR on LDR monitors, was heavily involved
& supported by ATi so it supports HDR+AA on everything from the R9500 and up.

FartCry supports Integer+FP Based HDR with AA, but it's still kinda buggy beta sofar. HDR is limited to GF6, 7 and X1K, and the HDR+AA is limited to FP16 on 1K and Ineger on GF6+7, although the later hasn't seen light of day.

Oblivion supports FP HDR and integer Bloom, and was developed almost exclusively on ATi hardware (Shows by the issues with the FX series, and many nV related bugs), but was then enveloped ito the nV TWIMTBP program, and suddenly the HDR+AA that was so strongly promoted in the fall launch schedule interviews was strangely felt to be too stressful for the desktop and made into an Xbox360 only feature, despite the X1900 and especially 2X1900 being more powerful than the Xenos. Many ElderScrolls fans felt that the reason for this was obvious, the new sponsor couldn't use the feature.

AOE3 is a Microsoft title, and they just wanna sell games. Their view of HDR+AA seems to be that of trying to please everyone, which is why IMO the game looks good on X800s as well as GF6800s. But their implementation is different than everyone else and requires added effort to do the AA in software. Few people have the resources, IMO M$ has more than enough resources.

Splinter Cell3 is another TWIMTBP title, which not only had limited HDR implementation, also restricted the render paths to PS3.0 or PS1.1, thus focusing on the nV series of cards at the expense of the R9500-X850 of ATi. They only added PS2.0 support in patch 1.2 after ATi had released their SM3.0 parts. It's not surprising that they didn't add native support for HDR+AA, and this title was mentioned as the next possible target for the Chuck patch, but it seems not enough interest was generated so I don't know if it will ever get it. But it was mentioned as possible.

Remember HDR itself is novel, HDR+AA is sort of a niche market of the novelty market, so focusing on it is not likely a priority for the game makers while they were developing it, but is worht the afterthough for some, like Crytek and FartCry show. You may see more patches in the future, like the recent one for ReturnToCastle Wolfenstein. Still haven't look into it's HDR implementation, but it's a good candidate for additional focus as the update was launched for Apple, WinPC, and Linux.

Really the best thing to do is start with Google and Wiki, and BitTech's investigations, like their look at HL2;

http://www.bit-tech.net/gaming/2005/06/14/hl2_hdr_overv...
http://www.bit-tech.net/gaming/2005/09/14/lost_coast_sc...
http://en.wikipedia.org/wiki/High_dynamic_range_imaging
October 1, 2006 9:13:11 PM

Quote:
heres the kicker, when ever u see hdr and aa benchmarks, nvidia cards arent running both, but ati cards are so take that into account


No, they're not. Benchmarks will say before they show the graphs that they know ATI cards can run HDR+AA but to keep things equal AA will be taken out of the equation. FUD.
October 1, 2006 10:25:17 PM

Thanks Great Grape Ape, shold keep me busy reading that for a while!
October 2, 2006 12:59:51 AM

Quote:
now why didn't i say that. is there not a sticky about this, cause there should be.

I agree. I thought I had a decent grip on this whole thing but now I see I don't. This is good info and a sticky would be handy as there is a lot of wrong info from uninformed posters (myself included) floating around the forum.

I was trying to provide a simple explanation based on the small amount of info I knew about this but after reading TGGA posts (and some info he PM'd me) I can that there is no simple answer.

I apologize for any errors and confusion.

StrangeStranger, your scolding was more or less warranted....but your still a bastage! :lol: 
October 2, 2006 10:14:39 PM

WOW. Very well put!
!