what happened to HD4890, when it was first tested on tomshw,it can ran a 40+fps at 4AA 1680x1050, crysis. but now, in the review of HD5850, HD4890 dropped to about 30fps under the same condition. i always trust tomshw. and i believe they will never bias anything. but what happened, don't tell me it's because of the plantform, they are likely the same, don't tell me the matter of brand, they can't change the performence of a card for more than 30%, and using different brand is a kind of bias itself. what a am thinking is the damn manufacture, they must half assed it, what do you think
Very High = full DX10 shaders and High Textures + full shadows.
High = DX9 shaders and reduced features.
Seems pretty obvious that the efficient AA has less impact than the increase in workload.
"Very high" is a DX10 only option and enabling DX10 even without raising the quality (ex: High vs High) cost at least a 10-20% performance drop in Crysis. Then add the extra cost of the "Very high". The lower AA might have compensated, but ATI cards were always good at keeping-up with AA so the performance diff isn't that high. Finally, there is always the driver version possibility.
Don't forget that Crytek locked the options for even better graphics than the Very High setting. You need mods to get the "ultra high" settings that make the game graphics even better. (it changes around the HDR and textures to make the scenery look incredible) =D