when playing crysis warhead in 32-bit XP dx9 on crysis warhead all settings enthusiast on the snow level i would get 35-50 fps but on vista home premium 64-bit dx10 i get 18-25 fps surely this cant be the expected loss from moving from dx9 to 10. ps if it makes difference from changing os i have updated the nvidia drivers to the most recent and i have the game on steam.
Q6600 stock speed
4 GB RAM
2x 8800 GTS 512MB
The stock q6600 is at fault imho. dx10 puts a real hit on frames anyway but vista and sli really appreciate more mhz. Try going from 2.4 to say 2.8 ghz and replay the snow levels. See if that makes a difference. Put object detail and textures down to 'gamer' aswell. The extra content streaming uses alot of cpu cycles too. My e6300 at 3.0ghz feels just about ok in warhead with a 4870 under windows xp, but im sure as eggs are eggs its holding my gpu back some.
The difference between DX9 and DX10 is VERY noticeable in crysis. I run Vista 64, as far as Razor512 saying there is poor driver support this is VERY untrue. Driver support for Vista 64 has been far better then XP 64 and so far I havent had any hardware that I couldnt find a working driver for.
Crysis tends to be a resource hog no matter what system you are running. If you want to see the difference in DX9 and DX10 simply run crysis in DX9 mode. Normally you would right click the icon and choose run in DX9 mode but since you are running it thru steam I'm not sure of what steps you would take but for certain parts of the game I did drop from DX10 to DX9 to help with the frame rate....