I was wondering how modern games would perform with the latest drivers, in DirectX9/DirectX10/DirectX11 with the latest 4 generations of high-end desktop single-GPUs from ATI and nVidia (every GPU should have at least 1.5 GB video)
From ATI, I had in mind:
(all of this GPUs should use the latest CAP, but I presume they already do that)
GTS 250 (since it's the equivalent of 9800 GTX+)
The resolutions used should be 1650x1050 and 1920x1080 with an optional 2560x1600. Please use noAA and 4xAA, and if time isn't a problem, also FXAA or MLAA.
The games I was thinking of using: Crysis 2, Battlefield 3, Call of Duty Modern Warfare 3, Elder Scrolls V: Skyrim, Hard Reset, Batman Arkham City, Metro 2033, Dirt 3 (or Dirt Showdown, because of the newer engine), Sleeping Dogs and Just Cause 2. You don't have to use all of them. Just some that look more interesting.
And maybe one of these 3: Cryostasis: The Sleep of Reason, Splinter Cell Conviction or STALKER: Call of Prypiat because of their less-than-optimal optimisation.
The test should be conducted with the absolutely latest beta drivers (for more optimisations), an Intel Core i7 3770K at let's say....4.5 GHz and 8 GB DDR3 of ram 1600.
I don't want this to be an ATI vs Nvidia article. Just a generation vs generation article.
All games should be used with their highest possible settings under DX9/10/11 and 16xAF.
Metro 2033 should be used with Advanced Physx on for both vendors, but only with Software acceleration, because it has a great Physx optimisation for the CPU.
Call of Duty MW3 on Nvidia hardware should/could use Ambient Oclusion from the Nvidia Control Panel
Just Cause 2 on Nvidia hardware should definitely use CUDA for Bokeh Depth of Field and for Water simulation (it was used for something like that, right?)
I would be very happy if such an interesting article like this would be made.