Benchmarks: Comparing Detail Settings And Performance Impact
With separate DirectX 9 and DirectX 10 render paths, in addition to three levels of detail settings, motion blur, and AA, there are a lot of performance variables.
We decided to examine the results that multiple combinations of these variables had on the slowest card with which we had to test, and use those results to choose the configurations we were most interested in benchmarking:
We can see there's relatively little difference between low and medium settings when it comes to performance, and since the medium settings offer greatly increased texture detail in addition to shadows, we'll consider medium detail to be the lowest recommended settings for running Resident Evil 5.
It is also obvious that there is a notable performance impact when enabling DirectX 10, but since we now know there is no visual advantage with this setting, we will avoid it, except to demonstrate the performance hit that owners of Nvidia's GeForce 3D Vision LCD glasses will experience.
In addition, we have seen that the highest detail settings offer some really impressive visuals. It also appears that enabling motion blur doesn't cause much of a performance hit with this hardware, so we'll leave that feature enabled for the highest detail benchmark runs.
With these observations in mind, we've decided to benchmark DirectX 9 at medium details as a baseline, at high details with motion blur enabled to demonstrate the maximum graphical fidelity, and with high details, motion blur, and 4x AA enabled to show how AA will affect performance.
or is this just a quick benchmark before the HD5*** series
RE5 = FAIL