Resident Evil 5: Demo Performance Analyzed
Image Quality: Radeon Versus GeForce + 3D Vision
We scrutinized both GeForce and Radeon screen captures, and from what we can tell, there's no difference of note:
This is our favorite conclusion when comparing graphics cards because it means that, regardless of the hardware, users will have a consistent experience.
Editor's Note: 3D Vision
The GeForce 3D Vision kit we have is in our Bakersfield, California lab, so Don didn't have the opportunity to check this title out with 3D Vision enabled. Resident Evil 5 is, however, considered one of the GeForce 3D Vision-Ready titles thanks to close cooperation between Nvidia and the game's developers.
We set the benchmark up on a Core i5-750-based machine with a GeForce GTX 260 and Nvidia's IR transceiver. As expected, the game looked great with GeForce 3D Vision enabled, though we didn't see any out-of-screen effects. This is something we've been waiting for from 3D Vision-enabled titles and still haven't witnessed yet (though we haven't tested on the full version of the game). There is a notable performance impact that comes from turning 3D Vision on, and we'll get into that shortly.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Current page: Image Quality: Radeon Versus GeForce + 3D Vision
Prev Page Image Quality Settings Next Page Image Quality: DirectX 9 Versus DirectX 10-
gkay09 Does this imply that game developers in general dint even utilize the full potential of DirectX 9 and jumped on DirextX 10 bandwagon and now to DirextX 11?Reply -
renz496 i've tried the benchmark before. the dx 10 produce slightly better frame rate than dx9. this game have better performance in dx 10 compared to dx 9 in my machineReply -
yellosnowman Am I going blind or is there no HD4890Reply
or is this just a quick benchmark before the HD5*** series -
mitch074 @yellosnowman: there is a 4890, but it's been downclocked to 4870 levels (read the article) to be used as reference for Radeon performances (tests on Radeon wasn't too extensive, as the benchmark is optimized for Nvidia hardware). And yes, with HD 5xxx almost there, doing complete benchmarks here is pretty much useless: the game is playable with everything at full on a Radeon HD 4770 up to Full HD quality.Reply -
voltagetoe Gkay09, Direct X 10 was a failure because Vista was a failure. DX 9 has been thoroughly utilized - there has been no other choice.Reply -
juliom On the variable benchmark @ 1680 x 1050 2x AA my Phenom II x4 955 and Radeon 4870 pulls and average of 80 fps in directx 9. I'm happy and have the game pre-ordered on Steam :)Reply -
amnotanoobie voltagetoeGkay09, Direct X 10 was a failure because Vista was a failure. DX 9 has been thoroughly utilized - there has been no other choice.Wasn't it more of because the mainstream DX10 cards (8600GT and 2600XT) didn't really perform well, and even some were beaten by previous generation cards. As such, pushing the detail level higher might mean fewer sales as fewer people had the cards to play the games at decent levels (8800GTS 320MB/640MB or 2900XT).Reply -
HTDuro DX10 isnt really a failure .. if programmer take time to really work on DX10 optimisation .. more on SM4.0. remember Assassins creed? ubisoft take time to work on SM4.0 and the game work better in D10 than 9 ... higher framerate with better shadowReply