frame rates different from VGA charts as to ATI vs NVIDA

radmanrugg

Distinguished
Oct 18, 2006
2
0
18,510
I have LCD monitors at 1600x1200 so that is the resolution I pay attention to most. I want to be able to play anything above 30 FPS. Anything else is gravy. Min FPS is more important to me than Max. I hate stutter. Usually happens at the most inopportune times during game play.

Oblivion seems to be the game that makes and breaks setups. I noticed that the ABS Ultimate 9 with ATI cards kicked NVIDIA’s butt at 1600x1200 among other resolutions. See Toms review Game On With the Ultimate X9 from ABS Computers. But if you look at Toms VGA charts NVIDIA kicks ATI’s butt. So what contributes to the difference? It must be the CPU power required to feed the graphics cards or newer drivers. I am betting ATI requires more CPU overhead to feed its card but once fed does better in Oblivion outdoor.

Any thoughts / confirmations on this assumption?
 
NVidia is traditionally better at OpenGL games than ATI, but ATI is no slouch in those types of games though. In general ATI is better at D3D games.

If Oblivion is the game you want to play then go with ATI since their cards performs better. The only real advantage I can say nVidia has over ATI is power consumption. The 7900GT 512MB uses about 49w of power while the Radeon X1900XT 512MB consumes about 109w of power. That could make a difference of having to upgrade the power supply or not.
 
Chuck patch was not used for the tests, and as such CF is still disabled by Bethesda (check the readme it's right there they specifically disable it), and therefore no benifit in the test.

That's one of my big reservations of the VGA charts.

On a 1 to 1 basis at the top the ATis win every time, especially in the outdoor tests. Even the lower cards like the X1600 do much better than normal in other games due to their full use of pixel shader #s (vs ROPs and texture units) and better dynamic branching.

In THG's tests CF vs SLi numbers should be ignored. Darren should really enable the chuck patch if he wants to test the two.

Their test is not representative of actual Oblivion users.
 

radmanrugg

Distinguished
Oct 18, 2006
2
0
18,510
Chuck patch was not used for the tests, and as such CF is still disabled by Bethesda (check the readme it's right there they specifically disable it), and therefore no benifit in the test.

That's one of my big reservations of the VGA charts.

On a 1 to 1 basis at the top the ATis win every time, especially in the outdoor tests. Even the lower cards like the X1600 do much better than normal in other games due to their full use of pixel shader #s (vs ROPs and texture units) and better dynamic branching.

In THG's tests CF vs SLi numbers should be ignored. Darren should really enable the chuck patch if he wants to test the two.

Their test is not representative of actual Oblivion users.

Excellent information. The reason I think Oblivion is the game to compare with is that with all the other games people compare cards with, the frame rates are way up in the playable fps levels.

Of course now my big question is if NVIDA or ATI will release their next gen cards soon enough I can forget about this all together. It would be nice to just buy 1 card that’s future capable (read direct X 10 /vista support) and not 2 limited life cards. I think I will hold out a month. Announcements or better yet releases should be bouncing around for the holiday rush by then.
 
Yeah if you're willing to wait then wait for the GF8xxx series to come out with a GT style card to give you more than the GF7900/X1900 series right now for about the same price.

Right now though the X1900 would support more future features, but if you can wait then wait for the G80s apperance.