Optimal Resolutions, wait what?

fractalfx

Distinguished
Feb 29, 2008
4
0
18,510
This has been in my mind for a while, can performance benchmarks prove that resolution settings in games measure higher than others? At a glance the answer is, "duh, of course," but what if there were results that proved that 1280x1024 runs at higher frames than 1024x768 or even 800x600. There have been cases in the past when I owned a radeon 9700 pro and played an aged mmo known as Anarchy Online. I kept hitting terrible frames, average being around 15fps, so I went online and found out that if I had increase my resolution settings from 1024x768 to a slightly higher one, I would indeed achieve higher frames per second. Sceptically I thought, "no way..." but I went with it and I jumped from 1024x768 to 1280x1024, and strangely that support provided to be true, I jumped from my previous average of 15 fps at 1024x768 to around an average of 40 fps 1280x1024, both settings were being run on the same latest driver version at the time as well... But Anarchy Online is old (although it is going through an engine update/overhaul at the moment), compare benchmarks with better hardware and more up-to-date games, fair enough. Here are the benchmarks from Unreal Tournament 3, max settings across the board with the Geforce 9600gt from XFX accompanied with a 2.2GHz AMD dual-core cpu, (old-ish) 2GBs of DDR 400 ram (4x 512MBs) and a Seagate Barracuda 320GB SATA II HDD...

800x600 resolution: Frames, Time (ms), Min, Max, Avg

3560, 76471, 43, 48, 46.554


1024x768 resolution: Frames, Time (ms), Min, Max, Avg

4449, 97231, 38, 47, 45.757


1280x1024 resolution: Frames, Time (ms), Min, Max, Avg

4325, 93885, 43, 47, 46.067


(all frame times averaged by FRAPS)


I'm not sure if I understand why I'm not seeing any (at least somewhat) linear decrease in fps across traveling from lower to higher resolution settings. 800x600 and 1280x1024 are nearly identical in frames marks, while 1024x768 noticeably lags behind...
 

MayDay94

Distinguished
Aug 8, 2006
207
0
18,690
the avg frame difference in ut3 is +-1 frame. the only real difference is the min frames for the middle one. i would chalk that up to a likely "less than ideal" test methodology. once you go beyond 12x10, you will start to see some big drops as it become gpu bound. at those resolutions, you are most likely cpu and system bound, and since you did not add/remove ram or cpu, you should see roughly the same results.
 

pcgamer12

Distinguished
May 1, 2008
1,089
0
19,280
Resolutions are too low to mean anything. They would go down, but seeing as your CPU bottlenecks FPS at <47 FPS, you can't see the linear decrease. Try using all low settings..
 

fractalfx

Distinguished
Feb 29, 2008
4
0
18,510
Heh, I guess that means I should start getting ready for new build soon, seeing that I'm already starting to experience bottlenecks :'(. I thought I'd soon be hitting my socket 939 barrier of an old athlon x2 4200+ : /.

But, relating back to the tests, yeah if I were to increase to 1600x1200 it would be a much more noticable frame rate drop.
 

TRENDING THREADS