I'm posted this on the RIFT forums but wasn't having much luck with responses so figure I'd try here.
So I've been doing a lot of reading to determine what my issue is, is it a driver issue, gpu, cpu or if there is no issue and I just need to suck it up.
Anyways, I stood in Sanctum (main city) today in one spot, kept the same viewing angle, staring at the same scenery with very little moving pieces around me (i.e. players running around). And I wanted to see how fiddling with all the settings effected the frame rate, and I found it pretty interesting. I'm going to put some information below, maybe someone who has a better understanding of computers could give me some insight.
CPU - Core 2 duo at 3.45 ghz, 4 gb of DDR2 800 memory, ATI 5870 GPU - drivers 11.2 cap 3, max res 1680 x 1050.
1680 x 1050 resolution, all sliders to maximimum including AA/AF - 42 FPS
1680 x 1050 resolution, medium settings - 58 FPS
1680 x 1050 resolution, minimum settings - 73 FPS
- This all seems fairly normal, it's what's below that seems really weird to me.
800 x 600 resoluation, all sliders to maximum including AA/AF - 43 FPS
800 x 600 resoluation, medium settings - 57 FPS
800 x 600 resoluation, minimum settings - 75 FPS
These resoluation changes with no FPS change, this isn't normal is it? Is this an indication of something that might provide me some insight into my lower than expected frame rate. And if you says "well you get 42 FPS maxed out that's great" this is a low intensity setting I'm using, on high settings I'm dropping to less than 10 fps while in large groups.
So I've been doing a lot of reading to determine what my issue is, is it a driver issue, gpu, cpu or if there is no issue and I just need to suck it up.
Anyways, I stood in Sanctum (main city) today in one spot, kept the same viewing angle, staring at the same scenery with very little moving pieces around me (i.e. players running around). And I wanted to see how fiddling with all the settings effected the frame rate, and I found it pretty interesting. I'm going to put some information below, maybe someone who has a better understanding of computers could give me some insight.
CPU - Core 2 duo at 3.45 ghz, 4 gb of DDR2 800 memory, ATI 5870 GPU - drivers 11.2 cap 3, max res 1680 x 1050.
1680 x 1050 resolution, all sliders to maximimum including AA/AF - 42 FPS
1680 x 1050 resolution, medium settings - 58 FPS
1680 x 1050 resolution, minimum settings - 73 FPS
- This all seems fairly normal, it's what's below that seems really weird to me.
800 x 600 resoluation, all sliders to maximum including AA/AF - 43 FPS
800 x 600 resoluation, medium settings - 57 FPS
800 x 600 resoluation, minimum settings - 75 FPS
These resoluation changes with no FPS change, this isn't normal is it? Is this an indication of something that might provide me some insight into my lower than expected frame rate. And if you says "well you get 42 FPS maxed out that's great" this is a low intensity setting I'm using, on high settings I'm dropping to less than 10 fps while in large groups.