13 fps is quite a large jump and worth the slightly higher temps and possibly less stable system (still runs furmark for an hour w/o problems though).
I'm using a 1080p monitor so first i thought the 1080p furmark setting would be most relevant. However furmark is more gpu intensive than any games (i think?), so would gaming on a 1080p monitor be closer to the furmark 1080p preset or 720p one?
I was going to give an uneducated guess, but i realized i don't have a clue what i'm talking about. My guess would have involved explaining the above by saying the processor on the gpu is sometimes the bottleneck and other times it's the ram speed. I'm not sure which would be when. I'm fairly certain the amount of gpu ram (2gb) shouldn't bottleneck the system at either resolution.
Anyway, hope that makes sense and that someone could explain it to me.
EDIT: My 3DMark 11 scores in the performance (P) tests are P5932 and P5553 with and without oc respectively.
In the extreme test, i get X1896 with oc and X1782 w/o.
So once again, 720p has a larger difference.
Asus p8p67 pro
i5-2500k @ 4.2GHz with cm hyper 212+
6950 2GB moddded to a 6970 with +20% power
Corsair Vengence 8GB @ 1600
Corsair 750w the hx silver modular one
WD caviar black 1TB
Its a bit confusing what you're really trying to ask here.
Here's how I'm getting it; You have run some benchmarks with overclocking your card and without. You have looked at the frames per second, and determined that the synthetic benchmarks appear to be more different at 720p resolution and less different at 1080p.
So one could interpret this to mean that at higher resolutions, the results are less GPU dependent and more dependent on CPU performance. However, given your CPU overclock and i5 performance, a CPU bottleneck is unlikely.
The Vram should not be an issue, you really shouldn't consider it a bottleneck.
P5932 = 6% difference
X1986 = 6% difference
If there is a significant difference in FPS at lower resolution vs higher, there could be a number of factors. Considering these are synthetic benchmarks that don't really matter, I would conduct some further tests with games to see if this same pattern emerges. My guess is that the software is the variable, not some hardware bottleneck.
1080p furmark= 7% difference
720p furmark= 11% difference.
Do you see what I mean about it being software dependent?