Sign in with
Sign up | Sign in
Your question
Solved

From 900p to 1080p: 20 FPS less?

Tags:
  • Performance
  • FPS
Last response: in Graphics & Displays
Share
September 24, 2014 1:04:51 PM

Greetings folks,

I've been playing in 900p for a long time. However, I just upgraded my rig and wanted to try playing on a 32" fullHD TV. The results as the title says is 10-20 fps less. Where I got constant 55-60 fps, now I got 45-50 fps, depending on the game... Is it normal that much performance loss?

My rig:

♦i5-3330
♦R9 280 Windforce
♦8Gb 1600MHz
♦GA-B75M-D3H
♦Corsair TX650

Thanks in advance.

More about : 900p 1080p fps

September 24, 2014 1:08:54 PM

no, if you put it in 1080p the card should work 150% faster so you actually gain performance, hammer your gpu because its broken
m
0
l
September 24, 2014 1:09:12 PM

1600x900 ~1.44million pixels
1920x1080 ~2.050 mill. pixels

That is normal
m
0
l
Related resources
September 24, 2014 1:17:38 PM

I imagined as much...but benchmarks around the web give my R9 a better performance in 1080p. Maybe there's something wrong with the GPU.
m
0
l
September 24, 2014 1:20:36 PM

What games what settings. And online benchmarks use insanely good CPUs to prevent bottlenecking so you can't really go off them
m
0
l
September 24, 2014 1:23:36 PM

^-- True, typically like an 4770k/3930k/4960x etcetc..
m
0
l
September 24, 2014 1:34:55 PM

firo40 said:
What games what settings. And online benchmarks use insanely good CPUs to prevent bottlenecking so you can't really go off them


I got a 10 fps drop on Tomb Raider (maxed out). At 900p I get 50-60 fps and at 1080p I get 35-50.
Watch Dogs with the Worse Mod gave me a good boost with most settings on Ultra. 900p with 30-60 fps, and 1080p 20-50 fps.
Bound By Flame I get a constant 60 fps at 900p and at 1080p sometimes 40 fps.

These are all "old gen" games. I guess upcoming games will give a hard time to the R9.
m
0
l
October 8, 2014 6:11:18 AM

Sorry to resurrect this topic but I'm getting pretty confused with this 900p -> 1080p thing...

Older games:
I tried playing The Witcher 2 on both resolutions, everything maxed... Strangely I get unconstant 40-55 fps on 900p and on 1080p... constant 60 fps (???).
Dragon Age 2 the same as Witcher...

Newer games
On Crysis 3 though, I get a 10 fps drop from 900p to 1080p. I had to lower some settings to play.
Tomb Raider the same as Crysis
Watch Dogs I get only a 5 fps drop.

To conclude... Do older games get better performance on higher res. and newer games loose performance on higher res. (some more than others)?
So, is resolution performance really a matter of GPU or game optimizing? Or something else I'm missing here?

Thanks in advance
m
0
l
October 8, 2014 6:15:38 AM

Its quite simple, the higher the resolution, the harder work it is for the GPU to render (because of the extra pixels).

This is why even high end cards struggle with 4K, because its just so many more pixels.


Some older games might be more bottlnecked by CPU performance, not GPU performance, hense there being little difference when you increase the resolution (resolution makes no difference to CPU workload, it remains the same)
m
0
l
October 8, 2014 6:35:34 AM

I got it. So if a game loses few fps on higher res. It means that it's more CPU bound than GPU bound correct?
m
0
l

Best solution

October 8, 2014 6:42:28 AM

osorius said:
I got it. So if a game loses few fps on higher res. It means that it's more CPU bound than GPU bound correct?


Yup spot on. If you dont notice much difference in performance when going up a resolution, then the it was not the GPU limiting performance. Thats not to say your CPU is bad, but there always has to be something limiting performance.
Share
!