Resolution Showdown

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
Hey everyone, I am running a monitor that supports 1680x1050 resolution. I plan on investing in some serious graphics soon, but I can not decide whether I should upgrade to 1920x1200. To be honest, I have never played games at 1920x1200 resolution and so I do not know whether it looks nicer. Here is where things get tricky, I have a 20" monitor with 1680x1050 res, so the pixels per square inch of screen ratio is nice and high. This will give me nicer graphics on the whole than that resolution on a 22" monitor. Now if I were to buy a 24" monitor the pixels per square inch of screen ratio would go down. I'm not sure if this would balance things out any or make them closer. Also, is resolution so important that having 1920x1200 @ med-high settings on a game like Crysis or another demanding game would look nicer than having 1680x1050 @ high settings on the same game. I honestly do not believe there will be much of a difference, except that FPS would go down, and I have nothing against my current monitor. Also I plan on going red, with the 4xxx series from ATI so boosting the AA should be easy, but will this be necessary to even the comparison out? I am open to all opinions and information. Thank you, and I am sorry if this is in an inappropriate section, but I could not decide between Graphics Cards and Monitors, so I went with the more frequented one.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780
Those resolutions are at a level where you can't see the jagged edges anyway. So no, it's not as important. Crysis at high in 1680x1050 will definately look better than mid in 1920x1200. Unless fps go below around 40, it doesn't matter anyway, as human eyes can't really follow faster fps anyway.
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
Wow thanks for the fast and helpful reply dagger, as always it was a pleasure.

Edit: I have heard that Crysis is not very resolution dependent, so does this apply to other games as well? I most likely will have the graphics power to max out almost all games at 1920x1200 with perfectly fine FPS.
 
@ Raven
Here is a chart listing LCD screen sizes and DPI/pixel size etc.
To be honest its all really down to your FPS at this point. If you go up to a 24" screen and stay the same distance from the screen its likley you will notice the slightly larger pixels/lower density, however you should really move away from a screen as it gets larger.
This will offset the differance in size/density and it should look better to you. You wouldnt sit right in front of a big LCD TV and expect a pin sharp picture, same goes for a monitor.
Mactronix
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790


Above 60 FPS is where we lose our ability to see a difference in smoothness. Unless I am not human that is. If you cant tell the difference between 60 and 40 fps then you may need to get your eyes checked lol

Best,

3Ball
 

Ketchup_rulez

Distinguished
Jun 2, 2008
51
0
18,630
^^^^^

lol, you think you can tell the difference between 90 and 100fps? Are you joking?

On topic: Do you guys think there is much of a difference in graphics between 1680x1050 and 1900x1200? I think that a 24" monitor is too big but would be willing to buy one if the graphics look much better on it than on a 20" 1680x1050 monitor. So, do they?
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
Well the difference also depends heavily on the monitor used. High FPS can make a slower monitor produce image qualities that can make it seem like the game is moving faster, when in reality it is your screen moving slower.

Same place I am at Ketchup, 24" is too large. What Mactronix said is most likely very true. I know I will be close to my monitor so I will see the difference and the 20" should look better. If I were to move back then the difference would disappear. Unless I am mistaken.
 

dagger

Splendid
Mar 23, 2008
5,624
0
25,780

Of course he can't tell the difference. Typical screen refresh rate is between 60 and 75hz anyway.

The limit for human eyes is 40-45fps. Modern movies and tv run at 23-28 fps. It's fast enough because you can't control the movie and jerk that screen, not to mention built-in blur effect. Even if you jerk the screen, 50fps is the absolute limit human eyes can detect. That's 50 frames every second. Those who claim to see higher fps are just seeing that fraps display on the corner. It's placebo effect.
 
@ Raven,
I dont follow the first bit about a slower monitor looking faster.
The second bit sound like you got what i said so if you cant move back it may not be worth your time/money getting the bigger screen.
The best advice i can give you at this point is if possible go to a good retailer and actually see the screen in person that way you will know if it looks any good at the distance you will be working at. There is always AA of course but at 24" you start needing some pretty serious graphics power to keep the FPS up.
As you say its looks like it wont be such an issue with a 4 series card.

@ Dagger,
There is no scientific limit to what the human eye can see, everybody is different.
You would be very hard pressed to look at a screen and say that's 60 FPS and then look at another and say that's 75.
You could however tell the difference if you were used to using a CRT running at a high refresh rate and a high FPS and then got put in front of a LCD running at a refresh of 60 and about 50 FPS.
We go through this at a regular interval on these forums and the same old reasoning comes up but trust me while some people do as you say make claims to be able to tell the difference, what they really mean is they can notice a difference.
Mactronix [edit for spelling]
 

marvelous211

Distinguished
Aug 15, 2006
1,153
0
19,280


You can tell in games. Why? Because frame rates fluctuate. Now if it was constant 60fps then you are right.
 


Ok so even if that is correct how fast is it travelling ? whats the length of the optical nerve its traveling down ? Do they mean the impulses in the retina or does it include the impulses along the cortex as well ?
Too many variables. Too many questions.
It hasn't been solved scientifically because it cant be. As I said before people are different and while I agree that there is a point at which it really doesnt make much difference to most people you cant give it definite number.
Mactronix [edit for spelling]
 

3Ball

Distinguished
Mar 1, 2006
1,736
0
19,790


If it fluctuates between 60 and 80 then I am hard pressed to see the difference, but the difference between 60 and 40 is immediately apparent to me. Call of duty 4 on my pc fluctuates anywhere from 45 - 80 or so on my pc, and I can tell you when it is below 60, but not when it is above it. I know for sure that if it is locked at 30 vs locked at 60 that I can tell the difference. Until now I had never seen a game that was locked at 40 or 50 fps. So I ran fraps and locked the framerate of the recording to those said speeds on a game that I know will never drop below, which is halo pc.

I ran 30 fps locked in the game. 40 fps locked in fraps. 50 fps locked in fraps. then 60 fps locked in the game. I could see an incremental increase at each stage. As far as the blurriness is on the screen, which attests to the smoothness of the gameplay imo (though you are right the stuttering is caused from the fluctuation, which I never argue, but did fail to point out). The game by far felt and looked the smoothest at 60fps, which is what I am above when I play on a normal basis without fraps.

I will say this though. I can tell the difference when I am above 60 FPS in certain cases, but this has nothing to do with what we were talking about. If the game starts to tear badly from it going over the 60hz refresh rate then I can see that, but in most cases doesn't bother me if it isn't excessive. One game that doesnt do this is Counter Strike Source, which if you can...please explain why this happens as I have always wanted to know. In CS I average around 250+ fps or so, but get absolutely 0 tears and the game is by far the smoothest that I have ever played (just good programming in that regard), but I just have never understood how in CS I can get well over 200 fps and get no tearing, but in almost every other game that I play if I go 5fps over my refresh rate if starts tearing.

If only all games we developed by valve right? lol

Best,

3Ball