Sign in with
Sign up | Sign in
Your question

Why bother playing in ultra high resolutions?

Last response: in Graphics & Displays
Share
January 3, 2008 1:56:54 PM

So here is the deal:
I have HP 30” LCD and also just bought NVIDIA 8800GT 512MB.
I have tried to play some games in deferent resolutions and did not see much deference.
Why everyone wants to play their games in high resolutions? What are the advantages?
The only disadvantage I see is that you have to keep buying high end video cards just to be able to keep up with new games on high resolutions.
My understanding is this (please correct me if I am wrong) If I buy say a new game say in 6 months and my card can not play it on high I will just lower a bit the resolution and all the problem is gone so this way I can keep my card for much much longer.
Obviously I would not want to play any lower then 1024 X 720 but above that I do not see deference. My new video card does very good scaling in all resolutions so what am I missing?
Please enlighten me!
Thank you very much all!
January 3, 2008 2:11:53 PM

the image degrades when your not playing in your monitors native resolution. Im ok with the image being stretched and blurred in some games, but as much as possible, id like to game with the proper image.
January 3, 2008 2:15:54 PM

Native resolution. Full stop.
Related resources
January 3, 2008 2:17:33 PM

If you lower the res and turn AA on to like 2X then you might get better results in FPS with the same look and feel so go for it...what it comes down to in the end is 1. What do your eyes notice (after all you are the end user) and 2. Are the screen ratios the same at the lower resolution (because they should be for the game to look right).

Good observation nonetheless so thank you.
January 3, 2008 2:17:38 PM

Quote:
Why everyone wants to play their games in high resolutions?

Because you want to play at the native resolution of the monitor.
Quote:
What are the advantages?
If you play at lower than native resolution of your monitor, you start to degrade the picture quality since you are stretching a 1024x768 picture to a 1900x1200 picture. You'll start to notice more flaws. Also, some people prefer higher resolutions because they say that they no longer need AA. I, on the other hand, prefer a moderate resolution with some moderate AA.

Quote:
My understanding is this (please correct me if I am wrong) If I buy say a new game say in 6 months and my card can not play it on high I will just lower a bit the resolution and all the problem is gone so this way I can keep my card for much much longer.
You can do this if you'd like, but this is caused by the game becoming more demanding graphically. Your card has to render more space as you increase resolution, therefore increasing the texture quality, polygons, etc. and having that increased rendering space, you will see a drop in frame-rates.

If you're happy with 1024x768 or somewhere about there on a 30" monitor, great. I think some individuals will scoff at you for doing so though. :p  Kinda like buying a hi-def TV to watch only standard-def channels.
a c 130 U Graphics card
January 3, 2008 2:23:38 PM

At the end of the day if you have found a solution that works for you and gives you a longer usefull life out of your graphics cards then carry on as you are.
In the same way as some people are all about FPS some people are mad about video/picture quality.
Mactronix
January 3, 2008 2:26:14 PM

yvn, what is the native resolution of the 30" LCD? In other words, what is the highest resolution the monitor will display?

I ask because I play all my games on a 40" LCD...but the higest resolution it will play is 1680 x 720

Edit: Or is it 1380 x 720....I cant remember. Whatever a 720p HDTV resolution is. :pt1cable: 
January 3, 2008 2:28:04 PM

Native Res is the only way to go. I play 1920x1080 all the time, at this resolution AA is not necessary at all, as I find using AA removes some of the crispness and sharpness of the games.
January 3, 2008 2:38:04 PM

rallyimprezive said:
yvn, what is the native resolution of the 30" LCD? In other words, what is the highest resolution the monitor will display?

I ask because I play all my games on a 40" LCD...but the higest resolution it will play is 1680 x 720

Edit: Or is it 1380 x 720....I cant remember. Whatever a 720p HDTV resolution is. :pt1cable: 



Native resolution on most 30" monitor LCDs is 2560x1600.
January 3, 2008 3:16:34 PM

Malatar said:
Native resolution on most 30" monitor LCDs is 2560x1600.


Sounds like he's playing on a TV and not a computer monitor.
January 3, 2008 3:28:52 PM

What is the resolution of your 30"?
January 3, 2008 3:40:01 PM

TV LCD's are going to have a lower native res. I've got a 37" Olevia HDTV (1080i) and it's like 1900x1200. I'm pretty sure a 30" Monitor is going to be much higher as mentioned above.
January 3, 2008 4:01:25 PM

the thing i noticed with upgrading your settings is that you see no difference, but when you revert to your old settings you see a difference. I usto play all games with AA off, when i turned it on i noticed no difference...but when i turned it off again i did see the difference.
January 3, 2008 4:03:07 PM

yvn said:

I have tried to play some games in deferent resolutions and did not see much deference.

Obviously I would not want to play any lower then 1024 X 720 but above that I do not see deference. My new video card does very good scaling in all resolutions so what am I missing?
Please enlighten me!



You're one of the lucky folks who can't tell the difference in high visual fidelity, so playing at high resolutions doesn't offer much of a benefit for you.

As for me - and other unfortunate souls like me - running at anything other than the native resolution of our monitors looks like hell.

So rock on and thank god you can't see what we can. :) 
January 3, 2008 4:08:32 PM

Quote:
As for me - and other unfortunate souls like me - running at anything other than the native resolution of our monitors looks like hell.
I'm the same way, if the resolution is only slightly off. (The picture is squeezed or stretched a little) I notice it right away and it bugs the crap out of me. That's why I had to edit the .exe instructions for BF2142 to force it into 1280x1024... they only support 1280x900 or 1280x700 or something. It looks absolutely TERRIBLE on my native 1280x1024 screen, but all is well now.
January 3, 2008 4:25:58 PM

I could tell the difference but it's really minimal between AA and no AA. But once you get used to having AA and turn it off it makes more dramatic difference.

But what are we really playing games for? AA? AF? It all boils down to the game play and the story that entertain us.
January 3, 2008 4:52:22 PM

Quote:
If your playing on 30 inches and a res of 2560 x 1600 I think the 8800 gt isnt enough for that res. You should have a minimum GTX or Ultra for that res.


Yeah, for 2560x1600 he'll definitely want more than 512MB of VRAM to run it most games at high settings. :p 
January 3, 2008 5:22:56 PM

I play on a 17inc LCD monitor with a standard 1280*1024, but I'm perfectly happy :) 
January 3, 2008 6:35:46 PM

This may sound like a newb question but Im not a bigtime gammer.

How can you tell what your native resolution is on your monitor ?

I have an old 19" Viewsonic CRT

Thnks
January 3, 2008 6:44:32 PM

CRT's Monitors really don't have a native resolution, but they do... kinda (Confusing, huh? Read on). They can display just about anything you throw at them. Once you start to get the resolution too high, you'll start to notice that the monitor can't fit every line of resolution onto the screen and you'll lose quality. (and probably your eye-sight trying to squint and read everything... lol) If you have a CRT TV with a VGA input or something, it does have a native resolution, which is usually like 680x420 or something ridiculously low. I just recommend not using a standard-def TV for graphics. >.>

On the hand, LCD's do have a native resolution which can either be found a label or in the documentation for the device. (You can probably just google the model # as well)
January 3, 2008 7:00:57 PM

Cool.. Thanks.. I am so behind the times. I really want to get a new monitor when I build my next PC..

BTW. I can run games at low res anyway.. My PC is to slow!
Anonymous
January 3, 2008 7:15:26 PM

I run an ATI HD3870 with 512MB of ram on my 22" at full native of 1680x1050 with full effects on all games. CoD4 is sooo amazing that it almost looks real in some areas. its does 24x AA and 16x AF which does make a difference. But in my opinion its not needed to go nearly that high for the performance loss you suffer on most systems.

Like everyone else says. It all comes down to the user and what you find to your liking. I also find it true the closer you are to your native resolution the less AA and AF you need or notice.

PS I'm also itching to get a 30" LCD. Its the next toy on my list. So in short, I'm jealous !
January 3, 2008 7:34:21 PM

rgeist554 said:
TV LCD's are going to have a lower native res. I've got a 37" Olevia HDTV (1080i) and it's like 1900x1200. I'm pretty sure a 30" Monitor is going to be much higher as mentioned above.


Your 37" Olevia is 720p (1280x720, possible 1366x768).

"1080" is all marketing. Before 1080p models were mainstream, you would actually see 720p advertised. Now, you see 1080i on everything. Sure, 720p models will accept a 1080i signal and scale, but they only have 1280x720 pixels.
January 3, 2008 7:34:27 PM

you bastards im on 15" 1024 x 768......and its 48ms gray to gray.


=[
January 3, 2008 7:35:10 PM

yvn said:
So here is the deal:
I have HP 30” LCD and also just bought NVIDIA 8800GT 512MB.
I have tried to play some games in deferent resolutions and did not see much deference.
Why everyone wants to play their games in high resolutions? What are the advantages?
The only disadvantage I see is that you have to keep buying high end video cards just to be able to keep up with new games on high resolutions.
My understanding is this (please correct me if I am wrong) If I buy say a new game say in 6 months and my card can not play it on high I will just lower a bit the resolution and all the problem is gone so this way I can keep my card for much much longer.
Obviously I would not want to play any lower then 1024 X 720 but above that I do not see deference. My new video card does very good scaling in all resolutions so what am I missing?
Please enlighten me!
Thank you very much all!


Why Bother playing in higher resolutions? Hmmm......

Because the marketers who sell all this crap told you so! That's why! BUY BUY BUY!!!!!!

I agree though. I don't bother with anything more that 1280x1024, and 1024x768 still does me fine.
If the game looks good at a lower res, then that is all that matters to me.

January 3, 2008 8:01:38 PM

Why higher resolutions? It's simple; if you want a larger image (bigger monitor) you need a higher resolution to maintane image quality. It's not marketing or the big evil corporations trying to swindle you. Need proof? Right click on somebody's avatar and save it. Open it and zoom in to 150%. If you don't see image degredation you should get your eyes checked. My 22" monitor has a 1680x1050 native resoltion. Anything lower becomes very obviously degraded. Any larger becomes distorted in order to fit what is essentially a large image on a screen thats too small for it.
January 3, 2008 8:04:51 PM

Quote:
Your 37" Olevia is 720p (1280x720, possible 1366x768).

"1080" is all marketing. Before 1080p models were mainstream, you would actually see 720p advertised. Now, you see 1080i on everything. Sure, 720p models will accept a 1080i signal and scale, but they only have 1280x720 pixels.
Are you sure? Even when going through the options it displays 1900x1080 - 1080i, etc. Also, I can scale through different definitions - Standard Def (480i - I forget the res here), 720p - 1280x720, and of course 1080i - 1900x1080. Each one displays the actual resolution as I listed and each one looks progressively better with the 1080i looking the absolute best.
January 3, 2008 9:59:57 PM

am i the only one loosing my marbles readin this? why are people quoting 1080i as a good thing? 1080i is a waste of time if your screen will not accept 1080p thats 1900x1080 progressive, then why bother trying to force 1080 onto it. it will only display 1366x720pixels (or something like that) and so you will lose image quality.


i would recommend to all people to get a high end card at least once in their life. Play a game on full specification and you will never go back. the quality is generally amazing. imagine on low res, a person in the distance is a black square due to the pixel being massive as it has been stretched. try getting a headshot. now imagine on high res, the person is detailed, you can see what is a head a body and legs... you will find it easier to shoot them in the head. because you will be able to see it. i realise its not always that bad but thats the basics of it.
January 3, 2008 10:25:01 PM

Yeah I have a X600 so in COD4 I play on 800 600 and turn on AA 4X. Looks ok then.

But Native res always looks the best.
January 3, 2008 11:30:57 PM

gow87 said:
am i the only one loosing my marbles readin this? why are people quoting 1080i as a good thing? 1080i is a waste of time if your screen will not accept 1080p thats 1900x1080 progressive, then why bother trying to force 1080 onto it. it will only display 1366x720pixels (or something like that) and so you will lose image quality.


i would recommend to all people to get a high end card at least once in their life. Play a game on full specification and you will never go back. the quality is generally amazing. imagine on low res, a person in the distance is a black square due to the pixel being massive as it has been stretched. try getting a headshot. now imagine on high res, the person is detailed, you can see what is a head a body and legs... you will find it easier to shoot them in the head. because you will be able to see it. i realise its not always that bad but thats the basics of it.

I agree interlaced video is crap.Spend the extra money and get 1080 progressive
January 4, 2008 6:15:45 AM

hey guys i planning to get a 20" or 22" lcd which do you recommend widescreen or standard oh and same number of pixels in a smaller screen is better quality right? they both have the same resolutions.
January 4, 2008 7:45:56 AM

I game at ultra high resolutions because not only do they look better, they also allow for more viewable content to be displayed on the screen while playing the game. The reason why framerates suffer so horribly from higher resolutions is because at every interval the amount of pixels the card has to process grows exponentially...the high end cards are generally up to the task, but high resolutions on todays latest games generally require a multi-card setup.

So, they look better and you can see more...but I can only speak for myself.
January 4, 2008 7:50:16 AM

@neisonator...

it depends on the resolution you're talking about. If the highest resolution supported by both monitors is, say, 1680x1050, it'll look best on the 22", either the 20 inch standard ratio monitor will have black bars, or it will stretch the image depending on what you have your display adapter settings on...

I personally have a 22" with a native resolution of 1680x1050 and it looks fantastic...also very cost effective at around 200...
January 4, 2008 7:56:48 AM

cheers mate will get the 22 widescreen might have somthing that might put up a challenge to my gts lol
January 4, 2008 1:55:01 PM

cah027 said:
This may sound like a newb question but Im not a bigtime gammer.

How can you tell what your native resolution is on your monitor ?

I have an old 19" Viewsonic CRT

Thnks


CRTs are much more flexible than LCDs; they'll look decent at nearly every input resolution unless of course you go beyond its maximum and shrink the thing.
LCDs however, scale very poorly. If the max (ie: native) resolution is 1600x1200 for example, and you send, say, 1024*768 at it, it looks, well it looks really bad. You want to be using the native resolution for all applications on a LCD, otherwise, you're probably better off using a CRT.
January 4, 2008 2:34:06 PM

it'd be nice if the OP read is own thread and answered a few questions.

Or maybe im just to sensitive.
January 4, 2008 6:39:21 PM

cleeve said:

You're one of the lucky folks who can't tell the difference in high visual fidelity, so playing at high resolutions doesn't offer much of a benefit for you.

As for me - and other unfortunate souls like me - running at anything other than the native resolution of our monitors looks like hell.

So rock on and thank god you can't see what we can. :) 



I couldn't agree more "cleeve". And as, for the current 30" LCD computer monitors the native resolution is 2560x1600 like some of you already mentioned above. Gaming between my Sony 52" XBR4 LCD HDTV with an PS3 @ 1080p is no comparison to my PC @ 2560x1200 resolution...at least to me it is.
Happy gaming!
______
System:
* Intel Core 2 Q6600
* Tt Orb II Blue Cooler
* BFG GeForce 8800Ultra 768MB DDR3
* ASUS P5B-VM DO with Tt LANBox microATX case
* 2x1GB, PC2-6400, CL=4-4-4-12
* Ultra X2 750-Watt with UV & SLI ready
* 1.5TB Seagate RAID 0
* 150GB WD Raptor X
* Lite-On 20x SATA/DVD/CD Writer with LightScribe
* TrackIR 4 Pro
* Cougar HOTAS
* Dell UltraSharp 3007WFP LCD (2560x1600 native resolution)
One can never have enough!
a b U Graphics card
January 4, 2008 8:00:13 PM

Quote:
What are the advantages?


If you are playing any FPS games then quality makes a HUGE difference esp. when you are sniping ppl.
January 4, 2008 8:07:03 PM

It's because you don't know how to shoot. :lol: 

Sitting there and sniping is no fun. I rather run around blast everyone. :bounce: 
January 4, 2008 8:57:05 PM

1920x1200 & 4xAA is enough for me in games. I can use more res in windows apps, but for games I don't notice any difference.
As another poster pointed out, interlace sucks. Don't bother.
Also forget about CRT, this is the 21st centry already!

Also, get a 28" or larger wide-screen @ 1900x1200+ if you can afford it. I see a HUGE difference over a 20" 1600x1200. Night and day! That extra few inches makes all the difference in games and movies!
Maybe it helps the new monitor is like 2 generations newer and brighter, but WOW.
January 4, 2008 9:09:43 PM

I don't find higher/lower resolution a problem since it accounts for the amount of pixels are in screen, if you use same res at higher screen size however then it matters.

I just use a 720p HDTV to run it on and use 4x AA funnily enough I get 100% perfect picture and much better FPS than a screen using 2560x1600 without any AA at all. There are a few insignificant details missing but it doesn't really matter when I can possibly up the image quality more. ALso don't need dual GPU's to power the screen.


Thats just me though.
January 4, 2008 9:41:32 PM

rallyimprezive said:
yvn, what is the native resolution of the 30" LCD? In other words, what is the highest resolution the monitor will display?

I ask because I play all my games on a 40" LCD...but the higest resolution it will play is 1680 x 720

Edit: Or is it 1380 x 720....I cant remember. Whatever a 720p HDTV resolution is. :pt1cable: 


If you hook up to your monitor using a High def cable such as HDMI or component and your video card supports HD outputs you can get better resolutions. I am currently hooked up to a 50" in 1080i, btw, that means you still run in 1920 x 1080, but you are capped at 30 mhz for a refresh rate, where as 1080p is 60mhz.
January 4, 2008 10:02:19 PM

cleeve said:
You're one of the lucky folks who can't tell the difference in high visual fidelity, so playing at high resolutions doesn't offer much of a benefit for you.

As for me - and other unfortunate souls like me - running at anything other than the native resolution of our monitors looks like hell.

So rock on and thank god you can't see what we can. :) 


Either that, or his TV/connection are crap.
a b U Graphics card
January 4, 2008 10:12:49 PM

aznstriker92 said:
Yeah I have a X600 so in COD4 I play on 800 600 and turn on AA 4X. Looks ok then.

But Native res always looks the best.

Ahh!! Low res + AA = one giant blur! Drop the AA and go for 1024x768. :lol: 
January 5, 2008 7:17:21 AM

cleeve said:
So rock on and thank god you can't see what we can. :) 


rofl
January 5, 2008 8:22:13 AM

enewmen said:
1920x1200 & 4xAA is enough for me in games. I can use more res in windows apps, but for games I don't notice any difference.
As another poster pointed out, interlace sucks. Don't bother.
Also forget about CRT, this is the 21st centry already!

Also, get a 28" or larger wide-screen @ 1900x1200+ if you can afford it. I see a HUGE difference over a 20" 1600x1200. Night and day! That extra few inches makes all the difference in games and movies!
Maybe it helps the new monitor is like 2 generations newer and brighter, but WOW.


I still run a couple of old high end 19"CRT's that go up to 2048x1536, The nice thing is that at higher resolutions the pixels themselves shrink unlike LCD's, where the increase in resolution is just a bigger screen size not a pixel shrink. Pixels Shrinking automatically reduce jaggies significantly as well as making the image look sharper in general. The way that CRT's work to draw the image can lead to far smoother performance in games.

If you are using a LCD as your display for everything then I can understand you wanting a big monitor, I use a projector for movies, and a 19"CRT is plenty big enough for gaming use, when you crank up the resolution you can fit a huge amount on the screen and not have to divert your eyes as far to monitor stats etc (high resolution for games with lots of menus and sub displays is a godsend), for gaming purposes more of the screen is in your central field of vision on a smaller screen with higher resolution as opposed to a larger screen with the same number of pixels.

A really nice feature of good old CRT's is flexibility, I find an ideal resolution for windows, and play games in totally different resolutions without any negative side effects. I can play different games in different resolutions, my monitor which lets me play at 2048x136 in some games can turn down to 1280x1024 to improve performance in crysis without suffering from any native resolution issues.

The only issues with old high end CRT's are availability and ageing. Sadly they arent getting any younger...

Sure its the 21st century but compare a 200 year old oak table to something you get flat packed from MFI, old doesnt mean lower quality or less practicality, Its probably heavier and bulkier though :lol: 

CRT's may be old, they may be near enough out of production but they are still the display of choice for many serious image professionals and some of the gamers who were in a position to own high end CRT's rather than cheapys before cheap LCD's flooded the market.

If your CRT cost £400+ 5 years ago the chances are theres little in the way of LCD's now that you would swap it for.
January 5, 2008 8:23:19 AM

MisfitSELF said:
Sounds like he's playing on a TV and not a computer monitor.



Know wonder he can't tell the difference in resolution.
a b U Graphics card
January 5, 2008 11:50:43 PM

I just hate using anything over 1280x960, because my mouse is not a high enough DPI to cover the extra number of pixels without me throwing my hand all over the desk :lol: 
January 6, 2008 12:14:54 AM

Would be nice to get some screen shot comparisons of lower res + AA vrs higher res without AA, roughly at the performance.

say 1280x1024 + w/ 4x AA compared to 1920x1200 with no AA.
a b U Graphics card
January 6, 2008 12:20:16 AM

I'll try to do some later, I'll even through that (ugh) 800x600 w/ 4xAA in.
!