Sign in with
Sign up | Sign in
Your question

Gaming at 1680x1050 vs. 1920x1200 with 8800 GTX

Last response: in Graphics & Displays
Share
May 3, 2007 2:54:24 AM

Hello everybody!

I will purchase an 8800 GTX the 15th (or an R600 if theyre finally available!), I am still debating whether to spend the additional $450-$500 for a 24" monitor, or whether to save the money and get a 22".

I have seen the benchmarks and the 8800 GTX will be able to handle both resolutions beautifully with everything maxed out on just about every single game. Some taking AA better than others.

My big question is, will there be a noticeable difference in the image quality between those two resolutions.

Keep in mind that I will attempt to run 16xAA whenever possible no matter which res I choose cause I hate jaggies with a passion.

Please keep the discussion to image quality only. Frames per second can be fixed with SLI, which Ill definately do if needed.

Thanks for any help. I would love to see some screenshots if anyone knows of any links. I sadly could not find anything.
May 3, 2007 3:57:00 AM

I play all my games in 1920x1200 and couldn't go back to less then that.
May 3, 2007 4:43:01 AM

I'm not sure what you mean by a difference in image quality between resolutions. A 1920 x 1200 display will show 2304000 pixels on the screen in order to fill it, and a 1680 x 1050 display draws 1764000. The only difference between the screens would be that the larger screen contains more pixels, and the image will be larger.

There's really not much to it. If I had the cash for it, I'd certainly plunk down the money for a 24" screen and an 8800GTX, no question. I play at 1680 x 1050 on a Radeon X1900XT, and that's pretty much heaven for me considering I'm a college student with limited desk space and a budget.

Why not go to a store that carries the two screens you're interested in buying, get friendly with them, and ask if you can compare them both on a high end system? You might be able to finagle that at your local Best Buy or something.

If you're going to spend that much dough on something you'll be staring at for the next few years, make sure it's exactly what you want.
Related resources
May 3, 2007 12:10:48 PM

Sounds reasonable but no such luck. The only store in the area is a CompUSA and they dont allow it. Already tried.

Anyway, I can understand your point because I am a graphic artist. I understand the DPI concept. However, the reason I ask is because of this; I realise that one of the main pros of more pixels is a much better image smoothness, or less jaggies. But since the 8800 GTX is capable of such ridiculous AA (16xAA), then I am questioning whether the added pixels are really necessary. Two more inches of screen space are definately not worth $450 extra bucks. However, if the added 490 pixels im getting can provide noticeably better image quality and definition, then Id go for it.

For instance, is there a noticeable difference in image smoothness between the two res without AA turned on? Do textures and the like look more detailed with the added pixels?

Im just trying to justify the $450 extra I would need to shell out. Also, Playing at 1680x1050 would obviously help my 8800 GTX achieve a longer service time.

@BadDad

Would it be a hassle if I asked you for screen shots of any game at both res? Could you play around with both settings and give your honest opinion? Is there a noticeable difference in image quality? Obviously, if you have an LCD, the image quality might detiriorate by lowering the res, but see what you can do. Thanks.
May 3, 2007 12:30:19 PM

Curious about this myself.Hope you get an answear.
May 3, 2007 12:47:46 PM

Dont count on constant high frame rates at 1920x1200 with all the options maxed.

I run two pcs with 8800GTX's on CRT screens at those sorts of resolutions and a little higher, and oblivion with texture mods can still result in stutter at times in intense areas, X3 unmodified will have the occaisional stutter and LOTRo also causes issues (although that could be coding related as the client appears to have some memory leak issues). I dont doubt theres other games out there that can cause slow down even with the 8800GTX when you get to the high resolutions with everything turned up.

Also need to make sure that the actual pixel size is smaller when you go to the larger screen in order to see significant improvement in image quality from resolution otherwise if the pixels are the same size but theres just more of them you are just seeing more of the game at the same quality. It wont be taking away any jagged edges unless the pixels are smaller. I use LCD screens at work but CRT screens for play, Im going to be gutted when one of these CRT's dies because theres little that can match them for image quality :(  refresh rate or resolution with high DPI.

Im planning on going SLI as soon as I have the money.
May 3, 2007 1:05:46 PM

Quote:
The pixel pitch of a 24" is better than that of a 22".

http://www.newegg.com/Product/Product.aspx?Item=N82E168...
http://www.newegg.com/Product/Product.aspx?Item=N82E168...

Not an earth shattering difference, but its there.
And the price difference certainly reflects that.

Anyways, I don't think that performance should be your concern when choosing between the 22" and the 24". Playing with 8xAA instead of 16xAA isn't going to kill you, so choose the monitor you want, not the one you think will perform better.
May 3, 2007 1:14:27 PM

In my experience, I can really tell the difference between 1900x1200 vs. 1680x1050 on my 24". Especially in Oblivion, I cannot play at 1680x1050 without cringing. Although 1680x1050 might be adequete for a 22", it's definitely not for 24" (at least for me).

Also, don't expect to play newer games maxed out at 1900x1200 with a single 8800 gtx at 16xAA. It's a good card, but not invincible. Imo, 16xAA is overkill at that resolution; 8xAA (even 4xAA) is more than sufficient for me at that res.
May 3, 2007 1:19:43 PM

Quote:
Dont count on constant high frame rates at 1920x1200 with all the options maxed.

I run two pcs with 8800GTX's on CRT screens at those sorts of resolutions and a little higher, and oblivion with texture mods can still result in stutter at times in intense areas, X3 unmodified will have the occaisional stutter and LOTRo also causes issues (although that could be coding related as the client appears to have some memory leak issues). I dont doubt theres other games out there that can cause slow down even with the 8800GTX when you get to the high resolutions with everything turned up.

Also need to make sure that the actual pixel size is smaller when you go to the larger screen in order to see significant improvement in image quality from resolution otherwise if the pixels are the same size but theres just more of them you are just seeing more of the game at the same quality. It wont be taking away any jagged edges unless the pixels are smaller. I use LCD screens at work but CRT screens for play, Im going to be gutted when one of these CRT's dies because theres little that can match them for image quality :(  refresh rate or resolution with high DPI.

Im planning on going SLI as soon as I have the money.


Ditto on the CRT's for gaming. I've often thought about trading my LCD in to go back to CRT. Can't be beat for IQ. I have dreams about this:

http://www.amazon.com/Sony-GDM-FW900-Widescreen-Trinitr...
May 3, 2007 1:21:54 PM

That looks nice, dont have the deskspace though, would definately need a new piece of furniture to go with the PC. Looks nice though.
May 3, 2007 1:27:30 PM

Go for the 24" you won't regret it
May 3, 2007 1:48:14 PM

Quote:
That looks nice, dont have the deskspace though, would definately need a new piece of furniture to go with the PC. Looks nice though.


*It is a piece of furniture :lol: 

But yea, not the most space-efficient monitor.

My Dell 2405 is great. Would definitely recommend.
May 3, 2007 1:49:57 PM

I have a 24 inch Acer and 2 X1900's in X-fire and I don't think I could ever go back to playing at anything less than 1900 X 1200. Well, at least not happily anyway.
May 3, 2007 1:56:26 PM

While I do appreciate your comments and the comments of other posters, I would appreciate it greatly, and Im sure everybody would appreciate it greatly as well, if one of you gamers with a 1920x1200 screen can do us all the favor of taking some screen shots.

The ideal set up would be taking 4-5 screens for 1920x1200 and and 4-5 screens for 1680x1050, each screen with a different level of AA. Obviously, if someone has a better set up, please include more screens depending on the AA capabilities of the set up. Cause if you can go all the way up to 16xAA, you will be a godsend. Please do not resize the images or reduce the quality. Sorry if im being too demanding, but honestly, anyone who can go through all the trouble will have the appreciation of everyone participating on this thread. Thanks!!

Edit: And a scene with power lines and trees and stuff that can take advantage of AA would also be a perfect candidate for this. Thanks!!
May 3, 2007 3:30:37 PM

I gave this a lot of thought when I bought my current monitor. I settled for a 22" 1680x1050 Samsung. I wanted my 8800gtx to have some headroom to push this monitor on future titles. I really don't like the idea of multi gpu setups and am hoping that as Dx10 titles come out the I can still do as I do now which is run anything at native res. with good-max settings.

I have no doubt though that anyone sitting in front of a 24" can't go back. I know when I get behind my 19" it feels piddly though when it was new I thought it was huge.

Now I could really warm up to a 30" but that definitely requires a multi-gpu setup.
May 3, 2007 3:35:33 PM

That has been my reasoning. Getting a 22" monitor will extend the service time of a $550 card.

So far, I havent been able to justify the purchase of the 24". It is much better quality and allows more connections, but I plan on purchasing a projector anyway, so I would be using the PJ as my primary TV and game console viewing. The monitor is just for gaiming and internet/web design/programming.

The less amount of pixels will almost probably allow me to use 16x AA with smoother fps. And in the end, image quality is what im looking for, cause if I want size, ill just hook the PC up to my PJ and get 100" of gaming on my wall. Now that would be a sweet set up.
May 3, 2007 4:58:48 PM

Well, your uses for the monitor changes things somewhat. If you want to program and web design, then the extra screen real estate and higher resolution is just what the doctor ordered.

I used a 30" at my job when I coded, and it made life soooo much easier.
May 3, 2007 5:30:30 PM

I think you're slightly confused.

There is no inherent difference in image quality between 16x10 and 19x12 resolutions. Like some have already mentioned, the difference lies in the DPI - this is where your comment about less "jaggies" and such comes in. The more DPI, the better. However, you're not ever (well, you shouldn't run your 24" native 19x12 monitor at 16x10, and your 22" 16x10 :CD won't run at 19x12. Since pixel pitches are all about the same, more or less, there will not be a difference in image quality.

That said, I cannot go back to a 22" for several reasons.

1)Real estate - the difference 2 inches makes at that size is huge, and trust me, it's worth every penny. If you have the pennies to begin with that is.

2) TN panel (all 22" panels are I believe atm) vs a real panel. I don't often look at my monitor from the side, but there are enough times where you turn your head or move around or whatever and if you have a TN panel - you immediately start to see the dreaded darkening/yellowing of the screen. Heck, with most 22" TN panels I can see it just from the angle from where I'm sitting in front of it to the sides of it (if that made sense). So yeah - TN panel = bad, if you ever see them side by side with a real panel you will never use one again.

Really it comes down to your budget, since the 24" screens will run you double still than a 22" will cost you. But since you have a GTX already and are even thinking SLI - I would say go for it. You certainly won't regret it, I don't think I know any 24" inch LCD users that regretted the purchase.
May 3, 2007 5:51:00 PM

Sorry to be a bit of a noob about this but how do I work round the oblivion screen shot restrictions? I cant take screenshots with AA switched on, Ive manually edited the .ini file but it just wont work with AA enabled :( 

If you could see what oblivion looks like at 2048x1536 with just 8xaa on a monitor with a .22 dot pitch you would definetely not be wanting 16xAA.

Even if I could take the screenshots with AA turned on they wouldnt look the same on a monitor with a higher dot pitch :( .
May 3, 2007 5:54:07 PM

24" have many pros over the 22"

The biggest one is the TN panel as you stated. The BenQ FP241W sports a P-MVA, so of course it is much better quality in comparison to a TN panel. Also, the connectivity is great with the BenQ with its HDMI and Composite connections.

I have the money, dont get me wrong, but the whole thing is justifying the purchase, as it will run me $450 extra. However, your argument so far is pretty strong, so right now im leaning towards the BenQ.

@DTQ

Thanks for the attempt man. Sadly I cant help you since I would not know how to do it. Good luck.
May 3, 2007 8:01:12 PM

Will grab some pictures from LOTRO at 2048x1536 with 8xaa on as soon as it gets to day time (jaggies probably wont show so well against night sky). Will have to wait to post them here till I get into my works ftp server dont think photobucket allows pictures that big anymore :( .
May 3, 2007 9:22:04 PM

Is it going to matter what screen rez the screenshot is taken at if you are going to be examining it at whatever rez you currently have? :?

I know you could just look at part of it at a time, but that's not going to give you an idea what 19x12 looks like, it's just going to give you an idea what part of that image looks like at 1024x1280 (or whatever you have now). If you really want to see how high the quality of image is at higher resolutions, you've got to get in front of one somewhere. It seems like if you told the guys over at the computer store that you were going to buy one they would let you test drive, no?

I do CAD work and some of the newer laptops they are buying us are 19x12, and brother they are really nice! (of course they haven't got ME one yet...)
May 3, 2007 10:33:23 PM

1680x1050


1920x1200


I don't know if this helps or not.
May 3, 2007 10:33:47 PM

Your post has two ideas behind it. One is the concept of DPI which is a printing term, but I do not know of the monitor jargon for this effect. The other is Pixel Pitch which is a similar concept. The laptops you are mentioning, with 1920x1200 17" screens, have an amazing Pixel Pitch in comparison to most LCDs. At 1680x1050, the 20" monitor offers the best Pixel Pitch, and the difference between the 22" and the 24" in Pixel Pitch is almost negligeable.

However, where you are mistaken is that at 1024x768, you say you only see a part of a 1920x1200 screenshot. That is false because when you resize a resolution in a game, you are not increasing your viewing area, you are increasing the amount of DPI (not really DPI but its the closests thing I can think of that resembles the effect of making the res higher). With higher DPI, there are more pixels available to one inch of screen space, meaning that it has more definition and less jaggies.
May 3, 2007 10:38:05 PM

Great job there BadDad

Is it possible to get these images in HQ?

You can send them over to my email if its too much of a hassle.

PM me and ill give you my email.
May 3, 2007 10:45:54 PM

I think your basic misconcpetion is that you are going to be resizing your gaming resolution on your LCD. This worked in the CRT era - you could up your res basically to what your video card would support - and get better image quality because essentially your DPI was increasing - but LCDs are slightly different in that they have a NATIVE reolustion - the number of pixels - and ANY other resolution needs to be interpolated and simply WILL NOT look nearly as good (ie 16x10 will look SIGNIFICANTLY worse on a 19x12 montior than on a native 16x10 monitor, even if you don't stretch the resulting picture).

Thus your image quality on all current LCD monitors, as determined by the pixel pitch, is all about equal. You just get a BIGGER size of screen at the same DPI (= pixel pitch) for a 19x12 monitor vs a 16x10 monitor.
May 3, 2007 11:02:10 PM

You are correct sir. However, the Pixel Pitch of a 24" is slightly better than that of a 22".

My point to PETEvsDRM was that changing resolutions does not change the viewing area. This is what I understood from his post. If that is not what he meant then I apologize.
May 3, 2007 11:16:06 PM

Indeed, you're right - but the whole point is that you ain't gonna be doing any resizing, unless you want your IQ to go down the drain. If you buy a 22inch monitor, you WILL be gaming at 16x10, and if you buy a 24 inch, you WILL be using 19x12, and nothing else.
May 3, 2007 11:48:47 PM

That I perfectly understand.

What are you gaming with?
May 4, 2007 12:43:45 AM

Amen, same here. Just say no to TN panels!
a b U Graphics card
May 4, 2007 12:56:14 AM

Quote:

My point to PETEvsDRM was that changing resolutions does not change the viewing area. This is what I understood from his post. If that is not what he meant then I apologize.


Actually that's incorrect. If the game supports WideScreen, then your game will change your view angles. Oblivion and a bunch of others do this (I know Oblivion personally so it's my example).

In your reply to PvD you use the 1024x768 resolution (4:3) and 1920x1200 (16:10), that will show an effect. however keep them WS resolutions then it should scale the scene equally and you should see similarly on a 1680x1050 screen what you see on the 1920x1080 screen, however they'll both be different from the 1280x1024 screen.

Attach an external monitor or use a CRT and change the resolutions from WS to 4:3, you will see the exact same area differently if the game automatically adjusts for WS as it should if supported. If the game didn't adjust for WS reoslutions then your image would be distorted or you'd have black bars.

Here's a WS support list;
http://www.widescreengamingforum.com/wiki/index.php/Mas...

Look at the difference in Oblivion between 4:3 and 16:10 support;
http://www.widescreengamingforum.com/screenshots/oblivi...
May 4, 2007 1:14:44 AM

personally, I think that 1920x1200 is a resolution where things start to get pretty demanding on gpu. Am I the only that thinks even a 8800gtx will struggle to play @1920 titles pretty soon (if not already now, Oblivion etc...). I think if you want to game at 1920 now AND 1.5-2yrs from now; get 8800gtx SLIed to be safe.

Bottomline is a single 88gtx can already be brought to its knees @1920 today by certain titles! Imagine titles 1year from now (Crysis etc...). 88gtx SLIed!
May 4, 2007 1:34:59 AM

I see the mistake I made. I shouldve pointed out that it was 1440 x 900 in comparison to 1920x1200. That was the point I was trying to make. Didnt realise I was forgetting about the aspect ratios. You understood me right?
May 4, 2007 2:21:42 AM

@vpsaline

That is why im getting a P5N-e SLI. Future proofing my investment. :wink:

Also, I must admit that when my two most awaited games come out (Brothers in Arms: Hells Highway and Assasins Creed), if the performance is not at a point where I like it to be, Im instantaneously upgrading, whether it be SLI or another GPU. I could always sell my 8800 GTX to a friend to recouparate some of the money. But currently there is no game title on the market a 8800 GTX cant run.

http://techreport.com/reviews/2007q1/geforce-8800-sli/i...

Even at 2560x1600 on one of the most demanding games out, a single 8800GTX can still get you by at a reasonable level.
May 4, 2007 2:23:12 AM

Here are the links to the images provided BadDad by in HQ!


Thanks alot!


May 4, 2007 2:31:50 AM

Hmm actually, I think I've been talking out of my @ss.. it would make sense that having more pixels will improve the IQ even if the pixel pitch stays the same.

Please ignore my previous dumbass comments... :x
May 4, 2007 2:33:37 AM

Dont come to that conclusion just yet.

Reason being that this image obviously has some AA being applied to it.

That kinda takes away from the ability to judge accurately the real level of IQ between them.
May 4, 2007 2:48:07 AM

There is no AA applied to those pictures.
May 4, 2007 2:53:57 AM

Upon closer examination.. can't really find any differences.. ?
May 4, 2007 2:55:57 AM

Are you asking or telling me?

I personally do not see any difference in image quality.

Obviously, the Pixel Pitch would be better with the 1920, so that is a plus for IQ right off. But Im just amazed at how good the quality of the image is without AA.

I mean... thats just stupid good.
May 4, 2007 7:13:15 AM

No AA? The 1920 I could just about understand that on but the 1600 ? With no AA at all Iwould expect to see some jaggies there even on my 0.22 monitor! even at 1920 I get some. Are you sure the Nvidia drivers arent overriding game settings? My single 8800GTX should be giving similiar image quality to the 8800GTX in SLI, at the same settings? Maybe its just that scene I dont know which game that is and so dont know if thats an intro or in game scene? but Certainly non o the games I have here show those sort of edges without AA enabled either in game or in the drivers.
May 4, 2007 7:55:25 AM

Note, screen captures: if you want to give really accurate pictures and still apply compression, use PNG, not JPEG: JPEG will modify the image (even in HQ) making AA level fore example irrelevant.
May 4, 2007 10:18:10 AM

They were screen shots of game play in Quake 4 uncompressed jpeg.
May 4, 2007 10:21:21 AM

JPEG doesn't handle uncompressed; even 100% quality will result in image degradation.
Use PNG for non-destructive image compression.
May 4, 2007 12:17:50 PM

The 8800 GTX do have some default AA settings applied to it. I read it here I think

http://www.rage3d.com/reviews/video/nvidia8800/index.ph...

It is not an in-game scene, I saw the non-AA version and the AA version and there is a difference with the fence in the back. Ill try and post it later.

@Mitch074

I think youre being a little too paranoid here.
May 4, 2007 4:35:36 PM

Quote:
Here are the links to the images provided BadDad by in HQ!


Thanks alot!





All I was saying is that when you open up these pictures at full resolution, you will have to scroll one way or the other to see the whole thing.

The entire image cannot be displayed at full resolution unless your monitor is the same resolution that the screenshot was taken , or greater.

That's why looking at a screenshot will not give you an idea of what the higher rez monitor looks like.



All you are ever seeing (unless you resize the image, which destroys the point) is a fragment of the screenshot a your current monitors native resolution. :wink:

Which gives you no idea how the Image would look on the actual high rez monitor.
May 4, 2007 6:10:27 PM

Now I understand what you mean. You are right my friend.

Thanks to DTQ, I have acquired some very good screen shots of the difference between 1600x1200 vs. 2048x1536, I shall be posting them soon. Since he obviously has a CRT in most likelyhood, then he does achieve better definition. However, even if this werent the case, and the pixels were simply expanded so that all images have the same DPI, there is still a larger amount of pixels covering the same area. So if I resize the higher res image to 1600x1200, the image quality is slightly better. The images were not taken with AA turned on.

This helps me come to the conclusion that the difference between 1680x1050 and 1920x1200 on the 22" and 24" inscreens will probably be very slight.

However, in light of some of the arguments for the 24" provided by some posters, I am heavily leaning on getting a 24". The higher quality panel and HD capabilities with the extra inputs really makes it a better investment, the two extra inches are a little bonus throw in.

I am in most likelyhood getting a BenQ FP241W.
!