Sign in with
Sign up | Sign in
Your question
Closed

Single 5770, going to 1920 x 1200, would appreciate some advice

Last response: in Graphics & Displays
Share
September 13, 2010 4:43:31 PM

Hello everyone,

I just recently upgraded my system with an i5-760, MSI P55-GD80, and 4 gb of corsair xms3. I had a single ati 5770 from before the upgrades, which I've been happy with my current resolution of 1440x990, and with the i5 upgrade the fps have been all that much better.

However, I'd like to push this new system a little harder, and I'd like to start with a bigger monitor, the popular resolution lately seems to be 1920x1200. If I were to get a monitor at this resolution, I know that it would strain the 5770 on higher settings. I'm not looking for the best possible settings from games, I don't want to spend $600 on a gpu, been there done that. The value play, to me, seems best to get another 5770 and crossfire. In many benches at 1920x1200 however, I see fps dipping below 40 and 30 (honestly I'm not that picky, I can't even remember if I can notice the difference between 40 and 60 fps).

My question is, what do you all suggest? The value play, and spend only $~130 more for 5870 level quality? My mobo supports both sli and crossfire, actually has 3 pci but the 3rd runs at 4x. I suppose I can sell the 5770 (I've never sold my past parts, be a new one for me), and end up spending around the same out of pocket for a 5870, of which I could crossfire later if needed, or go with nvidia tech.

Man, seems the market is more complicated now than it was 4 years ago, the performance and value gaps are pretty tight. Thanks for the suggestions.
a c 376 U Graphics card
September 13, 2010 5:00:27 PM

1920x1080(AKA 1080p) is actually far more common. You'll need to spend at least $100 more for a 1920x1200 monitor.
At 1080p the HD5770 is passable but not great. It will have issues maxing out the more intensive current games. I would just go ahead and get the monitor and try it for yourself before deciding if you need a new card(or a second HD5770 for crossfire.)
a c 102 U Graphics card
September 13, 2010 6:47:32 PM

the great thing about monitors is that you dont have to play at its native resolution. If you end up buying a monitor 1080p which is indeed much more common and you determine that your 5770 is not up to snuf to play at that res you can always lower it until you decide what to do, which is either get a all new card or jus add a second 5770. I would suggest getting a second 5770 and put it in the 4x slot...I dont see the 4x really bottlenecking a 5770 too bad and if it does it wont be much. I wouldnt try and sell your 5770 and upgrade to a 5870. Whenever you try to sell old parts you always take a big hit in cash. you never get a comparable amount back
Related resources
a c 376 U Graphics card
September 13, 2010 6:56:42 PM

Running an LCD monitor at non-native resolutions is a bad idea and will result in poor image quality unless you aren't scaling the image to the screen size. In that case you won't be using the whole monitor and will have black bars on the top/bottom/sides but the image will be fine.
September 13, 2010 7:22:09 PM

drums101 said:
the great thing about monitors is that you dont have to play at its native resolution. If you end up buying a monitor 1080p which is indeed much more common and you determine that your 5770 is not up to snuf to play at that res you can always lower it until you decide what to do, which is either get a all new card or jus add a second 5770. I would suggest getting a second 5770 and put it in the 4x slot...I dont see the 4x really bottlenecking a 5770 too bad and if it does it wont be much. I wouldnt try and sell your 5770 and upgrade to a 5870. Whenever you try to sell old parts you always take a big hit in cash. you never get a comparable amount back



This mobo actually has 2 16x slots that run at 8x/8x in cf and a third pci 2.0 that runs at x4 when used while the other 2 are in use.
September 13, 2010 7:23:56 PM

Does the extra resolution in the 1920x1200 offer much more fidelity, or is it an aspect ratio thing?
a c 355 U Graphics card
September 13, 2010 7:34:18 PM

Playing a game at less than native resolution will not result in poor image quality. It will result in slightly less sharp images. Generally speaking, the lower the resolution, the lower the image quality will be.

Image quality will generally be determined by the LCD panel technology and the internal electronics of the LCD monitor.

Below is an example of the game Anno 1404 on an Acer S243HLbmii monitor:


This is the native resolution of 1920 x 1080




1680 x 1050




1440 x 900




1280 x 720




And finally 1280 x 1024




Here is the review for the graphics...
http://www.prad.de/en/monitore/review/2010/review-acer-...



Text on the other hand is a bit more obvious....


Native resolution of 1920 x 1080




1680 x 1050




1440 x 900




1280 x 1024




Here's the review for the text...
http://www.prad.de/en/monitore/review/2010/review-acer-...



Other monitors will have varying results. Some will look better, others will look worse. The Acer is simply an example.
a c 376 U Graphics card
September 13, 2010 7:37:14 PM

It is an aspect ratio thing, 16:9 vs 16:10. Basically it adds 11% more screen to the top compared to 1080p. This is nice thing for most purposes outside of watching wide screen video.
a c 355 U Graphics card
September 13, 2010 7:44:29 PM

I prefer 1920 x 1200 over 1920 x 1080 because I like having the extra 120 rows so I can see more data on the screen. Also at the time I bought my NEC and Planar monitors, the high end monitors only came in 16:10 aspect ratio.

1920 x 1080 is a subset of 1920 x 1200 resolution. However, in some games 1920 x 1080 mode gives you a wider field of view vs. 1920 x 1200 because the graphics are a little zoomed out. For example, let's say in Star Craft 2 (I don't know for sure since I don't play the game) a resolution of 1920 x 1080 results in you being able to see more of the map than 1920 x 1200 because the graphics are zoomed out; meaning everything you see is smaller.

a c 355 U Graphics card
September 13, 2010 7:48:43 PM

Watching widescreen movies on a 16:9 monitor / HDTV can still result in black borders on the top and bottom of the screen. There are at least 7 different aspect ratios used in the film industry. The three most popular are 16:9 ( or 1.78:1), 2.35:1 and 2.40:1.

Obviously anything less than 16:9 can result in vertical black borders on the left and right side and anything greater than 16:9 will result in horizontal borders at the top and bottom of the screen.
a c 376 U Graphics card
September 13, 2010 7:58:04 PM

jaguarskx said:
Playing a game at less than native resolution will not result in poor image quality. It will result in slightly less sharp images. Generally speaking, the lower the resolution, the lower the image quality will be.

No, it goes beyond this. What you are saying applied to CRT monitors as they did not really have a native resolution. For LCD monitors the native resolution is the actual number of display elements that make up the screen. Using a lower resolution and scaling it up to the screen means there simply will not be the appropriate number of screen elements to display the image properly without pixel distortion(some being larger than others in ways that aren't appropriate.) When you run let's say 1600x900 on a 1920x1080 screen then 320 pixels must be twice as wide as they should be and 180 must be doubled up vertically. This is actually clearly noticeable in some of the images you linked above. As I said you need to either use the native res or not scale the image to the screen to avoid this.
a c 355 U Graphics card
September 13, 2010 8:06:59 PM

jyjjy said:
No, it goes beyond this. What you are saying applied to CRT monitors as they did not really have a native resolution. For LCD monitors the native resolution is the actual number of display elements that make up the screen.


Take a look at the screenshots I provided in my post above. While there is a decrease in sharpness due to interpolation resulting from using less than native resolution, it is not as terrible as you describe.
a b U Graphics card
September 13, 2010 8:08:46 PM

Why can't we all get along and come up with one standard ratio? really do we need 10 different options when 1 or even 2 will work. When you have to deal with the black bars on the side or top it's too much. Just think of how much we paid extra to get that wide screen TV or monitor and then have someone waste it with a miss sized video.
a c 376 U Graphics card
September 13, 2010 8:09:59 PM

That's not a "decrease in sharpness." A decrease in sharpness is naturally to be expected from a lower resolution while I'm referring to an actual distortion of the presented image which is a different thing entirely.
a c 355 U Graphics card
September 13, 2010 8:13:24 PM

Additionally, PS3 and Xbox 360 games are not rendered in 1920 x 1080 (to the best of my knowledge) on an HDTV or PC monitors. Those games are generally rendered at 720p and then scaled up to fit 1920 x 1080.

I don't really hear too many complaints about poor graphics quality from console gamers since every one of them are looking stretched (interpolated) graphics.
a c 376 U Graphics card
September 13, 2010 8:18:21 PM

So why do people even bother with high end video cards? Should we all just be using an HD5750 which is fantastic for 720p and stretching the image? The visual fidelity simply is not the same, the entire video card industry is based on this basically. Also I'm not sure what you think screen shots shrunk down to 500x334 prove but they are entirely inadequate.
a c 355 U Graphics card
September 13, 2010 8:19:22 PM

jyjjy said:
That's not a "decrease in sharpness." A decrease in sharpness is naturally to be expected from a lower resolution while I'm referring to an actual distortion of the presented image which is a different thing entirely.


Yes. Agree.

I used "decrease in sharpness" to avoid getting into more technicial aspects of image quality such as artifacts resulting from the distortion of pixel placement on the screen.

Again, look at the screenshots of the various resolutions I posted from an Acer LCD monitor in my above post. Are the images so terrible that you cannot bare to look at them?
a c 376 U Graphics card
September 13, 2010 8:22:02 PM

500x334 images... ?
a b U Graphics card
September 13, 2010 8:30:00 PM

You can definitely go down a notch in resolution without losing much quality. I started playing NFS shift in 1680x instead of 1080 by mistake and I liked it better. I choose in cockpit view and that view felt more immersive, that resolution. Sure you are going to lose some sharpness, but not much. The extreme would be using a 800x600 on 1920x monitor.
But its definitely a workable, viable option for the gamer.
a b U Graphics card
September 13, 2010 8:34:50 PM

1920x1200 requires the cpu/gpu to pump 11% more pixels per unit of time than 1920x1080.

All else being equal, the images have the potential for sharper resolution vertically. The horizontal resolution, however, remains unchanged.

Your own eyes may not notice the difference in any PC game.

In all uses, text will appear (11%) smaller at 1920x1200 than at 1920x1080, which may or may not be an issue for you. Windows and games (sometimes, via UI adjustments) canc compensate for this. In Windows this sometimes causes display issues. Finally, your eyes may still be good enough to find the smaller text desirable (more info on screen) . . . or they may not.

1920x1200 does give you the option of running 1920x1080, subject perhaps to the issues discussed thoroughly above.
a c 376 U Graphics card
September 13, 2010 8:36:55 PM

If you accidentally used 1680x1050 on an 1080p monitor you were also using a different aspect ratio(16:10) which was then squished down to 16:9 which is likely what made it feel different.
a c 355 U Graphics card
September 13, 2010 8:37:27 PM

jyjjy said:
So why do people even bother with high end video cards? Should we all just be using an HD5750 which is fantastic for 720p and stretching the image? The visual fidelity simply is not the same, the entire video card industry is based on this basically. Also I'm not sure what you think screen shots shrunk down to 500x334 prove but they are entirely inadequate.



Because there are those people out there who prefer to play at native resolution rather than having to stretch a lower resolution to fill the screen (like you and like me).

However, not everyone can afford to spend a lot of money on a video card. That leave them option of lower graphics quality and / or playing at a lower resolution.

Simply put, using less than native solution and stetching the graphics to fill the screen does not automatically result in poor graphics quality. Just less than optimal. However, as stated before, as the difference between native resolution and the resolution used becomes greater, the worse the image quality will become.


You need to think beyond what is best for you and what is best for other based on whatever the limitations are; such as the $130 budget limitation that the OP stated.
a b U Graphics card
September 13, 2010 8:41:08 PM

Also I've read posts from more than one lucky guy that owns those 30' or 27' Apple Panels, and they don't always play their games in 25xx X
Its a matter of higher resolution looks better, but never at the expense of watching a slide show.
a c 376 U Graphics card
September 13, 2010 8:42:15 PM

Twoboxer said:
1920x1200 does give you the option of running 1920x1080, subject perhaps to the issues discussed thoroughly above.

Actually it can use 1920x1080 just fine, with small bars on the top and bottom. If you stretch it to fill the screen then you will get more than the pixel distortion discussed above but rather the whole image will be changed with everything slightly skinnier/taller than it should be. I'm sure everyone has seen this at some point back when 4:3 TVs were the norm. Frequently older movies would be stretched to fill out the screen.
a c 376 U Graphics card
September 13, 2010 9:03:33 PM

jaguarskx said:
You need to think beyond what is best for you and what is best for other based on whatever the limitations are; such as the $130 budget limitation that the OP stated.

He is talking about crossfiring his HD5770 with that money... I don't see why the question of lowering resolution should ever come into play given the issues associated with it.
a b U Graphics card
September 13, 2010 9:33:32 PM

Yeah, j, I personally agree. I just didn't want to get embroiled in all that other discussion lol . . . that's why I said "subject perhaps".
a b U Graphics card
September 13, 2010 9:42:49 PM

Well he is also debating getting a newer bigger monitor. Using its native resolution as a factor for needing more power.
Anyways, getting a nicer monitor is always a good rewarding experience. Even if you have to compromise settings to game on it. Then unless your broke, you will have more reasons to upgrade your gpu power. Thats how it played out for me :) 
I had 1 4770 with a gigaybyte sli/crossfire 8x8x board.
I went with the gtx 460.
Newegg stopping sales on it and I wanted dx11 were deciding factors.
September 13, 2010 10:01:29 PM

This is becoming a very informative thread :) .

I'm just wondering what the best play is for the money if I were to get a bigger nicer monitor with 1080p - 1200 resolution. Still seems getting another 5770 might be the best, perhaps they will drop in price when 6xxx comes out. Anyone think otherwise?
a c 376 U Graphics card
September 13, 2010 10:33:10 PM

Yeah, I would get the monitor when you see a deal you like but for a possible card upgrade you might as well wait a bit. Now really isn't the best time to buy a card. Over the next few months the HD6000 series as well as some more Fermi cards should be released. A single HD5770 will be passable for even 1920x1200. It won't be ideal but it should be serviceable in the meantime while the new cards are released and the prices on others respond. After that come back and ask again, probably November or so would be a good point to upgrade.
a b U Graphics card
September 13, 2010 10:43:40 PM

Powercolor released a new 1gb 5770 thats single slot. I believe the cooler must have a nice chunk of copper, its effective. The point is ,GURU 3d did some 2 , 3, 4 way crossfire with it. The results showed some good numbers even on 3 way. 4 way almost never showed gains except synthetics. Good read http://www.guru3d.com/article/powercolor-radeon-5770-si...
a b U Graphics card
September 13, 2010 11:08:35 PM

yeah get a 460

5770 isn't really enough for 1080p I have this issue now wish I would have went bigger


oops should have read everything play at the native resolution which should be fine with 2 5770's
September 14, 2010 1:48:47 AM

spentshells said:
yeah get a 460

5770 isn't really enough for 1080p I have this issue now wish I would have went bigger


oops should have read everything play at the native resolution which should be fine with 2 5770's


Are you saying you wished you went with the 460 as opposed to 2 5770 or just thought I was talking about one 5770?
a b U Graphics card
September 14, 2010 3:14:43 AM

that is absolutely right ........but I just couldn't wait the 7 months is all

but for the record SLI 5770's will fill that screen nicely.

I have a single 5770 @ 1080 it does the trick still better than consoles but you know I wish I had the extra grunt for AA

Best solution

a c 169 U Graphics card
September 14, 2010 4:24:45 AM
Share

the answer to the initial question should be, Yes, get another 5770 to run in crossfire! that will give you the best bang for buck and you wont have to worry about stuffing around and selling your old card, you wont get your money back from it anyway. (From what ive seen most people selling 2nd hand 5770's are asking around the same price a new one costs, if im paying more than 3/4 of what a new one is worth, id just get a new one.) It will run that screen resolution fine. And by the time the 2 5770's get outdated there will be much better cards available.
September 14, 2010 3:02:15 PM

Best answer selected by pensar.
a c 271 U Graphics card
September 14, 2010 7:09:31 PM

This topic has been closed by Mousemonkey
!