1680x1050 vs 1920x1080
I have had a dual 20" 1680x1050 and 1600x900 monitor setup for quite a few years now, and I was wondering if I should upgrade to a single 24 inch 1080p monitor and then use my 1680x1050 monitor as a secondary display. Would the extra image quality (if any) be worth the price, or should I wait till the next new HD standard comes out since 1080p has been around for a while now? I use my PC for gaming and video editing, so would there be a noticeable quality/performance difference?
There is UHD in the works; ultra HD @ 7680x4320 and 3840x2160 resolutions. It will probably take a very long time for that to migrate over to HDTVs. Even as of today there is really not enough bandwidth to broadcast programs at 1920x1080 resolution without a lot of video compression. The more compression you use, the worse video will look. If you ever streamed movies from NetFlix and compare the quality to movie played back on Blu-Ray you should notice a difference. Plus some people complain about lag because the movie stops playing the movie is not being streamed fast enough.
Internet speeds has come a long way from the old 56.6kbs standard of "Ye o' phone modems", but they still have a very long way to go for even the lower end of UHD because the higher the resolution the more bandwidth you need. 3840x2160 resolution has 4 time the number of pixels compared to 1920x1080. That's about 8.3 million pixels vs. about 2.1 million pixels.
DVD movies basically have a resolution of 720x480 and are stored on 8.5GB discs. Blu-Ray movies are basically 1920x1080 resolution and are stored on 50GB discs. Blu-Ray resolution is 6 times greater than DVD resolution and the storage capacity of a Blu-Ray discis almost 6 times greater than DVD (about 5.9 times). The lower end of UHD is 3840x2160 resolution which is 4 times greater than HD as stated above. That would mean movies would need to be stored on "Ultra Blu-Ray" discs with a capacity of 200GB.
I'm not exactly sure, but I would guess that NetFlix compresses a 2 hour movie down to around 1GB. A Blu-Ray movie would probably be compressed to around 6GB. A 2 hour "Ultra Blu-Ray" movie would probably be around 24GB. I'm not sure many can download 24GB of data in two hours. It's going to be a very, very long time for such band; if ever.
Right now it is possible to stream an "Ultra Blu-Ray" movie, but you will need a T3 line which costs more or less $4,000 per month. A T3 line can transfer 44,736Kbps. That's kilobits per second, which works out to about 5,460 kilobytes per second, which translates over to about 5.45MBps; megabytes per second. At maximum speed it would only take around 1.22 hours to download 24GB of data which means it should be fast enough to stream an "Ultra Blu-Ray" movie. Since Netflix would also need to pay $4,000 per T3, don't expect your movie streaming subscription to stay at $8 per month; it will easily jump up to over $100 per month. Also add the $4,000 monthly charge for your own personal T3 line.
Will UHD resolution trickle down to PC monitors and consumer TVs? The likely answer will be, "Yes, eventually". But you may be talking about 10 years from now for the above average consumer to be able to afford one. I'm guessing when price of a UHD monitor or TB can be brought down to around $6,000 (excluding inflation).
Also, consumer level internet bandwidth will need to be expanded. I believe cable service download speeds is limited to 5Mbps or 10Mbps if you choose a highly monthly subscription plan. That's 5 megabits per second for the average internet cable subscriber which translates to about 610KBps (Kilobytes per second) which is equivalent to 0.61MBps (Megabytes per second). That's a far cry from a T3 line @ 5.45MBps.
Generally speaking, the higher the resolution the more details there will be so increasing the resolution can make a game look a little better. The down side is that you get less performance with your current video card because it is pushing more pixels. For example, if your current video give you 50FPS @ 1680x1050, then it may drop down to 40FPS @ 1920x1200.
Also, may find that switching from a 16:10 aspect ratio monitor (1680x1050 and 1920x1200) to a 16:9 aspect ratio monitor (1920x1080), the field of view in the game may change. It depends on the game; each game does it differently. I can't find the review for my StarCraft 2 example which has screenshots of how field of view changes between different aspect ratio so I'll try to describe it.
On a 16:9 aspect ratio 1920x1080 resolution monitor you get a wide zoomed out view of the battlefield which is expected. While a 16:10 aspect 1920x1200 resolution monitor actually has a larger viewing area (because of the extra 120 rows of pixels), the field of view is actually a bit smaller because the view is not zoomed out much as on a 16:9 aspect ratio monitor. It's kind of ironic that on a monitor with higher resolution, you see less of the battlefield since you are zoomed in closer.
For gaming maybe... like I said the higher resolution monitor will display a little more detail. Does profoundly improve graphics? No. It's hard to say 'cause I went from a 1280x1024 monitor to a 1920x1200 monitor. That's a vast difference. Maybe you have a friend with a 1920x1080 or 1920x1200 monitor so that you can see if are any noticeable difference. Load games on your PC and take screenshots. Load games on your friend's PC and take screenshots; preferably the same games and the same scenes. Compare the screenshots.
For productivity, it will be an improvement because the higher the resolution, the more desktop space you have. So for video editing it can come in handy in the right situation. For example, say you encoded two videos at 900x675 resolution with different settings and you want to compare their quality at he same time. Well on a 1600x1050 monitor you can't really do that. If you place the two videos side by side (excluding the borders around the video players) it will be 1800x675 so both videos cannot fit on the same screen. If you place one video above the other, then it still won't all fit on the screen because the combined resolution would be 900x1350. Having a 1920x1080 or 1920x1200 resolution monitor will allow you to display both videos side by side on a single monitor.
I assume you do not use spreadsheets like Excel, but the higher resolution allows you to see more columns and rows on the screen at the same time. Less scrolling around a large spreadsheet means more productivity.
The thing is that I have TWO 20 inch monitors, one at 1680x1050 and one at 1600x900. If I wanted to compare two videos I could use one screen for each which would be leaps and bounds better than two images on a single 1920x1080 monitor. So I am asking not from a productivity standpoint, but purely from an image quality perspective.
I recently added a 1920x1200 24" to my 1680x1050 22". The 22" is in portrait mode for documents and reading - it's amazing to have so much vertical room for vertical content. Not so good for proofing videos, so maybe that's irrelevant.
I would not wait for the next standard.
Is 3 monitors/eyefinity an option?
I don't know what you mean by "extra image quality".
I've never set up 3 monitors so I'm not the one to ask, but it might require either a gpu update or a second gpu.
you could browse this thread for ideas: http://hardforum.com/showthread.php?t=862341&highlight=1920x1200&page=1009
and read up on eyefinity for info on adding a third monitor.