Is there any company or even researchers out there working in increasing beyond the "standard" resolution to size ratios? Right now, it seems that for the past 10 or so years monitors have all been about the same resolution for a given size, i.e. 1280x1024 5:4 at 17 or 19 inches, 1440x900 or 1680x1050 for 19-22 inch, 1920x1200 for 24, and 2560x1600 at 30, and while there's been a shift to a different aspect ratio, from first 5:4 to 16:10 to 16:9, there hasn't really been any increase in pixel pitch/sharpness as a result. Even as far back as CRT's, we more or less had the same pixel pitch for the average monitor. So why hasn't anyone been making higher pixel density monitors? Sure, it takes more graphical power to drive higher resolutions and they'd be more expensive, but a higher resolution is a flat increase in image quality, assuming you do things like scale up the font so its still readable. I would think there would at least be a very limited market for super high resolution monitors, both by extremely dedicated gamers and for those using a pc for various work or artistic purposes, but for some reason, there just don't seem to be any monitors like this.
It can't be purely a limitation of the technology, as a 17 inch 1920x1200 laptop screen has the required pixel pitch to form a 3840x2400 34 inch screen, and a higher pixel density than would be required for 2560x1600 or 2560x1440 at 24 inches. Now admittedly I'm fairly certain that most laptop screens are TN panels, so an increase in resolution may not be worth the cost of using TN to those dedicated to a better panel type, but I'd imagine its at least feasible to increase pixel density on a -VA or IPS panel.
Now I realize that even dual-link DVI can only handle up to 1920x1200 at 120 Hz and 2560x1600 at 60Hz, so until displayport is more prevalent we aren't going to go beyond 25x16 at all, but even that would be a fairly strong improvement at a smaller screen size than 30 inches.
Does anyone know if there are technical limitations to creating higher pixel density screens, or if it really is an issue where the companies don't think anyone would buy them?
Edit: I've thought of one disadvantage to a pixel pitch that tight - programs that run in fixed-size windows (although if somebody knows of a per-application magnifier, like I asked in this thread: http://www.tomshardware.com/forum/241504-49-application..., it might not be such a big deal)
Higher density on a smaller screen to increase PPI generally means it would make text harder to read and probably falls into a smaller niche than 30" 2560 x 1600 resolution monitors.
Sure it can be done since there are high resolution screens for laptops. However, due to the potentially niche market production output may be small. This usually means higher prices. For example, a 24" 2560 x 1600 resolution monitor can be created. However, it might cost as much as a 30" 2560 x 1600 resolution monitor is to manufacture because of fixed overhead.
As a very simplistic example...Say it costs $10 million to run an LCD panel production line per year. The demand for 30" 2560 x 1600 LCD panels is 25,000 per year, that means it costs $400 to manufacture each LCD panel. The demand for 24" 2560 x 1600 LCD panels may only be 10,000 per year, that means it costs $1,000 to manufacture each LCD panel.
i remember reading in a pc magazine about 5 years ago that there was a 19 or 20 inch monitor that had a resolution of 10,000 x 10,000 pixels. the monitor had such a tight pixel density that if you used a magnifying glass you could see details that the human eye unaided could not. the technology is out there, however it is not as easy to implement as you might think.
I'll list a few reasons:
-additional cost, users wouldn't want to pay a higher price
-most text/some programs have set sizes... you'd have to use magnification which defeats the purpose.
-for any fullscreen gaming or video you would need much better internal hardware
-consumer video maxes out at 1080p, only the original pre-production footage is higher resolution. what media would you store such video on? i don't feel like a 6 disk blue-ray collection for one movie
-quite a few people don't have eyes that could fully take advantage of such a massive gain in pixel density. even on my old 20" 1600 x 1200 i would have to get right up to the screen to see the pixels
Basically it seems that the consensus is that its too expensive, and that applications just aren't developed for it.
I wouldn't think it would be that much more expensive to produce say, a 24 inch panel at 2560x1600, considering the cost of laptop panels, but I suppose I don't know for sure. Certainly it wouldn't be much more than the $2000+ that some high-end color critical monitors at that size cost, anyway. I guess there's already a dedicated but very small market for color critical work and a higher resolution is just a useful but nonspecific improvement, so while useful it isn't required for anyone and thus has a market too small for even a single model.
Just judging by the popularity of 16:9 over 16:10 shows that in many cases a generally more useful feature is trumped by a more popular feature. It's probably at least partially the same idea of keeping with whats expected that applies here.
I know from experience that theres a visible difference between 1920x1200 at 17 and 24 inches (in particular I didn't ever need antialiasing on the 17 inch screen), but while the larger screen isn't as crisp it really isn't noticeably bad, and thats probably the main reason higher pixel density isn't very popular. Probably much like why TN panels are the most popular. They're cheaper than IPS/VA, and the benefit isn't all that noticeable to a casual observer.
On the other hand, video formats alone have gone from analog to ~1280x720 to 1920x1080 in recent years, and while 800x600 wasn't bad at first hardly anyone uses it as a primary resolution anymore. Maybe once video formats, hardware capabilities, and program design starts to catch up to the resolutions we have now it'll be more commercially feasible for companies to produce higher PPI/lower pixel pitch monitors.
In the meantime, if someone was really desperate enough for a higher pixel density they could just mount up 4 10 to 15 inch laptop panels in an array and handle the requisite electronics somehow.
I've had an idea along these lines - if a monitor had a really high internal resolution, like 10240x6400, it could scale most lower resolutions to exact multiples (I figure as long as it takes up at least 90% of the screen in one direction, it won't feel too small). Scaling to an exact multiple of the input resolution could be done quickly as it doesn't require pixel interpolation, which I think is the main reason non-native resolutions look bad on LCD monitors, though it would negate the effects of subpixel font smoothing. However, if it had a special input (probably USB) that allowed the computer to send text and font data to it separately, it could do its own text scaling which would look a lot better than the video signal would normally allow (at least for vector-based fonts; raster fonts might not look any better)
Edit: For that matter, it should be possible already to write software to make programs think the resolution is lower (e.g. 1920x1200 on my 2560x1600 monitor) for purposes of raster operations, but still draw vector-based fonts with the full resolution.
by material costs alone you are correct that a smaller screen would cost less. however, factor in the special equipment required to create such screens, development costs, marketing, and you're starting to look at a pretty large number. I can not remember offhand but five years ago the price was at least $5000
one cannot fault major manufacturers from pursuing the current trends. businesses are out to make money front and formost so they have to in order to stay alive. that said, once blueray (1080p) is about as common in homes as a dvd player there will be a push towards the future in media. once a new format (and a way to store it) is created we will see such products in every store.