That doesn't make any sense....I'm typing from a 17" 1920 x 1200.
How can a monitor be "too small for the rez" ? All it means that the picture is more accurate as they jammed more pixels in per inch to produce a higher quality image with less "graininess". There's a reason you don't see many manufacturers jamming more pixels in per inch .... cause it costs more money to produce a better product.
Most printers are 300 x 300 print resolution. So I guess my 600 x 600 printer is too big for 8.5 x 11 paper ???? No, all it means is that the 600 x 600 puts down 4 times as many dots as 300 x 300 on the same square inch of paper producing a more accurate image. Look at two 22" monitors side by side, all things being equal, the one with the higher resolution will have the more accurate image.
1920 x 1200 resolution is available on screen from 15" to 27". At 27" you can easily count the pixels as they are readily visible to the naked eye. As the size gets smaller they get smaller and smaller. In this respect 96 dpi can be considered the functional equivalent of 30 fps. Anything less than 30 fps and most people can see flicker.....anything less than 96 dpi and most people can easily see the individual pixels.
Let's compare how a face is defined on your 24" TN screen which covers 1" x 1" of screen real estate versus a 22" S-PVA model:
24" screen = 94 dots x 94 dots x 6 bits x 3 colors = 159,048 bits
22" screen = 106 x 106 x 8 bits x 3 colors = 269,664 bits
The 22" screen places 70% more data within that 1" x 1" space than the 24". In other words, the accuracy of the 24" will be only 59% of the 22" model.
A computer screen is "in your face" ; it's not hanging on a wall 10 feet away. Walk up to a 60" 1920 x 1200 screen and see how close you can get ...after about 7 feet, the image gets fuzzy and grainy. Now start walking up to smaller and smaller screens and you will note that you can get closer and closer before this occurs. With a computer monitor, we are talking 18" away...most desktop returns are only 18" - 24" deep.
Here's how Displaymate the makers of standard benchmark monitor testing software say about "Dot Pitch"
"The smaller the value the finer and sharper the image can be"
A 22" has a dot itch of 0.240
A 24" has a dot pitch of 0.270
So all that means is that the 22" model will have a finer and sharper image. The finer the dot pitch, the more accurate text appears on screen, the less "jaggies" you see on curbed lines, the more accurate colors are simply because you are squeezing more dots into a smaller space for more sharpness.
Look here for more:
http://www.littlepc.com/faq_lcd_technology.htm#dotpitch
"The dot pitch specification for a display monitor tells you how sharp the displayed image can be. The dot pitch is measured in millimeters (mm) and a smaller number means a sharper image. In desk top monitors, common dot pitches are .31mm, .28mm, .27mm, .26mm, and .25mm. Personal computer users will usually want a .28mm or finer"
As you can see 0.27 isn't bad as it's just under the acceptable limit of 0.28 .... but 0.24 is simply better and comes with a higher quality monitor.