Retina display query?

firstrig

Honorable
Dec 17, 2013
140
0
10,710
Aside from the jargon about having a high pixel density and fitting more pixels in a relatively smaller area, doesn't it violate the fundamental definition of a pixel?

A pixel is a fundamental unit which has its own dimensions. A 1920px X 1080 image is basically digital data which represents a 20 inch X14 inch image in terms of absolute dimensions, notwithstanding the actual size of your monitor. This means that if we had a hypothetical monitor with the dimensions 20 X14 inch, then it will display an image of the resolution 1920X1080p such that each pixel in the image will have a physical pixel on the monitor representing it. What happens when we get a bigger monitor is this density is diluted, and I can fathom this concept but not the reverse. How can we concentrate an image beyond the bare minimum size that it needs to occupy to represent an image of the given resolution, as Retina claims to do?

Now, taking that into consideration, how does one fit 'more pixels' in any area given that each individual pixel represents a unique bit of data and needs to occupy a certain amount of space for it to be called a pixel?

Taking my above example, fitting more pixels means keeping the resolution of the image intact but displaying it on a smaller screen, say 10X7 inch. Or using a resolution of 1440p on a 20X14 inch screen to squeeze in more resolution in a smaller resolution. So how are these extra pixels find place as opposed to the former alternative where we had a 1:1 representation for each pixel. Does it mean that the Retina display uses a smaller fundamental unit than what we conventionally know as a pixel?
 
Solution
I don't really get what you are on about. There is no "reference pixel" with a specific physical size. They can be as big as a fist or as small as a fraction of a mm.
A 1080p signal will be displayed on a 1440p monitor by interpolating the 'missing pixels'. A 1080p signal won't work on a 720p device. A picture taken in 5000x2500 will be recalculated on the fly to be displayed on a 1080p screen. On computers the output signal can be modified: 1024x768, 1440x900, 1920x1080 etc.
The GPU of a Mac with a retina display (don't know the exact resolution) will output the the exact amount pixels the monitor can display.

Interocitor2

Reputable
Apr 29, 2014
68
0
4,660
I don't really get what you are on about. There is no "reference pixel" with a specific physical size. They can be as big as a fist or as small as a fraction of a mm.
A 1080p signal will be displayed on a 1440p monitor by interpolating the 'missing pixels'. A 1080p signal won't work on a 720p device. A picture taken in 5000x2500 will be recalculated on the fly to be displayed on a 1080p screen. On computers the output signal can be modified: 1024x768, 1440x900, 1920x1080 etc.
The GPU of a Mac with a retina display (don't know the exact resolution) will output the the exact amount pixels the monitor can display.
 
Solution