Why is high resolution so desireable?

G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

I keep on seeing posts with comments that imply that such-and-such
(monitor, card, whatever) is better because it can display at a higher
resolution, etc.

I can accept that there are situations where the higher resolutions,
which result in smaller content on the screen, are an advantage in that
one is then able to fit more on the screen, and this is definitely
useful at times (but this has nothing to do with quality). But other
than that, what is the big deal about higher resolutions?

This leads me to wonder about the following: is there any difference
between viewing an image/DVD at a resolution of a x b, and viewing the
same image at a higher resolution and magnifying it using the
application's zoom software so that the size is now the same as that
under a x b? I have been unable to see any obvious difference, but then
again I haven't been able to do side-by-side tests (single monitor).

Thanks for any response.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Wow, that is a lot of information, Bob, and, to tell you the truth, I
will have to re-read it slowly if I'm going to digest all the details.
I will do that, at my pace.

Let me say here that I almost included the word "misnomer" in my
original post, with reference to the term "resolution", but I refrained
from doing so because in a way we ARE talking about resolution. i.e. we
are changing the number of pixels within fixed screen dimensions,
hence we are changing resolution. I am questioning the benfits of doing
so.

You make reference to what the eye can resolve, under some conditions.
What I fail to see (sorry) is this: taking a specific image, say a 5cm
square on a given monitor at a specific resolution setting, what is
the benefit of displaying that image at a higher resolution if the
result is going to be smaller? Are we going to be able to discern more
detail? This is not the same as a photographic lens being capable of
greater resolution, because the benefits of this higher resolution
would show up in larger prints, otherwise there is no point. If you
are going to make small prints you don't need a lens that can resolve
minute detail. Yes, the hardware is providing us with more pixels per
cm/inch, but the IMAGE is not being displayed using more pixels. Not
only that, but the image is now smaller, possibly to the point of being
too small to view comfortably.

I can't help but suspect that everybody is chasing "higher resolution"
without knowing why they are doing so.

Thanks for your response.

Bill
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Many gamers usually strive to
run at 1600x1200 because it creates a cleaner edge around objects
without resorting to anti-aliasing.

Not being a gamer, I'd have no appreciation for this, but fair enough.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

I think the more mainstream way you're seeing it involved higher
resolutions that involve a cleaner edge. Many gamers usually strive to
run at 1600x1200 because it creates a cleaner edge around objects
without resorting to anti-aliasing.

--
Cory "Shinnokxz" Hansen - http://www.coryhansen.com
Life is journey, not a destination. So stop running.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"bxf" <bill@topman.net> wrote in message
news:1118848795.170276.266490@f14g2000cwb.googlegroups.com...
> I keep on seeing posts with comments that imply that such-and-such
> (monitor, card, whatever) is better because it can display at a higher
> resolution, etc.
>
> I can accept that there are situations where the higher resolutions,
> which result in smaller content on the screen, are an advantage in that
> one is then able to fit more on the screen, and this is definitely
> useful at times (but this has nothing to do with quality). But other
> than that, what is the big deal about higher resolutions?

You've hit on a very good point, but to cover it adequately I'm
first going to have to (once again) clarify exactly what we mean
by the often-misused word "resolution."

In the proper usage of the word (and, by the way, how you
most often see it used with respect to such things as printers
and scanners), "resolution" is that spec which tells you how
much detail you can resolve per unit distance - in other
words, if we're really talking about "resolution," you should
be seeing numbers like "dots per inch" or "pixels per visual
degree" or some such. Simply having more pixels is not always
a good thing - you have to first be able to actually resolve them
on the display in question (not generally a problem for fixed-
format displays such as LCDs, if run in their native mode) AND
you need to be able to resolve them visually. That last bit
means that the number of pixels you really need depends on
how big the display (or more correctly, the image itself) will
be, and how far away you'll be when you're viewing it.

The human eye can resolve up to about 50 or 60 cycles
per visual degree - meaning for each degree of angular
distance as measured from the viewing point, you can't
distinguish more than about 100-120 pixels (assuming
those pixels are being used to present parallel black-and
-white lines, which would make for 50-60 line pairs or
"cycles"). Actually, human vision isn't quite this good
under many circumstances (and is definitely not this good
in terms of color, as opposed to just black-and-white
details), but assuming that you can see details down to
a level of about one cycle per minute of angle is often used
as a rule-of-thumb limit.

This says that to see how much resolution you need, and
therefore how many pixels in the image, you figure the
display size, what visual angle that appears to be within
the visual field at the desired distance, and apply this
limit. Let's say you have a 27" TV that you're watching
from 8 feet away. A 27" TV presents an image that's
about 15.5" tall, and if you're 8 feet (96 inches) away,
then the visual angle this represents is:

2 x inv. tan (7.75/96) = 9.2 degrees

At the 60 cycles/degree limit, you can therefore visually
resolve not more than about 576 line pairs, or 1152
pixels. Anything more than this would be wasted, and
even this, again, should be viewed as an upper limit -
your "color resolution" (the spatial acuity of the eye in
terms of color differences) is nowhere near this good.
In terms of pixel formats, then, an image using
the standard 1280 x 1024 format would be just about as
good as you'd ever need to be at this size and distance.
Note that a 15.5" image height is also what you get from
roughly a 32" 16:9 screen, so the HDTV standard
1920 x 1080 format is just about ideal for that size and
distance (and an 8' distance may be a little close for
a lot of TV viewing).

However, this again is the absolute upper limit imposed by
vision. A more reasonable, practical goal, in terms of
creating an image that appears to be "high resolution" (and
beyond which we start to see diminishing returns in terms of
added pixels) is about half the 60 cycles/degree figure, or
somewhere around 30. This means that for the above-mentioned
27" TV at 8', the standard 480- or 576-line TV formats,
IF fully resolved (which many TV sets do not do), are actually
pretty good matches to the "practical" goal, and the higher-
resolution HDTV formats probably don't make a lot of
sense until you're dealing with larger screens.

At typical desktop monitor sizes and distances, of course,
you can resolve a much greater number of pixels; from perhaps
2' or so from the screen, you might want up to about 300
pixels per inch before you'd say that you really couldn't use
any more. That's comfortably beyond the capability of most
current displays (which are right around 100-120 ppi), but
again, this is the absolute upper limit. Shooting for around
150-200 ppi is probably a very reasonable goal in terms of
how much resolution we could actually use in practice on
most desktop displays. More than this, and it simply won't
be worth the cost and complexity of adding the extra pixels.


> This leads me to wonder about the following: is there any difference
> between viewing an image/DVD at a resolution of a x b, and viewing the
> same image at a higher resolution and magnifying it using the
> application's zoom software so that the size is now the same as that
> under a x b?

No, no difference. In terms of resolution (in the proper sense
per the above, pixels per inch) the two are absolutely identical.

Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

bxf wrote:

> Wow, that is a lot of information, Bob, and, to tell you the truth, I
> will have to re-read it slowly if I'm going to digest all the details.
> I will do that, at my pace.
>
> Let me say here that I almost included the word "misnomer" in my
> original post, with reference to the term "resolution", but I refrained
> from doing so because in a way we ARE talking about resolution. i.e. we
> are changing the number of pixels within fixed screen dimensions,
> hence we are changing resolution. I am questioning the benfits of doing
> so.
>
> You make reference to what the eye can resolve, under some conditions.
> What I fail to see (sorry) is this: taking a specific image, say a 5cm
> square on a given monitor at a specific resolution setting, what is
> the benefit of displaying that image at a higher resolution if the
> result is going to be smaller?

One adjusts other settings so that feature size is the same. This can cause
other difficulties however.

> Are we going to be able to discern more
> detail? This is not the same as a photographic lens being capable of
> greater resolution, because the benefits of this higher resolution
> would show up in larger prints, otherwise there is no point. If you
> are going to make small prints you don't need a lens that can resolve
> minute detail. Yes, the hardware is providing us with more pixels per
> cm/inch, but the IMAGE is not being displayed using more pixels. Not
> only that, but the image is now smaller, possibly to the point of being
> too small to view comfortably.
>
> I can't help but suspect that everybody is chasing "higher resolution"
> without knowing why they are doing so.
>
> Thanks for your response.
>
> Bill

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

J. Clarke wrote:
> bxf wrote:
>
> >
> > You make reference to what the eye can resolve, under some conditions.
> > What I fail to see (sorry) is this: taking a specific image, say a 5cm
> > square on a given monitor at a specific resolution setting, what is
> > the benefit of displaying that image at a higher resolution if the
> > result is going to be smaller?
>
> One adjusts other settings so that feature size is the same. This can cause
> other difficulties however.

Hence my last question in the original post: what is the difference
between an image of a certain viewing size (dictated by the monitor
resolution), and the same image, viewed under higher resolution
settings and therefore a smaller image on the screen, all other things
being equal), but magnified by the application (or "other settings", as
you put it)?

Simplistically, this is how I see the situation: we have an image of A
x B pixels. If we view it under monitor resolution settings of say, 800
x 600, we will see an image of a certain size, which depends on the
monitor in use. If we change the resolution to 1600 x 1200, we are
halving the size of each monitor pixel, and the image will be half the
size that it was at 800 x 600. If we now tell the application to double
the size of the image, the application must interpolate, so that each
pixel in the original image will now be represented by four monitor
pixels. This would not result in increased image quality, and it
requires that the application do some CPU work which it didn't have
to do when the monitor was at the lower resolution setting.

So the question becomes one of comparing the quality obtained with
large monitor pixels vs the quality when using smaller pixels plus
interpolation. And, we can throw in the fact that, by having it
interpolate, we are forcing the CPU to do more work.

Any thoughts on this? Am I failing to take something into
consideration?
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

J. Clarke wrote:
> bxf wrote:
>

> Not magnified. Font size, icon size, etc adjusted at the system level, so
> things are the same size but sharper.

How do these apply if I'm viewing an image with a graphics program or
watching a DVD?
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Bob Myers wrote:
> "bxf" <bill@topman.net> wrote in message
> news:1118925982.351383.175100@g49g2000cwa.googlegroups.com...
> > Hence my last question in the original post: what is the difference
> > between an image of a certain viewing size (dictated by the monitor
> > resolution), and the same image, viewed under higher resolution
> > settings and therefore a smaller image on the screen, all other things
> > being equal), but magnified by the application (or "other settings", as
> > you put it)?
>
> Here we again run into confusion problems between "resolution"
> as is commonly used here and the term in its technically proper
> sense - but the bottom line is that a given object rendered at
> a specific resolution (in terms of PPI) looks the same no matter
> how many pixels are in the complete image (i.e., the full screen)
> or how large that screen is. In other words, if you have an image
> of, say, an apple appearing on your display, and that apple appears
> 3" tall and at 100 ppi resolution (meaning the the apple itself is
> about 300 pixels tall), nothing else matters.

It's funny. Although I understand what you say and can clearly see its
obvious validity, I still find myself failing to understand how it
relates to the following:

If I have an image obtained from a digital camera, for example, that
image consists of a fixed number of pixels. If I want to see that image
on my screen at some convenient size, I can accomplish that in two
ways: I can set my monitor's "resolution" to values which more-or-less
yield that convenient size, or I can tell the application to manipulate
the image so that it is now displayed at this size.

If I use the latter technique, the application must discard pixels if
it is to make the image smaller, or, using some form of interpolation,
add pixels in order to make it larger. In either case there is image
degradation (never mind whether or not we can discern that
degradation), and hence my attempt to understand how this degradation
compares to the effect of viewing an unmanipulated image using larger
monitor pixels.

To tell you the truth, I can't help but feel that I'm confusing issues
here, but I don't know what they are. For example, I cannot imagine any
video player interpolating stuff on the fly in order to obey a given
zoom request.

Where is my thinking going off?
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

bxf wrote:

>
> J. Clarke wrote:
>> bxf wrote:
>>
>> >
>> > You make reference to what the eye can resolve, under some conditions.
>> > What I fail to see (sorry) is this: taking a specific image, say a 5cm
>> > square on a given monitor at a specific resolution setting, what is
>> > the benefit of displaying that image at a higher resolution if the
>> > result is going to be smaller?
>>
>> One adjusts other settings so that feature size is the same. This can
>> cause other difficulties however.
>
> Hence my last question in the original post: what is the difference
> between an image of a certain viewing size (dictated by the monitor
> resolution), and the same image, viewed under higher resolution
> settings and therefore a smaller image on the screen, all other things
> being equal), but magnified by the application (or "other settings", as
> you put it)?

Not magnified. Font size, icon size, etc adjusted at the system level, so
things are the same size but sharper.

> Simplistically, this is how I see the situation: we have an image of A
> x B pixels. If we view it under monitor resolution settings of say, 800
> x 600, we will see an image of a certain size, which depends on the
> monitor in use. If we change the resolution to 1600 x 1200, we are
> halving the size of each monitor pixel, and the image will be half the
> size that it was at 800 x 600. If we now tell the application to double
> the size of the image, the application must interpolate, so that each
> pixel in the original image will now be represented by four monitor
> pixels. This would not result in increased image quality, and it
> requires that the application do some CPU work which it didn't have
> to do when the monitor was at the lower resolution setting.
>
> So the question becomes one of comparing the quality obtained with
> large monitor pixels vs the quality when using smaller pixels plus
> interpolation. And, we can throw in the fact that, by having it
> interpolate, we are forcing the CPU to do more work.
>
> Any thoughts on this? Am I failing to take something into
> consideration?

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

As far as monitors are concerned -as long as the display quality is
there- then for viewing digital photos, I would say the higher
resolution the better. You are display more pixels per inch when the
desktop resolution is increased which your eyes/brain can easily
resolve.

Just consider that a monitor might display an average of 72dpi, where
as the guidelines for printing a 4"x6" digital photo recommends a dpi
value of around 300.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Terence wrote:
> As far as monitors are concerned -as long as the display quality is
> there- then for viewing digital photos, I would say the higher
> resolution the better. You are display more pixels per inch when the
> desktop resolution is increased which your eyes/brain can easily
> resolve.

You are confusing issues, Terence. The more pixels you have in an
image, the better quality you can obtain when printing the image. But
this has nothing to do with monitor "resolution", where the image gets
smaller as you increase the "resolution" value.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

The point I was trying to make was that you can view a larger portion
of an image (viewed at full size) when increasing monitor resolution -
which is usually more desirable when performing image-editing work. I
do understand that whatever the desktop resolution may be set at has
nothing to do with the original quality of the image.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

chrisv wrote:
> bxf wrote:
>
> >Many gamers usually strive to
> >run at 1600x1200 because it creates a cleaner edge around objects
> >without resorting to anti-aliasing.
> >
> >Not being a gamer, I'd have no appreciation for this, but fair enough.
>
> No offense, but you really need to learn how to quote properly. You
> last post should have looked something like the below:
>
>
>
> Shinnokxz wrote:
>
> >I think the more mainstream way you're seeing it involved higher
> >resolutions that involve a cleaner edge. Many gamers usually strive to
> >run at 1600x1200 because it creates a cleaner edge around objects
> >without resorting to anti-aliasing.
>
> Not being a gamer, I'd have no appreciation for this, but fair enough.

So How's this? In fact, I was just told (after enquiring) how this is
done in another post.

Cheers.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Terence wrote:
> The point I was trying to make was that you can view a larger portion
> of an image (viewed at full size) when increasing monitor resolution -
> which is usually more desirable when performing image-editing work. I
> do understand that whatever the desktop resolution may be set at has
> nothing to do with the original quality of the image.

OK, I understand you better now. In fact I make this point in my
original post. Of course, this does not precdlude the need to magnify
the image sometimes simply so that one can deal with areas that are not
so small that they become impossible to edit with any precision.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

bxf wrote:

>Many gamers usually strive to
>run at 1600x1200 because it creates a cleaner edge around objects
>without resorting to anti-aliasing.
>
>Not being a gamer, I'd have no appreciation for this, but fair enough.

No offense, but you really need to learn how to quote properly. You
last post should have looked something like the below:



Shinnokxz wrote:

>I think the more mainstream way you're seeing it involved higher
>resolutions that involve a cleaner edge. Many gamers usually strive to
>run at 1600x1200 because it creates a cleaner edge around objects
>without resorting to anti-aliasing.

Not being a gamer, I'd have no appreciation for this, but fair enough.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

bxf wrote:

>chrisv wrote:
>>
>> No offense, but you really need to learn how to quote properly. You
>> last post should have looked something like the below:
>
>So How's this? In fact, I was just told (after enquiring) how this is
>done in another post.
>
>Cheers.

Much better. 8)
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

....
> How do these apply if I'm viewing an image with a graphics program or
> watching a DVD?
>

As long as the number of pixels in the source image is less than the number
being displayed, increased resolution doesn't buy you anything when viewing
the image. If the number of pixels in the source image is greater than the
current display setting, then a higher display "resolution" will improve the
picture because more of the source pixels can be represented.

For example:
Your screen is set at 800x600 = 480,000 pixels = 0.48MegaPixels.
You have a digital camera that take a 2MegaPixel picture = 1600x1200.
You will only be able to see about 1/2 of the detail in the picture if you
display the picture full screen. However, you buddy has a "high resolution"
monitor capable of 1600x1200 pixels. When he views the picture full
screen, he will see it in all it's glory. }:) Now given that 4, 5, 6, and even
8 MP cameras are common today, you can see why higher resolutions
can be convenient for displaying and working with digital images.

In the case of a DVD, the picture is something like 852x480 (16:9 widescreen).
Your 800x600 display will display nearly all the information in every
frame of the DVD. On your buddy's system, either the picture will be smaller,
or interpolated to a larger size (likely causing a small amount of degredation).
You might argue that a screen setting just large enough to display a complete
852x480 window give the best results for watching a movie.

That's fine for DVD, but what if you want to watch your HD antenna/dish/cable
feed? Then you might want 1278x720, or even 1980x1080 to see all the detail
in the picture.

And this doesn't even begin to consider issue of viewing multiple images/windows/
etc at a time.

In summary, you are probably correct that "high resolution" isn't necessary for
DVD watching, but it certainly is useful for a lot of other things.


--
Dan (Woj...) [dmaster](no space)[at](no space)[lucent](no space)[dot](no
space)[com]
===============================
"I play the piano no more running honey
This time to the sky I'll sing if clouds don't hear me
To the sun I'll cry and even if I'm blinded
I'll try moon gazer because with you I'm stronger"
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

"bxf" <bill@topman.net> wrote in message
news:1118925982.351383.175100@g49g2000cwa.googlegroups.com...
> Hence my last question in the original post: what is the difference
> between an image of a certain viewing size (dictated by the monitor
> resolution), and the same image, viewed under higher resolution
> settings and therefore a smaller image on the screen, all other things
> being equal), but magnified by the application (or "other settings", as
> you put it)?

Here we again run into confusion problems between "resolution"
as is commonly used here and the term in its technically proper
sense - but the bottom line is that a given object rendered at
a specific resolution (in terms of PPI) looks the same no matter
how many pixels are in the complete image (i.e., the full screen)
or how large that screen is. In other words, if you have an image
of, say, an apple appearing on your display, and that apple appears
3" tall and at 100 ppi resolution (meaning the the apple itself is
about 300 pixels tall), nothing else matters.

In the example you are talking about, though, the apple's image is NOT
necessarily "at a higer resolution" in a perceptual sense of the term. You
have more pixels in the entire display, but the same number in
the apple - making the apple smaller. Whether or not this LOOKS
better depends on just where the two cases were in terms of the spatial
acuity curve of the eye. If the smaller-but-same-number-of-pixels-version
now has the pixels sufficiently smaller such that you're past the acuity
limit,
all the detail might still be there but it's useless - your eye can't
resolve it,
and so you do not perceive it as being at the same level of "detail" or
"quality". This is why, for instance, it would be pretty silly to be
talking
about something like a 2048 x 1536 display on a PDA - you can't
possibly, in normal use, be seeing such a thing from a small enough
distance to really make use of all those pixels.

> Simplistically, this is how I see the situation: we have an image of A
> x B pixels. If we view it under monitor resolution settings of say, 800
> x 600, we will see an image of a certain size, which depends on the
> monitor in use. If we change the resolution to 1600 x 1200, we are
> halving the size of each monitor pixel, and the image will be half the
> size that it was at 800 x 600. If we now tell the application to double
> the size of the image, the application must interpolate, so that each
> pixel in the original image will now be represented by four monitor
> pixels. This would not result in increased image quality, and it
> requires that the application do some CPU work which it didn't have
> to do when the monitor was at the lower resolution setting.

And this is the problem with rendering images in terms of a fixed
number of pixels, rather than adapting to the available display
resolution (in terms of pixels per inch) and holding the image
physical size constant. Systems which do this are just fine as long
as all displays tend to have the same resolution (again in ppi, which
has been true for computer monitors for some time - right around
100 ppi has been the norm), but as we see more different display
technologies and sizes in the market, offering a much wider
range of resolutions (50 - 300 ppi is certainly possible already),
this model breaks.


Bob M.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

bxf wrote:

>
> J. Clarke wrote:
>> bxf wrote:
>>
>
>> Not magnified. Font size, icon size, etc adjusted at the system level,
>> so things are the same size but sharper.
>
> How do these apply if I'm viewing an image with a graphics program or
> watching a DVD?

Depends on the image. If it's 100x100 then you don't gain anything, if it's
3000x3000 then you can see more of it at full size or have to reduce it
less to see the entire image.

For DVD there's not any practical benefit if the display resolution is
higher than the DVD standard, which is 720x480.

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

Dan Wojciechowski wrote:
> ...
> > How do these apply if I'm viewing an image with a graphics program or
> > watching a DVD?
> >
>
> As long as the number of pixels in the source image is less than the number
> being displayed, increased resolution doesn't buy you anything when viewing
> the image. If the number of pixels in the source image is greater than the
> current display setting, then a higher display "resolution" will improve the
> picture because more of the source pixels can be represented.
>
> For example:
> Your screen is set at 800x600 = 480,000 pixels = 0.48MegaPixels.
> You have a digital camera that take a 2MegaPixel picture = 1600x1200.
> You will only be able to see about 1/2 of the detail in the picture if you
> display the picture full screen. However, you buddy has a "high resolution"
> monitor capable of 1600x1200 pixels. When he views the picture full
> screen, he will see it in all it's glory. }:) Now given that 4, 5, 6, and even
> 8 MP cameras are common today, you can see why higher resolutions
> can be convenient for displaying and working with digital images.

Sorry Dan, the above is incorrect.

If you view a large image on a screen set to 800x600, you will see only
a portion of the image. If you view the same image with a 1600x1200
setting, the image will be smaller and you will see a larger portion of
it. That's all. There's nothing here that implies better detail. The
image may appear SHARPER at 1600x1200, but that is simply because the
detail is smaller, just like small TV screens look sharper than large
ones.

> In the case of a DVD, the picture is something like 852x480 (16:9 widescreen).
> Your 800x600 display will display nearly all the information in every
> frame of the DVD. On your buddy's system, either the picture will be smaller,
> or interpolated to a larger size (likely causing a small amount of degredation).
> You might argue that a screen setting just large enough to display a complete
> 852x480 window give the best results for watching a movie.

Well, this makes sense to me, and I'm trying to confirm that I'm
understanding things correctly. In addition to less degradation, there
should also be less CPU overhead, due to the absence of interpolation.

> That's fine for DVD, but what if you want to watch your HD antenna/dish/cable
> feed? Then you might want 1278x720, or even 1980x1080 to see all the detail
> in the picture.

Once again, the monitor setting does not improve the detail you can
see. If your IMAGE is larger (e.g. 1980x1080 vs 1278x720), THEN you are
able to see more detail. But this is not related to your MONITOR
setting, which is only going to determine the size of the image and
hence what portion of it you can see.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

J. Clarke wrote:
> bxf wrote:
>
> >
> > J. Clarke wrote:
> >> bxf wrote:
> >>
> >
> >> Not magnified. Font size, icon size, etc adjusted at the system level,
> >> so things are the same size but sharper.
> >
> > How do these apply if I'm viewing an image with a graphics program or
> > watching a DVD?
>
> Depends on the image. If it's 100x100 then you don't gain anything, if it's
> 3000x3000 then you can see more of it at full size or have to reduce it
> less to see the entire image.

OK, but this is not a quality issue. You view the image at a size that
is appropriate for your purposes.

If I'm photoediting an image, I need to see a certain level of detail
in order to work. That means that, on any given monitor, I must have
the image presented to me at a size that is convenient for my intended
editing function. Does it matter whether this convenient size is
achieved by adjusting monitor "resolution" or by interpolation (either
by the video system or by the application)? If the "resolution" setting
is low, then I would ask the application to magnify the image, say,
20x, whereas at a higher "resolution" setting I may find it appropriate
to have the application magnify the image 40x (my numbers are
arbitrary). Is there a difference in the end result?
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

In addition to the above we have the question of the larger pixels, but
I don't know how to fit that into the equation.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

J. Clarke wrote:

> If the feature size on
> the image is at 40x still smaller than the pixel size then you gain from
> the higher res. If not then you don't.

I believe this statement is relevant, but I need to know what you mean
by "feature size". Also, by "pixel size" do you mean the physical size
of the pixel on the monitor?

> If you're used to low resolution and you change to high resolution then you
> may not notice much difference. But when you go back to low-res you almost
> certainly will.

While I'm writing as if I believe that "resolution" setting makes no
difference to the image we see, I am in fact aware that this is not the
case. I know that at low settings the image looks course.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.video (More info?)

bxf wrote:

>
> J. Clarke wrote:
>> bxf wrote:
>>
>> >
>> > J. Clarke wrote:
>> >> bxf wrote:
>> >>
>> >
>> >> Not magnified. Font size, icon size, etc adjusted at the system
>> >> level, so things are the same size but sharper.
>> >
>> > How do these apply if I'm viewing an image with a graphics program or
>> > watching a DVD?
>>
>> Depends on the image. If it's 100x100 then you don't gain anything, if
>> it's 3000x3000 then you can see more of it at full size or have to reduce
>> it less to see the entire image.
>
> OK, but this is not a quality issue. You view the image at a size that
> is appropriate for your purposes.
>
> If I'm photoediting an image, I need to see a certain level of detail
> in order to work. That means that, on any given monitor, I must have
> the image presented to me at a size that is convenient for my intended
> editing function. Does it matter whether this convenient size is
> achieved by adjusting monitor "resolution" or by interpolation (either
> by the video system or by the application)? If the "resolution" setting
> is low, then I would ask the application to magnify the image, say,
> 20x, whereas at a higher "resolution" setting I may find it appropriate
> to have the application magnify the image 40x (my numbers are
> arbitrary). Is there a difference in the end result?

Suppose your monitor could display one pixel? How much more useful to you
would a monitor that can display two pixels be? How about four? See where
I'm going?

If the application magnifies 40x, whether you get a benefit from higher
resolution or not depends again on the image size. If the feature size on
the image is at 40x still smaller than the pixel size then you gain from
the higher res. If not then you don't. One thing you do gain if you use
the default settings for font size and whatnot is that there is more
available screen area to display your image and less of it taken up by
menus and the like.

If you're used to low resolution and you change to high resolution then you
may not notice much difference. But when you go back to low-res you almost
certainly will.

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)