Sign in with
Sign up | Sign in
Your question

16 vx 32 bit color on LCD Monitors

Last response: in Graphics & Displays
Share
July 20, 2005 5:27:16 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Hey everyone, I've got a question.

Since LCD monitors are only capable of 16 bit color, is there any
advantage to running your video card in 32 bit mode?

I know there used to be a considerable difference in 16 vs 32 bit
performance.

So bottom line question is, I'm using a Geforce FX 5900 XT and I'm about
to get a 19 inch LCD monitor. Should I change the card to 16 bit to
improve performance? And will that cause any change in what I see on the
screen?

Pat

Posted Via Usenet.com Premium Usenet Newsgroup Services
----------------------------------------------------------
** SPEED ** RETENTION ** COMPLETION ** ANONYMITY **
----------------------------------------------------------
http://www.usenet.com

More about : bit color lcd monitors

Anonymous
a b C Monitor
July 20, 2005 9:44:16 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

There is a difference. 16 bit color doesn't look as good on an LCD.
Anonymous
a b C Monitor
July 21, 2005 11:58:42 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Pat wrote:
> Hey everyone, I've got a question.
>
> Since LCD monitors are only capable of 16 bit color, is there any
> advantage to running your video card in 32 bit mode?

LCDs are capable of more than 16-bit color. 8-bit LCDs can do true
32-bit (really 24-bit) color. 6-bit LCDs effectively do 18-bit color and
interpolate it up to 24 bits.

--
Robert Hancock Saskatoon, SK, Canada
To email, remove "nospam" from hancockr@nospamshaw.ca
Home Page: http://www.roberthancock.com/
Related resources
Anonymous
a b C Monitor
July 21, 2005 11:58:43 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Interesting how few people complain about 6-bit LCDs now. I remember the
Voodoo3 getting a lot of flak for not being able to display 32-bit color.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Robert Hancock" <hancockr@nospamshaw.ca> wrote in message
news:S7TDe.8328$5V4.5596@pd7tw3no...
> LCDs are capable of more than 16-bit color. 8-bit LCDs can do true 32-bit
> (really 24-bit) color. 6-bit LCDs effectively do 18-bit color and
> interpolate it up to 24 bits.
>
Anonymous
a b C Monitor
July 22, 2005 3:48:15 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Robert Hancock <hancockr@nospamshaw.ca> wrote:
> Pat wrote:
>> Hey everyone, I've got a question.
>>
>> Since LCD monitors are only capable of 16 bit color, is there any
>> advantage to running your video card in 32 bit mode?
>
> LCDs are capable of more than 16-bit color. 8-bit LCDs can do true
> 32-bit (really 24-bit) color. 6-bit LCDs effectively do 18-bit color
> and interpolate it up to 24 bits.

Also, the number of colours it can handle doesn't equate with the number
of *visible* colours. LCD displays still have a way to go before they
can display as many nuances as a good CRT, or with the same fidelity.

There's also video cards that can do 10+10+10 bits instead of 8+8+8, in
which case a CRT is currently the only choice.

Regards,
--
*Art
July 22, 2005 5:16:33 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

First of One wrote:
> Interesting how few people complain about 6-bit LCDs now. I remember
> the Voodoo3 getting a lot of flak for not being able to display
> 32-bit color.

Indeed interesting. But...
Most people simply don't know!

And manufactors are not very keen to inform about which panels are used in
the different monitors.

(Fake) response times is all that matters to the average users.

It is in the same box as "The megahertz myth" and "The megapixel myth"... ;) 

Zulu
Anonymous
a b C Monitor
July 22, 2005 7:41:11 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

First of One wrote:
> Interesting how few people complain about 6-bit LCDs now. I remember the
> Voodoo3 getting a lot of flak for not being able to display 32-bit color.

Well, it's not quite as bad as that - I believe the displays flicker the
pixels back and forth between the nearest values to approximate up to
the full 8 bits per color. I believe the color rendition is still not as
good though.

--
Robert Hancock Saskatoon, SK, Canada
To email, remove "nospam" from hancockr@nospamshaw.ca
Home Page: http://www.roberthancock.com/
July 23, 2005 12:48:00 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Thu, 21 Jul 2005 23:48:15 -0400, "Arthur Hagen" <art@broomstick.com> wrote:

>Robert Hancock <hancockr@nospamshaw.ca> wrote:
>> Pat wrote:
>>> Hey everyone, I've got a question.
>>>
>>> Since LCD monitors are only capable of 16 bit color, is there any
>>> advantage to running your video card in 32 bit mode?
>>
>> LCDs are capable of more than 16-bit color. 8-bit LCDs can do true
>> 32-bit (really 24-bit) color. 6-bit LCDs effectively do 18-bit color
>> and interpolate it up to 24 bits.
>
>Also, the number of colours it can handle doesn't equate with the number
>of *visible* colours. LCD displays still have a way to go before they
>can display as many nuances as a good CRT, or with the same fidelity.




Bollocks SONY has a TV that has better colors than a CRT..

>There's also video cards that can do 10+10+10 bits instead of 8+8+8, in
>which case a CRT is currently the only choice.



Is it SONY stopped making 17"to 19" CRT's..


>Regards,


Time to get the brain implant.
Anonymous
a b C Monitor
July 23, 2005 1:35:34 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

The good ol. Parhelia... Always wondered under what circumstances 30-bit
color can be useful. Common image formats like BMP, PNG, JPEG, etc. only
store 24-bits. Does TIFF or RAW allow 30-bit?

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Arthur Hagen" <art@broomstick.com> wrote in message
news:D bpq9v$iim$1@cauldron.broomstick.com...
> There's also video cards that can do 10+10+10 bits instead of 8+8+8, in
> which case a CRT is currently the only choice.
>
> Regards,
> --
> *Art
>
Anonymous
a b C Monitor
July 23, 2005 3:50:15 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

First of One <daxinfx@yahoo.com> wrote:
> The good ol. Parhelia... Always wondered under what circumstances
> 30-bit color can be useful. Common image formats like BMP, PNG, JPEG,
> etc. only store 24-bits. Does TIFF or RAW allow 30-bit?

JPEG doesn't store in 24-bits as such -- of course, if the source is
24-bit, it can't be better than that, but if the source is higher
quality (like from a scanner), and your jpeg decompresser allows it, you
can surely use higher than 8+8+8. PNG is supposed to be expandable, so
I would be surprised if it can't handle more than 8+8+8.

Regards,
--
*Art
Anonymous
a b C Monitor
July 23, 2005 6:28:57 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Some scanners use 36 bit (12,12,12) or even 48 bit color (16,16,16). For
these devices image quality is much more important than speed, and
supersampling color allows better downsampling.

Phil Weldon


"First of One" <daxinfx@yahoo.com> wrote in message
news:V_udnYNGx5scAnzfRVn-3Q@rogers.com...
> The good ol. Parhelia... Always wondered under what circumstances 30-bit
> color can be useful. Common image formats like BMP, PNG, JPEG, etc. only
> store 24-bits. Does TIFF or RAW allow 30-bit?
>
> --
> "War is the continuation of politics by other means.
> It can therefore be said that politics is war without
> bloodshed while war is politics with bloodshed."
>
>
> "Arthur Hagen" <art@broomstick.com> wrote in message
> news:D bpq9v$iim$1@cauldron.broomstick.com...
>> There's also video cards that can do 10+10+10 bits instead of 8+8+8, in
>> which case a CRT is currently the only choice.
>>
>> Regards,
>> --
>> *Art
>>
>
>
Anonymous
a b C Monitor
July 23, 2005 6:28:58 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

I suppose such precision is beneficial for printing and downsampling. Keep
in mind even good large CRTs top out at 2048x1536. For PC display, there's
no real benefit in displaying over 2048 shades, or 11 bits, per channel.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Phil Weldon" <notdiscosed@example.com> wrote in message
news:JXhEe.3153$6f.1818@newsread3.news.atl.earthlink.net...
> Some scanners use 36 bit (12,12,12) or even 48 bit color (16,16,16). For
> these devices image quality is much more important than speed, and
> supersampling color allows better downsampling.
>
> Phil Weldon
!