Sign in with
Sign up | Sign in
Your question

if 32-bit color is, in fact, 24-bit then why does w2k have..

Last response: in Graphics & Displays
Share
Anonymous
July 26, 2005 11:38:30 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

My Mom's system has an old Matrox Mystique 4MB video card. At 1280x1024 she
can't get 32-bit color only 24-bit color (the color quality selection in
Display Properties). But if 32-bit color is actually 24-bit color then there
should be no difference?

--
there is no .sig

More about : bit color fact bit w2k

July 27, 2005 12:16:39 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Doug" <pigdos@nospam.com> wrote in message
news:WiwFe.7173$_%4.2071@newssvr14.news.prodigy.com...
> My Mom's system has an old Matrox Mystique 4MB video card. At 1280x1024
> she can't get 32-bit color only 24-bit color (the color quality selection
> in Display Properties). But if 32-bit color is actually 24-bit color then
> there should be no difference?
>
> --
> there is no .sig
>

There will be no difference in the colour rendition, but 24-bit is slower
than 32 bit. This is because a memory 'fetch' is done in word (i.e. 32-bit)
chunks. When you've fetched the first 32-bit word, if you're running in
32-bit mode, then that's that; the required 24-bit value is there,
right-aligned in the word (i.e. the first 8 bits are zero). Subsequent
word-fetches bring in the next values, etc. In the case of 24-bit colour,
the fetch is still 32-bit, but the 24-bit value you want is left-aligned in
the word, so the logic has to shift the value right, dropping off the bottom
eight bits. Instant problem; those eight bits you've just 'dumped' are in
fact the first 8 bits of the NEXT 24-bit value, so you've got to store them
somewhere! Then, you bring in the next 32-bit word; the first 16 bits are
the right-most 16 bits of the next pixel (plus the 8 bits you just stored
earlier), and the next 16 bits are the first 16 bits of the third pixel, and
so on. That's why 32-bit is far preferable to 24-bit.
Regards,
Steve.
Anonymous
July 27, 2005 2:23:45 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Often it's just a matter of semantics with the drivers. With my Radeon
9800Pro driver, the 24-bit option does not exist, only 16- and 32-bit.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Doug" <pigdos@nospam.com> wrote in message
news:WiwFe.7173$_%4.2071@newssvr14.news.prodigy.com...
> My Mom's system has an old Matrox Mystique 4MB video card. At 1280x1024
> she can't get 32-bit color only 24-bit color (the color quality selection
> in Display Properties). But if 32-bit color is actually 24-bit color then
> there should be no difference?
Related resources
Anonymous
July 27, 2005 11:21:57 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

If it's just a matter of semantics why does the documentation for the Matrox
Mystique state certain high resolutions are only available in 24-bit color
while all lower resolutions are available at 32-bit color? If there's no
difference whatsoever then why would they bother stating this? I've got
another PCI card that has the same limitations in its technical
documentation (i.e. that it can't get 32-bit color in high resolutions).

--
there is no .sig
"First of One" <daxinfx@yahoo.com> wrote in message
news:4_CdnRULY7KDbHvfRVn-1w@rogers.com...
> Often it's just a matter of semantics with the drivers. With my Radeon
> 9800Pro driver, the 24-bit option does not exist, only 16- and 32-bit.
>
> --
> "War is the continuation of politics by other means.
> It can therefore be said that politics is war without
> bloodshed while war is politics with bloodshed."
>
>
> "Doug" <pigdos@nospam.com> wrote in message
> news:WiwFe.7173$_%4.2071@newssvr14.news.prodigy.com...
>> My Mom's system has an old Matrox Mystique 4MB video card. At 1280x1024
>> she can't get 32-bit color only 24-bit color (the color quality selection
>> in Display Properties). But if 32-bit color is actually 24-bit color then
>> there should be no difference?
>
>
Anonymous
July 27, 2005 11:21:58 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Doug thought about it a bit, then said...
> If it's just a matter of semantics why does the documentation for the Matrox
> Mystique state certain high resolutions are only available in 24-bit color
> while all lower resolutions are available at 32-bit color? If there's no
> difference whatsoever then why would they bother stating this? I've got
> another PCI card that has the same limitations in its technical
> documentation (i.e. that it can't get 32-bit color in high resolutions).

There is a difference, but it's not normally used: the extra 8-bits in
32-bit color are used for transparency effects (or gamma, I can't recall
exactly).

In other words, 32-bit color is basically 24-bit color with the extra 8-
bits used for special effects. At least in that regard it is different
from 24-bit color.

--
Kevin Steele
RetroBlast! Retrogaming News and Reviews
www.retroblast.com
Anonymous
July 28, 2005 8:23:10 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Doug wrote:
> My Mom's system has an old Matrox Mystique 4MB video card. At 1280x1024 she
> can't get 32-bit color only 24-bit color (the color quality selection in
> Display Properties). But if 32-bit color is actually 24-bit color then there
> should be no difference?

24-bit color does use less RAM than 32-bit, there may not be enough
video RAM for the card to do 1280x1024 in 32-bit color.

--
Robert Hancock Saskatoon, SK, Canada
To email, remove "nospam" from hancockr@nospamshaw.ca
Home Page: http://www.roberthancock.com/
Anonymous
July 28, 2005 11:55:52 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

That's what I assumed, but I've been reading on this very newsgroup that
32-bit color is 24-bit color and there's no difference between them but
there MUST be otherwise why would these video card manufacturers bother
stating that certain resolutions can't be had in 32-bit color.

--
there is no .sig
"Robert Hancock" <hancockr@nospamshaw.ca> wrote in message
news:o 4ZFe.58948$%K2.55475@pd7tw1no...
> Doug wrote:
>> My Mom's system has an old Matrox Mystique 4MB video card. At 1280x1024
>> she can't get 32-bit color only 24-bit color (the color quality selection
>> in Display Properties). But if 32-bit color is actually 24-bit color then
>> there should be no difference?
>
> 24-bit color does use less RAM than 32-bit, there may not be enough video
> RAM for the card to do 1280x1024 in 32-bit color.
>
> --
> Robert Hancock Saskatoon, SK, Canada
> To email, remove "nospam" from hancockr@nospamshaw.ca
> Home Page: http://www.roberthancock.com/
July 29, 2005 12:07:17 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Doug" <pigdos@nospam.com> wrote in message
news:cLaGe.7791$_%4.1473@newssvr14.news.prodigy.com...
> That's what I assumed, but I've been reading on this very newsgroup that
> 32-bit color is 24-bit color and there's no difference between them but
> there MUST be otherwise why would these video card manufacturers bother
> stating that certain resolutions can't be had in 32-bit color.
>
> --
> there is no .sig
> "Robert Hancock" <hancockr@nospamshaw.ca> wrote in message
> news:o 4ZFe.58948$%K2.55475@pd7tw1no...
>> Doug wrote:
>>> My Mom's system has an old Matrox Mystique 4MB video card. At 1280x1024
>>> she can't get 32-bit color only 24-bit color (the color quality
>>> selection in Display Properties). But if 32-bit color is actually 24-bit
>>> color then there should be no difference?
>>
>> 24-bit color does use less RAM than 32-bit, there may not be enough video
>> RAM for the card to do 1280x1024 in 32-bit color.
>>
>> --
>> Robert Hancock Saskatoon, SK, Canada
>> To email, remove "nospam" from hancockr@nospamshaw.ca
>> Home Page: http://www.roberthancock.com/
>
>
Doug,
As I said before, there is no difference in what is displayed on the screen;
but 1280x1024x3 (24 bits=3 bytes) = 3,932,160 bytes, so it will fit on the
4MB card. 1280x1024x4 (32 bits=4 bytes) = 5,242,880 bytes, which will NOT
fit; that's why the card does not allow this resolution.
Regards,
Steve.
Anonymous
July 30, 2005 11:02:02 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Doug wrote:
> That's what I assumed, but I've been reading on this very newsgroup that
> 32-bit color is 24-bit color and there's no difference between them but
> there MUST be otherwise why would these video card manufacturers bother
> stating that certain resolutions can't be had in 32-bit color.

The difference is that 32-bit trades off more video RAM usage for more
performance. The display quality is identical.

--
Robert Hancock Saskatoon, SK, Canada
To email, remove "nospam" from hancockr@nospamshaw.ca
Home Page: http://www.roberthancock.com/
Anonymous
July 31, 2005 6:45:54 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

I remember color pallettes for Photoshop allowed for a larger selection
w/32-bit color. I'm beginning to NOT beleive everyone who says 24-bit color
is the same as 32-bit color. If that were the case then what would be the
POINT of having a 24-bit color mode for various video cards? Unless someone
here has the technical background/experience or proof to back up their
statement that 24-bit color is the same as 32-bit color I'm just going to
assume it's unsubstantiated bullshit.

--
there is no .sig
"Robert Hancock" <hancockr@nospamshaw.ca> wrote in message
news:KBFGe.69202$s54.5394@pd7tw2no...
> Doug wrote:
>> That's what I assumed, but I've been reading on this very newsgroup that
>> 32-bit color is 24-bit color and there's no difference between them but
>> there MUST be otherwise why would these video card manufacturers bother
>> stating that certain resolutions can't be had in 32-bit color.
>
> The difference is that 32-bit trades off more video RAM usage for more
> performance. The display quality is identical.
>
> --
> Robert Hancock Saskatoon, SK, Canada
> To email, remove "nospam" from hancockr@nospamshaw.ca
> Home Page: http://www.roberthancock.com/
Anonymous
July 31, 2005 6:45:55 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Doug <pigdos@nospam.com> wrote:
> I remember color pallettes for Photoshop allowed for a larger
> selection w/32-bit color. I'm beginning to NOT beleive everyone who
> says 24-bit color is the same as 32-bit color. If that were the case
> then what would be the POINT of having a 24-bit color mode for
> various video cards? Unless someone here has the technical
> background/experience or proof to back up their statement that 24-bit
> color is the same as 32-bit color I'm just going to assume it's
> unsubstantiated bullshit.

The point of having 24-bit colour is when you have a CPU or GPU that
doesn't pay big penalties for accessing memories at a byte boundary, or
where swab and shift operations are fast and video memory is cache
mapped. In that case, you use 25% less video memory. Remember that all
video cards weren't originally made for the latest Pentium class PCs.

Also, some video solutions didn't store pixels, but bitplanes, in which
case it doesn't make sense to add more bitplanes than you need. This
has the advantage of super-fast blits (moving objects), but the
disadvantage of not being able to write in just one place to set a
single pixel.

32-bit is /almost/ always padded 24-bit, but the extra bits /can/ be
used for other purposes, like an alpha channel (which is a "dimmer" --
instead of dividing all pixel valuess by four to get part of an image at
25% intensity, you can set the "alpha" byte to 63 (25% of 255) to
achieve the same), or special modes like the "Gigacolor" of the Matrox
Parhelia, which uses 10 bits per pixel instead of 8.

However, in most cases, 32-bit is really just padded 24-bit due to it
being faster for a modern CPU/GPU to read a 32-bit longword than three
bytes.

--
*Art
August 1, 2005 1:27:34 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Doug wrote:
> I remember color pallettes for Photoshop allowed for a larger selection
> w/32-bit color. I'm beginning to NOT beleive everyone who says 24-bit color
> is the same as 32-bit color. If that were the case then what would be the
> POINT of having a 24-bit color mode for various video cards? Unless someone
> here has the technical background/experience or proof to back up their
> statement that 24-bit color is the same as 32-bit color I'm just going to
> assume it's unsubstantiated bullshit.
>

32 bit colour is made up of 8 bits Red, 8 bits Green, 8 bits Blue, and 8
bits Alpha (transparency). For 2D, generally speaking you don't need the
alpha bits so you can get away with using only 24 bits.

Image editing programmes may allow for more than 8 bits per colour
channel (8 bits gives 256 shades per channel), but at the end of the day
they will only be displayed on the monitor at 8 bits per channel.

Scanners are also scanning at higher precision per colour channel (e.g.
my scanner is capable of something like 48-bit, or 16 bits per colour
channel) but again, only up to 8 bits per channel are displayed on the
monitor.
Anonymous
August 16, 2005 3:39:44 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Doug wrote:

> If it's just a matter of semantics why does the documentation for the
> Matrox Mystique state certain high resolutions are only available in
> 24-bit color while all lower resolutions are available at 32-bit color? If
> there's no difference whatsoever then why would they bother stating this?
> I've got another PCI card that has the same limitations in its technical
> documentation (i.e. that it can't get 32-bit color in high resolutions).

What you see on the screen is usually not different. Note _usually_. Not
the same as _always_. The amount of memory used to store the frame,
however, _is_ different. The limitation is usually lack of RAM.

--
--John
to email, dial "usenet" and validate
(was jclarke at eye bee em dot net)
!