32K Colors

com

Distinguished
Mar 20, 2001
27
0
18,530
For example if I play Quake with 32K color configuration It will have a best video quality that I play Quake with 16K color configuration?
 
G

Guest

Guest
for some reason i dont see much difference in 32 bit
some 1 make me c the differences

Visit me site mate
www.technoxtreme.cjb.net
 

ejsmith2

Distinguished
Feb 9, 2001
3,228
0
20,780
Yes, I can tell the difference between 16bit and 32bit textures/graphics. Deusex and Unreal Tourney look different with 32bit, as well as 32bit textures.

But no matter how good a game looks, there's no substitute for performance. Running 16bit will usually help if you have a non-overclocked-like-hell system.
 

Arrow

Splendid
Dec 31, 2007
4,123
0
22,780
It really depends on the game and the user combined. You might notice it in one game, but not as much in others, and still not at all in others.
The sensitivity of every person's eyes are different.

Rob
Please visit <b><A HREF="http://www.ncix.com/shop/index.cfm?affiliateid=319048" target="_new">http://www.ncix.com/shop/index.cfm?affiliateid=319048</A></b>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
But in some games u can definitely see the difference.
Q3, UT, Deus EX - Most New 3D Games. Some use different Texture sets for different colour depths and resolutions.

<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
G

Guest

Guest
The human eye is capable of differentiating about 16 million colors. When there are too few colors available in a picture it shows up as banding. Areas that are all one primary color but have a lot of subtle shades will show this the worst. For example, many downloaded nature pictures show some banding in the sky if they show a clear blue day. The more compressed the picture is, the more noticeable this will be as compression lowers precision.

Now then, 16 bit color gives 65,536 shades. 32 bit color gives 4,294,967,296 shades. What?! Yeah you read right: 4.3 billion shades. So you should see some banding at 16 bit but never ever any at 32 bit? Well if dealing with bitmap photos at sufficient resolution, then yes. But games are all different.

In games the bit rating refers to the precision at which colors are defined BEFORE any calculations are made. Every time a calculation is made on a pixel's color, some of that precision is lost, meaning that the final output has fewer colors. So every time another texture is added, another special effect is blended in, another semi-transparent object is looked through, etc, color precision is lost.

This means several things. One, newer games that use more textures and effects will show more of a difference between 16 bit and 32 bit. Two, the way an engine handles colors and calculations, regardless of the numbers of textures and effects, will effect final precision. Three, I think that the different graphics cards even make some difference in color precision. So depending on the age and sophistication of the game engine, and perhaps on the graphics card, 32 bit might be very important. Four, as more and more stuff is added to game graphics, the more important high color precision will be. John Carmack of id software has already, quite some time ago, called for the development of a 64 bit color standard. That's 18,446,744,073,709,551,616 colors! I believe that is 18.4 quintillion? Ouch! Imagine the memory bandwidth problems with THAT!

Of course, there is also variation between people's eyesight and tastes, so in the end just set it on what makes you happy. :=)

Regards,
Warden