My dumb off subject question of the month..

Jarrell Conley

Forum Pro
Messages
15,506
Reaction score
173
Location
USA, GA, US
In personal computers, whats the difference between 16bit and 32 bit settings in the display? Where would I most likely actually see the difference?
Thanks,
Jarrell
--
D100, 5700 and 990
I child proofed my house, but they keep getting in!
FCAS Member, dslr division
 
It concerns the number of different colors the display can be asked to generate.

You will find it difficult to tell the difference between 16 bit ("high color") and 24 bit ("true color") because most human eyes are not that discriminatory.

At lower color resolutions the differences are easier to see, try setting your display to 256 colors (that would be 8 bit) and it is easy to see the difference.

In a real live situation the easiest test picture to determine the number of different colors is to view a gradually changing color hue across the screen and look for banding as one shade of the color changes to another. If there are so many possible shades that each one's width is less wide than the pixel spacing on the screen then you will be unable to detect them.

Does that make sense?
--
Jeff
 
In personal computers, whats the difference between 16bit and 32
bit settings in the display? Where would I most likely actually
see the difference?
Maybe this web page will help (haven't verified info, just found while googling).

http://hankfiles.pcvsconsole.com/answer.php?file=459

I don't know a lot about 32 vs. 16 bit, just that I'd rather use the greater color depth of 32 bit. Although as I understand it, there may be some compromises in processing performance in using the greater depth. Also, I read on another site that 32-bit is the default color depth for Windows XP. Having a video card and a monitor that support at least 1024 x 768 x 32-bit resolution should enable this color rendering.

--
Dawn
FCAS Member #89

With camera in hand, I see what I might otherwise overlook

C P 5 7 0 0; SB-50DX; WC-E80; TC-E15ED
pbase supporter
http://www.pbase.com/dlane/
 
Jarrell, My understanding (very limited, and probably not accurate) is as follows:

When you process an image in Photoshop in the normal 8-bit (2 in 8th power) mode each color (RGB) has 256 possible variations (this is confusingly sometimes also called 24 bit = 3x8). When you process a file in 8-bit mode and there are only 256 colors available for each color. With processing (levels and curves) you cause the colors to be tweaked and the resulting colors for exact display need to be something between these 256 colors. But there are only 256 available so PS assign the colors values that correspond most closely to one of the 256 colors available. This causes gaps in the colors that you can see this as tooth combing in the histogram.

If you process in 16 bit mode (also sometimes confusingly called 48-bit = 3x16) you have something close to 65000 colors available. The tweaked colors are therefore more accurate with very much less gaps (practically non considering the human eye's limited color perception accuracy. You will in this way get a more accurate and smoother result than in 8-bit processing.

To take advantage of the 16-bit processing your computers video card needs to allow for at least this amount of color for the selected display size, which is not a problem on modern computers. Your monitor also needs to be well calibrated if you want to take full advantage of the added color depth.

Whether an average person can see a difference with between 16 bit and 32 bit colors is in my opinion questionable. The most important difference would be that when one would examine closely the borders between two adjacent colors on the screen, with fewer colors banding could be visible (the colors would not change smoothly but change in bands depending on that the intermediate colors would not be available, banding in histograms could make this very noticeable).

Also for web use fewer colors are generally sufficient because of the difference between computers (most not well calibrated).

In my opinion the jump from 8 to 16 bits in postproduction is a real advantage (even if it causes the files to be much larger and poses more of a challenge for computer resources). This causes more accurate colors and particularly less (if practically any) banding in the final product.

Just my personal fairly uninformed opinion. I would be thankful for corrections.

PS: I use 32 bit colors on my display "just to be sure".

Kind regards
Kaj
C P 5 7 0 0
http://www.pbase.com/kaj_e
WSSA member
In personal computers, whats the difference between 16bit and 32
bit settings in the display? Where would I most likely actually
see the difference?
Thanks,
Jarrell
--
D100, 5700 and 990
I child proofed my house, but they keep getting in!
FCAS Member, dslr division
 
If I remember 16 bit is about 32,000 colors and 32 is 16 million. I use 32 for color ballance. If you have a good video card with plenty of memory, keep 32 bit allways. If screens re-draw slowly, set the colour depth lower (16bit) or even use lower rez, like 1024x768 or 800x600.

Another mistake many people make is "Wallpaper". Fancy fotos on your desktop use mamory. Even with 4Mb video mem on my laptop I do NOT use any wallpaper, simply black or dark blue. Try it next time your PC draws slowly...
Michael
In personal computers, whats the difference between 16bit and 32
bit settings in the display? Where would I most likely actually
see the difference?
Thanks,
Jarrell
--
D100, 5700 and 990
I child proofed my house, but they keep getting in!
FCAS Member, dslr division
--
Michael,
Fuji S2Pro, Nikon N65 & Nikon F80 w/Nikkor [email protected] AF S, Sigma [email protected], Sigma [email protected]/APO, Sigma [email protected], Sigma [email protected], Sigma [email protected]/f4-ASP.
 
I just said "but", to get your attention. What I wanted to say is that I have seen the difference. Rarely but, I'll tell you where. When you have an image with very subtle colours on a real good monitor the video card may dither OR just step colours, making what look like artifacts. In fact if you go on up to 32bit colour these “Steps” go away. I have seen this many times but NOT always.

Nor have I seen any combing in histograms caused by a video card. Photoshop (to the best of my knowledge) does NOT use the video card to process the working image. Only what you see on the screen. I know this because I talked to Adobe support about this very issue. The manager of a printing prepress business claimed that he need very expensive video cards claiming that the better the card, the better the final image. The owner wanted proof that new cards were needed. Adobe techs explained to me exactly what I thought. The video card processes the colors, resolution and refresh rate of the image ON SCREEN ONLY.

A better card displays more colors (i.e. = 32 bit) at a higher rez. (i.e. = 1200x1600Dpi or 1024x1280) and at a faster refresh rate. This is how many times the image is drawn on the screen. Anything more than 70 is unnecessary. Many older monitors used 60 Hz or 60 frames each second. Most people cannot see any difference, I can. The reason is because the florescent lights in offices, flickers at 60 Hz. As the bulbs get older the phosphorus weakens and the flickering worsens. Even though OSHA requires them to be changed after so many thousand hrs, most places don't until the bulbs burn out.

If your computer monitor refreshes at 60 Hz. you will suffer eye fatigue in even if you can not see the flickering. Setting your monitor to 70 to 75Hz reduces this flicker tremendously as well as eye fatigue. Refresh rates above 75 is not needed, even though 80-85 and up is available. These make some games look better but are not needed for working with still images as in Photoshop.

NONE of this processing affects the working image. You can work on a 386 PC with 16 colors (although that’s nearly impossible) and your working Jpeg, BMP or Photoshop image will look just as good as one done on a high end machine. Working the image would be MUCH slower. Photoshop does the work on the actual image, not your video card. It can get real tough to work an image with a junk video card, that’s for sure!

The more screen rez and more colour your monitor and video card produce the easier subtle colour balance and changes are. Some changes, such as cloning or airbrushing can look fine on your monitor but look horrible in the final print, simply because the monitor or card can not display everything.
When you process an image in Photoshop...
--
Michael,
Fuji S2Pro, Nikon N65 & Nikon F80 w/Nikkor [email protected] AF S, Sigma [email protected], Sigma [email protected]/APO, Sigma [email protected], Sigma [email protected], Sigma [email protected]/f4-ASP.
 
More Colors, less banding and toothing, and more accurate colors.

Here's how it works

A bit is a single piece of digital data coded either a 0 or 1

A byte is made up of 8 bits or a string of 8 0's and/or 1's run together to create a single number with 8 places.

To get the number of unique combinations of 0's and 1's based on the number of places you string together, use exponents with a base of 2 (representing 0 or 1) and the appropriate exponent representing the number of places you are going to string together. Thus for an 8bit number each unique combination of 8 0s or 1s represents one unique color. The an 8bit depth image can display up to 256 different colors. The matrix below expands on this idea.

8bit = 2 to the power of 8 (01010100)
16bit = 2 to the power of 16 (0001010001001000)
24bit = 2 to the power of 24 (010010001000100001001000)
32bit = 2 to the power of 32 (00001000000010010000100000000001)

Thus for colors these "bit depths" give increasing variations:

8bit = 256 colors
16bit = 64,536 colors
24bit = 16,772,216 colors (True color or the limit of human vision)

32bit = 4.29 billion colors (Beyond human vision or our ability to distinguish different colors)

It is not just banding you would want to avoid through using high bit depths but also you will see more accurate color. With 256 bit images a nice skin tone might be converted to orange. With 32bit it is more likely that a given skin tone (or whatever) will be displayed correctly - at least one of those 4.29 billion colors will be close ;) Thus for color accuracy (assuming everything else is calibrated properly) go for the higher bit depths.

Cheers,
Brent
In personal computers, whats the difference between 16bit and 32
bit settings in the display? Where would I most likely actually
see the difference?
Thanks,
Jarrell
--
D100, 5700 and 990
I child proofed my house, but they keep getting in!
FCAS Member, dslr division
--
Brent Haydamack
9-9-5, S-u-n-p-a-k P-Z--4-0-0-0-A-F
 
The bit depth is the total number of colors available for each pixel.

For a better DOF and most accurate colors you want to go as high as you are allowed by your video card.

As a gamer it becomes obvious as you get better color seperation and accuracy with the higher bit rate.
In conjunction with the number of pixels your screen shows you.

To get the most accurate picture available you should always aim for the highest bit rate and the highest number of pixels your vid card and monitor can show. You are limited by the ability of your equipment to render this information on your screen.

The higher the bit rate the more accurate the colors, the more pixels you render the more accurate the dof will be as well as the lines (the more pixels you show lessens the "jaggies").

Myself I keep my monitor/vid card at 1024x768 at a 32 bit rate refreshing at 75hz. (note keep your monitor refresh rate above 60hz to reduce screen sync flashing in rooms that use florescent bulbs, as florescent bulbs are at 60hz).

Without getting technical this is how it works.

--
'No matter where you go, there you are.'
FCAS Member, pbase supporter
http://www.pbase.com/sssnakesss
D70, CP 57 00, N70, PS7
 
In personal computers, whats the difference between 16bit and 32
bit settings in the display? Where would I most likely actually
see the difference?
Thanks,
Jarrell
--
D100, 5700 and 990
I child proofed my house, but they keep getting in!
FCAS Member, dslr division
--
D100, 5700 and 990
I child proofed my house, but they keep getting in!
FCAS Member, dslr division
 
All of the previous posters were correct, but most important issue is that even when you have millons of colors at your disposal - gradients between two colors are where difference between 32bit and 16 bit is most obvious. So, let's answer your second question - where would you see a difference? There is a very simple experiment. On a clear day, get out and take a shot of the sky. Try some nice wide angle of all shades of blue in the morning or evening, when son is making one end of the sky yellow, white or red, and other is black or blue. Keep it in RAW format, got to your computer and display it in both ways. You may want to close the picture viewer between display property changes, so it can adjust properly.

What will you look for? For banding in 16 bit display. They should e much less obvious in 32 bit world.

So, even though this won't be obvious in most of the pictures, 16 bit display would show some color seaparation on skies, baby faces and other images where shades of one color go gradually from one end of the spectrum to the other.
In personal computers, whats the difference between 16bit and 32
bit settings in the display? Where would I most likely actually
see the difference?
Thanks,
Jarrell
--
D100, 5700 and 990
I child proofed my house, but they keep getting in!
FCAS Member, dslr division
--
D100, 5700 and 990
I child proofed my house, but they keep getting in!
FCAS Member, dslr division
--
Sasha
 

Keyboard shortcuts

Back
Top