Bit depth = levels of gradient?

Started 2 months ago | Discussions thread
Tom_N Forum Pro • Posts: 15,758
Re: Bit depth = levels of gradient?

filmrescue wrote:

John Sheehy wrote:

J A C S wrote:

The levels are as many as I said. How many of them are distinguishable in a typical photo with whatever criterion, is a different question.

The OP isn't factoring in noise, though, so the OP is basically applying math to myth.

Explain please.

((2 ^ Bit_Depth) ^ Number_Of_Channels) is one limiting factor.

You can't do any better than that, but depending on other factors like the range of tonal values in the photo, or the calibration of your scanner or display system, or the limits of human perception, you could do worse. Is 0 the complete absence of light? Is (2^Bit_Depth - 1) the brightest light you could ever handle? Are the levels between evenly spaced?

Another way to see this is to consider 80-bit-per-channel representation. 2^80 is said to be roughly the number of elementary particles (protons, neutrons, neutrinos, electrons, and photons) in the observable universe. While your gear might be able to store 80-bit-per-channel numbers, there's no way that DACs, ADCs, displays, scanners, and printers could handle 2^80 gradations accurately. Nor could your eyes see all of them.

ERROR: Not able to count anywhere near 2^240 photons at this photo-site. There are not that many photons in the entire Universe. Would you like to move to an imaginary Universe?

Post (hide subjects) Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow