D600 High ISO in DX

Started Nov 23, 2012 | Questions thread
Leo360 Senior Member • Posts: 1,141
Re: Roger's optimum pixel pitch of 5um

bobn2 wrote:

Leo360 wrote:

A "true" camera independent image is the one constructed from those lambda functions for Red, Green and Blue channels.

No, that is not the "true" image. The true image is actually composed of all those individual discrete photon strikes. The lambda function is a mathematical model to allow us to map a continuous function onto what is a discrete one,

My interest in this discussion is from machine learning perspective. I am after objective description of the features of the scene. A "true" image is the description of the scene. Light is the media which conveys this information and provides a way to estimate the shapes, distances, etc via the interaction of EM field with the matter. Therefore, for my purposes image analysis becomes an estimation problem and this is why I treat it as such.

Now back to the real world. We do not know light distribution (lambdas) across the image plane.

Because it does not exist. We can discover it by observation of what happened in the real world, which was a number of discrete photon events.

The fact that some quantity is not directly observable does not mean it does not exist. Otherwise you are in danger of outlawing the Quantum Mechanics in it entirety.

We use a pixel based sensor to record a realization of this random process to estimate the lambda functions.

It does not 'record a realization'. The lambda function, which it estimates by averaging the count of photon events over an area, is an abstraction. The reality was the photon events.

I think Eric answered this question in his post within this thread.

If we wished we could use an alternative mechanism to do the same only better. Imagine we had a sensor so advanced that it could record the exact position (to within the limits of the uncertainty principle) of each photon event in time and space, then transmit that with the photon energy to the camera's processor. That processor could reconstruct the output of any sensor of any pixel count and CFA characteristic.

Yes, if your idealized sensor is bound only by fundamental Heisenberg uncertainty it will be the best possible sensor in a sense that no other sensor can convey more information about the scene observing the same light. But it cannot reconstruct exact pixel readings of other sensors or even of its own twin brother because the quantum mechanical events are probabilistic in nature. The probabilities of quantum events can be calculated but not the individual event outcomes. Actually it can reconstruct the lambdas obtained by other sensors.

At each pixel we have obtained a sampled photon count which served as an estimator for the lambda corresponding to this tiny pixel area. The accuracy of the estimation depends on the the photon count observed. The higher the photon count the lesser error in using it in place of (lamda*t). Everyone notices this fact while observing that there is more photon-shot noise in shadows than in highlights. Expose-to-the-right is the empirical way to say the same thing -- you expose so much as to obtain as many photons in every pixel as possible without falling off the DR upper end in highlights.

The lambda is not the reality, it is an abstraction. And if you sample the reality with a greater precision, you can always estimate this function more precisely. This is simply a matter of information, the more information you have, the more precise a model you can make.

Lambda is the reality. For example, for a black body radiation it is the temperature (see below).

To summarize -- saying that there is no "mean photon count" (aka expected value) is wrong.

No-one said that there is no "mean photon count", there is a mean of any set of observations. But 'mean photon count' says nothing about 'expected value', because 'expected' is before the observation, the mean is after.

In Probability Theory  "expected value" and "mean value" are synonyms. And so far we have not been talking about prior or posterior probabilities (might be we should).

These quantities are deterministic parameters of the photon count probability distribution. The camera is a device that tries to estimate these parameters (on pixel by pixel basis) from a given random process realization.

Which quantities are you now saying now are the 'deterministic parameters of the photon count probability distribuution'? What is actually happening is that photons are generated randomly by various quantum processes, they travel, reflect off things and end up hitting the sensor. If you have a model of their distribution and trajectory, and knew something about the scene, you could generate a model of the light energy distribution over the surface of the sensor, but any continuous function describing this is an abstraction from the reality.

This is easy. In black-body radiation you have such a deterministic parameter. It is called temperature and it determines the photon flux as well as its spectral distribution. A photo detector captures finite number of photons and allows you to estimate the temperature.  Poisson distribution of the photons hitting a pixel is a simple one-parametric (lambda) random distribution. All the physics (and geometry) of EM radiation, scattering, reflection, propagation, transformation by the lens is hidden inside the lambdas because there are simply no other parameter in the Poisson distribution. For no better word think of it as of a Pointing vector of classic EM field while Poisson shot-noise describe quantum fluctuations around it.

The number of photons registered by it should correspond to the number of photons which hit it. Shot noise is not because pixels are incorrectly counting the number of photons (that is read noise)

You are ascribing me the words I never said!

I didn't ascribe any words to you.

Where did I say that photon shot noise originates inside pixels?

Photon shot-noise is the property of photon flux and has nothing to do photon-to-electron conversion in the pixel. This is why noise coming out of a pixel is nominally broken into shot-noise, read-noise, dark-current components.

Good, we agree. The sensor does not affect the amount of shot noise in the image projected onto it.

Yes, this has been my understanding all along.

On top of it it is contaminated by read-noise which makes photon-to-electron conversion noisy. So the value of this pixel reading will not be exactly right.

Read noise is not 'on top of it'. Read noise is why the value of the pixel reading will not be exactly right.

There is also Quantum Efficiency (QE) that makes photons go missing and dark current which makes photons appear from nowhere. And QE of 53% does not mean that exactly 47% of photons go missing on each reading. Photon absorption is in itself a random process. I guess it should be factored into read-noise but I am not sure.

No, it gets catered for simply by multiplying the photon count by the QE.

Sorry, the act of a photon absorption is not a deterministic event. Since the depth of absorption varies there is a probability that a photon can escape. And since it is probabilistic event you cannot really tell the exact number of photons which were missed.

In your experiment with a single pixel filling the whole screen you will have a solid color (no spacial variation) but with the intensity slightly off. I think that what you are trying to say is that one cannot observe spacial noise variability (from one pixel to another) when measuring only a single pixel.

Temporal noise variability (i.e. from one frame to the next) is of no interest to the still photographer. It is the spatial variability (or at least the integration of spatial and temporal variability over the exposure time) that we are interested in.

I guess that people that combine consecutive shots in HDR photography may disagree. There are practical instances when you combine several shots of the same scene to boost photon count.

This is practically increasing the integration time.

By effectively combining pixel photon counts and reducing photon shot noise along the way.

And it is not only about temporal variability. You can have several identical cameras (idealized situation) taking identical images in controlled environment (think of Nikon factory testing facility). Photon counts in the corresponding pixels will be different because they sample different realizations of the same photon random process and because thermal noise, read-noise are uncorrelated. This is true even for the sensors consisting of a single pixel.

They will be different because the actual pattern of photon events is, and always will be different. That's the reality, rather simpler than 'sampling different realizations of the same photon random process'

You call it "pattern of photon events" while I call it "samples from the random photon process". We are talking about the same thing using different words.

Here I was talking about ways to describe read-noise and the random process governing the read-noise is happening in the pixel.

OK, go away and think about it. I have already thought about it.

-- hide signature --

Bob

Bob, this is what I call mildly insulting language (I refer to "OK, go away..."). If my postings irritate you, you are not obliged to respond.

Leo

 Leo360's gear list:Leo360's gear list
Nikon D5100 Nikon D750 Nikon AF-S DX Nikkor 18-200mm f/3.5-5.6G ED VR II Nikon AF-S Nikkor 24-120mm F4G ED VR Nikon AF-S Nikkor 50mm f/1.8G
Post (hide subjects) Posted by
cpw
cpw
cpw
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow