D600 High ISO in DX

Started Nov 23, 2012 | Questions thread
bobn2
bobn2 Forum Pro • Posts: 62,050
Re: Roger's optimum pixel pitch of 5um

Leo360 wrote:

bobn2 wrote:

Leo360 wrote:

A "true" camera independent image is the one constructed from those lambda functions for Red, Green and Blue channels.

No, that is not the "true" image. The true image is actually composed of all those individual discrete photon strikes. The lambda function is a mathematical model to allow us to map a continuous function onto what is a discrete one,

My interest in this discussion is from machine learning perspective. I am after objective description of the features of the scene. A "true" image is the description of the scene. Light is the media which conveys this information and provides a way to estimate the shapes, distances, etc via the interaction of EM field with the matter. Therefore, for my purposes image analysis becomes an estimation problem and this is why I treat it as such.

This is where we depart, at the start. The 'image' is what is projected onto the sensor by the lens. The perfect sensor captures that image in every detail (including the photon shot noise), and what it captures is the only image it has available. The 'scene', if by that you mean the set of objects arrayed in front of the lens, illuminated by whichever light sources, bears a complex relation to that image projected on the sensor, in particular it is a many to one relation - many different sets of objects may result in the same image projected on the sensor, so the 'true' image that you wish to find is not discoverable from the one projected onto the sensor.

Now back to the real world. We do not know light distribution (lambdas) across the image plane.

Because it does not exist. We can discover it by observation of what happened in the real world, which was a number of discrete photon events.

The fact that some quantity is not directly observable does not mean it does not exist. Otherwise you are in danger of outlawing the Quantum Mechanics in it entirety.

not at all. we are about to get into some abstract discussion about the nature of reality. The 'lambdas' across the image plane do not exist except as a post priori averaging of the photon events. Or they are unobservable, if you prefer. But I think that they do not exist. You think that there is a latent function there that gets progressively realised as more and more photons strike. I think not. For instance, what if the scene illumination changed during the exposure, suddenly you lambdas collapse and are replaced by new ones, and what you record is some average of the two - and in reality of course, at some level, the illumination is always changing during an exposure.

We use a pixel based sensor to record a realization of this random process to estimate the lambda functions.

It does not 'record a realization'. The lambda function, which it estimates by averaging the count of photon events over an area, is an abstraction. The reality was the photon events.

I think Eric answered this question in his post within this thread.

No he did not, he just enumerated all the different things that put randomness into the photon events. In fact, just one thing is enough.

If we wished we could use an alternative mechanism to do the same only better. Imagine we had a sensor so advanced that it could record the exact position (to within the limits of the uncertainty principle) of each photon event in time and space, then transmit that with the photon energy to the camera's processor. That processor could reconstruct the output of any sensor of any pixel count and CFA characteristic.

Yes, if your idealized sensor is bound only by fundamental Heisenberg uncertainty it will be the best possible sensor in a sense that no other sensor can convey more information about the scene observing the same light.

That is not actually the case. There is more information to be had, see for instance http://en.wikipedia.org/wiki/Angle-sensitive_pixel

But it cannot reconstruct exact pixel readings of other sensors or even of its own twin brother because the quantum mechanical events are probabilistic in nature.

That is a case of every image projected onto the sensor being unique, due to the random nature of  the light projected onto it. This near-perfect sensor cannot produce the same image twice, nor can any sensor.

The probabilities of quantum events can be calculated but not the individual event outcomes. Actually it can reconstruct the lambdas obtained by other sensors.

No, it can't, because those calculated lambdas will always differ at some level of precision, due to the difference in the data from which they are calculated. Your statement is exactly the same as mine 'that processor could reconstruct the output of any sensor of any pixel count and CFA characteristic' except that you have substituted 'lambdas' for 'output'. Just a change of name, and equally correct and incorrect depending on the level of precision at which you want to see it. The point is this - there is no way of determining what the 'lambdas' should have been, they are unobservable or if you want non-existant.

At each pixel we have obtained a sampled photon count which served as an estimator for the lambda corresponding to this tiny pixel area. The accuracy of the estimation depends on the the photon count observed. The higher the photon count the lesser error in using it in place of (lamda*t). Everyone notices this fact while observing that there is more photon-shot noise in shadows than in highlights. Expose-to-the-right is the empirical way to say the same thing -- you expose so much as to obtain as many photons in every pixel as possible without falling off the DR upper end in highlights.

The lambda is not the reality, it is an abstraction. And if you sample the reality with a greater precision, you can always estimate this function more precisely. This is simply a matter of information, the more information you have, the more precise a model you can make.

Lambda is the reality. For example, for a black body radiation it is the temperature (see below).

So, you are suggesting that the temperature of the image plane is the same as that of the warm black body from which the light illuminating the scene in front of the lens, or what? Let us get this straight. If one had information on the nature of every light source lighting the scene, and the geometry and material characteristic of every object within the scene we could construct a mathematical model of the light intensities at the image plane. That is exactly what we do with ray tracing in 3D modelling. But that is not the reality on any individual exposure. The reality is as Eric said, photons get emitted randomly, the bounce off whatever and whenever and some of them hit the lens, get directed to the sensor and cause an individual, one off, never to be repeated pattern of photon strikes. If you can gather enough of them you can reconstruct a convincing model of the scene in front of the lens. But that convincing model is not the reality.

To summarize -- saying that there is no "mean photon count" (aka expected value) is wrong.

No-one said that there is no "mean photon count", there is a mean of any set of observations. But 'mean photon count' says nothing about 'expected value', because 'expected' is before the observation, the mean is after.

In Probability Theory "expected value" and "mean value" are synonyms. And so far we have not been talking about prior or posterior probabilities (might be we should).

So they are in probability theory, but with a wider meaning more is revealed. As I said, what is 'expected' is what you expect before the observation (the clue is in the word). The mean is after the wave functions have collapsed, the observation has been made and the reality has been determined.

These quantities are deterministic parameters of the photon count probability distribution. The camera is a device that tries to estimate these parameters (on pixel by pixel basis) from a given random process realization.

Which quantities are you now saying now are the 'deterministic parameters of the photon count probability distribuution'? What is actually happening is that photons are generated randomly by various quantum processes, they travel, reflect off things and end up hitting the sensor. If you have a model of their distribution and trajectory, and knew something about the scene, you could generate a model of the light energy distribution over the surface of the sensor, but any continuous function describing this is an abstraction from the reality.

You are ascribing me the words I never said!

I didn't ascribe any words to you.

Where did I say that photon shot noise originates inside pixels?

I never said that you said that.

Photon shot-noise is the property of photon flux and has nothing to do photon-to-electron conversion in the pixel. This is why noise coming out of a pixel is nominally broken into shot-noise, read-noise, dark-current components.

Good, we agree. The sensor does not affect the amount of shot noise in the image projected onto it.

Yes, this has been my understanding all along.

No, it gets catered for simply by multiplying the photon count by the QE.

Sorry, the act of a photon absorption is not a deterministic event. Since the depth of absorption varies there is a probability that a photon can escape. And since it is probabilistic event you cannot really tell the exact number of photons which were missed.

As Eric pointed out, for the purposes of calculation it gets catered for simply by multiplying the photon count by the QE. As Joofa pointed out, that is an approximation.

By effectively combining pixel photon counts and reducing photon shot noise along the way.

Yes, by collecting a greater number of photon events.

They will be different because the actual pattern of photon events is, and always will be different. That's the reality, rather simpler than 'sampling different realizations of the same photon random process'

You call it "pattern of photon events" while I call it "samples from the random photon process". We are talking about the same thing using different words.

No we are not. A 'pattern of photon events' is the observation that occurs, the sampling is the nature of the counting that you do. A discrete photon event counter would not be a sampled system. How you sample does not affect the pattern of photon events.

Here I was talking about ways to describe read-noise and the random process governing the read-noise is happening in the pixel.

OK, go away and think about it. I have already thought about it.

Bob, this is what I call mildly insulting language (I refer to "OK, go away..."). If my postings irritate you, you are not obliged to respond.

Again, where is is 'mildly insulting'? I was simply repeating your own words (which you have conveniently clipped from the above. 'I have to think about it more'. Sorry if you think 'Go away' is mildly insulting.

-- hide signature --

Bob

Post (hide subjects) Posted by
cpw
cpw
cpw
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow