Unlimited Dynamic Range

Seems to me that focusing on the sensor and on board processing to control over exposure, wouldn't it be simpler to place the burden on the lens? How about a coating that responds to overexposed areas and reduces the transmission of that part of the picture.
--
Don't Forget to Remove the Lens Cap
 
Seems to me that focusing on the sensor and on board processing to
control over exposure, wouldn't it be simpler to place the burden on
the lens? How about a coating that responds to overexposed areas and
reduces the transmission of that part of the picture.
--
Don't Forget to Remove the Lens Cap
Optical theory rules that out as all of the lens is used for each part of the image, not just one portion of the lens per pixel.

Now if it could be made to work quickly enough (very doubtful) the micro-lenses of the sensor could have 'Reactolite' properties and truly increase DR, or rather the ability to handle highlights.
 
I just doubt seriously - with my very own very limited knowledge of
technology - that such a triggering / reporting mechanism could work
at all. If I understand the basic technology correctly there is no
chance of detecting that a photosite is "full": first you collect
photons; then you convert to a very weak electric current using a
photodiode; then you have to amplify this weak current to be able to
read it (we're all analog up to this - no possibilty to "store" or
"copy" any info for further use!). That means: You can't do a check
every millisecond on each photodiode if the maximum number of photons
has been collected - the act of reading the values itselfs empties
the "buckets", and they have to be filled up from start for the next
reading. And for a single photosite reporting to the ADC it must be
read somehow ...

Sure hope I make myself clear, and that somebody with more insights
in this technology can verify if my line of thinking is correct.
I think I follow. I was thinking more along the lines of creating a sensor where the photosites initiate the "I'm full" message. Based on the time this message was received by the ADC (relative to the start of the exposure), the camera could perform the calculations I used in my example above. Note, that I have no idea how to design such a photosite, or how to write an algorithm to manage the log-jam of over exposed photosite messages that would flood the ADC.
 
interesting, you could have a file format that goes something like
R,G,B, and then R Clip, G Clip, B Clip assuming the clip time can be
recorded... however would it also then be possible to have an
exposure more that waits till all pixels have been clipped then
re-align them in firmware... the problem there is that the exposure
time is to long for anything but still objects.
True, but that assumes that you wait for all the pixels to over expose. What I'm proposing is that the overexposed pixels "jump the queue" and report their status to the ADC, and that the remaining pixels (e.g. shadow areas) send a signal to the ADC at the end of the exposure (like with current sensors). Sorry if I wasn't explicit enough in my earlier posts.
 
for what could easily be obtained simply by doing two quick exposures
with an electronic shutter and processing the HDR automatically
in-camera. But the files would be massive and post-processing very
necessary.

Although unlimited dynamic range sounds great (we'd never have to set
exposure again!), sometimes its nice to have all your pixels in a
bunch — monitors, prints, and human eyes do not exhibit unlimited
dynamic range.
Agreed. This idea would not necessarily be for everyone, but I do think there would be a lot of avid photographers (and professionals), who would value the enhanced control over how the actual dynamic range of a scene is represented (compressed, really) into the dynamic range of a computer screen or a print.
 
The main difficulty with any timed overflow sort of design is that,
while effective at preserving highlights, it is terrible at capturing
dark shadow details...you either need to have a hybrid scheme where
both the captured charge in the photocell is read along with the
overflow time (if any), or else need to expose long enough for enough
photons to hit (practically) every photosite sufficiently to cause it
to overflow. It also means that your effective shutter speed varies
for different portions of the image, which could cause interesting
artefacts.
Exactly - for each photosite, you would need two pieces of information:

1) Exposure value (0-255 in my example above)

2) Time to Full (this variable would be either the actual time for overexposed photosites or equal to some maximum value for photosites that were properly exposed).

I've got no idea what potential artifacts might look like (if any) - but I'm not sure they can be predicted without a working prototype.
 
Seems to me that focusing on the sensor and on board processing to
control over exposure, wouldn't it be simpler to place the burden on
the lens? How about a coating that responds to overexposed areas and
reduces the transmission of that part of the picture.
Interesting. I'm not sure how the coating on the lens would know what aperture the lens was set at (let alone what ISO or shutter speed the camera was using).
 
Seems to me that focusing on the sensor and on board processing to
control over exposure, wouldn't it be simpler to place the burden on
the lens? How about a coating that responds to overexposed areas and
reduces the transmission of that part of the picture.
Interesting. I'm not sure how the coating on the lens would know
what aperture the lens was set at (let alone what ISO or shutter
speed the camera was using).
No, the light reaching a particular part of the sensor has passed through lots of different parts of the lens - and correspondingly, the light passing through a particular part of the lens is aimed at lots of different parts of the sensor.

You'd need an antialiasing filter material (right on the sensor surface, where the light is focused) that darkened predictably and locally when illuminated, filtering each photosite optimally like light-reactive sunglasses.

Then you'd need a means of recording the different darkenings immediately after the exposure, before they faded, so that this filtration could be re-applied to the data in reverse. Maybe close the mirror and strobe an internal light?

Then you'd need a means of quickly making all these light-reactive filters clear again, ready for the next shot. So this could probably not work with Live View.

RP
 
Some really good discussion on this topic and lots of different ideas/perspectives coming to the table. Hopefully someone with the resources to make this happen is listening!

Thanks again everyone!
 
for what could easily be obtained simply by doing two quick exposures
with an electronic shutter and processing the HDR automatically
in-camera.
I think this will be the way it initially goes. If the exposures were quick enough, it would work as SR as well.
But the files would be massive and post-processing very
necessary.
No more than raw already is.
Although unlimited dynamic range sounds great (we'd never have to set
exposure again!), sometimes its nice to have all your pixels in a
bunch — monitors, prints, and human eyes do not exhibit unlimited
dynamic range.

-Matt
No, but it allows for ultra clean and rich shadow detail, which would put compacts on an even level with DSLRs for noise.

--

Through the window in the wall
Come streaming in on sunlight wings
A million bright ambassadors of morning
 
Thinking a bit more about your basic concept – there may be a problem:

Measuring the time of the overflow to calculate the additional exposure is fine as long as the RATE of photons is constant but this is not always the case. One obvious example of this is flash photography. The shutter may be open for say 1/180 sec but the length of the exposure may be only say 1/10000 sec. from the flash I think this would really screw things up and there are many times when light from say just parts of the image last much less time than the shutter time, panning images, long exposures etc.

I think this could be a major problem for your idea.

Dave.
 
Your plan would work in theory, although the Samsung method is easier (flush the photosite as it fills and just count how many times it fills up). In the latter case all you need is a counter not a clock (easier to do and less real estate used up).

The other problem is storage. In a 12bit RAW file you can specify 2 ^12 "brightness" levels (around 4096). In a very high DR image your eye could probably pick out 20X as many levels so at what point do you decide that a highlight is more important than a midtone or a shadow when you compress it? Where do you let the information go?

OK, so why not have 24bit RAW files? Because youd have huge files and a very slow camera plus, if you are printing the output, you still have to compress this into a far lower value at some point because prints themselves have limited DR(though of course you could choose yourself which contrast curve to apply).

So yes, it is possible to extend the range of a sensor but if you expand more than a couple of stops you soon run into storage issues. However even 2 stops improved highlight sensitivity would be great and make digital exposure far more tolerant.
--
Steve
When I can master technique I'll be a photographer.
When I can realise a vision I'll be an artist.
When I get paid I'll be a professional.
 
For a start the lens based scheme mentioned above is just not possible without doing some amazing optical discoveries.

The CMOS (or CCD) sensors are analog. There is no digital value until the readout is performed and the values are passed to the ADC. AFAIK the pixels are clocked to the ADC sequentially and there is at most 1 ADC per column/row. Hence individual pixels cannot be read with any current technology.

Note that 14bit ADCs are relatively large and power hungry (plus there are latency issues to deal with). You might be able to use lower res. ADCs and oversample in the time domain but how many photons do you loose whilst resetting the detector and converter? This might be different for adjecent pixels, effectively resulting in increased noise. The shot noise due to the statistical variation in the time of arrival of photons would also place a limit on the max. SNR that would be acheivable (I think)

I'm, not saying that individual readout is impossible, just impractical currently and has a number of theoretical hurdles to overcome.

If it was feasible then I doubt that noone at Dalsa, Sony, Matsushita, Nikon, Canon , Samsung, Fujifilm or Micron would have thought of it :)

However, quantum photodetectors that detect the energy (and hence frequency ==colour) in individual photons - now that would be exciting (and I think they do currently exist)

Laurens
 
Your plan would work in theory, although the Samsung method is easier
(flush the photosite as it fills and just count how many times it
fills up). In the latter case all you need is a counter not a clock
(easier to do and less real estate used up).
But then the whole thing needs to work asynchrounously.
The other problem is storage. In a 12bit RAW file you can specify 2
^12 "brightness" levels (around 4096). In a very high DR image your
eye could probably pick out 20X as many levels so at what point do
you decide that a highlight is more important than a midtone or a
shadow when you compress it? Where do you let the information go?

OK, so why not have 24bit RAW files? Because youd have huge files and
a very slow camera plus, if you are printing the output, you still
have to compress this into a far lower value at some point because
prints themselves have limited DR(though of course you could choose
yourself which contrast curve to apply).
Each time you would flush the buffer you add (up to) 1 stop (==1bit), unless you're in the habit of taking pictures of nuclear explosions there is no need for 24bit data :)

Laurens
 
Here's an idea for a sensor/firmware-based solution that would allow
a camera to capture unlimited dynamic range.
[snip]
Here's my idea: If the sensor and firmware were able to capture and
interpret both the amount of light captured during the exposure AND
the time it took to capture that amount of light, the full dynamic
range in the scene could be estimated by the firmware when producing
the image. Here's and example to show how I envision this working
(I'll use 8-bit values to represent the amount of light captures just
to illustrate the concept):

Current Technology - A group of photosites reaches the maximum value
of 255 during a 1/125 second exposure. The firmware translates this
into pure white in the final file, resulting in blown highlights.

Proposed Technology - The same group of photosites and light exposure
as above, but the sensor/firmware combination also records the fact
that the maximum value of 255 was reached in 1/250th of a second.
This would allow the firmware to estimate that the actual amount of
light at that point is approximately 510 (since the maximum value was
reached in half the exposure time). This information could then be
recorded in the RAW file. Tonal curves could be applied to
manipulate the final output either during post-processing, or
immediately by the firmware to produce JPEGs.

The result of this technology should be greatly-enhanced detail in
the high-key areas of a scene, resulting in much greater dynamic
range. It should also allow for much greater control over how bright
areas are represented in the final image. Overexposure could become
a relic of the past!
[snip]

What you have described is, approximately, a CID detector (CID = Charge Injection Device). These have been around since the 1970s, almost as long as CCDs, and share the same photovoltaic principle as CCD and CMOS detectors. However, they differ fundamentally from both CCDs and CMOS in their on-chip electrical circuitry. The idea is that the charge is nondestructively read out, allowing continued accumulation of photoelectrons. Thus, by repeated readouts during an exposure, the time to saturation for bright pixels can be used to determine their brightness, and the exposure can be continued to allow even fairly dark pixels to have a good number of photoelectrons. Repeated read-out has the added advantage that read-out noise can be attenuated, but even so, CIDs tend to have higher overall noise levels than CMOS or CCD.

CIDs are indeed used to record images with very high contrast and with detail at greatly different brightness levels. They have been the "next thing coming" more than once, but prices remain stubbornly high due to manufacturing issues (multiply the price of a CCD camera with a similar-sized detector by about 10 to 20). Their use is mostly in scientific and forensic imaging, often with multi-stage cooling or coupled to an image intensifier (Microchannel Plate).

http://www.cis.rit.edu/research/CID/a_cid_is.htm

--

Horses are good subjects for photography, but terrible platforms for a photographer.
 
Sorry, I should have changed the title in my post above (that this is replying to).
--

Horses are good subjects for photography, but terrible platforms for a photographer.
 
Thinking a bit more about your basic concept – there may be a problem:

Measuring the time of the overflow to calculate the additional
exposure is fine as long as the RATE of photons is constant but this
is not always the case. One obvious example of this is flash
photography. The shutter may be open for say 1/180 sec but the length
of the exposure may be only say 1/10000 sec. from the flash I think
this would really screw things up and there are many times when light
from say just parts of the image last much less time than the shutter
time, panning images, long exposures etc.

I think this could be a major problem for your idea.
Great point! I didn't think about that one. However, the information in the RAW file could still be interpreted by a RAW converter the way it is today (i.e. ignore the "time to full" variable) such that the blown highlights were left blown. Also, this problem would only apply to photosites that were overexposed by the flash (e.g. pointing your camera-mounted flash directly at a mirror and firing. Properly exposed photosites would be fine.
 

Keyboard shortcuts

Back
Top