When sensors hit Quantum limits

  • Thread starter Thread starter kamerakiri
  • Start date Start date
Could you explain why the wavelength of light prevents a photon from being detected by photosites on the sensor and how that relates to diffraction in terms of f-number of the lens, size of the micro lens etc.

In other words, when does the diffraction limit on the lens dominate, so that increasing sensor resolution has no impact on system MTF.

Wave-particle duality does not mean that a photon is the size of its wavelength. It’s not like an inkjet printer.

Thanks

Andrew

--
Infinite are the arguments of mages. Truth is a jewel with many facets. Ursula K LeGuin
Please feel free to edit any images that I post
 
Last edited:
Could you explain why the wavelength of light prevents a photon from being detected by photosites on the sensor and how that relates to diffraction in terms of f-number of the lens, size of the micro lens etc.
I guess the pixel itself is some kind of engineering measure of something that exists on sensor(photosite). I mean, it is not a one to one measure. Pixels in image are after all constructed after data goes through processor. So display pixel and pixel pitch are different things.

Photosites detect light by a variant of photoelectric process - which involves electrons populating higher energy levels(bands) when certain kind of light hits them. In theory, this means that wavelength as a physical measure should have little to do with functionality of photosite. Since the process is discrete, a photon excites an electron or something like that. Obviously if you shrink photodiode too much it will affect the nature of lattice and its energy bands.

And it is not as if semiconductor industry can't shrink size per photosite. They very well can.

However, as I said, it does not really matter. Our aim is to obtain an image, not to make a crazy spec sensor. And image is limited by optical path. Optical resolution is limited by wavelength of light(no doubt here really).

This is the same reason why optical microscopes don't resolve beyond wavelength of visible light, in general, that is. I won't discuss using a specialized illumination and computers to increase resolution in some super resolution microscopes. A microscope using natural white light won't resolve more than 1 micron in practice.

Thus, I go back to original argument. The limit is the optical path here - from lens to UV-IR filter to micro lenses. It is very difficult to create an optical path that faithfully reproduces data on less than 1 micron square area of sensor, no matter how good the glass is. Even small specs of dust on sensor create significant damage in resolution at this level. In other words, do we really gain anything by chasing an ever shrinking photosite? Not at all. At least we as photographers don't. Physicists and scientists might want one for their high energy experiments or whatever, but consumers don't.
In other words, when does the diffraction limit on the lens dominate, so that increasing sensor resolution has no impact on system MTF.
That's a good point. For several lenses, high resolution sensors(even 45MP!) reveal smearing in corners. No point in increasing resolution further as we don't get any more useful data.

Nikon has a nice article here:

Wave-particle duality does not mean that a photon is the size of its wavelength. It’s not like an inkjet printer.
True. I like to think of it this way: When light interacts with matter to give or take energy, the particle model is good. Otherwise, wave model works fine and there is little, if ever, any need to use photon model. It is a fair rule of thumb.
Thanks

Andrew
 
Our human eye has the equivalent of 576MP, and a dynamic range of 100 million to one. Are you trying to say that our prized advanced silicon technology is still not close to matching our eye and the processing power of our brain made mostly of water, carbon and nitrogen?

https://clarkvision.com/imagedetail/eye-resolution.html

:)
you have cherry picked a number, thats not actually what our eyes can resolve as far as fine detail. I bet you cant read a news paper out the corner of your eyes :-) even a car number plate is imposable :-)

Ds
DXo Mark only tested the P-3 12mp sensor at 10 stops dynamic range. These scenes didn’t appear to pop as much in person as the photos I took.

bde09c93660246e4b8376ccae1ca5f1a.jpg

0d89579652d641cf88799b1e451a3e2f.jpg

c42cecf0b0054282a1ddd1d4b085c1fe.jpg

ecd325c563a049d49475104255d7d26f.jpg

All those were shot in Normal compression JPEG.

Newer 16mp sensors have almost 13 stops of dynamic range and the 20mp sensors a little bit more.

This old building isn’t as pretty in person as in this photo.

c52556fd47ef4bd5b10d81e29dd65bce.jpg

I’m all for better and I’m rooting for the sweet, bright glorious tomorrow.

Bur until then we aren’t without good gear, you know?
True. Our displays are mostly 8 bit per channel and they work fine. We don't even fully use 12 or 14 bit depth in our images. Our tools are already good enough - we no longer need to fear becoming outdated with what we have. We have to learn to love photography itself.

A good analogy I have is - at one time in history, or for that matter before dawn of modern chemistry, making paints was a difficult process. A painter had to create his own paints - which is why many older paintings use a limited palette. Today most $20 paint sets exceed old paints in quality, tones, color fidelity and archival properties(except those exotic mineral paints). So now we don't need to fight over brushes, tools or color. We can relax and paint and just love the process. Saturation in technology is a good thing at times.
 
Resolution of good lenses is not much more than 100 lines per mm, or about 5 microns. This is for really good and expensive glass.

A 50MP resolution on MFT is already overkill for most lenses in the system.
 

Keyboard shortcuts

Back
Top