Could you explain why the wavelength of light prevents a photon from being detected by photosites on the sensor and how that relates to diffraction in terms of f-number of the lens, size of the micro lens etc.
I guess the pixel itself is some kind of engineering measure of something that exists on sensor(photosite). I mean, it is not a one to one measure. Pixels in image are after all constructed after data goes through processor. So display pixel and pixel pitch are different things.
Photosites detect light by a variant of photoelectric process - which involves electrons populating higher energy levels(bands) when certain kind of light hits them. In theory, this means that wavelength as a physical measure should have little to do with functionality of photosite. Since the process is discrete, a photon excites an electron or something like that. Obviously if you shrink photodiode too much it will affect the nature of lattice and its energy bands.
And it is not as if semiconductor industry can't shrink size per photosite. They very well can.
However, as I said, it does not really matter. Our aim is to obtain an image, not to make a crazy spec sensor. And image is limited by optical path. Optical resolution is limited by wavelength of light(no doubt here really).
This is the same reason why optical microscopes don't resolve beyond wavelength of visible light, in general, that is. I won't discuss using a specialized illumination and computers to increase resolution in some super resolution microscopes. A microscope using natural white light won't resolve more than 1 micron in practice.
Thus, I go back to original argument. The limit is the optical path here - from lens to UV-IR filter to micro lenses. It is very difficult to create an optical path that faithfully reproduces data on less than 1 micron square area of sensor, no matter how good the glass is. Even small specs of dust on sensor create significant damage in resolution at this level. In other words, do we really gain anything by chasing an ever shrinking photosite? Not at all. At least we as photographers don't. Physicists and scientists might want one for their high energy experiments or whatever, but consumers don't.
In other words, when does the diffraction limit on the lens dominate, so that increasing sensor resolution has no impact on system MTF.
That's a good point. For several lenses, high resolution sensors(even 45MP!) reveal smearing in corners. No point in increasing resolution further as we don't get any more useful data.
Nikon has a nice article here:
The resolution limitations in microscopy are often referred to as the diffraction barrier, which restricts the ability of optical instruments to distinguish between two objects separated by a lateral distance less than approximately half the wavelength of light used to image the specimen.
www.microscopyu.com
Wave-particle duality does not mean that a photon is the size of its wavelength. It’s not like an inkjet printer.
True. I like to think of it this way: When light interacts with matter to give or take energy, the particle model is good. Otherwise, wave model works fine and there is little, if ever, any need to use photon model. It is a fair rule of thumb.