Nearing the physical limits of sensor sensitivity

Started Feb 2, 2011 | Discussions thread
ForumParentFirstPreviousNextNext unread
Flat view
zerozeronine
Contributing MemberPosts: 595
Like?
Nearing the physical limits of sensor sensitivity
Feb 2, 2011

Some of you guys know a lot about sensor engineering, so I was hoping you could shed some light on this (Doug, John...?).

I saw somewhere that quantum efficiency is around 25-50% for most camera sensors. That means that, at best, we can only double or quadruple our photon capturing. Given that Sony's new sensors seem to be significantly more sensitive than Canon's best (just a rough eye-balling estimate from someone's dark-shot test (in this forum) between the Nikon D7000 and 5D II), they may be getting pretty close to maximum efficiency.

Once we reach that limit, read-out noise will be the limiting factor, which is an area where Sony also excels. If so, it's neat to think that we're at the point where the limitations of form factors are being dictated by nature. That is, we'll be able to make statements like if you want a certain amount of quality at a given shutter speed, you'd have to go with a full-frame or bigger. People can stop hoping for technology to come out with something like a camera phone that takes great photos in near darkness.

Maybe that's why Sony recently came up with a Foveon-like patent, because they're running out of places to improve. Maybe this is why Canon is so focused on increasing MP.

In some sense, this is great for us. DSLRs are already so good, and, as we go further, we'll feel even less need to upgrade in the future. OTOH, what a nightmare this must be for the camera manufacturers!

Nikon and Sony currently have the sensor advantage, but I'm still happy to be in the Canon camp, because I feel it still has the best lenses. Let's hope Canon does some catching up. Ultimately, as all the companies move up in MP, lenses will matter most.

Some other questions I had that I was hoping you guys could answer:

There's chroma noise and luminance noise, but I was thinking it's strange that there are two kinds when it should be just one due to the stochastical nature of counting photons. I'm guessing that the difference between luminance and chroma noise is just in the overall count level, and that we have both luminance and chroma sliders in Lightroom only because they have different algorithms optimized for each case. I.e., when the counts get close to one, zero counts on a red sensel means no red info, whereas a single count fully counts as red, which makes for ridiculous color "information," and thus a splotchy appearance. Is that what's going on?

Also, how difficult is it to increase DR by increasing the electron well with a different design (I think I read somewhere that the current designs only allow for the capacitance to be proportional to the surface area of the sensel)? Having truly lower ISOs is something that would be nice for things like landscapes, but we haven't been progressing at all in that direction.

Thanks!
Kaz

ForumParentFirstPreviousNextNext unread
Flat view
Post (hide subjects)Posted by
ForumParentFirstPreviousNextNext unread
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark post MMy threads
Color scheme? Blue / Yellow