saltydogstudios
Veteran Member
It's always weird to me when camera reviewers say "we can't report on the sensor because Adobe hasn't included this camera yet" and then say "but it's the same sensor that's in x camera".
But all they end up measuring is dynamic range and high ISO noise performance.
Specifically, they don't measure how a frequency of light gets translated into the RAW file.
I know the equipment needed to do this well is expensive - a light source that can transmit a single frequency of light so it can measure the response at the sensel level (each red or green or blue pixel on the sensor).
But there are other ways - diffraction grating can create a rainbow, or colored filters that have a known characteristic that only allow through certain frequencies of light. Wratten "color separation" filters in particular.
I know the prevailing consensus is that that RAW files are relatively unbiased and that camera mismatches at the CFA -> RAW conversion process can be reduced via post processing, and that indeed Adobe and other RAW conversion software attempts to do this by measuring each camera and creating various color profiles.
But I would be genuinely interested in seeing how each camera handles color separation on the CFA -> RAW level.
My limited understanding here is that - say - Nikon and Leica cameras allow the "red" sensels to pick up "blue" frequency light, and Canon cameras have cleaner separation of red and blue. I'm sure each camera manufacturer has a reason for this - taking the light that exists in the world and turning into an RGB image (JPG usually) that we can view is a complex process.
And that camera manufacturers purposefully weaken the CFA because it'll give a better high ISO noise performance - more photons getting through the CFA means less noise - but at the expense (to some extent) of color separation.
And camera manufacturers "cook" the RAW files - either to achieve the colors they want or to reduce variation between batches - since creating the CFA is a chemical process, there is some potential for variation between batches.
I know in the real world - very few people care about this, and that there isn't much we can do as the end user with this information, but exploring this could lead more people to care about it, and we can start drawing correlations between pigments in the CFA and the resulting images we get. Again - even if the prevailing wisdom is that this particular thing doesn't matter much.
Maybe with the rising popularity of monochrome only cameras we can open the discussion more about how the CFA -> RAW process happens with more interest. Until we get these measurements, perhaps the whole idea is a bit theoretical, but I suspect that if a thorough review of cameras was made, we'd find that there are real world implications to this - for, say, ETTR exposure - trying to maximize the amount of information we gather at the sensor level, and showing that there are differences between cameras & the philosophy behind color that each camera manufacturer imbues at the sensor level.
I'm sure I'll get a lot of responses that these differences are minimal and that anyone who shoots RAW can get nearly identical results from different cameras with "proper" post processing, but that's the exact sentiment I suspect could be unwound with lots of testing and measurement.
If anyone does know of someone who does this sort of measurement, I'd love to hear about it.
But all they end up measuring is dynamic range and high ISO noise performance.
Specifically, they don't measure how a frequency of light gets translated into the RAW file.
I know the equipment needed to do this well is expensive - a light source that can transmit a single frequency of light so it can measure the response at the sensel level (each red or green or blue pixel on the sensor).
But there are other ways - diffraction grating can create a rainbow, or colored filters that have a known characteristic that only allow through certain frequencies of light. Wratten "color separation" filters in particular.
I know the prevailing consensus is that that RAW files are relatively unbiased and that camera mismatches at the CFA -> RAW conversion process can be reduced via post processing, and that indeed Adobe and other RAW conversion software attempts to do this by measuring each camera and creating various color profiles.
But I would be genuinely interested in seeing how each camera handles color separation on the CFA -> RAW level.
My limited understanding here is that - say - Nikon and Leica cameras allow the "red" sensels to pick up "blue" frequency light, and Canon cameras have cleaner separation of red and blue. I'm sure each camera manufacturer has a reason for this - taking the light that exists in the world and turning into an RGB image (JPG usually) that we can view is a complex process.
And that camera manufacturers purposefully weaken the CFA because it'll give a better high ISO noise performance - more photons getting through the CFA means less noise - but at the expense (to some extent) of color separation.
And camera manufacturers "cook" the RAW files - either to achieve the colors they want or to reduce variation between batches - since creating the CFA is a chemical process, there is some potential for variation between batches.
I know in the real world - very few people care about this, and that there isn't much we can do as the end user with this information, but exploring this could lead more people to care about it, and we can start drawing correlations between pigments in the CFA and the resulting images we get. Again - even if the prevailing wisdom is that this particular thing doesn't matter much.
Maybe with the rising popularity of monochrome only cameras we can open the discussion more about how the CFA -> RAW process happens with more interest. Until we get these measurements, perhaps the whole idea is a bit theoretical, but I suspect that if a thorough review of cameras was made, we'd find that there are real world implications to this - for, say, ETTR exposure - trying to maximize the amount of information we gather at the sensor level, and showing that there are differences between cameras & the philosophy behind color that each camera manufacturer imbues at the sensor level.
I'm sure I'll get a lot of responses that these differences are minimal and that anyone who shoots RAW can get nearly identical results from different cameras with "proper" post processing, but that's the exact sentiment I suspect could be unwound with lots of testing and measurement.
If anyone does know of someone who does this sort of measurement, I'd love to hear about it.

