Do optical WB filters improve RAW color latititude?

Started Aug 26, 2012 | Discussions
ZOIP
Junior MemberPosts: 35
Like?
Re: Good answer
In reply to Mako2011, Aug 29, 2012

Hello Mako
Thank you for you kind comments.

The issue of RAW processing is indeed crucial, I expect that deficiencies here account for why many folk don't see changes or improvements in techniques they test. For example unless you can remove the effects of all noise reduction it is unlikely you can ever see the subtle aspects different lenses have to offer in terms of local contrast, clarity etc, many RAW convertors do not allow this. And of course the same with colour as you say, likewise the actual inherent noise in the file cannot be ascertained if the NR cannot be switched out, and oddly enough the NR can actually make the native noise look worse due to clumping but that is a whole separate issue.

For my tests I have used Adobe Photoshop RAW (basically I don't use it for any real work) RAW Developer, RPP and Aperture. They all have their benefits and deficits but for my most serious work I use Raw Developer and RPP. RPP when I am particularly after colour accuracy and aiming for a filmic look and RAW Developer when I intend to use the files for mono extractions or want the very greatest amount of textural information. Aperture does an excellent job of recovering colour accuracy from tricky files and has the ability to produce stunning skin tones but its ability to extract detail is average at best. Adobe is just average, master of nothing in particular...just a general purpose tool but fine I am sure for most folk.

I don't have Lightroom, but I have used and taught it, its good but would not be my prime choice for the way I work.

It is worth noting for those not familiar with the various options that neither RPP or Raw Developer are particularly easy to use options and they don't have all the bells and whistles like CA correction, they are the specialist tools of the raw world designed to do some things really really well. I would describe Raw Developer as a digital scalpel and RPP as the digital equivalent of the human eye.

By way of comparison the Sony raw convertor that came with my Sony Cameras is....how shall I put this........completely hopeless. BTW any sony user who has only ever used that application or shot in JPEG has no idea what so ever of how good their gear can be.

It is also worth noting that most images posted to the web to demonstrate effects etc are utterly useless, JPEG compression, colour space issues and downsizing, monitor calibration all negate the real differences........this pointy end stuff is for the world of high end prints (the solid gold standard) for web images there is almost nothing to be gained from the pointy end approaches.

And then there is sharpening algorithms and how they effect noise and micro detail...............

Ah yes Raw Converters, certainly a very under-rated aspect of the whole imaging chain. Anyhow I am rambling and I don't want to confuse the issues to much so I will leave it at that.
--
Trying to make the complex simple

Reply   Reply with quote   Complain
Mako2011
Forum ProPosts: 15,594
Like?
Thanks
In reply to ZOIP, Aug 29, 2012

ZOIP wrote:

Hello Mako
Thank you for you kind comments.

The issue of RAW processing is indeed crucial, I expect that deficiencies here account for why many folk don't see changes or improvements in techniques they test. For example unless you can remove the effects of all noise reduction it is unlikely you can ever see the subtle aspects different lenses have to offer in terms of local contrast, clarity etc, many RAW convertors do not allow this. And of course the same with colour as you say, likewise the actual inherent noise in the file cannot be ascertained if the NR cannot be switched out, and oddly enough the NR can actually make the native noise look worse due to clumping but that is a whole separate issue.

Covered very well. Yes, NR is, IMO, the gorilla in the room that is often difficult to deal with in an optimal way. Especial when B&W is the final goal.

For my tests I have used Adobe Photoshop RAW (basically I don't use it for any real work) RAW Developer, RPP and Aperture. They all have their benefits and deficits but for my most serious work I use Raw Developer and RPP. RPP when I am particularly after colour accuracy and aiming for a filmic look and RAW Developer when I intend to use the files for mono extractions or want the very greatest amount of textural information. Aperture does an excellent job of recovering colour accuracy from tricky files and has the ability to produce stunning skin tones but its ability to extract detail is average at best. Adobe is just average, master of nothing in particular...just a general purpose tool but fine I am sure for most folk.

As I make use of Nikon's ADL, I have View/CaptureNX2 as the start of my workflow with ACR as an alternative.

I don't have Lightroom, but I have used and taught it, its good but would not be my prime choice for the way I work.

ViewNX2/CaptureNX2=> 16-bit TIFF CS6 and a plugin or two from there. Looking at Tiffen Dfx v3 Photo Plug-In as my digital filter package. Using true optical color effects filters optimally may be beyond my ability/patience level.

It is worth noting for those not familiar with the various options that neither RPP or Raw Developer are particularly easy to use options and they don't have all the bells and whistles like CA correction, they are the specialist tools of the raw world designed to do some things really really well. I would describe Raw Developer as a digital scalpel and RPP as the digital equivalent of the human eye.

Thank you for that insight. Well received and appreciated.

It is also worth noting that most images posted to the web to demonstrate effects etc are utterly useless, JPEG compression, colour space issues and downsizing, monitor calibration all negate the real differences........this pointy end stuff is for the world of high end prints (the solid gold standard) for web images there is almost nothing to be gained from the pointy end approaches.

That we certainly agree on. In my case, print is always the goal my workflow is being tailored for. Any other display medium is really just a temp step to print in my mind. But, really just an amateur with dreams here and not a learned professional

And then there is sharpening algorithms and how they effect noise and micro detail...............

What I don't know on that can fill volumes.

Ah yes Raw Converters, certainly a very under-rated aspect of the whole imaging chain. Anyhow I am rambling and I don't want to confuse the issues to much so I will leave it at that.

Thank you for taking the time. I learned something.

Reply   Reply with quote   Complain
hjulenissen
Senior MemberPosts: 1,614
Like?
Re: Do optical WB filters improve RAW color latititude?
In reply to Vitruvius, Aug 29, 2012

Vitruvius wrote:

Yes, yes, and yes. I understand all this... Filters reduce incoming light in specific color channels. Makes total sense, since that is the point of a color filter.

This question really is about getting the exposure as close as possible PER COLOR CHANNEL during exposure.

Your reply suggests to me that you did not read or comprehend my post. I discussed the total amount of light and the relative of amount of light hitting each color channel.

Repeating myself:

1. Having different exposure in different color channels can be a good thing and it can be a bad thing. It depends on if you want a (narrower) DR window of good color precision, or a wider DR window of lesser color precision. The former suggests color filter balancing, while the latter suggests color filter de-balancing (!).

2. Digital matrixing of sensor channels is simple, inexpensive and gives fast feedback. Physical filters allows for spectral modifications that cannot be done in Photoshop.

3. Capture exposure cannot be completely decoupled from rendering exposure. If the blue channel is 6 stops lower than the green one, and you want it rendered like that as well, you might not be able to see that the effective DR of the blue channel is very low. Capturing the most possible information from a scene seems like a good thing, but there are limits to what I am willing to do to capture information that I am fairly certain that I will never utilize.

4. In practical terms, I'd suggest that if you have to ask, chances are that you will improve your images more by spending energy on other things.

-h

Reply   Reply with quote   Complain
ZOIP
Junior MemberPosts: 35
Like?
Re: Do optical WB filters improve RAW color latititude?
In reply to hjulenissen, Aug 29, 2012

I may well be wrong about this and happy to be proven so, but as far as I know colour accuracy is not a product of differing degrees of exposure across the three channels but rather the cut off of colours brought about by having filters that better reject wavelengths outside their filter colour. In other words if the filters are significantly different we get better seperation and improved colour. One of the reasons phone cams for example have such poor colour is the high degree of cross-talk between the colour channels which is seen as a fair compromise in order to keep the noise levels under control for such small sensors and allow practical usage under lowish light levels.

To get this improved degree of separation requires the one or more of the filters to have greater density than the others and this along with a couple of other factors means the channels have differing sensitivities as a byproduct.

Thousands of fully balanced captures that I have made have consistently produced better not worse colour and this especially applies to subtle natural shades in skin, foliage etc. Having said that one does need to do some profiling etc to get the optimal results, it is not a trivial matter but in my experience the ultimate result is better not worse colour.

As said, happy to be shown an alternative and proven wrong.
--
Trying to make the complex simple

Reply   Reply with quote   Complain
Mako2011
Forum ProPosts: 15,594
Like?
Re: Do optical WB filters improve RAW color latititude?
In reply to ZOIP, Aug 29, 2012

ZOIP wrote:

I may well be wrong about this and happy to be proven so, but as far as I know colour accuracy is not a product of differing degrees of exposure across the three channels but rather the cut off of colours brought about by having filters that better reject wavelengths outside their filter colour. In other words if the filters are significantly different we get better seperation and improved colour. One of the reasons phone cams for example have such poor colour is the high degree of cross-talk between the colour channels which is seen as a fair compromise in order to keep the noise levels under control for such small sensors and allow practical usage under lowish light levels.

To get this improved degree of separation requires the one or more of the filters to have greater density than the others and this along with a couple of other factors means the channels have differing sensitivities as a byproduct.

Thousands of fully balanced captures that I have made have consistently produced better not worse colour and this especially applies to subtle natural shades in skin, foliage etc. Having said that one does need to do some profiling etc to get the optimal results, it is not a trivial matter but in my experience the ultimate result is better not worse colour.

Another thought provoking post. Given today's sensor state of the art....a really accurste WB card and camera custom WB capability.....I wonder if we are on the same page when it comes to "better color" vs "accurate color". Just thinking out loud. Color checker anyone

Reply   Reply with quote   Complain
alanr0
Senior MemberPosts: 1,246
Like?
Sensor filters and colour accuracy
In reply to ZOIP, Aug 29, 2012

ZOIP wrote:

I may well be wrong about this and happy to be proven so, but as far as I know colour accuracy is not a product of differing degrees of exposure across the three channels but rather the cut off of colours brought about by having filters that better reject wavelengths outside their filter colour. In other words if the filters are significantly different we get better seperation and improved colour.

As I understand it, colour accuracy would not improve if crosstalk between red green and blue channels were greatly reduced. Reproduction of subtle variations in hue relies on partial overlap between colour channels.

Consider a rainbow, or preferably the spectrum of light dispersed from sunlight by a prism or diffraction grating in a darkened room. The eye perceives a continuous transition through red/orange/yellow/green/blue/violet. Suppose your camera sensor has filters with sharp transitions between red/green and green/blue. If there is no overlap between filters, then each monochromatic hue can only appear to the camera as saturated red, green or blue, with more subtle distinctions impossible.

For accurate colour reproduction, the filters should respond in a similar way to the pigments in the human retina, which show considerable spectral overlap.
http://en.wikipedia.org/wiki/Color_vision

In general, the match is not exact, so it can be difficult to simultaneously get good reproduction of, for example, skin tones, reds and greens, all in the same image.

Thousands of fully balanced captures that I have made have consistently produced better not worse colour and this especially applies to subtle natural shades in skin, foliage etc. Having said that one does need to do some profiling etc to get the optimal results, it is not a trivial matter but in my experience the ultimate result is better not worse colour.

Theory can point you in a good direction to experiment, but the final result - as perceived by a human eye - is what matters.

Cheers.
--
Alan Robinson

Reply   Reply with quote   Complain
ZOIP
Junior MemberPosts: 35
Like?
Re: Sensor filters and colour accuracy
In reply to alanr0, Aug 29, 2012

Hello Alan

Thank you for your response, makes perfect sense and draws attention to the fact I need to be a little more precise some times in posts and also helps clarify a couple if issues.

There needs to be a limited amount of crosstalk for good neutrals, otherwise they would end up somewhat strongly coloured but the cross talk should not be so great as to lower colour separation between what should be clearly different colours of the spectrum.

The problem with many small sensors and from my experience even some larger ones is too much cross talk in an effort to improve high ISO performance. Perfect colour from the filtering end is a product of walking a pretty fine line with cross channel filter densities/strengths.

The point I was trying to get across, (poorly I am afraid) in response to one of the posts made was that it is not the differing levels of exposure in the channels that give us good colour but the spectral separation enabled by the 3 colour filters, the exposure variance is a necessary by product of the process which if we could avoid it would be preferable.

Your response is so helpful as it draws attention to why some cameras have sensors that do not need to be fully zeroed out to get optimal colour whilst others do, and also explains why it is not possible to simply have some standard filter pack that works for lots of cameras.

By way of example, the NEX 5n does not fully benefit from total sensor balancing and attempting to reconstruct correct colour if you do is extremely difficult, it needs a somewhat modified approach. On the other hand the Sony A900 gives stunning files with extraordinary colour stability under wild white balance variations when perfectly balanced.

Which all as you say goes to show theory can point you in a direction but practical application is needed for surety and clarity.

Of course all of this colour stuff is rather hard for any of us, after all we are not privy to the inner workings of the sensor makers and how they have configured things both hardware wise and electronically......we can only make a best guess, which is often wrong of course.

Hope we chat again
Zip

-- hide signature --

Trying to make the complex simple

Reply   Reply with quote   Complain
hjulenissen
Senior MemberPosts: 1,614
Like?
Re: Sensor filters and colour accuracy
In reply to ZOIP, Aug 30, 2012

ZOIP wrote:

The point I was trying to get across, (poorly I am afraid) in response to one of the posts made was that it is not the differing levels of exposure in the channels that give us good colour but the spectral separation enabled by the 3 colour filters, the exposure variance is a necessary by product of the process which if we could avoid it would be preferable.

"good color" is a subjective term. I assume that color filters that closely mimic those of the standardised human observer is a necessary prerequisite. If you want to improve on that, I think that you need to sweep a set of narrow bandpass filters in front of a (best-case) un-filtered sensor. That is what is done in hyper-spectral imaging, and I believe that color-critical applications (such as reproduction work) use such methods.

My understanding of the thread-starter is that he is generally curious as to how color filters in front of his camera can "improve" his pictures (another subjective term).

threadstarter wrote:

I am thinking that digital RAW files would have more color latitude if they were pre-white-balanced during exposure by a color correction filter.

I have allready presented arguments to why "inter-channel exposure variance" can be both a bad thing and a good thing. If you disagree, I would appreciate that you adressed those.

-h

Reply   Reply with quote   Complain
hjulenissen
Senior MemberPosts: 1,614
Like?
Re: Sensor filters and colour accuracy
In reply to hjulenissen, Aug 30, 2012

In a highly idealized (and unlikely) sense, including a truely colorless scene and a highly frequency-selective filter in front as well as orthogonal CFA filtering, you could use a bayer-sensor camera featuring N stops of DR natively, to capture 3xN stops by shifting the different color channels in respect to each other.

In a more practical sense, "inter-channel exposure variation" allows you to recover some luminance details in the extreme highlights and shadows, that might be impossible if the channels had been perfectly equalised.

-h

Reply   Reply with quote   Complain
Vitruvius
Junior MemberPosts: 37Gear list
Like?
Re: Do optical WB filters improve RAW color latititude?
In reply to ZOIP, Aug 31, 2012

Thankyou. This is exactly what I suspected. I often shoot RAW thinking that I can always adjust WB later. Then I often have difficulty getting the WB in the RAW file corrected without something getting grainy, or banding. I use PS CS 5.5 but I am not versed in all the other software like Lightroom or DXO Labs. I would assume that these would drastically help though. But I also thought that it would be best to start with a RAW file with each color channel as close as possible to what the sensor is 'tuned' for.

 Vitruvius's gear list:Vitruvius's gear list
Canon PowerShot Pro1 Sony SLT-A77 Sony a77 II Sigma 70-200mm F2.8 EX DG Macro HSM II Sony DT 16-50mm F2.8 SSM +1 more
Reply   Reply with quote   Complain
Keyboard shortcuts:
FForum MMy threads