Color Science and Post Processing Order

Status
Not open for further replies.

dbmcclain

Member
Messages
45
Reaction score
2
I just bought a Hasselblad X2D and it arrived one week ago. I am now learning about photographic post-processing.

I arrive by way of Astrophysics and PixInsight processing of both: filter wheel multi-color exposure stacks, and CFA image stacks from Bayer matrix sensors.

There is no ISO setting on an Astronomical camera, and I have full control over what linear and nonlinear processing steps I apply to raw images, once corrected for flat field, read noise, and bias, (and possibly deBayered).

So, now in the photon rich environment of photography, there is ISO -- which appears to be simply a way of scaling the significant portion of 16 bit DAC outputs to the upper 8 bits shown on histograms in programs like Phocus, Capture One, and others.

I am confused that there is so little discussion of deBayering of CFA's as an integral part of "color science" in the popular posts. And there appears to be no available information from any of the camera manufacturers about how they deBayer their raw data. Likewise, there are no explicit mentions of the ordering of exposure correction, histogram adjustments, saturation changes, etc.

My impression is that Phocus reads in a Bayer'ed image from the camera and applies lens distortion and vignetting corrections (akin to my flat field corrections), and deBayering, in addition to a possible nonlinear Gamma correction, on the way to the in-memory image shown on screen, and used for export conversion.

However, the .FFF files remain as RGGB Bayer'ed images, and are possibly stored without lens distortion and vignetting corrections. The .FFF files also do not appear to have the Gamma correction folded in.

All these camera corrections appear to be performed by Phocus, and not by the camera body processor. I cannot say what happens to in-camera JPEG images. Certainly much of this happens in camera, but that is of little concern to me. I am mainly interested in RAW processing.

Do any of you veterans out there know any of these details of processing? Thanks in advance!
 
nor is it colour data.
Because the data is the result obtained from measurements taken thru color filters, for the purpose of providing the necessary data an application requires to create a color image, I personally find referring to that as "color data" an acceptable use and reasonable expression.
Are you maintain that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
It's simpler than that, I'm suggesting nothing more complicated than the notion that when a color filter is placed in front of a sensor to collect from a scene the data needed to produce a color image, it's reasonable to refer to the resulting image data recorded as "color data". Others may disagree and find that nomenclature unacceptable, but I personally find it a logical and useful expression when discussing color image formation.

The color information collected from the scene by the sensor and stored as raw data is converted and encoded into color values by processing algorithms in the camera or a raw converter application. Those color values can then be displayed or printed by an output device for us to sense and perceive as color with that color experience being encoded by our brain in our own memory system. That's my layman's explanation for whatever it's worth to anyone.
I note that you didn't answer the question. I further note that in the context of this thread, which relates to color management, the work color should be used with rigor.
I note that some of my replies to you are deleted before anyone else can see them. But in this case, I answered your question to my satisfaction.
You didn't answer it at all. Do you think that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
I think that you meant to say the word color should be used with rigor, not the work color.
You are correct.
 
the raw data is the result obtained from measurements recorded thru color filters
Not necessarily. The method of channel separation, even if there is such a separation (and sometimes there isn't), is irrelevant.
I have always found every link in the imaging chain to have some degree of relevance in color reproduction.
three things: a light source, an object, and an observer
You should try asking questions, not making statements.
Thanks for your advice. Which of the three things you cited from my post are unnecessary in the production of a color viewing event, if any? My original post specified that the object observed may be "perceived to reflect, transmit, or emit color". If the object perceived to have color is reflecting or transmitting light to the eyes of the observer to stimulate a color response, it seems that a source of light is required.

If the observer perceives color being emitted from the object observed; then the object, in that instance, is both the light source and the object being observed and perceived to have color. Are all three things neccessary for a color perceptual event or can you eliminate the observer, or the source of light, or the object perceived to have color?
My best wishes to you.
 
the raw data is the result obtained from measurements recorded thru color filters
Not necessarily. The method of channel separation, even if there is such a separation (and sometimes there isn't), is irrelevant.
I have always found every link in the imaging chain to have some degree of relevance in color reproduction.
three things: a light source, an object, and an observer
You should try asking questions, not making statements.
Thanks for your advice. Which of the three things you cited from my post are unnecessary in the production of a color viewing event, if any? My original post specified that the object observed may be "perceived to reflect, transmit, or emit color". If the object perceived to have color is reflecting or transmitting light to the eyes of the observer to stimulate a color response, it seems that a source of light is required.

If the observer perceives color being emitted from the object observed; then the object, in that instance, is both the light source and the object being observed and perceived to have color. Are all three things neccessary for a color perceptual event or can you eliminate the observer, or the source of light, or the object perceived to have color?
So, your position is that you need three things to perceive color, two or which can be the same thing. How is that last case different from needing two things to perceive color?

--
https://blog.kasson.com
 
Last edited:
nor is it colour data.
Because the data is the result obtained from measurements taken thru color filters, for the purpose of providing the necessary data an application requires to create a color image, I personally find referring to that as "color data" an acceptable use and reasonable expression.
Are you maintain that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
It's simpler than that, I'm suggesting nothing more complicated than the notion that when a color filter is placed in front of a sensor to collect from a scene the data needed to produce a color image, it's reasonable to refer to the resulting image data recorded as "color data". Others may disagree and find that nomenclature unacceptable, but I personally find it a logical and useful expression when discussing color image formation.

The color information collected from the scene by the sensor and stored as raw data is converted and encoded into color values by processing algorithms in the camera or a raw converter application. Those color values can then be displayed or printed by an output device for us to sense and perceive as color with that color experience being encoded by our brain in our own memory system. That's my layman's explanation for whatever it's worth to anyone.
I note that you didn't answer the question. I further note that in the context of this thread, which relates to color management, the work color should be used with rigor.
I note that some of my replies to you are deleted before anyone else can see them. But in this case, I answered your question to my satisfaction.
You didn't answer it at all. Do you think that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
Your question, which followed my explanation of why I think "color data" is a reasonable expression, was "Are you maintain [sic] that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?" I explained what it was that I was maintaining in the post to which you replied. I also included an additional paragraph regarding what I maintain as to what encodes color values in the digital process and where color is encoded which happens in our brain.

That satisfied me as a reply to what I was maintaining in both the post which prompted your question and the encoding of color values in processing digital image data as distinct from the encoding of color which occurs in the brain.

Anything further which you maintain and would like to add to the discussion, feel free to express it. I'm moving on to other matters.
 
Just to interject some typical naivety into this discussion...

Colours don't exist as a physical property, right?...
A color perception event requires three things: a light source, an object, and an observer.
You don't need an object. Color perception occurs when a light source is directly observed.
As I mentioned (see below), the object observed can be "perceived to reflect, transmit, or emit color". If you look at a light source, in that instance, it is both the object perceived to emit color and the light source.
The light source and the object observed (and perceived to reflect, transmit, or emit color) each have individual measurable properties which contribute to the production of color stimuli perceived by the observer.
As I said earlier color is not a physical phenomenon. The color is established by the wetware in the observer or by the Standard Observer.
I described them as "essential components of a perceptual color event." (see below)
They are all essential components of a perceptual color event.
 
nor is it colour data.
Because the data is the result obtained from measurements taken thru color filters, for the purpose of providing the necessary data an application requires to create a color image, I personally find referring to that as "color data" an acceptable use and reasonable expression.
Are you maintain that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
It's simpler than that, I'm suggesting nothing more complicated than the notion that when a color filter is placed in front of a sensor to collect from a scene the data needed to produce a color image, it's reasonable to refer to the resulting image data recorded as "color data". Others may disagree and find that nomenclature unacceptable, but I personally find it a logical and useful expression when discussing color image formation.

The color information collected from the scene by the sensor and stored as raw data is converted and encoded into color values by processing algorithms in the camera or a raw converter application. Those color values can then be displayed or printed by an output device for us to sense and perceive as color with that color experience being encoded by our brain in our own memory system. That's my layman's explanation for whatever it's worth to anyone.
I note that you didn't answer the question. I further note that in the context of this thread, which relates to color management, the work color should be used with rigor.
I note that some of my replies to you are deleted before anyone else can see them. But in this case, I answered your question to my satisfaction.
You didn't answer it at all. Do you think that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
Your question, which followed my explanation of why I think "color data" is a reasonable expression, was "Are you maintain [sic] that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?" I explained what it was that I was maintaining in the post to which you replied. I also included an additional paragraph regarding what I maintain as to what encodes color values in the digital process and where color is encoded which happens in our brain.

That satisfied me as a reply to what I was maintaining in both the post which prompted your question and the encoding of color values in processing digital image data as distinct from the encoding of color which occurs in the brain.

Anything further which you maintain and would like to add to the discussion, feel free to express it. I'm moving on to other matters.
It was a question that could be answered with a yes or a no. Instead, you are dancing around it.

--
https://blog.kasson.com
 
Last edited:
Just to interject some typical naivety into this discussion...

Colours don't exist as a physical property, right?...
A color perception event requires three things: a light source, an object, and an observer.
You don't need an object. Color perception occurs when a light source is directly observed.
As I mentioned (see below), the object observed can be "perceived to reflect, transmit, or emit color". If you look at a light source, in that instance, it is both the object perceived to emit color and the light source.
The light source and the object observed (and perceived to reflect, transmit, or emit color) each have individual measurable properties which contribute to the production of color stimuli perceived by the observer.
As I said earlier color is not a physical phenomenon. The color is established by the wetware in the observer or by the Standard Observer.
I described them as "essential components of a perceptual color event." (see below)
They are all essential components of a perceptual color event.
This business of the object is unnecessary. All you need to perceive or measure color is a sufficient quantity of electromagnetic radiation in the range between roughly 380 and 740 nm impinging on the observer, whether the observer is a person or a device. It doesn't matter how the electromagnetic radiation is generated. If black is a color -- and I think it is -- you can drop the sufficient quantity part.

--
https://blog.kasson.com
 
Last edited:
the raw data is the result obtained from measurements recorded thru color filters
Not necessarily. The method of channel separation, even if there is such a separation (and sometimes there isn't), is irrelevant.
I have always found every link in the imaging chain to have some degree of relevance in color reproduction.
three things: a light source, an object, and an observer
You should try asking questions, not making statements.
Thanks for your advice. Which of the three things you cited from my post are unnecessary in the production of a color viewing event, if any? My original post specified that the object observed may be "perceived to reflect, transmit, or emit color". If the object perceived to have color is reflecting or transmitting light to the eyes of the observer to stimulate a color response, it seems that a source of light is required.

If the observer perceives color being emitted from the object observed; then the object, in that instance, is both the light source and the object being observed and perceived to have color. Are all three things neccessary for a color perceptual event or can you eliminate the observer, or the source of light, or the object perceived to have color?
So, your position is that you need three things to perceive color, two or which can be the same thing.
Sure. An object could have a dual capacity to function in more than one role — as is the case when the light source, neccessary to stimulate visual color perception, is also the object which is being viewed and perceived to emit color.

I specified that "the object observed" could be "perceived to reflect, transmit, or emit color". If the object is perceived to reflect or transmit color, a light source is needed for the object perceived to be visible to the observer.

If an object emits light and that object is additionally viewed by an observer and perceived to have color; the object perceived and the light source are simply a single source providing two of the three components for what I described as "essential components of a perceptual color event".
How is that last case different from needing two things to perceive color?
In that case, two of the three "essential components" I described as being required for a color perception event are being derived from a single source.

We still have 1) an observer, 2) a light source, and 3) an object being observed and perceived to have color — the last two components being derived from the same source, but playing two discrete roles in the event simultaneously.

If someone wants to explain why two component parts of an event cannot be derived from a single source, I'm willing to listen.
 
the raw data is the result obtained from measurements recorded thru color filters
Not necessarily. The method of channel separation, even if there is such a separation (and sometimes there isn't), is irrelevant.
I have always found every link in the imaging chain to have some degree of relevance in color reproduction.
three things: a light source, an object, and an observer
You should try asking questions, not making statements.
Thanks for your advice. Which of the three things you cited from my post are unnecessary in the production of a color viewing event, if any? My original post specified that the object observed may be "perceived to reflect, transmit, or emit color". If the object perceived to have color is reflecting or transmitting light to the eyes of the observer to stimulate a color response, it seems that a source of light is required.

If the observer perceives color being emitted from the object observed; then the object, in that instance, is both the light source and the object being observed and perceived to have color. Are all three things neccessary for a color perceptual event or can you eliminate the observer, or the source of light, or the object perceived to have color?
So, your position is that you need three things to perceive color, two or which can be the same thing.
Sure. An object could have a dual capacity to function in more than one role — as is the case when the light source, neccessary to stimulate visual color perception, is also the object which is being viewed and perceived to emit color.

I specified that "the object observed" could be "perceived to reflect, transmit, or emit color". If the object is perceived to reflect or transmit color, a light source is needed for the object perceived to be visible to the observer.

If an object emits light and that object is additionally viewed by an observer and perceived to have color; the object perceived and the light source are simply a single source providing two of the three components for what I described as "essential components of a perceptual color event".
How is that last case different from needing two things to perceive color?
In that case, two of the three "essential components" I described as being required for a color perception event are being derived from a single source.

We still have 1) an observer, 2) a light source, and 3) an object being observed and perceived to have color — the last two components being derived from the same source, but playing two discrete roles in the event simultaneously.

If someone wants to explain why two component parts of an event cannot be derived from a single source, I'm willing to listen.
You are making this more complicated than it has to be.

--
https://blog.kasson.com
 
Last edited:
the raw data is the result obtained from measurements recorded thru color filters
Not necessarily. The method of channel separation, even if there is such a separation (and sometimes there isn't), is irrelevant.
I have always found every link in the imaging chain to have some degree of relevance in color reproduction.
three things: a light source, an object, and an observer
You should try asking questions, not making statements.
Thanks for your advice. Which of the three things you cited from my post are unnecessary in the production of a color viewing event, if any? My original post specified that the object observed may be "perceived to reflect, transmit, or emit color". If the object perceived to have color is reflecting or transmitting light to the eyes of the observer to stimulate a color response, it seems that a source of light is required.

If the observer perceives color being emitted from the object observed; then the object, in that instance, is both the light source and the object being observed and perceived to have color. Are all three things neccessary for a color perceptual event or can you eliminate the observer, or the source of light, or the object perceived to have color?
So, your position is that you need three things to perceive color, two or which can be the same thing.
Sure. An object could have a dual capacity to function in more than one role — as is the case when the light source, neccessary to stimulate visual color perception, is also the object which is being viewed and perceived to emit color.

I specified that "the object observed" could be "perceived to reflect, transmit, or emit color". If the object is perceived to reflect or transmit color, a light source is needed for the object perceived to be visible to the observer.

If an object emits light and that object is additionally viewed by an observer and perceived to have color; the object perceived and the light source are simply a single source providing two of the three components for what I described as "essential components of a perceptual color event".
How is that last case different from needing two things to perceive color?
In that case, two of the three "essential components" I described as being required for a color perception event are being derived from a single source.

We still have 1) an observer, 2) a light source, and 3) an object being observed and perceived to have color — the last two components being derived from the same source, but playing two discrete roles in the event simultaneously.

If someone wants to explain why two component parts of an event cannot be derived from a single source, I'm willing to listen.
You are making this more complicated than it has to be.
An observer, a light source, and an object observed and perceived to have color seems simple enough to me. That works for me as an explanation for a color perception event.

But, others are free to disagree and offer simpler explanations as they choose.
 
An observer, a light source, and an object observed and perceived to have color seems simple enough to me. That works for me as an explanation for a color perception event.

But, others are free to disagree and offer simpler explanations as they choose.
I explained why you don't need three things, but I'll say it again in shorthand: all you need is a spectrum and an observer.

In a discrete formulation, the color is the sum of the wavelength by wavelength products of the spectrum and each of the three observer color matching functions.

How the spectrum is created is irrelevant to the determination of the absolute color.

--
https://blog.kasson.com
 
Last edited:
nor is it colour data.
Because the data is the result obtained from measurements taken thru color filters, for the purpose of providing the necessary data an application requires to create a color image, I personally find referring to that as "color data" an acceptable use and reasonable expression.
Are you maintain that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
It's simpler than that, I'm suggesting nothing more complicated than the notion that when a color filter is placed in front of a sensor to collect from a scene the data needed to produce a color image, it's reasonable to refer to the resulting image data recorded as "color data". Others may disagree and find that nomenclature unacceptable, but I personally find it a logical and useful expression when discussing color image formation.

The color information collected from the scene by the sensor and stored as raw data is converted and encoded into color values by processing algorithms in the camera or a raw converter application. Those color values can then be displayed or printed by an output device for us to sense and perceive as color with that color experience being encoded by our brain in our own memory system. That's my layman's explanation for whatever it's worth to anyone.
I note that you didn't answer the question. I further note that in the context of this thread, which relates to color management, the work color should be used with rigor.
I note that some of my replies to you are deleted before anyone else can see them. But in this case, I answered your question to my satisfaction.
You didn't answer it at all. Do you think that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
Your question, which followed my explanation of why I think "color data" is a reasonable expression, was "Are you maintain [sic] that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?" I explained what it was that I was maintaining in the post to which you replied. I also included an additional paragraph regarding what I maintain as to what encodes color values in the digital process and where color is encoded which happens in our brain.

That satisfied me as a reply to what I was maintaining in both the post which prompted your question and the encoding of color values in processing digital image data as distinct from the encoding of color which occurs in the brain.

Anything further which you maintain and would like to add to the discussion, feel free to express it. I'm moving on to other matters.
It was a question that could be answered with a yes or a no. Instead, you are dancing around it.
It could be answered yes or no—which I would answer as no. It could also be answered in a more detailed fashion as I did... "The color information collected from the scene by the sensor and stored as raw data is converted and encoded into color values by processing algorithms in the camera or a raw converter application."

I chose the latter more detailed way of answering your question regarding the encoding of color and made a semantic distinction between encoding of numerical color values from the perception and encoding of color which, as you've pointed out, occurs in the mind of the observer.

I'm now truly moving on to other matters of interest.
 
Are you maintain that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
It's simpler than that, I'm suggesting nothing more complicated than the notion that when a color filter is placed in front of a sensor to collect from a scene the data needed to produce a color image, it's reasonable to refer to the resulting image data recorded as "color data". Others may disagree and find that nomenclature unacceptable, but I personally find it a logical and useful expression when discussing color image formation.

The color information collected from the scene by the sensor and stored as raw data is converted and encoded into color values by processing algorithms in the camera or a raw converter application. Those color values can then be displayed or printed by an output device for us to sense and perceive as color with that color experience being encoded by our brain in our own memory system. That's my layman's explanation for whatever it's worth to anyone.
I note that you didn't answer the question. I further note that in the context of this thread, which relates to color management, the work color should be used with rigor.
I note that some of my replies to you are deleted before anyone else can see them. But in this case, I answered your question to my satisfaction.
You didn't answer it at all. Do you think that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?
Your question, which followed my explanation of why I think "color data" is a reasonable expression, was "Are you maintain [sic] that the triples (or quadruples, if you prefer) of the planes of a raw file are encoding colors?" I explained what it was that I was maintaining in the post to which you replied. I also included an additional paragraph regarding what I maintain as to what encodes color values in the digital process and where color is encoded which happens in our brain.

That satisfied me as a reply to what I was maintaining in both the post which prompted your question and the encoding of color values in processing digital image data as distinct from the encoding of color which occurs in the brain.

Anything further which you maintain and would like to add to the discussion, feel free to express it. I'm moving on to other matters.
It was a question that could be answered with a yes or a no. Instead, you are dancing around it.
It could be answered yes or no—which I would answer as no.
If your answer is no, then we agree. No colors there.
It could also be answered in a more detailed fashion as I did... "The color information collected from the scene by the sensor and stored as raw data is converted and encoded into color values by processing algorithms in the camera or a raw converter application."
Irrelevant to the topic under discussion.
I chose the latter more detailed way of answering your question regarding the encoding of color and made a semantic distinction between encoding of numerical color values from the perception and encoding of color which, as you've pointed out, occurs in the mind of the observer.
The SO has no mind.
 
I chose the latter more detailed way of answering your question regarding the encoding of color and made a semantic distinction between encoding of numerical color values from the perception and encoding of color which, as you've pointed out, occurs in the mind of the observer.
The SO has no mind.
The CIE Standard Observer (SO) models (there's more than one and they don't match exactly) don't perceive color. They only describe the measured effects of light stimulating the vision of a group of human observers who did perceive color in their mind.
 
I chose the latter more detailed way of answering your question regarding the encoding of color and made a semantic distinction between encoding of numerical color values from the perception and encoding of color which, as you've pointed out, occurs in the mind of the observer.
The SO has no mind.
The CIE Standard Observer (SO) models (there's more than one and they don't match exactly) don't perceive color. They only describe the measured effects of light stimulating the vision of a group of human observers who did perceive color in their mind.
An argument without relevance to the point of this subthread. This is about whether there are colors in raw files.

--
https://blog.kasson.com
 
Last edited:
It could also be answered in a more detailed fashion as I did... "The color information collected from the scene by the sensor and stored as raw data is converted and encoded into color values by processing algorithms in the camera or a raw converter application."
Irrelevant to the topic under discussion.
I found it relevant, perhaps some others did too. You did not, and that may also apply to others as well.

That happens when people have discussions and I find that tolerable. But again, I am only speaking for myself.
 
Status
Not open for further replies.

Keyboard shortcuts

Back
Top