The ultimate Linearization: my take
The ultimate Linearization: my take
7 months ago
Hello to all the friends here,
I bought an Epson R3000 this winter, and a ColorMunki Photo in January.
After some test print in B&W with several top quality fine art papers I realized that even using the special ABW mode the printed output was not perfectly linear, and that there was no easy way to balance this and, in addition, to get a perfect neutral (or sepia split) tone too.
Using ICC profiles and printing in color was not better, even if more potentially flexible.
I tried the QTR linearization tool (following the very useful Keith Cooper article at Northlight Images site) but, I found this solution a little bit cumbersome and the results were not so good/consistent. In addition I wanted something able to neutralize the gray tone (and/or to get some sepia/split tones) in a automated consistent repeatable way.
So I started to think how to find a "final" solution for this issue, and started to learn a LOT of things, an started to develop a tool for this purpose, and testing, and printing, measuring, and learning again and again...
Only to give you a short indication of the insane work involved in this task I list the followings things I needed to develop/do to reach the full goal:
- 4 B&W test strips set of 18, 34, 52 and 68 patches (1 ,2 ,3 and 4 strips of 18 patches each with redundand black and white patches), optimized for the CMunki in 16bit TIFF.
- print the first strip, use X-Rite ColorPicker to measure the strip and export the proprietary X-Rite ".cxf" file of the measured values.
- 6x averaging of the measurements in 3x forward-backward sequence to minimize measurements errors.
- developing/coding a ".xml" filter to import the ".cxf" file in OpenOffice Calc for data manipulation/graph and ".csv" conversion (ColorPicker for windows does not directly export to ".csv")
- developing/coding some ".m" scripts in Octave (an open source Matlab equivalent) for a complex adaptive multi-step data processing algorithm providing the required linearization curves (including Adobe RGB and sRGB to/from CIELab conversion thanks to Bruce Lindbloom wonderful site).
- learning/creating a DeviceLink and/or Abstract ICC profile (Adobe RGB and sRGB supported) based on the linearization curves previously calculated (interpolated up to 4096 points). I developed some custom batch files using ICCxmltool for this task.
- apply of the Linearization ICC profile and printing the second strip. A DeviceLink ICC profile is like a Curve on steroids, with each RGB channel automatically calculated based on the strip measurements. No more empiric trial and errors.
- repeat to improve the linearization if needed (yes, this is a multi-step capable system). Typically the first pass (18 patches) corrects roughly 80%-90% of the non-linearity. A second step (34 or 52 patches) is capable to reach a near perfect results. For reference/paranoid results you can add a third step of 68 patches. At each step is possible to include, if desired, the gray tone neutralization for ICC color profiles prints.
The great thing of this system is that all the calculation are made in CIELab colorspace, allowing a separate paper linearization compensation between Lightness and color for every kind of print: ABW B&W and ICC color.
- For B&W prints using ABW and Gray Gamma 2.2 grayscale you can fully linearize the Lightness (the color toning is not under external control in ABW mode).
- For B&W prints using RGB ICC (custom or canned) profiles you can choose to linearize the Lightness AND (or not) neutralize the gray tone (or produce a custom/split toned curve too) still maintaining the Lightness linearization.
- for Color prints using RGB ICC (custom or canned) profiles you can choose to linearize the Lightness AND (or not) neutralize the gray axis too (compensating for paper/profile color cast) still maintaining the Lightness linearization.
As you can easily realize I have spent more than 4 full months of learning, coding, testing, re-learning, thinking, recoding, retesting, but today I can share with you some early results of this project.
In attachment you can find a jpg with 4 graphs. The paper used in this example is Epson Hot Press Bright 330 Signature Worthy (matte black).
The first two graph are related to ABW B&W print (neutral dark). The two graph in the second row are related to a B&W print using ICC (Epson canned profile).
In black is showed the Lightness (L*) and in green/blue the a* and b* values. X axis is Gray expressed in L* (from black to white). Y axis is measured Lightness (0-100 to the left) and -5/+5 for measured a* and b* (as visible to the right).
- As you can see in "ABW - BEFORE" (first graph, 18 patches) you have a typical dark print, reaching more than -8 L* units error in the middle when compared to the theoretical straight line from black to white (dotted line). Gray tone is not perfect but not too bad. Look at the yellowish tone of the paper too (blue line).
- The "ABW - AFTER linearization" is the results after 2 step of the linearization process (reaching 52 measured patches). The Lightness is perfectly straight now. Gray tone obviously is not affected here and remains near the same (neutral dark).
- The "ICC (canned) - BEFORE" print (second row) is related to the same 18 patches strip as the ABW one, but printed using the canned ICC (relative colorimetric, black point compensation on). There is a slightly weaker black value and some more gray tone deviance, but the overall Lightness non linearity is similar to the ABW print.
- In the "ICC (canned) - AFTER linearization AND tone neutralize" graph you can see the effect after 3 steps (up to 68 patches) of the combined action of Lightness linearization AND gray tone compensation. Here the black point is comparable to the ABW one, the Lightness is perfectly straight and the gray tone is perfectly gray. The roll-off to the paper yellow color is allowed and specifically designed to prevent an abrupt transition from full neutral gray to the yellow paper tone (we cannot change the paper white tone).
Keep in mind that tagging this whole object as "alpha" is not wrong: the usage of this multiple tool is very complex at the moment, because involves a lot of passages between different applications and custom scripts and human supervision, and I don't know if it could never became a single straight-forward application in the future, this will require a lot of time for sure and I don't know if there could be a real world interest in something like this, apart from some crazy people like me.
Let me know what do you think, every comment/suggestion/question is welcome.
Thanks for the patience.