Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
BTW, here is a comparison of rawdigger to a different picture control I made. The picture controls are calculated to 16-bit precision and then scaled out to 8-bit, so it is possible to find and highlight (roughly) the raw clipping point.On Nikons, the tone curve of the target jpeg is set by the picture control. There is always a picture control in effect. I think the camera comes with 20 or so (Vivid, Neutral, etc.). There is demosaicing going on also and it is the combination of the two (plus saturation, clarity, sharpness, noise reduction, etc. in the picture control) that maps the raw data to the pixels that go to the in-camera JPEG compression to create the SOOC JPEG and raw preview.
I've not done the Rawdigger check to see exactly how close to (or over) clipping the exposure is when the picture control changes to black. I do know that in extreme cases I have needed to back off 1/3 of a stop beyond what the picture control indicates in order to avoid highlight clipping on pixels near the limit. An example of such a case is a moonrise over the ocean, where I wanted to avoid blowing out the moon while maximizing the DR in the dim reflections on the waves.
Todd
Both the O/P approach and yours can be very useful but also have limitations: they may indicate that an area is clipped while in reality it is not: this will occur because either the red channel or the blue channel or both are clipping after applying white balance coefficients, which depending on the light can be larger than 1.BTW, here is a comparison of rawdigger to a different picture control I made. The picture controls are calculated to 16-bit precision and then scaled out to 8-bit, so it is possible to find and highlight (roughly) the raw clipping point.
I can't think of a reason that this technique couldn't be used in conjunction with UniWB.Both the O/P approach and yours can be very useful but also have limitations: they may indicate that an area is clipped while in reality it is not: this will occur because either the red channel or the blue channel or both are clipping after applying white balance coefficients, which depending on the light can be larger than 1.BTW, here is a comparison of rawdigger to a different picture control I made. The picture controls are calculated to 16-bit precision and then scaled out to 8-bit, so it is possible to find and highlight (roughly) the raw clipping point.
blog.kasson.com
Good point, solves the problem.I can't think of a reason that this technique couldn't be used in conjunction with UniWB.Both the O/P approach and yours can be very useful but also have limitations: they may indicate that an area is clipped while in reality it is not: this will occur because either the red channel or the blue channel or both are clipping after applying white balance coefficients, which depending on the light can be larger than 1.BTW, here is a comparison of rawdigger to a different picture control I made. The picture controls are calculated to 16-bit precision and then scaled out to 8-bit, so it is possible to find and highlight (roughly) the raw clipping point.
https://blog.kasson.com/using-in-caera-histograms-for-ettr/
you can put a link in your blog with the keywords... you are searchableToo bad we can't make this thread sticky.
Good idea!you can put a link in your blog with the keywords... you are searchableToo bad we can't make this thread sticky.!
I agree, and I do that sometimes myself, though very rarely. UniWB will scale the red, green, and blue channels appropriately; and then the picture control will approximate the limits of raw clipping. Usually, I don't expose to full raw anyway (I dial it back 1/3 stop or more); and I rarely find red or blue clipping individually.I can't think of a reason that this technique couldn't be used in conjunction with UniWB.Both the O/P approach and yours can be very useful but also have limitations: they may indicate that an area is clipped while in reality it is not: this will occur because either the red channel or the blue channel or both are clipping after applying white balance coefficients, which depending on the light can be larger than 1.BTW, here is a comparison of rawdigger to a different picture control I made. The picture controls are calculated to 16-bit precision and then scaled out to 8-bit, so it is possible to find and highlight (roughly) the raw clipping point.
https://blog.kasson.com/using-in-caera-histograms-for-ettr/
I tried it and i think it works. It gives you real time indications for both the under and over exposed areas. The only real issue is that while you can exposure compensate live for the over exposed areas you cannot do this reliably for the underexposed. But the hole idea was in the first place to add functionallity with your camera in real time. Anyway thank you again for bothering to build these picture controls for us.I tried something like that, where, in addition to mapping 255-in to 0-out on the high end, I mapped 0-in to 255-out on the low end. I found that it didn't help much. Perhaps pushing it up a little so that the clipping point wasn't exactly 0 might have made it more useful. I put the picture control file in the same folder, alongside the one that only shows the clipped highlights:Thank you very much, i just downloaded the file and its amazing in my z7. I wonder if there was a way to show also the underexposed pixels lets say with blue color? Anyway thank you again for your solution!
https://drive.google.com/drive/folders/11JDOGdWH8VtN8W466i_knVwh0v9eiWOB?usp=sharing
It's unimaginatively called Flat-Clip2.NP3. Feel free to give it a try.
Thanks for the instructions. Worked perfectly, I’ll give it a go and see how it works out.Copy the file to NIKON/CUSTOMPC on your memory chipThanks for this, looks of interest in hunting blinkies.
Just one question how I do take the downloaded file and get it onto my Z6 to use? I’m not familiar with how this works.
Cheers
Put the chip in the camera & turn the camera on
Press the MENU button on the camera
Under the PHOTO SHOOTING MENU, tap on Manage Picture Control
Tap on Load/Save
Tap on Copy to Camera
From there, it should prompt you to select the picture control, which should be called Flat-Clip, and the custom picture control slot to put it in (which will be C1, if you've never done this before).
To activate it, under the PHOTO SHOOTING MENU, tap on Set Picture Control
Scroll to "C1 Flat-Clip" and select OK
There are 28 pre-installed picture controls on my Z7 and the custom ones are at the bottom of the list. Note that the list "wraps" which means that you can scroll up from the top of the list and get to the bottom of the list with just one button press.
For those of us not fully versed in all this can you explain why this file is better for your purposes? I've used the OP's two files and really liked them, but if there is an added advantage to this one I'd love to know. Thanks!Thanks again for this workaround for getting the zebra effect. At first, I also used this image profile very actively. But then I wanted to transfer this original curve, "baked" on the basis of the "Flat Profile", to the "Standard Profile". For my purposes, this "Standard profile" is better. I'm posting here a link on my Google Drive to this image profile file - maybe it will be useful to someone. Once again I express my gratitude to the author of the original image profile for the original idea.
https://drive.google.com/file/d/10fDpcyiu0FHzdM7NZCoMSo33HSyGQyLK/view?usp=sharing
Noting that RawDigger defaults its Under Exposure stats to 8 EV below saturation, I created an updated picture control that roughly matched that. While it "works", such that pixels with values lower than a particular threshold are highlighted, I'm not sure that there's much benefit to be had.I tried it and i think it works. It gives you real time indications for both the under and over exposed areas. The only real issue is that while you can exposure compensate live for the over exposed areas you cannot do this reliably for the underexposed. But the hole idea was in the first place to add functionallity with your camera in real time. Anyway thank you again for bothering to build these picture controls for us.
Agree completely. That’s why I like this custom picture control so much!I never understood the reliance on the histogram with ML. A histo will show that SOMETHING is blown or crushed, but you don't know what part of the shot necessarily. There are shots where a person might not care if one part is blown, but would another part, it would be nice to know exactly where the overexposure is happening.
Visual aids do just that.
Most of the time, you can tell with a glance at the subject.I never understood the reliance on the histogram with ML. A histo will show that SOMETHING is blown or crushed, but you don't know what part of the shot necessarily.
Can't argue with that.There are shots where a person might not care if one part is blown, but would another part, it would be nice to know exactly where the overexposure is happening.
The reason I like zebras so much is that they are harder to ignore than the histogram.Visual aids do just that.
But there are times when you can't. Such as when a bird has a bit of white on it but there are also a few spots of light coming through the trees/leaves in the background. I have found that very challenging with the histogram in places like Costa Rica. Always end up taking a shot and checking.Most of the time, you can tell with a glance at the subject.I never understood the reliance on the histogram with ML. A histo will show that SOMETHING is blown or crushed, but you don't know what part of the shot necessarily.