Olympus: are the orange and blue warnings based on JPEG?

Started 2 months ago | Discussions thread
Anders W
Anders W Forum Pro • Posts: 22,027
Re: Olympus: are the orange and blue warnings based on JPEG?

JakeJY wrote:

Anders W wrote:

JakeJY wrote:

Anders W wrote:

Mark Ransom wrote:

Anders W wrote:

The raw data must be read and stored as part of the live-view feed since they are the basis for the converted data ultimately displayed. Hence, I can’t imagine it’d be much of a problem using the data prior rather than after conversion for histograms and highlight/shadows warning.

To support the extreme data rates required for real-time display update, the processing pipeline has to be highly optimized. The raw data may not be as easily accessible as you imagine.

I don’t imagine anything. I know it. The raw data are the very first stage of the pipeline you are talking about. Hence, they are by definition as easily accessible as they could be.

But the point is it may not be easily accessible to the part of the processor that does the histogram graphs (which is likely higher level based on the fact that close to all cameras do it based on the processed JPEG or viewfinder data, not the actual RAW values).

They may already have a highly optimized pipeline in the chip that writes out to the RAW file, another one that does demosaicing and all the other processing necessary to generate the JPEG, video, or viewfinder view.

Well, for a mirrorless camera, pretty much everything you do when shooting depends on the raw data from the sensor: live-view, metering, AF, post-capture jpeg creation, and so on. It follows that the raw data must be stored in a place that is as quickly accessible to all these processes as possible.

In fact, calculating histograms and highlight/shadows warnings from the raw data rather than the tone-mapped data used for live-view should be able to reduce live-view latency a bit since the calculation of histograms and highlight/shadows can then be done in parallel with rather than after tone-mapping.

If the tone-mapped images for live-view are demosaiced, you'll save some additional calculation time since there is less data to deal with: 12 bits per pixel rather than 3x8 bits. But it's possible that the live-view images rest on pixel-binning rather than demosaicing.

And as to the reasons why camera makers do what they do, there are plenty of other reasons for that than the technical ones you suggest. See here:

https://www.dpreview.com/forums/post/65295442

I'm looking at the Magic Latern mentioned, and it seems for photos the ExpSim function must be on for the RAW based exposure feedback functions to work, so that must be a critical part of the pipeline there that exposes that for example. Other systems might not have that.

https://wiki.magiclantern.fm/raw_based_exposure_feedback

Well, what Magic Lantern has proved is that there aren't any technical impediments to calculating live-view histograms and highlight/shadows warnings based on the raw data rather than the tone-mapped data used for EVF/LCD display.

Well I'm not arguing that it's not technically possible (in fact, I started this to say it was possible). I'm just saying it's not necessarily as straight forward as you say it is for existing MFT cameras.

The part of the chip that generates the histogram may not have access to the raw data or have the processing budget available to it to use it even if it did. Because generating the live view, JPEGs, videos etc in realtime is critical to the function of the camera, the chip may have dedicated pipelines already to do those functions. The more general purpose part of the chip which may be handling the histogram and highlight/shadow warnings might not even have access to the RAW data (vs access to the already processed data).

1. See what I already said about access.

2. Few things are quite as simple as creating a histogram so even the most narrow-minded of chips should be able to do it even though (for reasons already explained) they don’t have to.

There's also nuances, like that fact that the viewfinder, AF, metering, etc is often done on a reduced resolution image, so there is no need for those functions to ever access the RAW data directly anyways. There's also things like the interpolation need to fill in the blue pixels for sensors with PDAF arrays.

For reasons already explained, don’t confuse the fact that the things you mention operate on reduced resolution with the fact that they need access to raw data. The point is that the raw data are the sine qua non of all digital camera functions, especially live-view (aka mirrorless) cameras.

 Anders W's gear list:Anders W's gear list
Panasonic Lumix DMC-G1 Olympus PEN-F Olympus E-M1 II Panasonic Lumix G Vario 14-45mm F3.5-5.6 ASPH OIS Panasonic Lumix G Vario 7-14mm F4 ASPH +20 more
Post (hide subjects) Posted by
MNE
MNE
MNE
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow