Does HEIC have wider dynamic range than JPEG?

CAcreeks

Forum Pro
Messages
20,530
Solutions
22
Reaction score
3,691
Location
US
This issue was raised on other newsgroups, but without a definitive answer. I tried this scene with "High Efficiency" vs "More Compatible" camera settings. The exposure was not quite the same, but close enough. I did not see brighter highlights in the HEIC, nor better shadows, but greens and especially reds are more saturated. However, when converting HEIC to JPEG (using Affinity Photo) the resulting JPEG gets the same high saturation.

I have no idea how the iPhone works. The HEIF had motion from the spill-over fountain. The "compatible" result was actually an MP4 video, but after AirDrop became a JPEG. The HEIC showed motion in Preview, but not after AirDrop. Shrug. I must read up on the iPhone!

HEIC converted to JPEG
HEIC converted to JPEG
 
Last edited:
Apple is currently using an 8-bit encoder for their HEIC files but higher bit depths are allowed by the standard. I believe Fujifilm is using 10-bit.
 
Hmm,,, my iPhone makes 10 bit deep HEIC files. Could it depend on the dynamic range of the scene or perhaps on the version of software/iPhone. Below from a iPhone 16 picture with iOS 18. It seems to show a depth of 10 bits.

From iPhone 16.
From iPhone 16.
 
Hmm,,, my iPhone makes 10 bit deep HEIC files. Could it depend on the dynamic range of the scene or perhaps on the version of software/iPhone. Below from a iPhone 16 picture with iOS 18. It seems to show a depth of 10 bits.
Hmmm. That’s new. Or at least news to me. Images from my 16 Pro show 8-bit depth, as do images from the 13 Pro that proceeded it.

SOOC HEIC image from iPhone 16 Pro running the latest release version of iOS 18. EXIF reflects an 8-Bit Depth
SOOC HEIC image from iPhone 16 Pro running the latest release version of iOS 18. EXIF reflects an 8-Bit Depth

All the documentation I’ve seen from Apple and Apple adjacent sources indicates that Apple is using HEIC for efficiency. The goal being a smaller file that “unpacks’ to the same quality as a jpg. More relevant is that I haven't heard anything about this changing in any of my tech or photo conversations — and I'd be paying attention. I wish the files were 10 (or 12) bit.
According to official specifications, the Apple HEIC registered brand uses the Main or Main Still Picture profile of HEVC with an 8-bit-per-channel color depth and chroma subsampling of 4:2:0, while the Sony and Canon HIEX registered brands may additionally increase the bit-depth to 10 bits per channel and chroma subsampling to 4:2:2 or 4:4:4.
High Efficiency Image File Format - Wikipedia

Bolding mine. Sorry that I can’t find a source closer to the metal at this moment. A (fast) search online shows lots of people wanting 10-bit from the iPhone. So far, you’re the only person I can find who has seemingly achieved it.

Update:

I just looked at that same photo in Photos on Mac OS. I can't read the bit depth from within Photos, but after an Export Unmodified Original operation, I opened the exported file in Preview. That file does reflect a 10-Bit depth. But (so far) only in Preview. The same file loaded into Fujifilm's RAW Studio or EXIFTool (using JEXIFToolGUI) shows 8-bit.

I suspect it's a bug in Preview, but I don't know this for certain. Another research project…possibly for another day.
 
Last edited:
Maybe it depends on the export path of the HEIC? Mine (at top of thread) was sent from iPhone to Macbook over AirDrop. Here's what its EXIF says:

% exiftool IMG_0019.heic | grep Depth
Image Pixel Depth : 8 8 8
Bit Depth Luma : 8
Bit Depth Chroma : 8
I just looked at that photo in Photos on Mac OS. I can't read the bit depth from within Photos, but after an Export Unmodified Original operation, I opened the exported file in Preview. That file does reflect a 10-Bit depth. But (so far) only in Preview. The same file loaded into Fujifilm's RAW Studio or EXIFTool (using JEXIFToolGUI) shows 8-bit.
Is Export Unmodified Original an option in Apple Photos?

Do you remember the HEIC photos of Catalina island, Big Sur coastline, and other scenes that changed with time of day? It makes sense that HEIC can show water motion, or panorama panning, being just a short video. But I've never found how the picture can adjust to changing sunlight, and be close to correct at all times of the year.

Sonoma and Sequoia HEIC photos show zoom-in and pan, which is easier to implement.
 
Last edited:
Is Export Unmodified Original an option in Apple Photos?
Has been for as long as I can remember — at least in the Mac OS version.



[ATTACH alt="The menu will export "originals" for all of the images in this album because I haven't made any selections. "]3675834[/ATTACH]
The menu will export "originals" for all of the images in this album because I haven't made any selections.
 

Attachments

  • 1e760261e4644d168dc0c559fda987e1.jpg.png
    1e760261e4644d168dc0c559fda987e1.jpg.png
    344.2 KB · Views: 0
I do believe that it depends somewhat on the dynamic range of the photo. For high dynamic range photos they are actually two exposures made one for the highlights; one for the shadows, and these different images are combined for one displayed image. So exif data for the individual images can be 8 bit, but when you combine the different exposures, it extends the bit depth. If you have a lower contrast scene that can be handled by 8 bits then only one image is made. In Preview app. the high dynamic range images have two components and a lower dynamic range image only has one image which you can see.

If you have exiftool, then you can see the QuickTime tags in the HDR image for the different light levels:

[QuickTime] MaxContentLightLevel : 203
[QuickTime] MaxPicAverageLightLevel : 64

https://www.quora.com/Why-does-the-iPhone-take-2-pictures-when-the-HDR-option-is-selected

So maybe for some images, they can be recorded with eight bits, but other HDR images need 2 exposures which are then combined for more bits.
 
I do believe that it depends somewhat on the dynamic range of the photo.
I’m looking at the same photo — literally the same image file — using multiple utilities. Only one, Preview, reports — erroneously (?) — that the file is 10-bit.

The final output of the camera image processing stream you describe, Deep Fusion and Smart HDR, is a single-image file. That file can be an Apple ProRAW (DNG), JPG, or HEIC. The latter being the topic of this discussion. Apple is on record that their HEIC files — a HEIF file container with a single image encoded using HEVC payload — are 8-bit. (A file containing a sequence is designated HEICS.) HEIC provides the quality of the compressed jpg format it replaces, deemed good enough for consumers (and more than a few journalist/event shooters), at approximately half the file size. The assumption is that enthusiasts and professionals will use ProRAW if they require greater bit depth.

I am unable to locate any source indicating Apple has updated their HEIC standard — and I’ve been revisiting the developer documentation. I would think that a move to 10-bit would be documented and well publicized. Deserving a call out at the introduction of iPhone 16 or iOS 18. I’m open to evidence to the contrary, but until I see it from a definitive source I’m call this misinformation, placing the blame on a Preview bug.

As an aside, for anyone wanting a deep dive into HEIF/HEIC from Apple’s perspective:

High Efficiency Image File Format - WWDC17 - Videos - Apple Developer
 
Last edited:
A Bit about “HEIF”
Apple has recently popularized an image format called HEIF (High Efficiency Image Format). Pronounced “heef,” this is a standardized and general-purpose format that has many improvements over JPEG, such as support for 12 and 14 bit images, and more efficient compression, which results in smaller files for the same quality. While HEIF has the ability to address a number of the weaknesses of JPEG, images shot on iPhones with HEIF are still 8-bit, and the images have been significantly processed by the camera. In practice, the advantages I list for RAW are still valid for iPhone-generated HEIFs. Apple uses the .heic file extension — those are HEIF files using a specific compression technology. HEIC-based files are roughly half the file size of equivalent JPEGs, which is a big savings, both in cloud storage and on your device.

Should I Shoot RAW or JPEG? What’s HEIF?
The author of this article is Nik Bhatt, coder behind Raw Power and Nitro.
[…] an 18 year veteran of Apple. His roles in the Photo Apps group included Senior Director of Engineering and Chief Technical Officer. Among other roles, he led the Aperture, iPhoto, RAW Camera and Core Image engineering teams, as well as the imaging team for the Mac version of Photos.
Mr. Bhatt holds over 55 patents in a wide range of disciplines including image processing, audio processing, geotagging, wireless networking, and user interface design.
Again, it’s not impossible that Apple changed the file standard in the interim, but if they did nobody is talking about it.
 
I am unable to locate any source indicating Apple has updated their HEIC standard — and I’ve been revisiting the developer documentation. I would think that a move to 10-bit would be documented and well publicized. Deserving a call out at the introduction of iPhone 16 or iOS 18. I’m open to evidence to the contrary, but until I see it from a definitive source I’m call this misinformation, placing the blame on a Preview bug.

As an aside, for anyone wanting a deep dive into HEIF/HEIC from Apple’s perspective:

High Efficiency Image File Format - WWDC17 - Videos - Apple Developer
Thanks, that answers lots of questions, though not all. I guess the timelapse Catalina island and timelapse Big Sur coastline were done with "time metadata" as the video says.

The trouble with HEIC is that Nokia claims a patent on "images and image sequences encoded in a particular format (HEVC or AVC)" [WikiP] so industry-wide acceptance is slow.

My iPhone SE 3rd generation does not offer in-phone HDR (2nd generation did). According to this article, "older iPhones" without smart HDR can produce subject motion blur, as below.


89d2da8b075a4030ab19ff57067dbd7c.jpg
 
Last edited:
Yes, HEIC images (can) have a wider dynamic range.

To see the wider dynamic range, you need a display capable of displaying a wide dynamic range. Inside an image file you have a primary base image which is designed to display a standard dynamic range image. Yes the base image is a 24 bit RGB image (8:8:8) which is designed to have a SDR dynamic range. But there is more inside the file than just that. Most displays are 8 bit (or 8 bit + A-FRC) SDR, so the base image maps, very well to that kind of display. Many of the newer Apple displays have a wider dynamic range:

The dynamic range of HDR on Apple displays varies by device and model. HDR displays can have a dynamic range of up to 14 stops or more.

HDR dynamic range on Apple devices

• iPhones with XDR display: Up to 8x SDR EDR headroom

• iPad Pro's Liquid Retina XDR display: Up to 16x SDR

• Pro Display XDR: Up to 400x SDR in the default XDR preset

• Apple XDR Display (P3-1600 nits): Extreme Dynamic Range (XDR) support up to 1600 nits

HDR dynamic range on Mac models

• MacBooks Pro introduced in 2018 or later: Support HDR video on their built-in display

• MacBooks Air introduced in 2018 or later: Support HDR video on their built-in display, and on external HDR10-compatible displays

• iMacs introduced in 2020 or later: Support HDR video on their built-in display

These HDR displays are accommodated by the HEIC file also containing HDR GainMap data. This data is added onto the base image, data and pass through the display image processing pipeline to then appear on the display screen.

HEIC and Image Depth

• Standard HEIC: The HEIC format is a container format and typically stores images with 8-bit per channel depth for standard photos on iPhones, as this is sufficient for most uses and keeps file sizes manageable.

• HDR HEIC: When an image is captured in HDR, it may use Apple’s proprietary enhancements, including computational photography techniques like Smart HDR, which merge multiple exposures to simulate a broader dynamic range. Even a single exposure can have gain map data, as the sensors have a wide dynamic range.

HDR Display and Dynamic Range

• On HDR-capable displays (like those found on modern iPhones and Macs), HDR images take advantage of the display’s ability to render a higher bit depth and brightness range. This doesn’t mean the stored image in the HEIC file is necessarily more than 8 bits per channel—it might still be 8 bits but mapped to a broader perceptual range.

• The rendering process on the display can interpolate or expand the dynamic range for HDR playback, making it look as if the image has more than 8-bit depth in practice.

Two Separate Exposures and Depth

• In HDR mode, Apple’s camera system typically captures multiple exposures (e.g., one for highlights and another for shadows) and combines them using tone mapping. But even images made from one exposure, may have the game map data to expand the dynamic range on a proper monitor.

• While the final image may be stored as an 8-bit HEIC file, the intermediate data during processing does use a higher bit depth (e.g., 10-bit or 12-bit) to ensure minimal loss of detail during merging.

• If exported in formats like ProRAW or other high-bit-depth options, these images can retain higher bit depths (e.g., 12-bit or 14-bit), suitable for professional editing.

In Summary

• Stored Image Depth: The HEIC file might still be 8-bit, but HDR enhancements rely on the merging of exposures and the capabilities of the display to simulate a higher dynamic range visually.

• Intermediate Processing: Even if the final file is 8-bit, the processing behind HEIC HDR images uses higher bit-depth data to ensure a broad dynamic range.

You can also tell by just looking at the images on a good display. So while the base image is 24 bit RGB, there is usually more data added, merged, and interpolated with each pixel thus on a good display the HEIC file appears with a wider dynamic range and smooth gradations -- effectively a 10 bit HDR.
 
My iPhone SE 3rd generation does not offer in-phone HDR (2nd generation did). According to this article, "older iPhones" without smart HDR can produce subject motion blur, as below.
According to Apple your phone has Smart HDR…
My iPhone SE has no interface for activating and de-activating HDR, but perhaps Smart HDR "automatically uses HDR when it’s most effective."

https://support.apple.com/guide/iphone/adjust-hdr-camera-settings-iph2cafe2ebc/ios

I have XDR P3-1600 Macbook and an Acer XV265K set to HDR (1700 nits). Both should be able to display HDR, if present.
 
Last edited:
I wonder if "Smart HDR 4" is smart enough to avoid this effect when combining images?
Except when using Live Photo or a special app to produce the effect, I haven’t seen motion blur from an iPhone (under normal lighting conditions) in recent memory. 🤷🏻‍♂️
 
This issue was raised on other newsgroups, but without a definitive answer. I tried this scene with "High Efficiency" vs "More Compatible" camera settings. The exposure was not quite the same, but close enough. I did not see brighter highlights in the HEIC, nor better shadows, but greens and especially reds are more saturated. However, when converting HEIC to JPEG (using Affinity Photo) the resulting JPEG gets the same high saturation.

I have no idea how the iPhone works. The HEIF had motion from the spill-over fountain. The "compatible" result was actually an MP4 video, but after AirDrop became a JPEG. The HEIC showed motion in Preview, but not after AirDrop. Shrug. I must read up on the iPhone!
It looks like you've taken these shots with the camera set to "Live". In this mode a short video is actually taken and one frame extracted as a still image. If you look in Photos you only see the still image and not the video file but it's there "hidden" in the library. You can change the frame you want extracted in the Edit mode of Photos.

In this case the only difference between HEIC and JPG is the container format and the codec used (HEVC and AVC respectively). The HEVC codec gives a smaller file size. There is no difference in dynamic range.

Gain maps have been mentioned in this thread. It is worth noting that iPhone photos automatically have a gain map inserted for both JPG's and HEIC's.

Dave
 
This issue was raised on other newsgroups, but without a definitive answer. I tried this scene with "High Efficiency" vs "More Compatible" camera settings. The exposure was not quite the same, but close enough. I did not see brighter highlights in the HEIC, nor better shadows, but greens and especially reds are more saturated. However, when converting HEIC to JPEG (using Affinity Photo) the resulting JPEG gets the same high saturation.

I have no idea how the iPhone works. The HEIF had motion from the spill-over fountain. The "compatible" result was actually an MP4 video, but after AirDrop became a JPEG. The HEIC showed motion in Preview, but not after AirDrop. Shrug. I must read up on the iPhone!
It looks like you've taken these shots with the camera set to "Live". In this mode a short video is actually taken and one frame extracted as a still image. If you look in Photos you only see the still image and not the video file but it's there "hidden" in the library. You can change the frame you want extracted in the Edit mode of Photos.
Yes! I don't know what I'm doing with iPhone. I guess Live is default under normal conditions. Apple help says "with Live Photos, your iPhone records what happens 1.5 seconds before and after you take a picture." Presumably that means before and after the selected frame.

I tried posting the Live image, set to Loop, on Facebook. It didn't move. However I have seen 3D and pan-scan images posted by friends on Facebook.
In this case the only difference between HEIC and JPG is the container format and the codec used (HEVC and AVC respectively). The HEVC codec gives a smaller file size. There is no difference in dynamic range.
Unix file(1) command says the "JPEG" is actually a MOV container with QT encoding. Maybe QT is the same as AVC?
Gain maps have been mentioned in this thread. It is worth noting that iPhone photos automatically have a gain map inserted for both JPG's and HEIC's.
Looks like I'll have to trick my phone into shooting HDR instead of Live, if I want to take a look at the effect of gain maps.
 
Last edited:

Keyboard shortcuts

Back
Top