Why some camera manufacturers still haven't implement HEIF image

HEIF isn't a competitor on ¾ of the laptops and desktops currently in use.
I suspect the vast majority of those computers are being used for very mundane purposes and not for processing or viewing photos.

For photography I have nothing against JPEG for delivering photos - that's the utility of having a lowest commonly used data format, but HEIF is superior to JPEG as an initial recording format.
 
but HEIF is superior to JPEG as an initial recording format.
Is it superior to the vendor's raw format or is the intent of HEIF to replace all of the various raw formats to become a universal standard for high-quality recording?

I don't know the answer, that's why I'm asking.

I shoot both raw and jpg on my cameras. I scan my film to raw. Sometimes I'll just use the jpg but usually I edit the raw to jpg. I don't see any reason for me to want yet another standard.
 
but HEIF is superior to JPEG as an initial recording format.
Is it superior to the vendor's raw format
I don't think it is, but that's for the type of photographs I like to make and the non-portrait range of subjects I photograph. In all likelihood, what you do will probably be different from what I do.
or is the intent of HEIF to replace all of the various raw formats to become a universal standard for high-quality recording?
I think that is a pretty broad and, frankly, unsustainable supposition.
I shoot both raw and jpg on my cameras. I scan my film to raw.
If you are using a scanner, the files that are produced aren't really a raw format, no matter what your scanning software is telling you. At least that was true the last time I checked, which was probably ten years ago, but things might have changed. If you are using a camera to digitize prints or film, that is a different matter.
Sometimes I'll just use the jpg but usually I edit the raw to jpg. I don't see any reason for me to want yet another standard.
If what you are doing works for you, don't change. But is it worth doing critical testing? I think so.
 
Good camera!
 
I'm very positive that none of my current cameras will get a firmware update (if that's all that would be required?) to enable it to save as HEIF. And I'm not going to buy a newer camera just because it does save as HEIF. Actually, I don't intend to buy a newer camera at all.

If my post software supported it is there any reason to save cooked raw images to HEIF?

And what kind/res' monitor is needed to view these HEIF files?
 
A monitor which supports 10bit color.
 
A monitor which supports 10bit color.
From my monitor specs...



03b8dfdec4b848a5abf7ad4191b6414a.jpg
 
The best-selling camera in the world is the iPhone and it shoots HEIF, not JPEG.
That's a misleading statement since probably many to most people didn't buy the iPhone specifically for the camera. Very few people who use the iPhone camera care at all what image format the phone uses.
iPhone users don’t care what image format the camera uses, they care that the photos look good… and they do. People post on DPReview all the time that they get better SooC shots out of their phone than out of their dedicated cameras, and a 10-bit lossy file format is part of that system… not all of it, but part of it.

Apple has been shipping ultra high-quality screens in their laptops and phones for a long time, and a 10-bit image file is part of the quality. As a photographer, you know perfectly well that we seek perfection from glass to glass.

HEIF is a superior image format to JPEG. It just plain is, and that’s why Apple is using it. 4:2:2 10-bit is better than 4:2:0 8-bit, because of today’s screens.
 
  1. Sittatunga wrote:
It's a horrible format, so let's hope not.

In one of my many silly Samsung upgrades it reset my default from jpeg to heif. It was awful and it took me quite awhile to find the problem and reset it.
“Oh my Lord, it’s something new! Quick, get it away from me!!”

John, this is the nature of new things.

LCD panels now support 10-bit color, but JPEG cannot. That is why HEIF is so much better. We just need time for Microsoft and the others to come up to speed on it.
I'm sure they will the instant they don't have to pay licensing fees. The situation between the formats is a bit like the Betamax versus VHS struggle except that JPEG has had a 13 year start on HEIF and has already seen off JPEG 2000.
13 years? No, JPEG came out around 1990.
My apologies for the typo, it's nearly a 23 year start. According to Wikipedia, JPEG came out on 18 September, 1992 and HEIF was finalised in the middle of 2015.
JPEG2000 went nowhere. It’s not a competitor. If you want 10-bit color lossy, it’s HEIF.
HEIF isn't a competitor on ¾ of the laptops and desktops currently in use. I'm not saying that HEIF will go the way of JPEG 2000, but it's not immediately noticeably better than JPEG on a 6" / 150mm phone screen so it's only really relevant if you only want to reach the 1 person in 7 who uses macOS. By all means add HEIF output to cameras, though it's not a format most people can use. Look at HEIF images in the privacy of your own home by all means, but it isn't a useful publishing format.
The best-selling camera in the world is the iPhone and it shoots HEIF, not JPEG.
It's had between 20% and 24% (currently 16%) of global shipments since 2009, and their shipments exceeded 15% of global shipments for 32 of those 96 quarters, see www.statistica.com/statistics/316459/global-market-share-of-apple-iphone/ . So, like computers, well under a quarter of smartphones produced since HEIF was a thing use HEIF. (I have twice been issued with an office iPhone in the days when they produced JPEGs and expected to take site inspection photos with it. It was unbelievably complicated to get photos from them into a Word document, particularly when IT forbade us to plug phones into computers for security reasons. I'm well shut of them now.)
It’s just silly that you’re thinking an inferior format is better. It’s not better and if you want better color and HDR in snapshots, you must use HEIF because JPEG cannot do it.
Actually 16 bit TIFF is a much better format for better colour and dynamic range, and just about any computer and display can use them. They are huge, though very useful for extreme colour and contrast stretches. But the human eye can distinguish between about a million and about 10 million colours, so 8 bits per channel (16 777 216 colours) is comfortably more than the human eye can see anyway. So the difference between JPEG and HEIF as a display or printing format is the difference between 'more than adequate' and 'much more than adequate'. I'd rather edit lossless 14 bit RAW format files than any 8 or 10 bit format.
Thank you for confirming the fact that the iPhone is far and away the best-selling camera in the world.

As I correctly explained, there are many photographers who do not wish to photograph in RAW, or they cannot photograph in RAW. You always shoot RAW, good for you, here’s you medal. For many reasons, that’s not possible for everyone. If you’re shooting for a newspaper, you need photographs ASAP.

HEIF delivers more dynamic range with the same file sizes as JPEG. So you can still move the Highlights/Shadows/Saturation sliders around and get a decent image, without the steps of RAW editing. I do that on my iPhone all the time. I once tried using the 45 megapixel mode on my iPhone and it’s virtually useless — it created 60 megabyte DNG’s, which took forever to copy off the phone, and it disables all of the phone’s image processing as well, so you take several minutes to download one image which you can edit in Lightroom or wherever.
 
It's a horrible format, so let's hope not.

In one of my many silly Samsung upgrades it reset my default from jpeg to heif. It was awful and it took me quite awhile to find the problem and reset it.
“Oh my Lord, it’s something new! Quick, get it away from me!!”

John, this is the nature of new things.

LCD panels now support 10-bit color, but JPEG cannot. That is why HEIF is so much better. We just need time for Microsoft and the others to come up to speed on it.
I'm sure they will the instant they don't have to pay licensing fees. The situation between the formats is a bit like the Betamax versus VHS struggle except that JPEG has had a 13 year start on HEIF and has already seen off JPEG 2000.
Betamax had better IQ, but VHS was cheaper and more open.

HEIF has better IQ than JPEG, and is free and open format. It will dominate in few years (when most consumers upgrade their hardware) if nothing else shows up.
 
It's a horrible format, so let's hope not.

In one of my many silly Samsung upgrades it reset my default from jpeg to heif. It was awful and it took me quite awhile to find the problem and reset it.
“Oh my Lord, it’s something new! Quick, get it away from me!!”

John, this is the nature of new things.

LCD panels now support 10-bit color, but JPEG cannot. That is why HEIF is so much better. We just need time for Microsoft and the others to come up to speed on it.
I'm sure they will the instant they don't have to pay licensing fees. The situation between the formats is a bit like the Betamax versus VHS struggle except that JPEG has had a 13 year start on HEIF and has already seen off JPEG 2000.
Betamax had better IQ, but VHS was cheaper and more open.

HEIF has better IQ than JPEG, and is free and open format. It will dominate in few years (when most consumers upgrade their hardware) if nothing else shows up.
I don’t think HEIF is free, but its patents are all part of the HEVC patent pool, and most cameras use those patents anyway. I have no idea whether VIA licensing charges more when the patents are used for HEIF inside a camera.
 
AFAIK, HEIF container format is royalty-free for everybody. HEIC (i.e. compression) is not.
 
And what kind/res' monitor is needed to view these HEIF files?
Look if it's on the list here

https://displayhdr.org

If it is, then physically it can display HDR (according to the VESA standard), but then you also need it to be hooked up to a computer whose hardware, OS and application software for stills photography display HDR.

To simplify things, Macs with displays since 2018 (2019 or 2020 depending on the model) have the HW and SW to do it for stills. Video on the other hand has been figuring this out since 2015 and is much farther along. Practically all TVs larger than 32" have been HDR capable for video since around 2020. I have a 2016 TV that is HDR. You can hook up most TVS in people's homes right now to your computer as a display and get HDR stills with the proper computer HW/SW. That doesn't mean that the colors will be accurate but that's another discussion.
 
Last edited:
I'm very positive that none of my current cameras will get a firmware update (if that's all that would be required?) to enable it to save as HEIF. And I'm not going to buy a newer camera just because it does save as HEIF. Actually, I don't intend to buy a newer camera at all.

If my post software supported it is there any reason to save cooked raw images to HEIF?

And what kind/res' monitor is needed to view these HEIF files?
Any monitor that can display jpeg can also display HEIF files. It's not the monitor requirement, but the software. Of course, if your image includes HDR data, both the monitor and software, e.g., browser, need to support it. I started a thread in another forum for such images. If you have a supporting monitor, you should be able to see them in Chrome and some other browsers, like Brave. This forum does not display the originally uploaded photo, but a derivative of it in the main post, so to test if your setup supports HDR display, follow the link to the original. These are not HEIF files but are HDR Jpeg files exported from Lightroom.
https://www.dpreview.com/forums/post/67881805

If your setup supports viewing these files, the originals will look much different from the ones shown in the message (mostly brighter brights, richer yellows, etc.)

--
Victor Engel
 
Last edited:
“Buy one get one free” still means it isn’t free, right?
 
HEIF image is excellent in image quality (4:4:2 10 bit) and compact file size. Nowadays, most of the smartphone and Mac does support it. More TV, computer monitor do support 10bit HDR.

Why some camera manufacturers still haven't implement it?
Raw is superior. And I think most dedicated camera users who actually go through the trouble of even buying a camera don't want an inferior format to Raw. I'd never shoot in HEIF. That doesn't mean I wouldn't export in it, but there's no point in cameras supporting it.
Your point about RAW is correct, of course.
But HEIF is far better than RAW,
HEIF doesn't contain as much data as a RAW file, and is not as malleable. That makes HEIF ''worse' for some people.
and journalists who must shoot in JPEG can now shoot in a 10-bit color format. For instance, credentialed Olympic photographers cannot edit their photos in any way or else they lose their credential.
Yes, less malleable (and smaller file size) is 'better' for some people.
Really? Is this truly so difficult for you to understand, or are you just trying to be obtuse?

I’ve repeated this over and over. Not all photographers can use RAW files. Not all applications permit it. RAW is lossless or visually lossless compressed and JPEG/HEIF are not. RAW files cannot be used straight out of the camera and JPEG/HEIF can.

HEIF is a better JPEG, it is not a better RAW.
 
HEIF image is excellent in image quality (4:4:2 10 bit) and compact file size. Nowadays, most of the smartphone and Mac does support it. More TV, computer monitor do support 10bit HDR.

Why some camera manufacturers still haven't implement it?
Raw is superior. And I think most dedicated camera users who actually go through the trouble of even buying a camera don't want an inferior format to Raw. I'd never shoot in HEIF. That doesn't mean I wouldn't export in it, but there's no point in cameras supporting it.
Your point about RAW is correct, of course.
But HEIF is far better than RAW, and journalists who must shoot in JPEG can now shoot in a 10-bit color format. For instance, credentialed Olympic photographers cannot edit their photos in any way or else they lose their credential.
HEIF, like JPEG, is a finished image file type. 8 bit colour is more than enough for printing from or viewing but 10 bit of better for the sort of editing that needs journalists are not allowed to do anyway. The HEIF compression may well be better, but the final result is a format that ¾ of the people you want to reach can't open. The advertisers won't like that.

It's the old videocassette y thing again; Betamax was better but VHS was good enough and cheaper.
It’s not Betamax vs VHS, because I couldn’t update the firmware in my VHS player to play Betamax tapes.

My Canon R5 can capture HEIFs. My iPhone captures HEIFs. It’s not that big a deal. The only question is whether more browsers and software products support HEIF or not, and if they do, we’ll use HEIF more than JPEG and we won’t even think about it.
 

Keyboard shortcuts

Back
Top