Why Do We Still Use 8-Bit JPEG When Screens and Phones Are Capable of 10-bit?

Started 1 month ago | Discussions thread
ForumParentFirstPrevious
Flat view
FluidKnowledge Regular Member • Posts: 448
Why Do We Still Use 8-Bit JPEG When Screens and Phones Are Capable of 10-bit?

This has been really puzzling me for a while and not sure how to easily solve. JPEG images are limited to only 8-bit color and is an ancient, archaic format for the dawn of the internet yet traditional photographers still use it as the defacto standard for final images even though modern cameras capture 12-14 bits of color.

And 10-bit laptop, TV, and smartphone screens have been the standard for years now with the introduction of HDR movies. So why are we still using only 8-bit JPEG? Our photos will never look as good as they could be on modern screens discarding those additional bits.

I read the other day that iPhones now use the new HEIF photo format that by default stores 10+ bits of color information and since the iPhone display is already 10-bit capable, the photos on it look stunning simply because it keeps more color information than us traditional photographer JPEGs. When viewed on Apple laptops and desktops, the HEIF photos would have a lot more color than an equivalent JPEG.

Google doesn't support HEIF photos so Android photos don't currently have the same color benefit even though virtually all modern Android phone displays are capable of showing 10-bit color.

So what is a traditional photographer supposed to do to not get left behind with JPEG 8-bit color in a 10-bit display world? Use PNG or TIFF as the final format for photos?

ForumParentFirstPrevious
Flat view
Post (hide subjects) Posted by
tko
ForumParentFirstPrevious
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow