Why does Apple Lie to Photographers?

  • Thread starter Thread starter Peter Sills
  • Start date Start date
P

Peter Sills

Guest
Alright, so Apple charges a premium for their systems. The "Apple Tax" so to speak. They are supposed to be THE computer for the working photographer, graphic artist, videographer, etc.

So how come they lie to us???

Apple monitors so far DO NOT DISPLAY 10-BIT COLOR!!!

Apple monitors display 8-bit color and dither the remaining 2-bits from an 8-bit palette. In other words, they take in a 10-bit signal, but display it at 8-bits. The only Apple display that truly shows 10-bit is the 5k display under El Capitan. Huh?

Read here: https://macperformanceguide.com/blog/2015/20151105_2300-OSX_ElCapitan-10bit-dualNEC.html

Apple has bee touting 10-bits since 2015 and it is all BULL! I assume that the true 10-bit displays from NEC, EIZO and BenQ will work properly with El Capitan as well, but am unsure until some tests it.

My PCs running Windows 10 run 10-bit flawlessly with those displays and an Nvidia card.

I smell another lawsuit!
 
Alright, so Apple charges a premium for their systems. The "Apple Tax" so to speak. They are supposed to be THE computer for the working photographer, graphic artist, videographer, etc.
In my professional experience, there are far more Windows machines in use by photographers, videographers, and most certainly, graphic designers. We have a couple, but our shop could not run on only Macs, as the raster image processors we use for proofing and output are often incompatible and can't be relied upon for long periods of use. I don't think Apple claims outright that they are THE computer for this type of work. It's mostly FUD spread through internet message boards by people from outside the industry pretending to know more than they actually do.
 
One of the reasons we use Windows is that there are some FREE programs for Windows that we rely on that don't even exist on Macs. Though the Macs are great systems, I think they are more geared toward home users than professionals.
 
You do realize that Chambers wrote that piece back in October 2015.

and that there have been at least two updates of the OS X operating since Yosemite: Sierra and High Sierras.
 
Yes, but El Capitan is the first OS to officially support 10-bit output and so far only one Apple monitor (5k) supports it.
 
You do realize that Chambers wrote that piece back in October 2015.

and that there have been at least two updates of the OS X operating since Yosemite: Sierra and High Sierras.
I am not aware of Sierra or High Sierra addressing the issue.
 
Yes, but El Capitan is the first OS to officially support 10-bit output and so far only one Apple monitor (5k) supports it.

--
Peter Sills
Digital Focus
www.digitalfocus.net
www.focusstudios.com
Is your hardware (video and graphics processor as well as the display) recent enough to support 10-bit?

By the way my 27" Retina iMAc display is a 10-bit display and a 10 bit capable video pipeline.

--
Ellis Vener
To see my work please visit http://www.ellisvener.com
And follow me on instagram at therealellisv
 
Last edited:
Actually its not. Its an 8-bit panel and whule it will take in a 10-bit signal, the remaing 2-bits are dither. Call Apple. They will confirm this. The ONLY true 10-bit Apple display is the stand alone 5k.
 
Actually its not. Its an 8-bit panel and whule it will take in a 10-bit signal, the remaing 2-bits are dither. Call Apple. They will confirm this. The ONLY true 10-bit Apple display is the stand alone 5k.
 
Apple monitors display 8-bit color and dither the remaining 2-bits from an 8-bit palette. In other words, they take in a 10-bit signal, but display it at 8-bits. The only Apple display that truly shows 10-bit is the 5k display under El Capitan. Huh?
Because it is the only display Apple sells that offers 10-bit colour.
Read here: https://macperformanceguide.com/blog/2015/20151105_2300-OSX_ElCapitan-10bit-dualNEC.html

Apple has bee touting 10-bits since 2015 and it is all BULL! I assume that the true 10-bit displays from NEC, EIZO and BenQ will work properly with El Capitan as well, but am unsure until some tests it.
You have an interesting definition of 'touting'. You need to search deep in Apple's documentation to find any mention of 10-bit colour. And what you call lying at worst is the lack of distinguishing between hardware 10-bit and temporal-dithering 10-bit in the System Information app.
 
Actually its not. Its an 8-bit panel and whule it will take in a 10-bit signal, the remaing 2-bits are dither. Call Apple. They will confirm this. The ONLY true 10-bit Apple display is the stand alone 5k.
There is no standalone 5K Apple display. Their only 5K display is the one that is built into the 5K iMac. And it is exactly that display which has 10-bit hardware. Lloyd is rambling a bit in the linked article, that might have contributed to you getting this wrong.
 
Sorry, didn't realize the 5k was also built into an iMac. However, that is the only display from Apple which currently works.
 
Yeah, I checked and you are correct. Though they announced plans for a standalone 5k the only current one is in the very expensive iMac.
 
Apple has been talking about 10-bit since 2015 and just delivered it in the middle of 2017.
 
Apple has been talking about 10-bit since 2015 and just delivered it in the middle of 2017.

--
Peter Sills
Digital Focus
www.digitalfocus.net
www.focusstudios.com
No they didn't - it was already there in their iMac hardware, but El Capitan was needed to give it the needed OS-support. So 2014 iMacs running El Cap or later have been delivering 10-bit colo(u)r.

And, like mentioned by others before, it was done 'quietly':
https://www.cultofmac.com/395028/apple-quietly-added-10-bit-color-support-for-new-5k-imac/

Next time you accuse a company of lying, do your homework first or save up for a good lawyer.
 
Last edited:
These are decent monitors. For TRUE 10-bit you will need a pro monitor as true 10-bit support is RARE. These are from EIZO, NEC or BenQ.

In fact, even Apple is folding a bit on their own 10-bit claims.

Read the section on the new iMac Pro when it comes to color: https://www.apple.com/imac-pro/

Not that they say 10-bit Spatial and Temporal dithering in 10-bit mode. Here is the definition:

There are two kinds of dithering, both of which can simulate the reproduction of colors:
  • Spatial dithering. It is done by sticking two different colors next to each other. At a normal viewing distance, those two colors will appear to mix, making us see the desired target color. This technique is often used both in print and in movies.
  • Temporal dithering. Also called Frame Rate Control (FRC). Instead of sticking two similar colors next to each other, a pixel will quickly flash between two different colors, thus making it look to observers like it is displaying the averaged color.
With good dithering, the result can look very much like higher bit-depth, and many monitors use this process to smooth out gradients onscreen. 8-bit monitors can use dithering to generate a picture that looks very much like it has 10-bit color depth.

This is not true 10-bit! (I don't know if the current 5k iMac is doing the same, but would assume so.)
 
Check (I just did). 8-Bit panels with 10-bit processing.

No true 10-bit.
 
These are decent monitors. For TRUE 10-bit you will need a pro monitor as true 10-bit support is RARE. These are from EIZO, NEC or BenQ.

In fact, even Apple is folding a bit on their own 10-bit claims.

Read the section on the new iMac Pro when it comes to color: https://www.apple.com/imac-pro/

Not that they say 10-bit Spatial and Temporal dithering in 10-bit mode. Here is the definition:

There are two kinds of dithering, both of which can simulate the reproduction of colors:
  • Spatial dithering. It is done by sticking two different colors next to each other. At a normal viewing distance, those two colors will appear to mix, making us see the desired target color. This technique is often used both in print and in movies.
  • Temporal dithering. Also called Frame Rate Control (FRC). Instead of sticking two similar colors next to each other, a pixel will quickly flash between two different colors, thus making it look to observers like it is displaying the averaged color.
With good dithering, the result can look very much like higher bit-depth, and many monitors use this process to smooth out gradients onscreen. 8-bit monitors can use dithering to generate a picture that looks very much like it has 10-bit color depth.

This is not true 10-bit! (I don't know if the current 5k iMac is doing the same, but would assume so.)
 

Keyboard shortcuts

Back
Top