Why are sensors designed to mimic the eye’s spectral response?

Started 2 months ago | Discussions thread
ForumParentFirstPrevious
Flat view
NateW Forum Member • Posts: 85
Why are sensors designed to mimic the eye’s spectral response?

First of all, let me say that I know I’m punching above my weight class here. I’m sure there’s something I’m misunderstanding or missing, so I just want to be clear that I’m offering these questions out of curiosity, not necessarily contention with the way things are done

Anyway, If you were to use a theoretical digital camera that perfectly corresponds to the Luther-Ives condition to photograph the full color spectrum and then display that image on a theoretical monitor who’s primaries perfectly match the color response of the sensor wouldn’t you end up with an image that distorts color by effectively multiplying the the color response boundaries of the human eye? For instance, in the true red (say, 620-680nm) part of the spectrum the response of the human eye falls off quickly; if the camera response also falls off just as quickly wouldn’t you be doubling the visual fall off?

In the analog realm film stocks are typically much more sensitive to these deep reds than the human eye and images are printed on paper whose dyes reflect light beyond the deep reds into the near IR. Thus when viewed by human eyes (under light that emits these near IR wavelengths) the red response is rolled off naturally.

I understand that displays aren’t capable of displaying the same full spectrum that print paper dyes pass, but wouldn’t it make more sense then to design sensors to be closer to physically achievable display/print color gamuts?

ForumParentFirstPrevious
Flat view
Post (hide subjects) Posted by
ForumParentFirstPrevious
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow