Thom Hogan hints 50mp for upcoming D820/D850

Started Mar 19, 2017 | Discussions thread
havoc315 Veteran Member • Posts: 4,552
Re: Simulation vs. WYISWYG

Canadianguy wrote:

The example you made yourself can't. The sensor is making 30 or 60 very dark images pr. second (they are made, even they are not stored) - while you are being shown a bright on the EVF.

You're playing with semantics with absolutely no point --- When you view the image in the EVF, if your exposure is going to be a bright exposure, you see a bright image.  If your exposure is going to be a dark image, you see a darkimage.

Just like the final image is coming from the imaging sensor. Are you under some belief that there are extra sensors in mirrorless cameras? You're confusing me greatly. if the data isn't coming from the imaging-sensor, where do you think it is coming from?!?!?!?

No - not at all.

What you see on the EVF is not informations coming direct from the sensor. It's an image, that is very much manipulated and interpolated by the cameras software -

But the processor is indeed getting that information --- wait for it -- from the Sensor.

Again, you're playing with semantics for no reason.  If I say "I flew directly from New York to Paris" ---   You would object and say, "no.. you took a plane!"

Yes, the data goes through a processor.     Because you can't see data.  And people can't flap their arms and fly from New York to Paris.

The data is originating directly from sensor information.     If I point my camera at a flower --- the light get directed through the lens to the SENSOR... the SENSOR collects data.... gets processed and sent to the viewfinder.

For some reason, you seem to be objecting to the word "direct" because there is a processor in between.  But that's a ridiculous semantic distinction.   If you look at that flower with your eyes, no camera at all ... is the image going directly to your brain?  Based on your tight definition, nothing is ever direct.  After all, a flower isn't actually be transposed on to your brain.  Light is travelling through air...  then the nerve receptors in your eyeballs convert that light into nervous system signals.. which then travel to the brain, where they are processed into an image.    So based on your tight semantics, nothing is ever ever ever direct.

that was my point - the EVF is nothing as WYSIWYG - the EVF is not getting any informations direct from the sensor, none so what ever.

So where is the data coming from?   Is it just randomly guessing what the lens is pointed at?  So next time I point my lens at a flower, the EVF is going to show me a clown instead?!?!

Or... hold on --- the EVF is getting the image information from --- guess what-- from the sensor!!!

If the EVF suits you, then that is good - it was not my business to denigrate and disparage the EVF - only to counter the often used "advantage" of the EVF being a "WYSIWYG" part.


If you wish to call it a simulation instead of WYSIWYG - than sure call it a simulation - it stills gets the user a better preview of what is captured in the file than an OVF and that's the advantage some want in their cameras.

Precisely.  It is a simulation of what-you-see-is-what-you-get.   And when used properly, it can be an extremely accurate simulation.

Me, I want that simulation on my LCD to work better than what Nikon has been able to get till now and leave my OVF alone. That way - I can pick how I want to shoot.

I think Nikon should go both routes --- Improved LCD on their OVF cameras.  I do think they should continue OVF dSLRs.

But also a good EVF on a mirrorless lineup.

 havoc315's gear list:havoc315's gear list
Sony a6300 Sony a7R III Sony E 50mm F1.8 OSS Nikon AF-S Nikkor 18-35mm f/3.5-4.5G ED Sony FE 35mm F2.8 +10 more
Post (hide subjects) Posted by
(unknown member)
(unknown member)
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow