So much for the EVF being insufficient for action.....

Started 11 months ago | Discussions
MJW1
Contributing MemberPosts: 853Gear list
Like?
Re: Trust manufacturer's claim?
In reply to DVT80111, 10 months ago

DVT80111 wrote:

My X-E1 is a dog in low light. I can't even shoot my kids blowing the candles.

I had that situation the other night. We were in a restaurant and the staff brought out the cake before I was ready. I only had time to grab the camera, turn it on, lift it to my eye and trust it to handle exposure and fast autofocus. My DSLR would have handled it, but it was at home. Still, I ended up with a set of well focused, well exposed shots. Good thing I had my Nikon 1 V1 with me!

 MJW1's gear list:MJW1's gear list
Nikon D7000 Fujifilm FinePix S1 Pro Fujifilm FinePix S3 Pro Nikon D50 Panasonic Lumix DMC-G1 +15 more
Reply   Reply with quote   Complain
Mike CH
Veteran MemberPosts: 3,543Gear list
Like?
Re: Why not...
In reply to MJW1, 10 months ago

MJW1 wrote:

Mike CH wrote:

Leonard Migliore wrote:

Mike CH wrote:

YRUNVS wrote:

Were you abused by a SLR as a child?

And thus can't use a camera with a mirror for those all-important selfies...

I can't see how you can take a selfie with an SLR unless you cheat and use live view with a swiveling screen.

... stick a wide angle lens on the camera, set the AF to area, turn the camera around and hit the shutter?

Result can't be much worse than the majority of other selfies

Alternatively you could just get one of those new-fangled cameras with one of those fancy "self timers" (first introduced by Contax in 1936, I believe) and one of those funky "tripod" thingies.

I am pretty sure that a 'proper' selfie requires handholding the camera. And making a face.

Regards, Mike
--
Wait and see...

 Mike CH's gear list:Mike CH's gear list
Canon EOS 5D Mark III Canon EF 16-35mm f/2.8L II USM Canon EF 100mm f/2.8L Macro IS USM Canon EF 180mm f/3.5L Macro USM Canon EF 70-200mm f/2.8L IS II USM +8 more
Reply   Reply with quote   Complain
unknown member
(unknown member)
Like?
also doable
In reply to Mike CH, 10 months ago

Mike CH wrote:


Alternatively you could just get one of those new-fangled cameras with one of those fancy "self timers" (first introduced by Contax in 1936, I believe) and one of those funky "tripod" thingies.

I am pretty sure that a 'proper' selfie requires handholding the camera. And making a face.

Regards, Mike
--
Wait and see...

with a dslr! just use AF and a 35mm lens!

Reply   Reply with quote   Complain
MJW1
Contributing MemberPosts: 853Gear list
Like?
Re: also doable
In reply to canonagain123, 10 months ago

canonagain123 wrote:

Mike CH wrote:

Alternatively you could just get one of those new-fangled cameras with one of those fancy "self timers" (first introduced by Contax in 1936, I believe) and one of those funky "tripod" thingies.

I am pretty sure that a 'proper' selfie requires handholding the camera. And making a face.

Regards, Mike
--
Wait and see...

with a dslr! just use AF and a 35mm lens!

I must admit, I did think the self-timer and tripod was cheating but I couldn't resist.

 MJW1's gear list:MJW1's gear list
Nikon D7000 Fujifilm FinePix S1 Pro Fujifilm FinePix S3 Pro Nikon D50 Panasonic Lumix DMC-G1 +15 more
Reply   Reply with quote   Complain
57LowRider
Senior MemberPosts: 2,698Gear list
Like?
Re: Trust manufacturer's claim?
In reply to DVT80111, 10 months ago

DVT80111 wrote:

My X-E1 is a dog in low light. I can't even shoot my kids blowing the candles.

M mode and AF-L helps in those circumstances.

 57LowRider's gear list:57LowRider's gear list
Fujifilm X-E1 Fujifilm X-T1 Fujifilm XF 35mm F1.4 R Fujifilm XF 14mm F2.8 R Fujifilm XF 18-55mm F2.8-4 R LM OIS +8 more
Reply   Reply with quote   Complain
enemjii
Senior MemberPosts: 1,602Gear list
Like?
Re: OVF vs EVF and Bokeh
In reply to enemjii, 10 months ago

enemjii wrote:

Greg A A wrote:

Many people don't want to look at the world they are photographing through a small TV set. Try replacing someone's glasses with an electronic viewing screen. Why doesn't Google just replace the entire glasses with an electronic screen? This is why many people prefer their OVF.

Theoretically, what you see on mirror is not what is captured on Sensor because the sensor is not emulating a mirror. They are 2 different technologies. You may think you like what you see in a mirror, but when it hits the sensor, it is different. Couple of days back, I was composing a bokeh on a Canon EOS 100D/SL1, and I was excited with what I saw on the mirror, snapped the picture and when I preview it, it is completely different.

In other words, I like looking at the mirror, but would have the electronics companies come up with an EVF rather quickly that shows me what the sensor sees, and will capture.

Search and yea shall succeed. I did find a picture of what I was talking about the Bokeh in OVF vs the EVF. See for yourself and shake your head in disbelief.

 enemjii's gear list:enemjii's gear list
Sony SLT-A65 Sony a77 II Sony DT 55-300mm F4.5-5.6 SAM Sony DT 35mm F1.8 SAM Sigma 85mm F1.4 EX DG HSM +6 more
Reply   Reply with quote   Complain
enemjii
Senior MemberPosts: 1,602Gear list
Like?
it takes one-tenth of a second for the brain to process what it sees
In reply to Sante Patate, 10 months ago

Sante Patate wrote:

Ontario Gone wrote:

Well in case anybody else hasn't heard, Fuji just dropped a bomb [...] they also ratcheted down the lag. We are looking at a 0.005 sec lag time, not bad eh?

A bird flying at 60 km/h travels 8cm in 5 milliseconds. With a 200mm lens, on APS-C, giving an AoV of 8 degrees, a bird with a two metre wingspan fills the frame if it is 11 metres away. Obviously, you want to centre it in the frame. That is not easy if you have an OVF, and what you see as you move the camera tracks the bird's motion with no delay. But the 5 miilisecond lag time corresponds to 0.4 degrees, one twentieth of the frame, so by the time you see it in the centre, it is actually off the far edge.

I am sure it is possible for the human brain to learn to compensate for the viewfinder lag, but in my experience that is a very steep learning curve even if the motion is predictable.

When human eyes see an object, it takes one-tenth of a second for the brain to process that information, said Gerrit Maus, a postdoctoral fellow in psychology at UC Berkeley.

"The fundamental problem is that our brain doesn't work in real-time," Maus said. "The brain actually works rather slow, compared to some electronics or computers that we have today. Information that the brain receives from the eye is already out of date by the time it gets to the visual cortex."

"The brain does not think the object is in the position where the eye tells us it [that it] is," Maus told LiveScience. "The object is shifted forward in the direction that it's moving, so we're actually predicting where things are going to be."

This means the brain perceives moving objects to be farther along in their trajectory than what a person actually sees with their eyes, he explained.

 enemjii's gear list:enemjii's gear list
Sony SLT-A65 Sony a77 II Sony DT 55-300mm F4.5-5.6 SAM Sony DT 35mm F1.8 SAM Sigma 85mm F1.4 EX DG HSM +6 more
Reply   Reply with quote   Complain
TrojMacReady
Senior MemberPosts: 8,534
Like?
You forgot something. Or should I say 3 things.
In reply to Sante Patate, 10 months ago

Sante Patate wrote:

Ontario Gone wrote:

Well in case anybody else hasn't heard, Fuji just dropped a bomb [...] they also ratcheted down the lag. We are looking at a 0.005 sec lag time, not bad eh?

A bird flying at 60 km/h travels 8cm in 5 milliseconds. With a 200mm lens, on APS-C, giving an AoV of 8 degrees, a bird with a two metre wingspan fills the frame if it is 11 metres away. Obviously, you want to centre it in the frame. That is not easy if you have an OVF, and what you see as you move the camera tracks the bird's motion with no delay. But the 5 miilisecond lag time corresponds to 0.4 degrees, one twentieth of the frame, so by the time you see it in the centre, it is actually off the far edge.

I am sure it is possible for the human brain to learn to compensate for the viewfinder lag, but in my experience that is a very steep learning curve even if the motion is predictable.

You already have to compensate for far more than 5 milliseconds. Even prefocused with the fastest DSLR's, the shutterlag is already 40-50 milliseconds. Meaning that by the time you decide to hit the shutterbutton, the bird still moves for 40-50 milliseconds. Entering AF into that equation, adds another 90-250 milliseconds. And let's not overlook the largest delay, the human reaction time, usually being the longest.

That puts a whopping 5-20 milliseconds into perspective.

Reply   Reply with quote   Complain
Erik Magnuson
Forum ProPosts: 12,103Gear list
Like?
Not the mirror but the focus screen
In reply to enemjii, 10 months ago

enemjii wrote:

I was composing a bokeh on a Canon EOS 100D/SL1, and I was excited with what I saw on the mirror, snapped the picture and when I preview it, it is completely different.

Its not the mirror that' stone issue, but the "ground glass" view screen.   Moderns laser-cut screens are optimized for brightness;  for more accurate DOF or bokeh you want a fine ground glass

-- hide signature --

Erik

 Erik Magnuson's gear list:Erik Magnuson's gear list
Canon EOS 5D Mark II Canon EOS 450D Sigma SD10 Sony Alpha NEX-5 Nikon D3200 +28 more
Reply   Reply with quote   Complain
enemjii
Senior MemberPosts: 1,602Gear list
Like?
Re: 5 milliseconds is still too much. - BS
In reply to Sante Patate, 10 months ago

Sante Patate wrote:

Ontario Gone wrote:

Well in case anybody else hasn't heard, Fuji just dropped a bomb [...] they also ratcheted down the lag. We are looking at a 0.005 sec lag time, not bad eh?

A bird flying at 60 km/h travels 8cm in 5 milliseconds. With a 200mm lens, on APS-C, giving an AoV of 8 degrees, a bird with a two metre wingspan fills the frame if it is 11 metres away. Obviously, you want to centre it in the frame. That is not easy if you have an OVF, and what you see as you move the camera tracks the bird's motion with no delay. But the 5 miilisecond lag time corresponds to 0.4 degrees, one twentieth of the frame, so by the time you see it in the centre, it is actually off the far edge.

I am sure it is possible for the human brain to learn to compensate for the viewfinder lag, but in my experience that is a very steep learning curve even if the motion is predictable.

So I took my Sony A65, put one eye to the EVF, kept the other eye open. Guess what?

I had perfect stereo vision. The EVF did not contribute to any lag and make me go seasick, as would be the case if both eyes were capturing the action at different rates.

So i call BS on your assertions.

 enemjii's gear list:enemjii's gear list
Sony SLT-A65 Sony a77 II Sony DT 55-300mm F4.5-5.6 SAM Sony DT 35mm F1.8 SAM Sigma 85mm F1.4 EX DG HSM +6 more
Reply   Reply with quote   Complain
Erik Magnuson
Forum ProPosts: 12,103Gear list
Like?
Exactly what you do t want an EVF adding extra lag
In reply to enemjii, 10 months ago

enemjii wrote:

When human eyes see an object, it takes one-tenth of a second for the brain to process that information, said Gerrit Maus, a postdoctoral fellow in psychology at UC Berkeley.

And this is exactly why you don't want an EVF adding extra lag. Your brain has 20-70 years of experience estimating motion based on direct observation.  Add extra lag (or even worse the stutter of static images that happens with high FPS shooting) and you are fighting all of that experience.

-- hide signature --

Erik

 Erik Magnuson's gear list:Erik Magnuson's gear list
Canon EOS 5D Mark II Canon EOS 450D Sigma SD10 Sony Alpha NEX-5 Nikon D3200 +28 more
Reply   Reply with quote   Complain
enemjii
Senior MemberPosts: 1,602Gear list
Like?
Re: Not the mirror but the focus screen
In reply to Erik Magnuson, 10 months ago

Erik Magnuson wrote:

enemjii wrote:

I was composing a bokeh on a Canon EOS 100D/SL1, and I was excited with what I saw on the mirror, snapped the picture and when I preview it, it is completely different.

Its not the mirror that' stone issue, but the "ground glass" view screen. Moderns laser-cut screens are optimized for brightness; for more accurate DOF or bokeh you want a fine ground glass

-- hide signature --

Erik

The simpler answer is to have a better EVF. WYSIWYG.

 enemjii's gear list:enemjii's gear list
Sony SLT-A65 Sony a77 II Sony DT 55-300mm F4.5-5.6 SAM Sony DT 35mm F1.8 SAM Sigma 85mm F1.4 EX DG HSM +6 more
Reply   Reply with quote   Complain
unknown member
(unknown member)
Like?
Evolution complication: double compensation, double complication
In reply to TrojMacReady, 10 months ago

TrojMacReady wrote:

You already have to compensate for far more than 5 milliseconds. Even prefocused with the fastest DSLR's, the shutterlag is already 40-50 milliseconds. Meaning that by the time you decide to hit the shutterbutton, the bird still moves for 40-50 milliseconds. Entering AF into that equation, adds another 90-250 milliseconds. And let's not overlook the largest delay, the human reaction time, usually being the longest.

That puts a whopping 5-20 milliseconds into perspective.

People have evolved to negating their internal lag when observing moving objects. This accomodation comes from learned discrepancy between sight (what is seen) and reality (what is). The brain does not easily discern between movement seen in real time though OVF or with bare eyes and movement seen on EVF with a lag, no matter how small the lag is. All other factors being equal (once you use a camera for fast action shooting long enough, your brain also learns to accommodate for the specific lag times of that camera, but it's a relatively slow process), adding another layer of lag complicates thing. Now let's see why it complicates things very much:

Learning to add to your prediction due to shutter lag is a learned response directly related to your own reaction time. It uses the tactile sense and relation to past experience in the same way you learn to predict where a baseball will be in order to catch it. Hand predicting baseball, finger predicting shutter, muscle memory. One-layer visual processing required.

Having to learn that the image you're seeing is not the image your brain is used to analyze and base its calculations on is an order of magnitude more complicated. Either your brain has to learn two different lag time averages for the same action with the same muscle memory, or do parallel visual processing, where it has to process visual data that it's not used to and factor in the extra lag into the visual calculations, retraining the muscle memory to predict a higher lag, effectively erasing the previous accustomation to any other camera you've used, creating new processing patters and new order of processing from the visual cortext to the movement centers in the brain. Either way the brain works it out is going to be a whole lot more complicated than simply getting used to a new camera, because the fundamentals are different: your brain can no longer process visual information as it's used to from the get-go (since you were born). It has to dread unknown territory and find a new way to cope with the known-but-not-perceived visual lag on top of the already-compensated-for-known-visual-reactonary-lag. Compensation on top of compensation as a new exercise that cannot be built on old experience. Two-layer visual processing required.

Reply   Reply with quote   Complain
unknown member
(unknown member)
Like?
Exactly
In reply to Erik Magnuson, 10 months ago

Erik Magnuson wrote:

enemjii wrote:

When human eyes see an object, it takes one-tenth of a second for the brain to process that information, said Gerrit Maus, a postdoctoral fellow in psychology at UC Berkeley.

And this is exactly why you don't want an EVF adding extra lag. Your brain has 20-70 years of experience estimating motion based on direct observation. Add extra lag (or even worse the stutter of static images that happens with high FPS shooting) and you are fighting all of that experience.

-- hide signature --

Erik

I just stated that in my previous post! I used a few more words.

Reply   Reply with quote   Complain
Erik Magnuson
Forum ProPosts: 12,103Gear list
Like?
Different part of the cycle
In reply to TrojMacReady, 10 months ago

TrojMacReady wrote:

You already have to compensate for far more than 5 milliseconds.

Read the post above about how the brain estimates motion. Extra delay when viewing throw this off.

Even prefocused with the fastest DSLR's, the shutterlag is already 40-50 milliseconds. Meaning that by the time you decide to hit the shutterbutton, the bird still moves for 40-50 milliseconds.

Exactly - but as humans have been tracking objects and predicting their future position based on direct observation for a long time; this is a problem we are well equipped to solve.

Entering AF into that equation, adds another 90-250 milliseconds. And let's not overlook the largest delay, the human reaction time, usually being the longest.

I take it you don't do this type of shooting very often.  It's not point and shoot, but you acquire and then follow the target. Like your brain, a good AF system will begin to do predictive tracking.  You can then start shooting single or burst as you continue to track the target In the viewfinder.  Yes, there is blackout but again that's a type of condition we are very well adapted to.  (You dont lose the ability to track a moving object if it's behind a picket fence do you?)

-- hide signature --

Erik

 Erik Magnuson's gear list:Erik Magnuson's gear list
Canon EOS 5D Mark II Canon EOS 450D Sigma SD10 Sony Alpha NEX-5 Nikon D3200 +28 more
Reply   Reply with quote   Complain
enemjii
Senior MemberPosts: 1,602Gear list
Like?
Re: Exactly
In reply to canonagain123, 10 months ago

canonagain123 wrote:

Erik Magnuson wrote:

enemjii wrote:

When human eyes see an object, it takes one-tenth of a second for the brain to process that information, said Gerrit Maus, a postdoctoral fellow in psychology at UC Berkeley.

And this is exactly why you don't want an EVF adding extra lag. Your brain has 20-70 years of experience estimating motion based on direct observation. Add extra lag (or even worse the stutter of static images that happens with high FPS shooting) and you are fighting all of that experience.

-- hide signature --

Erik

I just stated that in my previous post! I used a few more words.

5 milliseconds adds lag to 1/10 sec? I don't know what your eyes are trying to capture folks.

 enemjii's gear list:enemjii's gear list
Sony SLT-A65 Sony a77 II Sony DT 55-300mm F4.5-5.6 SAM Sony DT 35mm F1.8 SAM Sigma 85mm F1.4 EX DG HSM +6 more
Reply   Reply with quote   Complain
David Hull
Veteran MemberPosts: 5,555Gear list
Like?
The mirrorless (or mindless) troll is back -nt
In reply to Ontario Gone, 10 months ago

Ontario Gone wrote:

Well in case anybody else hasn't heard, Fuji just dropped a bomb and there will likely be a lot of broken glass, as well as broken hearts. Not only is it a .77x magnification, the biggest of any digital camera, they also ratcheted down the lag. We are looking at a 0.005 sec lag time, not bad eh? The goodies are listed HERE. I guess now it's just a matter of getting that on sensor predictive tracking a bit faster and then...well, we know what then. Happy shooting

-- hide signature --

"Run to the light, Carol Anne. Run as fast as you can!"

-- hide signature --
 David Hull's gear list:David Hull's gear list
Canon EOS 50D Canon EOS 5D Mark III
Reply   Reply with quote   Complain
jtan163
Senior MemberPosts: 1,286Gear list
Like?
Re: No mention of refresh rate?
In reply to coudet, 10 months ago

coudet wrote:

I do wonder why are they hiding it.

I'm not sure they are hiding it, I'm pretty sure I've seen in mentioned somewhere as 55hz.

 jtan163's gear list:jtan163's gear list
Olympus C-740 UZ Nikon D7000 Olympus OM-D E-M5 Nikon D750 Nikon AF-S DX Nikkor 16-85mm f/3.5-5.6G ED VR +12 more
Reply   Reply with quote   Complain
TrojMacReady
Senior MemberPosts: 8,534
Like?
Re: Evolution complication: double compensation, double complication
In reply to canonagain123, 10 months ago

Overly simplified because both AF lag and human lag vary by a much larger amount than even the total (much more consistent) EVF lag.

Having done swallows in flight with a fast OLED EVF, I can also safely say that it didn't take me much time to compensate for the lag compared to my DSLR.

Reply   Reply with quote   Complain
TrojMacReady
Senior MemberPosts: 8,534
Like?
Re: Different part of the cycle
In reply to Erik Magnuson, 10 months ago

Erik Magnuson wrote:

TrojMacReady wrote:

You already have to compensate for far more than 5 milliseconds.

Read the post above about how the brain estimates motion. Extra delay when viewing throw this off.

No, as explained above, the AF and human delay vary and their average variance is already a multitude of the rather consistent EVF delay. I have found compensating for EVF delay no harder to adjust to than just the bare shutter delay, let alone AF. YMMV.

Even prefocused with the fastest DSLR's, the shutterlag is already 40-50 milliseconds. Meaning that by the time you decide to hit the shutterbutton, the bird still moves for 40-50 milliseconds.

Exactly - but as humans have been tracking objects and predicting their future position based on direct observation for a long time; this is a problem we are well equipped to solve.

We are equally equipped to compensate for an extra few milliseconds. The bird might have changed direction, but that could have equally been the case during the cycle of shutterlag. The differences in shutterlag between different cameras is already larger than the timeframe discussed here.

Entering AF into that equation, adds another 90-250 milliseconds. And let's not overlook the largest delay, the human reaction time, usually being the longest.

I take it you don't do this type of shooting very often. It's not point and shoot, but you acquire and then follow the target. Like your brain, a good AF system will begin to do predictive tracking. You can then start shooting single or burst as you continue to track the target In the viewfinder. Yes, there is blackout but again that's a type of condition we are very well adapted to. (You dont lose the ability to track a moving object if it's behind a picket fence do you?)

You fail to realize that every sequence starts with AF delay and even during a session of predictive tracking, there is delay which can cause predictive failures of the AF system. The EVF delay is a fraction of that. The only difference is that the EVF can visually already represent the loss of connection with the subject when our own compensation for the EVF delay fails, where as the possible disconnection during shutter delay, AF delay and reaction time, will only be represented in the pictures. Which means a much more delayed or no direct feedback. Which in turn hinders improvement of compensation. With the EVF delay continually represented, your brains are offered continuous feedback. Yes, it's great if you are able to keep that bird framed in you OVF without failure during a longer period, but in the end it's about the pictures and then you can't ignore the rest of the delay cycle for reasons mentioned.

I've seen for example 2 users respectively in their 70's and 80's succesfully track and frame remote controlled jets and helicopters doing speeds in excess of a 100 mph and unpredictable stunts, all with an EVF that has about 10 ms delay. I'd say most of us shouldn't have many issues to learn to adapt too then, in an age where millions of gamers are also successfully dealing with input lag, display and internet lag (the latter 2 often have similar effects as EVF lag, but internet lag is usually less consistent) that is much greater in fast paced 3D shooters. Proving this isn't exactly magic.

Reply   Reply with quote   Complain
Keyboard shortcuts:
FForum MMy threads