EVF "lag" tests REQUIRED in (ML) REVIEWS ???

It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras? Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
 
Last edited:
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras?
It's a little better, since it's a 3" display.
Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
The Hoodman has no lag. It's just optics.
 
https://www.zoul.cz/shutter-lag/ my finger to image capture was .17 em12 by the way their is virtually no evf lag. lets see how good you dslr users are :-)

Don
"Please note that this measures the shutter lag in the most general sense, including the viewfinder and (most importantly) you".....
sorry i forgot you own sloooow cameras.:-)

Don
23daa8ba43f24c7ab8927a7867d72807.jpg

:D
i have one at 1.90 😀 I was a bit trigger happy . Tomorrow I’m going to shoot a 60 frame video wth my em52 with my em12 shooting the screen and see what the actual evf lag is.
I assume you realize the only way to derive EVF-lag from this test is to compare the average of two series of only looking at the monitor.... and another looking at EVF/LCD.
yes im quite aware how to do the test what do you think it will be ?
i expect it will be about how fast you are :-) wish I didn’t have to work so much I’m missing out on all the fun and experiments. A mate shoots birds with a v15 pf300 he loves it and his images are first class. He prefers it over his d500 with a 300 vr2 2.8.

Don
I just did a test run where 1st camera watched the monitor, and a 2nd camera behind the first camera watched the monitor and the LCD on the 1st camera.

1st camera = Sony A6300

2nd camera = Sony A7iii

I did not try various shutter speeds and ISO values. I shot the monitor at 1/100 sec and f9 with a 35mm lens at ISO 2500 on the 1st camera. But I did try regular shutter and silent electronic shutter, and there was no difference.

(times in mSec)

1st 2nd

Camera Camera

1.54 1.54

0.85 0.85

1.24 1.24

0.66 0.66

1.95 1.95

1.01 1.01

1.44 1.44

With 10 mS resolution in this test, I found no evidence of any shutter lag.
 
Last edited:
Every camera has shutter lag. EFV lag is something completely different - it can't keep up with quick movement - panning the camera especially.
 
Every camera has shutter lag. EFV lag is something completely different - it can't keep up with quick movement - panning the camera especially.
Sorry, I called it by the wrong name. This was LCD lag since I was looking at the LCD and the monitor. LCD lag, or EVF lag, does not need panning to show its fault. The changing numbers on the monitor accomplished the same task with them changing every 10 mS. If the actual lag was 1/2 of that 10 mS increment time, I would expect to see 50% of the readings to vary by one digit. But I didn't see that. I did see one in the process of changing

Here is one example of what I was capturing. I ended up focusing in between the monitor and the LCD screen since I wasn't getting both in focus at the same time.

d351ab7d400146cca37faa484404cb80.jpg

Considering the consistency of this test, it would be safe to say that if LCD lag was exactly 10 ms, then the two number would usually differ from each other by 10 mS.

Let's suppose it was 5 mS, consider this: (left number is the monitor with the number displayed on the monitor to two decimal places, and only changing every 10 mS. But internally, consider that there is another digit that we don't see.)

0.000 --> monitor displays 0.00

0.001

0.002

0.003

0.004

0.005 --> 0.00 will be displayed on camera 1

0.006

0.007

0.008

0.009

0.010 --> monitor changes to 0.01

0.011

0.012

0.013

0.014

0.015 --> 0.01 will be displayed on camera 1

If we had this, I would expect to have the two numbers to differ 50% of the time by 1 digit since I can't time it on where I take an image with camera 2.
 
Last edited:
Human response lag is in additional to the machine response lag. I described machine response lag.
Human response lag and EVF lag are apples and oranges in some contexts.

This is not just about "WHEN", but "WHERE"; as in, where is the subject?

With EVF delay, you lose track of the subject more often. Pre-burst will not capture anything useful 0.3 seconds ago, if the subject was not in front of the lens 0.3 seconds ago, while you were frantically searching for it, or before you even noticed it was missing!

Mechanical delay, human response, and EVF delay only add in like units if the subject being in the frame is a given fact.
your figure is way off.
What figure? It was clearly a number pulled out of a hat. I didn't measure anything.

The point is that pre-burst doesn't help if the subject is not in the frame at the time that the saved frames were exposed, and EVF delay can make it more likely that you do lose the subject.
what 10-50 mili secs ?

Don
 
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras?
It's a little better, since it's a 3" display.
Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
The Hoodman has no lag. It's just optics.
 
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras?
It's a little better, since it's a 3" display.
Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
The Hoodman has no lag. It's just optics.
So it's not an EVF.
The LCD is an EVF. The Hoodman just converts it to eye level.
What was the laggy EVF you keep talking about across several threads?
All the EVFs I've ever used or tried.
 
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras?
It's a little better, since it's a 3" display.
Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
The Hoodman has no lag. It's just optics.
So it's not an EVF.
The LCD is an EVF. The Hoodman just converts it to eye level.
What was the laggy EVF you keep talking about across several threads?
All the EVFs I've ever used or tried.
 
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras?
It's a little better, since it's a 3" display.
Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
The Hoodman has no lag. It's just optics.
So it's not an EVF.
The LCD is an EVF. The Hoodman just converts it to eye level.
What was the laggy EVF you keep talking about across several threads?
All the EVFs I've ever used or tried.
Your entire argument is disingenuous unless you can show that the LCD on a DSLR in live view performs the same as the EVF a current mid to high level MILC. At the least you have been seeking to misrepresent the situation.
Don't waste any more of your time. Everything is about what Lee has done.
 
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Lee ... I am agreeing with you so this is not an argument ...

But I could ask how new/modern is an attachable EVF ... and would it favorably compare with a latest-technology, (especially A9) ???

UPDATE ... have you been arguing "LV" all this time ??? Not at all the same.
 
Last edited:
my findings as well around 10/30msec i think all the ovf users are haveing kittens now.

Don
 
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Lee ... I am agreeing with you so this is not an argument ...

But I could ask how new/modern is an attachable EVF ... and would it favorably compare with a latest-technology, (especially A9) ???

UPDATE ... have you been arguing "LV" all this time ??? Not at all the same.
Yes, the whole time.

I think basically everyone across multiple threads has been under the impression he was talking about an EVF on a MILC. No wonder he has been so evasive (but then he tends to be at the best of times).
 
I am a very strong ML/EVF supporter, but have to admit EVF-"lag" is its single biggest problem, and much different (worse) than older SLR/dSLR "blackout".

With former SLR/dSLR "blackout" the (fast-moving) subject that is being panned can still be in the same place after blackout, (w/ only basic panning technique).

But the inherent delay between the light striking the (imaging) sensor and display on eye-EVF/rear-LCD results in the camera always pointed BEHIND a (fast-moving "panned") subject.

This is not a major issue until after the (first) shot and then the subject reappears "ahead" after the blackout, often completely OUT OF THE FRAME, (and the camera then needs to be "jumped" ahead to be back on the subject).

This is problematic for even the best panner, (shooting a continuous series/sequence) ,but much worse for an erratic-moving subject -- and has been recently discussed in 150+ threads.

As a staunch supporter of ML/EVF, I also have this admit it remains a major problem I can only hope can be reduced in the future.

I assume that with "no" blackout (unless done by tricks), the A9 may be better in this regard, but I have no personal experience with it yet.

But I also must assume there still is an (electronic) delay between the light striking the sensor, and then being "displayed", (on either the eye-EVF or rear-LCD).

I have seen tests where a (separate) camera records a (single-shot) of BOTH a timer and the LCD of a camera pointed at that timer. The timer is of course always slightly ahead of the (delayed) image then being displayed.

I have another question is whether the actual (recorded) image is of the exact-time/instant the shutter-button is pushed ... OR ... is it an EARLIER-time (as recorded before delayed and eventually displayed) ???
No. Never exactly at the instant the shutter is pushed. There are shutter and possible autofocus lags. Also if you use a viewfinder or LCD to decide when to push the shutter the target may have moved off frame by the time you decide to press the shutter because of viewfinder or LCD lag.

Imaging Resource reviews of cameras give precisely measured (to the nearest millisecond) for shutter and any autofocus lag. EVF lags are limited by the sensor readout rate which is typically around 3 milliseconds (300 frames per second) for recent cameras in addition to the refresh rates of the EVF or LCD. I am not aware of any reviews the measure EVF or LCD lags precisely.

Recording an earlier time (before the shutter is pressed) happens in pre-Burst shooting. In typical pre-Burst mode, the camera saves image frames for 2 secs, 1 second before and 1 second after the shutter is pressed. From those you can select the most appropriate.
I think these should be a REQUIRED TESTS of all (ML/EVF) REVIEWS, (@ different EVF refresh-rates).
Get an OMD EM1X and not only have no lag but actually shoot with NEGATIVE lag, like time travel photography.

I think the most lag happens with very very high speed scenarios, or, in many cases, with very low light where the OVF would already be too dim. I sometimes curse the "lag" but it's actually me shooting at 6400 in almost darkness.
 
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras?
It's a little better, since it's a 3" display.
Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
The Hoodman has no lag. It's just optics.
So it's not an EVF.
The LCD is an EVF. The Hoodman just converts it to eye level.
What was the laggy EVF you keep talking about across several threads?
All the EVFs I've ever used or tried.
Your entire argument is disingenuous unless you can show that the LCD on a DSLR in live view performs the same as the EVF a current mid to high level MILC. At the least you have been seeking to misrepresent the situation.
What I've been demonstrating is that lag reduces tracking performance. All EVFs have lag (have to) as does the LCD on my SLR. This test was of a relatively easy to track subject and yet a bit of lag greatly reduced tracking performance.

Unless you can find a current mirrorless camera to which you can add a TTL OVF, this is the only way to do this test as DSLRs can act as mirrorless cameras by flipping up the mirror.

This is as honest a test as can be done. Everything was the same between the two tests except for viewfinder lag - same photographer, camera, lens, focal length, subject speed, subject direction and these tests were only minutes apart. I'd challenge you or anyone to do a better test of the tracking performance reduction caused by viewfinder lag.
 
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras?
It's a little better, since it's a 3" display.
Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
The Hoodman has no lag. It's just optics.
So it's not an EVF.
The LCD is an EVF. The Hoodman just converts it to eye level.
What was the laggy EVF you keep talking about across several threads?
All the EVFs I've ever used or tried.
Your entire argument is disingenuous unless you can show that the LCD on a DSLR in live view performs the same as the EVF a current mid to high level MILC. At the least you have been seeking to misrepresent the situation.
What I've been demonstrating is that lag reduces tracking performance. All EVFs have lag (have to) as does the LCD on my SLR. This test was of a relatively easy to track subject and yet a bit of lag greatly reduced tracking performance.

Unless you can find a current mirrorless camera to which you can add a TTL OVF, this is the only way to do this test as DSLRs can act as mirrorless cameras by flipping up the mirror.

This is as honest a test as can be done. Everything was the same between the two tests except for viewfinder lag - same photographer, camera, lens, focal length, subject speed, subject direction and these tests were only minutes apart. I'd challenge you or anyone to do a better test of the tracking performance reduction caused by viewfinder lag.
using a 600mm lens dont people use red dot sights :-) isnt that what they were invented for ?

Don

do you think that is more skillful than a fighter pilot flying at night in combat using instruments ? with Lag ;-)

--
Olympus EM5mk2 ,EM1mk2
http://www.dpreview.com/galleries/9412035244
past toys. k100d, k10d,k7,fz5,fz150,500uz,canon G9, Olympus xz1 em5mk1
 
Last edited:
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
so you seem to think the subject is going to disappear from the evf in that split second ;-)
Yeah, because that's what happens in the real world.

Lag

Lag free

Everything else is the same (same camera, same lens, same focal length,
I'm confused ...

Isn't the point that the video with lag was shot through an EVF, while the lag-free sequence of photos was shot through an OVF?
Yeah,
So how can it be "same camera, same lens"?
Because I have a dSLR, which has an OVF and the ability to attach an EVF.
Please explain. What is the EVF you attach to your 7D?
Hoodman Custom Finder Kit.
And you say this is directly comparable to the EVFs built into current model mirrorless interchangeable lens cameras?
It's a little better, since it's a 3" display.
Really?

I may be wrong, but wouldn't you be adding lag from your LCD in live view to lag from the Hoodman? No wonder it's unusable.
The Hoodman has no lag. It's just optics.
So it's not an EVF.
The LCD is an EVF. The Hoodman just converts it to eye level.
What was the laggy EVF you keep talking about across several threads?
All the EVFs I've ever used or tried.
Your entire argument is disingenuous unless you can show that the LCD on a DSLR in live view performs the same as the EVF a current mid to high level MILC. At the least you have been seeking to misrepresent the situation.
What I've been demonstrating is that lag reduces tracking performance. All EVFs have lag (have to) as does the LCD on my SLR. This test was of a relatively easy to track subject and yet a bit of lag greatly reduced tracking performance.

Unless you can find a current mirrorless camera to which you can add a TTL OVF, this is the only way to do this test as DSLRs can act as mirrorless cameras by flipping up the mirror.

This is as honest a test as can be done. Everything was the same between the two tests except for viewfinder lag - same photographer, camera, lens, focal length, subject speed, subject direction and these tests were only minutes apart. I'd challenge you or anyone to do a better test of the tracking performance reduction caused by viewfinder lag.
using a 600mm lens dont people use red dot sights :-)
Some people do - to workaround EVF lag. No one would do that if lag weren't a problem.

But such a device doesn't show you framing which is a huge problem if you're using a zoom lens and zooming and shooting at the same time.
isnt that what they were invented for ?
As far as I know they were used before that as finders for telescopes. I have a fancy one on mine.
 
As a staunch supporter of ML/EVF, I also have this admit it remains a major problem I can only hope can be reduced in the future.
Already addressed long time ago.
I assume that with "no" blackout (unless done by tricks), the A9 may be better in this regard, but I have no personal experience with it yet.
And what you have personally experienced?

As many of us mirrorless users has already experienced years ago problem free EVF operations in most erratic situations, with better results than any DSLR offers.

Already years ago it became that no way back to DSLR that has anyways longer blackouts and incapability to compensate the real lag, the reaction time of the user.

Already 7 years ago the EVF lag was negligent compared to reaction time lag that photographers has.
  1. EVF vs OVF difference in lag was about 16ms vs 0ms.
  2. Reaction time on average photographer who is expecting something to happen is about 250-350 ms.
  3. The shutter release lag time is by average 50-80 ms depending the modes.
So the arguments that EVF is always showing the action late and no change to react to them, is purely false, has been for almost decade.

The only cause that makes or breaks the change of the photo has been the reaction time of the photographer AND knowledge of the subject/situation.

Example, a experienced sports player who has followed specific team for years, knows their players and knows their usual strategies and tactics. A dedicated sports fan knows the tactics used by the players in different moments, and they know what comes next once cue appears, they know when the player is to do the action that is wanted to be photographed.

A seasoned wildlife amateur knows the animals and their behavior, with a dedication the person learns to identify specific individuals like specific wolves, bears, foxes, elephants etc if they are usual animals in the area. The person learns their habits, the terrain and the usual locations and times. It becomes easier to find the animals, and in time many of the animals even learns to accept such people around them.

So what comes as the real limitation on the getting the shot? Timing. The reaction time is main limiting factor, a bird taking off or catching a prey is the moment of action that many like to capture, question is the pose, the attitude, the moment.

There are those who will manage to do it with a single frame if the action is well predictable like example a football player kicking a penalty kick at the goal, american football player throwing ball or bird preparing to dive on pray and then it is question of just timing the shot when the foot moves and when the bird dives.

So how have photographers increased their change to get the moment? High speed sequential/burst/motor shooting. Meaning pure luck method, but statistically increasing the probability to overcome one-shot change for hit-or-miss.

Why for the long time the camera FPS capability was critical as you had change to get the single frame from the series of 5-7 that was great or 2-3 frames that were acceptable, depending how many frames per second the camera could capture.

Like on film bodies you had max 5 frames per second since 1972 (Olympus was way before of others, Minolta, Pentax, Canon, Nikon etc), limited by the film length and your focusing speed and accuracy.

For very long time on digital cameras it was around 4-5 frames in full frame and if you cropped then you managed to get higher, like Nikon D2x had 5 in full frame and in cropped 8 frames. Canon 1D II had 8.5 frames per second in 4Mpix mode.

Until we started to see higher rates, 5.5, 6.5 and then 7-8 frames constantly on lower end cameras as well, usually AF disabled.

Not so long time ago we did reach laughable numbers, 10 FPS!

And now you had people saying that they don't need so ridiculous rates but a 3-4 FPS or 4-5 FPS is enough as otherwise you fill the buffer or you need to go through huge amount of wasted frames.

And what we have now?
  • 18-20 FPS with AF or 60 FPS without.
  • We have infinite buffers or buffers of 50-80 frames
We have a Pro-Capture mode in professional high-end models that allows you to capture 25 frames before you fully press the shutter button! And you can get that 60 FPS if you can use focus stationary, or 18 FPS with continuous focus.

And what is the point of all this?

Features like Pro-Capture mode eliminates the reaction lag of the photographer!

A DSLR shooter has today max 14 FPS when AF is locked, compared that to 60 FPS with AF locked on mirrorless.

A DSLR shooter has 12 FPS with AF, mirrorless has 18/20 FPS with AF.

A moment is about to come, you half-press shutter release button and Pro-Capture starts buffering 25 frames at the 18 FPS rate, when you see that the moment starts or it already happened, you fully press shutter release button and now the previous 25 frames are written to card and your buffer starts to fill with frames until you release the shutter button when you see the action was done.

You are not required to predict moment so much, a sudden move, a not predicted action etc that happens in the moment you have the shutter release button half-pressed, and you have it very likely with the 18/60 FPS rate at max (nothing denies to use even lower rates).

Your 250-350 ms reaction time limitation has just been eliminated!

And you will save a lot of time to go through frames that doesn't have something that didn't happen. As you can just release the shutter release button from half-way if there didn't happen anything.

So how many is still whining about a 5-6 ms EVF lag? Does people even talk that it is 5-6 milliseconds? Even a 32/16 ms delay is negligent compared to anything that the user reaction times are. We are talking about speeds that no photographer is there having problems with it. Why? Because not even computer players with a far superior reaction times and hand-eye coordination skills than any photographer likely has, have problems with such or even longer lag!



The camera users whining about 16 or even 32 millisecond lag is just pathetic.

The unwillingness to upgrade and use a far more capable gears is just stubbornness to stay in the old DSLR technology that is holding them back, or comparing things to obsolete models.

10 years ago I would personally agreed, but things changes. Technology evolves. And what you knew 5 years ago or even 3 years ago, doesn't anymore matter. It is old information, claims based to old information is invalid.

It doesn't matter is the argument about camera from 1972 or 2018, but when it is old information, it is called history. We can learn from it, but we shouldn't be stuck to it and believe that nothing has changed.

The arguments about EVF lag is like arguing is the 4.5 FPS so much worse than 5 FPS?
 
Last edited:
It's not easy to measure EVF lag.
of coarse it is. just look at the camera recording the screen. this is so basic. LOOK at both screens at the same time.
Right...and you're capable of determining millisecond level timing by eye.
People are incapable to determining millisecond level timing by eye, but people have problem with millisecond level lag?

Sure.... Sounds logical...
 

Keyboard shortcuts

Back
Top