But as much time as you spend here arguing that you have the best camera, biggest yacht, and how DSLRs can't take pictures... there is no way you have any.
It was Lee Jay claiming mirrorless cameras can't take photos of moving objects because of EVF lag.
And taking photos of a timer on a computer monitor sure put that to rest.....
Regardless, your comment has absolutely nothing to do with what I said.
Nowhere in this thread did the OP claim DSLRs can't take pictures. That is just something you made up.
This thread is about Lee Jay's claim that mirrorless cameras can't take photos of moving objects because of EVF lag...
Th OP never mentioned Lee Jay.
That is just something you made up.
Lee Jay's claim sparked a number of threads, including this one. If you follow the sequence, it is clear, even though he wasn't mentioned by name in the OP.
Lee Jay's claim was also based on entirely false premises, but that is another story.
What you and Kiwi have said about "Lee Jay's claim," is deliberately disingenuous and I think you probably know it. Lee Jay and others have addressed the issue of EVF lag causing problems in keeping
fast moving subjects appropriately in the image frame. That is especially true when the fast moving subjects also change direction drapidly and unpredictably. No one denies the ability of EVF cameras to pan and follow moderately moving subjects with predictable trajectories.
I can see why, as it'd be the same case as with fast paced online games. Timing of frames, dropped frames, etc. may completely confuse our own tracking (for good reason). This may also be case in sports. I think it may represent 0.01% of cases (that level of unpredictable speed when having the button pressed for a while in AF-C) and different cameras will have different effects, so the problem is hen there is a broad generalization.
Creating a disingenuous strawman argument does this discussion no good to anyone--not even the ML fanboys. It also makes you seem like you only want to engage in argumentation instead of addressing the actual issue of lag. That's also how I interpret your continual conflation of EVF lag with the very different effects of shutter lag, mirror blackout and other issues unrelated to tracking.
We lack the benchmarks. The problem is when he says he needs 9x the pixels, or when he is generalizing from one camera to the entire world of EVF cameras.
If one has a camera with an EVF, it helps one to keep the image within the image field to have a full frame sensor and to use a moderate focal length lens. That is simply overpowering the tracking problem by giving you a smaller image in a wider field but what we're really addressing here is tracking with otherwise similar framing not using different sized frames to assist a viewfinder lag.
I think unless you enrich your understanding of lag, it creates flame wars. The 5ms lag of many modern EVF systems is not enough to cause anything. It's too short.
1) the lag needs to be small
2) the frame updates need to be as consistent (dropped frames)
3) The timing needs to be perfect (out of time)
The reading of a sensor 18 times per second at full resolution, etc. can quickly bottleneck the process, in addition to introducing blackout times, and can also cause delays. The same problems arise with DSLRs. When the photo is being taken, the mirror is on the way. if you take 18 shots in a second, the sensor needs to be read, but this is alleviated as the light is routed through the mirror and prism, even if the camera core processing it is saturated. Ultimately, the EVF will WIN because it will be able to continue to show the image with incredibly low delay 100% of the time. But, this is not easy when the readouts create locking or delays or unresponsiveness. For some people, the mirror may be a big issue too. But right now, I understand if the system is not designed for these cases, the processing of the information may be affecting EVF. And I think anyone not taking AF-C full res 15 or 30 shoots, may not notice.
The other aspect is that people that trained their targeting with OVF do have to adjust a little. They will need tocope with 15ms, or 4ms or X ms delay. All the tracking you do is not conscious. Has been automated. So it's normal if at the beggining you are off by some 50 pixels or so. When I played games, I adjusted to the lag, my brain would for example aim 4 pixels to the right or left, knowing there is some slight delay. Literally, if you are always looking 15ms into the past, in a highly dynamic shooting, you will be off certain amount of pixels IF you have trained yourself only on 0ms delay. But for anyone shooting with EVF at 15ms, they will always get it mostly right (except when the subject takes unexpected turn, where you will have, if you are used to OVF and are now trying EVF:
Thing travels fast to right, you track. You are 15ms behind without knowing.
Subject abruptly changes direction. After 7 ms subject is well aimed. Now you depend on the refresh to know it turned. May not show until 15 ms later. Now you are ˜23 ms apart, and only now you notice it...you still haven't even started processing, and may need a few more frames. Plus, add reaction time? 20ms? You are still going to the right for 20 ms more time. So the math may be off and not perfect, it's an example, when things compound in this way, you may well have an object that traveled 30ms left while you traveled 30 ms right. Now, you will get all the shots, ut the framing may be affected a lot more than with OVF. Especially, if the frame rate changes, the camera is not very good in AF-C, any timing issue, and higher baseline ms delay. The first thing that HELPS is try different cameras. The second is that, yes, you need to adjust to the delay. If you know it's 20ms, you want to always be 20ms AHEAD in your aim. And when you see the object change the direction, you need to OVERCOMPENSATE at first. This is how it works when you chance something with a little lag. And at this level of FL and kind of photography, and when you factor in our reaction time and other things, it's easy to blame de EVF when in reality it was already hard to get any good framing in these cases. But if you do this for a living, you can see that maybe a lot more shots are a little more off in the framing. And now there's the temptation to always blame the camera as well.
But the problem comes that this experience is not shared... This is an extreme case, and those not being listened to then feel the need to generalize from one camera to another, or make grossly overstated statements that just annoy the entire universe.