Have you ever looked at your smartphone or GoPro and said, "I wish my camera could stabilize an image like that?!" Chris explains the limits of electronic image stabilization, and why your camera probably can't stabilize like that.
I think that you have this one wrong. The way you describe the EIS is where GoPro was a couple of years ago (and it was giving rather mediocre results). The new Hypersmooth stabilization, the one everyone is "raving about", uses built in high speed gyroscopes/accelerometers to stabilize the video on per-line basis. Which also removes most of the rolling shutter effect, while the sensor readout is still as slow as before.
BTW, why would smaller sensors be read faster? The readout speed depends on video resolutions, sure, but why on sensor size?
If I were to answer why GoPro is better stabilized - cameras do not have the gyroscopes, that will hopefully change in future - it is easier to stabilize wide lens then narrow lenses - it is easier to specialize the stabilization algorithm for one specific fixed lens than for an interchangeable lens system - camera makers are less motivated to implement EIS properly
1st point - I assume the Olympus featured in the video does have a 6 axis gyro sensor, and that is why it is so good! The action cameras reached that level at the point they got their 6 axis gyro sensors (movement and rotation axis), GoPro got left behind until recently because they didn't implement enough axis.
3rd point - It is not just that the lens is fixed, it is mainly that it is a prime lens. Stabilisation corrections require taking into account all lens distortions, so all that fisheye on a gopro lens has to be accounted for while moving the image about and calculating the correct position of the pixels, otherwise it looks a real mess with the corners getting warped in apparently random directions. Much easier to do with a fixed prime lens, but of course still perfectly possible on any camera as long as you have a lens calibration with sufficient data for it, in fact the Gitup F1 action camera has two lens options, and accompanying calibrations for the EIS stabiliser.
I've always said, I would always want the best IBIS and low rolling shutter even if it meant a little more noise and less DR during video. Its not even close. I don't want people getting headaches and motion sickness watching a video. And even a some jerkiness during video is a big distraction and major turn off. If you don't want to pay for a Gimbal, the GoPros or an Olympus ILC is the best option. (although the GH5 is pretty good too)
Unrelated, if this was shot on panasonic s1h 10 bit video....why am I seeing colour extreme banding throughout video? is that youtube compression or bad vlog lut recovery editing?
How do you send it to YouTube and would it be better to send it in a format quite close to what they will use? So they don't have issues with how they interpret lots of colour resolution?
So if a manufacturer wanted to save on IBIS, they could use two internal proccesing chips. One reads the sensor out as fast as possible, line by line to create an alignment map, coupled to a gyroscope. Then sends that data to the recording chip at 24, 30, 60 fps to offer stable images. I wonder if Qualcomm chips are cheap enough and easier than a complicated floating sensor system for stabilization.
For photography you could use the system I described for handheld high resolution. Combine an alignment map with at least 4 rapidly taken images to gain added pixel information. I hope Panasonic offers a handheld high resolution mode to the 6k or 4k shot modes.
Higher framerates make you look like an octopus with super-imposing images. I too am not 24 fps obsessed but I feel like framerate is a lot like shutter on a stills camera (in fact if you only use say 180 degree shuttle angle, they're all but equivalent). Why would you ever not shoot at 1/4000s shutter? Because water doesn't look still in your brain. Because motion blur makes a car look like it's moving fast rather than parked while going by at 200 mph. Because shutter is typically as SLOW as you can get away with in stills photography. Why do you think video quality would be any different?
I wish someone would build an aps-c camera in the format of the DJI Osmo Pocket. A gimbal camera with no viewfinder but a small and high res oled display on the handle. And much less than the weight of a full ILC on a gimbal.
Guys, I’m sorry to say it, but this is THE FIRST fail I observed in videos from you! I do not think you managed to pass through your point(s). (For the other videos, I enjoyed very much what you do — and I enjoy it “professionally”: my job is ALSO about presenting very heavy-weight topics in “deceivingly” simple ways.)
If I understand it correctly, you wanted to: • List particular shake-related defects of the resulting video. • Specify which of them can be fixed by EIS and which by OIS/IBIS.
With the first item, I feel you mostly succeeded — however, their differences between the types of the defects could have been stressed out a bit more.
I think with the second it was more or less a complete fail. IIUC, NONE of the issues you listed can be fully fixed by EIS — but the explanations of which of them CANNOT be fixed by OIS/IBIS was fully muffled!
Time to time it was sounding as if ALL of them may be fixed by OIS/IBIS, time to time — not. Sorry again to bear bad news!
One more point: I do not think you listed one type of defect clearly visible on better-stabilized snow-scenes you show: • The perspective distortions. Indeed, EIS/OIS/IBIS have a chance to fix “the rotational” shakes of the camera (those where the entry pupil does not shift). With a 2D subject, also “the other 2 of 5”-axes can also be fixed. However, (obviously!) with a 3D subject, shakes resulting in a shift of POV cannot be compensated.
(Probably, this is related to wider angle of view, when closer objects on the ground are visible together with the remote ones.)
I don't know where this "24fps is best" obsession started, but that frame rate is terrible for action and panning. Playing video at 60fps gives you a closer experience to the real thing when it comes to motion quality. I'd use 24fps mostly in low light, but the more frame rates the better when looking for realism aka fidelity.
Look at what people are raving about the Apple ProMotion featured in high end iPads where the screen can go up to 120 fps refresh rate for instance.
By the way, you should have mention 360 cameras which are doing amazing stuff in EIS, much better than GoPros
" don't know where this "24fps is best" obsession started, but that frame rate is terrible for action and panning."
People don't even realize that the issues this low frame rate are so bad that cinematography manuals have charts for how fast you can safely pan at various focal lengths and shutter angles.
24 fps is a *minimum* speed to create the illusion of continuous motion that creates artifacts that people have learned to associate with "cinema". It was created in part to minimize the amount of expensive film stock needed to shoot and distribute films, not as an ideal frame rate for the best possible quality projection. :-/
Thank you, sir. I hate 24p with a burning passion, even in cinema. I’m not sure who watches movies and thinks “oh boy, I’m so glad that I can’t really tell what’s going on in this scene. It would really ruin the movie if it was comprehensible.” But especially for personal media and YouTube, 60 and even 90 to 120fps is the future. It’s real, personable, crisp and clear. “Team crispy” talks about their high res 4K and 8K red raw, but then chop and blur it all up with the terrible 24fps. One day the world will come around. One day...
Oh boy. So 24p gives a natural look. 30p looks less so, hence "premium" telly programmes were shot on film before video had the capability to shoot 24p. Action sequences were shot for decades at 24p and managed to not be horrible. The reality is that there is no "best" frame rate. The video Chris and Jordan did comparing frame rates speaks to the advantages of different frame rates for different purposes.
ANOTHER related issue is the current obsession with 180° shutter angle. UNLESS the aim is a choppy motion, I think one should maximize the shutter angle to be as close to 360° as possible…
@lilBuddha: > “The video Chris and Jordan did comparing frame rates speaks to the advantages of different frame rates for different purposes.”
It DOES speak — but it was in no way convincing. The way I see it: • You want choppy — you shoot choppy. • You do not want choppy — you should get as close to 120fps with 360° shutter angle as possible. I do not think they provided any reason to have anything in between…
Depends what you're shooting but anyone who says they prefer viewing movies with the awful soap opera effect must be taking something. It instantly makes cinema look cheap and artificial.
The reason people often go wrong shooting 24fps is that they aren't observing the 180 or 360 degree "rule". If you use a high shutter speed with 24fps it will look awful.
It's not just that we're used to 24FPS. It's because it more closely resembles what we perceive as natural movement.
That's why 24fps movies that have been interpolated to 300fps by a tv look so unnatural (also why the TV refresh rate peeing contest is so ridiculous)
For action videos though, higher framerates often work way better and that same look that hinders movies will actually compliment things. Sometimes you want to be able to pan fast and keep details sharp etc.
I believe it started with the Panasonic DVX100 which was the first really good digital video camera. It shot at 24fps and I’ve read that people struck negs off of its output which were then printed to film.
fatdeeman: Why do you think television have been using 50i/60i for many decades? Because it's much more fluid and thus poses less restriction on camera and subject movement. It's not artificial, it's realistic. It's not just used for soap operas, that's a byproduct, it's used for all news footage, sports and games, where 24fps is unwatchable.
Reality is fluid, not blurred or chopped. Higher frame rate might stand out to your btain as less natural, but not to mine. Choppy and blurry low framerate footage is not realistic to me in the slightest.
The origin of 24 fps was decided with the introduction of optical sound tracks and the necessary film movement to create acceptable audio frequency response from the optical readers of the time. It had little to do with 24 fps being the minimum frame rate necessary to create the illusion of fluid motion, and it's certainly not able to deliver the illusion of smoothness with some pans and camera movement. Before optical sound, frame rates were as low as 12fps and as high as 40, depending on the studio.
I had the opportunity to see a documentary produced by Douglas Trumbull, (one of the special effects designers for the film 2001: A Space Odyssey) at the 1986 world's fair. It was displayed at 65fps as I recall. It looked amazing and one benefit of high frame rate film, is the fact that dirt and miscellaneous scratches remain on the screen for a shorter period of time. The high frame rate acts as a filter to make them less visible.
No frame rate is “better.” They re all of products of their industries and the vagaries of progress. NTSC television is done at 30 and 60fps because that’s the frequency of the electricity in NTSC countries.
If such is the case, then it would mean that footage at 100fps would look most natural by merging two successive 1/100s input frames for every output frame (so, effectively a “720°” shutter angle) so that each frame represents the last 1/50s, to maintain the correct amount of motion blur. I wonder how that would look? I’m curious now…
“It's not just that we're used to 24FPS. It's because it more closely resembles what we perceive as natural movement.”
The link I posted implies that it was only true when we were physically limited by a 180° shutter. But now we are not, and we can maintain the necessary motion blur with many more frames. So why should we restrict ourselves to 24fps? Surely our eyes are faster than that, otherwise we wouldn’t see a difference to higher framerates to begin with.
Thinking about it a little bit more, a “720°” shutter might not be necessary (but it probably wouldn’t hurt for compatibility with lower-framerate displays). As long as the motion is 100% continuous (which a 360° shutter should ensure), I suppose that the blurring by the eyes should compensate for the “lack” of it in the video. Not sure why that didn’t immediately occur to me…
Interesting article, but I have consistently picked the high framerate + high SS combinations as the most natural. And there is a good reason for that - the theory that you need to artificially blur the footage in order to portray motion realistically makes no sense.
In all cases, it's the brain itself that makes you percieve quick motion, which you are unable to follow with the eye, as blurred. It does not matter if it's on screen or the real thing. A quickly rotating wheel will look blurred no matter what (unless you are able to follow the spikes, in that case a sharp view is expected). There is zero need to artificially blur the footage in addition to that. The only result is that it prevents you to see the footage as sharp when you want to track the motion with your eye, like a moving object or a pan.
Just to avoid confusion - what I say is that there is no need to use motion blur at high enough framerates. At low framerates, blur helps in creating the illusion of fluid motion.
I think you were wrong on some aspects. Like Nigel says, the reason why action camera do better job is mostly because of much wider angle. Also, post stabilization can show better result on stabilisation but I think it has 2 disadvantges that you did not mention: you will loose quality (recompression and/or pixels re-arranging to keep same size) and will also loose side frame rows and columns in order to stabilize image. More stabilization is required, more rows and columns you will loose (in good stabilization) - otherwise row and column count are fixed and not optimized (which could be worse). I think it is very incomplete and I'm not sure you explain reasons but more exposed facts based on experience. I was expecting something a little more accurate and more scientific.
Stabilization algorithm has great effects. Just the software behind electronic stabilisation can makes a huge difference (GoPro could be just better).
About rolling shutter, I can't see the relation with stab...
The only real reason that action cameras can do a much better job is that they are generally much wider angle and so there is less stabilisation required.
Rolling shutter is easily corrected in recent action cameras, stick one on a model aeroplane and compare on with off and it is obvious that it is not stabilising frame by frame, it does it on a line by line or maybe pixel by pixel basis based on the movement data from the gyro sensor (accelerometer), a short vibration half way down the frame gets canceled out by the EIS.
Software stabilisation normally can't get close to an action camera's internal stabilisation because it doesn't have the gyro sensor data so can only guess at movement on a frame by frame basis by analysing the image, which if there is a lot of vibration can easily go wrong and make things worse.
I have to say the stabilization on an EM1 Mark II is better than on my GoPro. I've seen extremely sharp 1 and 2 second exposures on the Olympus too, something I've never seen on a GoPro or smartphone. Both the Oly and GoPros are excellent though.
Of course the Olympus has very little rolling shutter too.
I wasn't using lens IS, but that is possible with the right lens. Conversely, with my Sony, it also can use the full combination, between the rolling shutter and poor IS (even with all 3 combined), video can be horrid. For some reason with certain cameras electronic stabilization adds a weird wobble to the corners. I noticed this on my A77ii and other cameras.
'Sony A-mount cameras from the original A77 used electronic stabilization for recording video. They dropped it for the A99II. I think it works well. It uses the same sensors as the sensor shift system for stills.
@QuietOC It was a real shame when Sony crippled video with the A99ii. It could have been a good camera. And sadly all A mount lenses are cripple on the A7 series cameras too. Its weird that Sony has the tech to combine electronic IS and 5 axis IBIS like other companies, but do not. But then look what they did to A mount.
As mentioned in the video, EIS does nothing for stills shooting, which is why your Olympus clearly outperforms the GoPro there. It's only when capturing video that the GoPro's EIS system makes it comparable to the Olympus.
Chris... Thanks for the quick class on image stabilization! My only complaint is that I cannot take notes as fast as you can talk... however with it being on video, I can go over areas my brain misses as often as needed :D Again THAHKS!!
Just a bit of a nitpick, you say around the 3:18 mark that "electronic stabilization cannot stabilize rolling shutter". This isn't entirely the case. Many phones and small format sensor cameras apply warping based on accelerometer data in order to remove the leaning effect you get from rolling shutter. Many video editing suites also offer this sort of correction. It's a bit more destructive than other forms of electronic correction as you have to warp the entire frame removing a certain amount of resolution from the final image, but it's definitely something you can correct for.
While rolling shutter caused by camera panning can be solved as such (camera can also just crop during recording and skew the readout lines at the time to compensate), rolling shutter can also be caused by moving object when the camera is stationary and this kind cannot be corrected easily in post with accelerometer data.
Actually, this went a little fast for me. In many cases I could not see the effect you were trying to show in the examples. Please slow down. I do , however, appreciate the fact that you are trying to explain more video technology. We all have it now on our cameras, but do not know how to use it well. I think that video will dominate image making in the very near future.
Quick tip - You can slow the playback speed on YouTube by hovering over the video and clicking on the gear icon that appears at the bottom. This comes in handy to see some of the effects discussed in this video.
The Nikon Z30 is the company's latest 'creator' focused mirrorless camera, a 21MP APS-C model made to be more vlogging friendly than ever. Find out what it offers and what we think so far.
Nikon has announced the Z30, an entry-level Z-mount camera aimed at vloggers and other content creators. What are our initial impressions? Better watch to find out.
Sony has just released a trio of impressively small, light, ultrawide lenses for APS-C. These lenses are designed for vloggers, so Chris decided to film himself and find out how they perform.
What’s the best camera for around $2000? These capable cameras should be solid and well-built, have both speed and focus for capturing fast action and offer professional-level image quality. In this buying guide we’ve rounded up all the current interchangeable lens cameras costing around $2000 and recommended the best.
What's the best camera for shooting landscapes? High resolution, weather-sealed bodies and wide dynamic range are all important. In this buying guide we've rounded-up several great cameras for shooting landscapes, and recommended the best.
Most modern cameras will shoot video to one degree or another, but these are the ones we’d look at if you plan to shoot some video alongside your photos. We’ve chosen cameras that can take great photos and make it easy to get great looking video, rather than being the ones you’d choose as a committed videographer.
Although a lot of people only upload images to Instagram from their smartphones, the app is much more than just a mobile photography platform. In this guide we've chosen a selection of cameras that make it easy to shoot compelling lifestyle images, ideal for sharing on social media.
Photographer Stewart Marsden, the official photographer for the London New Year fireworks display and a National Geographic contributor, has written a lengthy tutorial that shows how to capture the best fireworks photographs possible.
Ahead of a full announcement next week, details have emerged about Xiaomi's upcoming flagship smartphone, the 12S Ultra. The phone, co-developed with Leica, will include the Sony IMX989 sensor, a new 1"-type sensor that Xiaomi reportedly developed alongside Sony.
Kosmo Foto founder, Stephen Dowling, has written a comprehensive tribute to the Olympus OM-1, a camera that set a new path for SLRs with its compact form factor and extensive lineup of lenses.
The Nettle Magic Project uses a hidden Raspberry Pi device with an IR camera to scan and decode a deck of cards marked with invisible UV reactive ink. The scan produces a full breakdown of the deck and delivers it to the performer in nearly real-time.
We go hands-on with Nikon's new compact super-telephoto lens, the Nikkor Z 400mm F4.5 VR S, to see what all Nikon has managed to pack into this lens, even without the help of PF elements.
Profoto's new A2 monolight is extremely compact and lightweight. It's about the size of a soda can and weighs around 770g with its battery and optional stand adapter attached. The 100Ws light is designed to be portable and easy to use.
DigiKam is a free, open-source raw photo management and editor for macOS, Windows and Linux. The team has recently released the latest version, bringing the app to version 7.7.0. The update adds many bug fixes, new features and file support.
The Nikon Z30 is the company's latest 'creator' focused mirrorless camera, a 21MP APS-C model made to be more vlogging friendly than ever. Find out what it offers and what we think so far.
Nikon has announced the Z30, an entry-level Z-mount camera aimed at vloggers and other content creators. What are our initial impressions? Better watch to find out.
Nikon has announced the Z30, a 21MP APS-C mirrorless camera aimed at vloggers and content creators. It has a lot in common with the existing Z50 and Z fc with a few tweaks and a lower price tag.
The Nikkor Z 400mm F4.5 VR S is incredibly compact, measuring just 104mm (4.1”) in diameter by 235mm (9.3") long and weighing 1245g (2lb 12oz) with the tripod collar. It's set for a July 2022 launch.
NASA and the University of Minnesota are working on a citizen scientist initiative alongside the Juno Mission and need your help. Volunteers are tasked with identifying atmospheric vortices on Jupiter, as captured by the Juno spacecraft.
The PROII CPL-VND 2-in-1 Filter offers a variable neutral density filter with between 3-7 stops of compensation as well as a circular polarizer filter. Independent control means you can dial in the exact type of compensation you want in a single filter.
Joining its diverse lineup of ONE R and RS action cameras, Insta360 has announced the 1-inch 360 Edition camera, co-engineered with Leica. The camera sports dual 1"-type image sensors and records 21MP still photos and 6K/30p video with a full 360-degree field of view.
Capture One Mobile bring Raw photo editing to iPadOS devices. While it's a familiar look and feel, it's clear Capture One has focused on providing a touch-first interface, designed for quick and easy culling and editing on-the-go.
Godox has announced the R200 ring flash for its AD200 and AD200Pro pocket flashes. The new add-on is a lightweight ring flash that works with numerous new light modifiers, promising portable and controllable ring light.
Even sophisticated microphones can't eliminate ambient noise and the effect of acoustics. But researchers at Carnegie Mellon University have developed a camera system that can see sound vibrations and reconstruct the music of a single instrument in an orchestra.
Do you want to shape and create content for the largest audience of photography and video enthusiasts in the world? DPReview is hiring a Reviews Editor to join our Seattle-based team.
In our continuing series about each camera manufacturer's strengths and weakness, we turn our judgemental gaze to Leica. Cherished and derided in equal measure, what does Leica get right, and where can it improve?
A dental office, based in Germany, had a team of pilots create a mesmerizing FPV drone video to give prospective clients a behind-the-scenes look at the inner workings of their office.
Samsung has announced the ISOCELL HP3, a 200MP sensor with smaller pixels than Samsung's original HP1 sensor, resulting in an approximately 20 percent reduction in the size of the smartphone camera module.
Street photography enthusiast Rajat Srivastava was looking for a 75mm prime lens for his Leica M3. He found a rare SOM Berthiot cinema lens that had been converted from C mount to M mount, and after a day out shooting, Srivastava was hooked.
The lens comes in at an incredibly reasonable price point, complete with a stepping motor autofocus system and an onboard Micro USB port for updating firmware.
The new version of the Blackmagic Design Pocket Cinema Camera 6K brings it much closer to the 6K Pro model, with the same battery, EVF but a new rear screen. New firmware for the whole PPC series brings enhanced image stabilization for Resolve users
The OM System 12-40mm F2.8 PRO II is an updated version of one of our favorite Olympus zoom lenses. Check out this ensemble gallery from our team, stretching from Washington's North Cascades National Park to rural England, to see how it performs.
The first preset, called 'Katen' or 'Summer Sky,' is designed to accentuate the summer weather for Pentax K-1, K-1 Mark II and K-3 Mark III DSLR cameras with the HD Pentax-D FA 21mm F2.4 ED Limited DC WR and HD Pentax-DA 15mm F4 ED AL Limited lenses attached.
Comments