ZStacker - My automated focus shifting proof of concept

Started 2 months ago | Discussions thread
ForumParentFirstPrevious
Flat view
Horshack Veteran Member • Posts: 9,787
ZStacker - My automated focus shifting proof of concept
20

The in-body implementation of focus shifting on Z bodies allows you to visually specify the starting focus, which is the point nearest to the camera you want in focus for your stack. The camera doesn't let you visually set the ending focus, the point furthest from the camera you want in the stack. This feature is available on Fuji cameras, for example described on the GFX 100S focus stacking manual page. Instead you have to use trial and error to arrive at a "number of shots" value that will include the ending point of focus. Nikon doesn't provide much help in this regard:

"It can be tricky to choose the minimum number of shots needed based on the lens focal length or focus step width, but you can take some of the pressure off during shooting by taking a large number of photos with the idea that you can choose the ones you want during focus stacking." (source).

I recently discovered that Nikon added an EXIF field on their Z system that records the lens's absolute focus position. This is an integral value presumably corresponding to the physical encoded position of the focus element. Infinity is generally encoded with a value near zero, whereas MFD (minimum focus distance) is encoded to whatever unique number of focus positions the lens supports. This varies from lens to lens, and for focal lengths for the same lens (zooms). For example, the 50mm f/1.8Z has infinity @ value 7 and MFD at 1876. For the 24-70mm f/4Z, 24mm has infinity @ 18 and MFD @ 1324 and for 70mm infinity is @ 11 and MFD @ 3985.

This encoded absolute lens position value has all kinds of potential uses, including automated focus precision testing. One can measure the stdev of post-focus lens position in a variety of shooting parameters and light levels. I've already done some of this but don't have results in a form ready to share.

A more practical use that occurred to me is the automation of focus bracketing. Like most cameras, Nikon's remote camera API via MTP provides the ability to programmatically set the focus position but only in relative steps. This is why Helicon Remote allows you use the camera's AF to set the starting or ending focus point but you have to set the other end manually in Helicon because the camera's API doesn't support Helicon directing focus to absolute locations, or to even read the absolute lens position. So instead Helicon has to count the number of relative focus movements you command in their GUI to know the other end of the focus range.

Although Nikon added an EXIF field with the absolute focus position, it doesn't appear they published a method to programmatically read that position remotely from the camera, or to set absolute focus position. However, since the position is provided in the EXIF of images we can achieve the same result by taking photos and sampling the focus position from the EXIF. This is what my POC software does - it racks focus to both infinity and MFD, sampling the focus position of each. It then allows you to set the focus of the starting and ending points of your focus stack using the camera's normal focusing tools, then will take a photo at both to sample their focus position from the EXIF. It then calculates the # of relative focus increments needed to command the camera to focus at your starting and ending stack position relative to MFD/infinity, then performs the full focus stack by iteratively taking photos at each focus position, separated by a user-specific focus increment. While stacking it transfers the images directly to the computer without requiring them to be saved on the media card, after which you can import the images into your tool of choice to stack into a composite. In my demo below I use Helicon Focus.

Right now this is bare-bones and not ready to consume. It's implemented as a bash script, which isn't ideal but is the fastest way to throw together a POC. It uses gPhoto2 to control the camera over USB (WiFi should work too). If I ever develop this into a full open-source project I'd likely code it in Pyton, like I did for airnef. This demo uses jpg but supports any format the camera can shoot in, including raw.

ForumParentFirstPrevious
Flat view
ForumParentFirstPrevious
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow