This is by no means in its final form, but Lafodis has been progressing. It's a lot more usable now as well as easier to build, so hopefully we'll soon have the full open source software and hardware design release in good form....
The new herringbone gear drive system, using a separate Arduino Pro Micro for the stepper control, is working very well. It's still under $50 parts cost. Typical positioning accuracy over the complete range of motion seems to be within 30 microns, which is way better than the earlier versions did (especially rotationally) and I think is pretty good for 100%-3D-printed motion parts. The drive controls have also been greatly simplified, basically taking feedrate, target absolute angle and radius position, and allowing setting of a new home position.
The ESP32-CAM sensor in use here now also has an NIR blocking filter, so colors are much truer... although the filter is removable for NIR work. A typical single exposure (not even at highest quality setting) is:
A single capture using ESP32-CAM with Wollensak 135mm f/4.5 @ f/11
Really quite reasonable for 2.2 micron pixels in a $2 OV2640 camera!
Aiming the camera and tweaking image parameters is now done via an HTML interface accessed using any browser via 802.11 WiFi. Commands and image capture download is also now via WiFi rather than BlueTooth, although the TF card slot is also available for local capture storage (up to about 20K 1600x1200 ESP32-CAM captures per 4GB card).
It was a VERY windy day today. In fact, just after a 16x16mm capture (~52MP), the wind blew the tripod Lafodis was sitting on over, smashing the lens into the ground hard enough to shatter the lens mount plate -- I'm printing a replacement with an improved design. Nothing else was damaged. Anyway, the point is that the wind definitely was moving the tree branches around a lot, which makes it a really good test of dynamic capture/stitching.
Here's the crude first light stitch from that 16x16mm capture. Remember that Lafodis full capture area is a 160mm diameter circle, so this is just 0.3% of a full-resolution capture for Lafodis....
First light stitch, done in real time on an old laptop during Lafodis capture
The scan order used for this was a modified Hilbert walk, which improves the temporal adjacency of spatially nearby samples.
The blending in this test is rather crude and the scan parameters fell slightly short of covering the full space (which is why the black lower left corner), but it worked pretty well. Just for comparison, Hugin failed to automatically stitch these 196 image captures after using more than 40 CPU hours, so the new confidence-based stitch algorithm for Lafodis is working way better to produce this incrementally in real time during capture. Fully automatic stitch quality should be improving significantly within the next month or so as the blending, etc., get tweaked.
I'll be posting more updates as things develop. It's been really slow getting this tweaked to be more user-friendly for open source release during the pandemic... but everything is going slowly during the pandemic.