Seeing how many images from stacked sensors are being processed per second in modern cameras, I am guessing that these processors are more powerful for graphics processing than the mainstream PC processors. Is that accurate?
The buffer is generally a limitation otherwise the stacked sensor ML cameras are capturing, processing and saving RAW + JPEG at an insane speed. Sony's A1, latest Fuji X-H2S, Canon's R3, Nikon's Z9, Olympus OM-1, etc., come to mind.
They do run camera's OS (whatever that is), support Wi-Fi, HDMI, Ethernet, Bluetooth, external recording and a variety of video qualities.
If a minimalist Linux machine was built on a fairly recent laptop, and the fastest available image processing application was run, I don't think it will process anything close to 20 images per second.
Am I missing something?
Thanks
--
See my profile (About me) for gear and my posting policy.
The buffer is generally a limitation otherwise the stacked sensor ML cameras are capturing, processing and saving RAW + JPEG at an insane speed. Sony's A1, latest Fuji X-H2S, Canon's R3, Nikon's Z9, Olympus OM-1, etc., come to mind.
They do run camera's OS (whatever that is), support Wi-Fi, HDMI, Ethernet, Bluetooth, external recording and a variety of video qualities.
If a minimalist Linux machine was built on a fairly recent laptop, and the fastest available image processing application was run, I don't think it will process anything close to 20 images per second.
Am I missing something?
Thanks
--
See my profile (About me) for gear and my posting policy.
Last edited: