daran
Senior Member
Playing with my shiny sharp 18mm lens on my A7R2, I was testing the amount of star trailing vs exposure time to find a good compromise for my next star gazing trip. What I expected, given some previous research, was best results to be around 5s to 8s @ISO 640. What I found instead was a huge discrepancy between shots >= 4 seconds and those <= 3.2s.
It looks suspiciously like the infamous "star eater" algorithm, but I expected that to happen with bulb mode (12 bit) or serial shots. Instead I seem to trigger the effect with normal shots, as soon as I go beyond 3.2 seconds per shot. I tried ISO100, ISO640, ISO6400, raw compressed or raw uncompressed, normal shot or timed shot or external trigger, CaptureOne 9 vs dcraw vs RawDigger. All resulted in the same amount of "eaten" stars for 4s shots when compared to 3.2s shoots at otherwise same settings. Firmware is 3.30.
This sure does look like I need to live with the algorithm and just shoot long enough to reduce the effect. Or am I missing some magic combination of settings to avoid this rubbish filter?
It looks suspiciously like the infamous "star eater" algorithm, but I expected that to happen with bulb mode (12 bit) or serial shots. Instead I seem to trigger the effect with normal shots, as soon as I go beyond 3.2 seconds per shot. I tried ISO100, ISO640, ISO6400, raw compressed or raw uncompressed, normal shot or timed shot or external trigger, CaptureOne 9 vs dcraw vs RawDigger. All resulted in the same amount of "eaten" stars for 4s shots when compared to 3.2s shoots at otherwise same settings. Firmware is 3.30.
This sure does look like I need to live with the algorithm and just shoot long enough to reduce the effect. Or am I missing some magic combination of settings to avoid this rubbish filter?










