Eating stars

J A C S

Forum Pro
Messages
23,065
Solutions
17
Reaction score
16,657
Location
US
About the recent News article: do the Sony cameras do plain old NR for long exposure or they search for hot pixels and remove/average them?

Jim - what kind of images you used to derive the frequency response for long exposures?
 
I saw NR on dark frames using "FFT" in ImageJ. By the symmetry of it, it doesn't look that it is selective.
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:


Jim
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive. Such an algorithm makes sense, and this is more or less what LR is doing probably (well, with different parameters but LR kills hot pixels very well, not sure about stars).
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
Such an algorithm makes sense, and this is more or less what LR is doing probably (well, with different parameters but LR kills hot pixels very well, not sure about stars).
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
Feel free to propose a specific algorithm that produces the same result. Extra points if it is good at hot pixel suppression and similar to other hot pixel suppression algorithms that other manufacturers have used, like the algorithm in the post.

Jim
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
Feel free to propose a specific algorithm that produces the same result. Extra points if it is good at hot pixel suppression and similar to other hot pixel suppression algorithms that other manufacturers have used, like the algorithm in the post.
I have not seen evidence that this algorithm produces the same result. Besides, averaging over neighboring pixels for a Bayer sensor is a bit problematic.

I do not like the argument "what else can it be". Where I am coning from, I can be as skeptical as I can be, and you have to prove that this is (1) the only algorithm (2) which produces the same result and you have done neither. Matching the power spectrum on some test image is a very very weak evidence. Also, pixel peeping samples on the web does not go well with the proposed algorithm.
 
Last edited:
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
On that specific point, I would opine that the slanted-edge method would not be ideal for testing the impact of the star-eating algorithm on resolution.

Given my understanding of the star-eating algorithm, I would expect it to only have a significant impact on the value of a pixel if it differs significantly from its neighbours, meaning that its impact on non-outliers will be comparatively small.

Most SE implementations go to some lengths to reduce the impact of outliers on the SFR measurement, meaning that they would be fairly insensitive to the effect of the star-eater if the number of outliers is indeed small.

But nothing beats testing this idea, just to make sure. Gah --- now I have tempted myself :)

-F
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
That's another problem with sensors without AA filters; it is impossible to tell if a bright pixel is a point of light or noise, because they may record as the same thing if the point of light falls almost completely inside of a pixel. With an AA filter, a point of light creates a multi-pixel structure, as long as it is not from a narrow-band light source that only registers well in the red or blue RAW channel with a CFA.
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
On that specific point, I would opine that the slanted-edge method would not be ideal for testing the impact of the star-eating algorithm on resolution.

Given my understanding of the star-eating algorithm, I would expect it to only have a significant impact on the value of a pixel if it differs significantly from its neighbours, meaning that its impact on non-outliers will be comparatively small.
However, it is the outliers that you want to preserve. Not only faint stars but fine textures on petals etc.
Most SE implementations go to some lengths to reduce the impact of outliers on the SFR measurement, meaning that they would be fairly insensitive to the effect of the star-eater if the number of outliers is indeed small.

But nothing beats testing this idea, just to make sure. Gah --- now I have tempted myself :)

-F
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
Feel free to propose a specific algorithm that produces the same result. Extra points if it is good at hot pixel suppression and similar to other hot pixel suppression algorithms that other manufacturers have used, like the algorithm in the post.
I have not seen evidence that this algorithm produces the same result. Besides, averaging over neighboring pixels for a Bayer sensor is a bit problematic.
Jim provided the evidence in his post.
I do not like the argument "what else can it be". Where I am coning from, I can be as skeptical as I can be, and you have to prove that this is (1) the only algorithm (2) which produces the same result and you have done neither. Matching the power spectrum on some test image is a very very weak evidence. Also, pixel peeping samples on the web does not go well with the proposed algorithm.
Jim already addressed this to my satisfaction - "It is certainly possible that it is just a coincidence that the proposed “star-eater” algorithm produces the same spectra as whatever Sony is doing in-camera, but I don’t think so. This is probably the real deal, or close enough not to matter."
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
Feel free to propose a specific algorithm that produces the same result. Extra points if it is good at hot pixel suppression and similar to other hot pixel suppression algorithms that other manufacturers have used, like the algorithm in the post.
I have not seen evidence that this algorithm produces the same result. Besides, averaging over neighboring pixels for a Bayer sensor is a bit problematic.
Jim provided the evidence in his post.
I do not like the argument "what else can it be". Where I am coning from, I can be as skeptical as I can be, and you have to prove that this is (1) the only algorithm (2) which produces the same result and you have done neither. Matching the power spectrum on some test image is a very very weak evidence. Also, pixel peeping samples on the web does not go well with the proposed algorithm.
Jim already addressed this to my satisfaction - "It is certainly possible that it is just a coincidence that the proposed “star-eater” algorithm produces the same spectra as whatever Sony is doing in-camera, but I don’t think so. This is probably the real deal, or close enough not to matter."
I should note that the person who proposed the algorithm did not pick an algorithm that matched my test results. They proposed the algorithm independently, I tested it, and the results matched.

Jim
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
Feel free to propose a specific algorithm that produces the same result. Extra points if it is good at hot pixel suppression and similar to other hot pixel suppression algorithms that other manufacturers have used, like the algorithm in the post.
I have not seen evidence that this algorithm produces the same result. Besides, averaging over neighboring pixels for a Bayer sensor is a bit problematic.
Jim provided the evidence in his post.
I do not like the argument "what else can it be". Where I am coning from, I can be as skeptical as I can be, and you have to prove that this is (1) the only algorithm (2) which produces the same result and you have done neither. Matching the power spectrum on some test image is a very very weak evidence. Also, pixel peeping samples on the web does not go well with the proposed algorithm.
Jim already addressed this to my satisfaction - "It is certainly possible that it is just a coincidence that the proposed “star-eater” algorithm produces the same spectra as whatever Sony is doing in-camera, but I don’t think so. This is probably the real deal, or close enough not to matter."
Except that you cannot do this on a Bayer sensor (without getting absurd results).
 
Last edited:
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
Feel free to propose a specific algorithm that produces the same result. Extra points if it is good at hot pixel suppression and similar to other hot pixel suppression algorithms that other manufacturers have used, like the algorithm in the post.
I have not seen evidence that this algorithm produces the same result. Besides, averaging over neighboring pixels for a Bayer sensor is a bit problematic.
Jim provided the evidence in his post.
I do not like the argument "what else can it be". Where I am coning from, I can be as skeptical as I can be, and you have to prove that this is (1) the only algorithm (2) which produces the same result and you have done neither. Matching the power spectrum on some test image is a very very weak evidence. Also, pixel peeping samples on the web does not go well with the proposed algorithm.
Jim already addressed this to my satisfaction - "It is certainly possible that it is just a coincidence that the proposed “star-eater” algorithm produces the same spectra as whatever Sony is doing in-camera, but I don’t think so. This is probably the real deal, or close enough not to matter."
I guess that this passes for a proof in certain circles.
 
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
Feel free to propose a specific algorithm that produces the same result. Extra points if it is good at hot pixel suppression and similar to other hot pixel suppression algorithms that other manufacturers have used, like the algorithm in the post.
I have not seen evidence that this algorithm produces the same result. Besides, averaging over neighboring pixels for a Bayer sensor is a bit problematic.
Jim provided the evidence in his post.
I do not like the argument "what else can it be". Where I am coning from, I can be as skeptical as I can be, and you have to prove that this is (1) the only algorithm (2) which produces the same result and you have done neither. Matching the power spectrum on some test image is a very very weak evidence. Also, pixel peeping samples on the web does not go well with the proposed algorithm.
Jim already addressed this to my satisfaction - "It is certainly possible that it is just a coincidence that the proposed “star-eater” algorithm produces the same spectra as whatever Sony is doing in-camera, but I don’t think so. This is probably the real deal, or close enough not to matter."
I guess that this passes for a proof in certain circles.
You [or anyone] could try to falsify what is proposed here
 
  • Like
Reactions: osv
So this should affect visibly "ordinary" images as well, not just stars (with longer exposures)?
I will check when I will have the camera ;) I doubt it will be very visible, but resolution tests may show the nature of the filtering. I think they analyze a small cluster surrounding each pixel and if the value of the pixel in question is an outlier, it is assigned the minimum or maximum cluster value.
Right you are, as usual.

Details of the probable algorithm here:

http://blog.kasson.com/the-last-wor...e-sony-a7rii-long-exposure-spatial-filtering/
I do not find this evidence conclusive.
You are welcome to propose your own algorithm.
An algorithm which would provide the same power spectrum? A certain convolution filter would do it.
Feel free to propose a specific algorithm that produces the same result. Extra points if it is good at hot pixel suppression and similar to other hot pixel suppression algorithms that other manufacturers have used, like the algorithm in the post.
I have not seen evidence that this algorithm produces the same result. Besides, averaging over neighboring pixels for a Bayer sensor is a bit problematic.
Jim provided the evidence in his post.
I do not like the argument "what else can it be". Where I am coning from, I can be as skeptical as I can be, and you have to prove that this is (1) the only algorithm (2) which produces the same result and you have done neither. Matching the power spectrum on some test image is a very very weak evidence. Also, pixel peeping samples on the web does not go well with the proposed algorithm.
Jim already addressed this to my satisfaction - "It is certainly possible that it is just a coincidence that the proposed “star-eater” algorithm produces the same spectra as whatever Sony is doing in-camera, but I don’t think so. This is probably the real deal, or close enough not to matter."
I guess that this passes for a proof in certain circles.
You [or anyone] could try to falsify what is proposed here
???

You are missing my point. In science, you are wrong until proven right. I do not have to "falsify" anything. Whoever makes the claim must prove his point. I still have to see how this works on a Bayer sensor, and the visual effect it has compared to what can be found on the web about this effect. So far, nobody bothered to do it. Even then, that would not be any proof at all because there might be many local adjustment looking for outliers which would produce the same or similar FT and similar visual effects. Actually, Sony has three of them already (two firmware versions of the A7R2 and the A7R3). They all produce visually different outputs. They cannot be the same then. At least two of them are not what was proposed.

Let us say that I make a claim that the moon is made of cheese and as a proof, I present photos which makes the moon look like that. You say: "that does not prove anything, it is might be just a coincidence". Me: "Prove me wrong then". You: "Alright, I must admit that you are right".
 
Last edited:

Keyboard shortcuts

Back
Top