bclaff

Forum Pro
Messages
14,415
Solutions
24
Reaction score
13,409
Location
Metro-West Boston, MA, US
I was recently reminded that DxOMark detects noise reduction during their testing and labels such ISO settings as "smoothed".
Here's a complete list of what they detected:

Canon EOS 1DX: 102400,204800
Leica M9: 320,640,1280,2560
Leica M9-P: 200,400,800,1600,2500
Leica M-E Typ 220: 400,800,1600,2500
Nikon 1 J1: 800,1600,3200
Nikon 1 J2: 800,1600,3200,6400
Nikon 1 V1: 800,1600,3200,6400
Nokia Lumia 1020: 800,1600
Nokia Lumia 1520: 800,1600,3200,4000
Olympus SP-565 UZ: 800,1600
Pentax K-01: 3200,6400,12800,25600
Pentax K10D: 800,1600
Pentax K200D: 800,1600
Pentax K20D: 3200,6400
Pentax K-30: 3200,6400,12800,25600
Pentax K-5 II: 3200,6400,12800,25600,51200
Pentax K-5 IIs: 3200,6400,12800,25600,51200
Pentax K-5: 3200,6400,12800,25600,51200
Pentax K-50: 3200,6400,12800,25600,51200
Pentax K-500: 3200,6400,12800,25600,51200
Pentax K-7: 3200,6400
Pentax K-m: 800,1600,3200
Pentax K-r: 3200,6400,12800,25600
Pentax K-S1: 12800,25600,51200
Pentax K-x: 3200,6400,12800
Pentax MX-1: 1600,3200,6400,12800
Pentax Q: 125,200,400,800,1600,3200,6400
Pentax Q10: 100,200,400,800,1600,3200,6400
Samsung EX2F: 800,1600,3200
Samsung Galaxy NX: 25600
Samsung GX-20: 1600,3200,6400
Samsung NX20: 12800
Samsung NX200: 6400,12800
Samsung NX2000: 25600
Samsung NX30: 25600
Sony DSC-RX10: 6400,12800
Sony DSC-RX100M2: 6400,12800
Sony DSC-RX100M3: 6400,12800
Sony DSC-RX100M4: 6400,12800
Sony DSC-RX100M5: 6400,12800
Sony DSC-RX10M2: 6400,12800
Sony DSC-RX10M3: 6400,12800

It's clear that some brands, or product lines within brands, are more likely to do this.

As an example, from my 2D FTs I can confirm the findings for the Nikon 1 J1:

 Top row: ISO 100, 200, 400; Bottom row: ISO 800, 1600, 3200, and 6400
Top row: ISO 100, 200, 400; Bottom row: ISO 800, 1600, 3200, and 6400

--
Bill ( Your trusted source for independent sensor data at http://www.photonstophotos.net )
 
FYI, at Electronic Imaging 2017, I published Refining raw pixel values using a value error model to drive texture synthesis. I don't consider this noise reduction per se because there is no smoothing, but it does decrease noise. You can improve raw data quite substantially by this method, although no pixel is ever assigned an improved value outside of the error bounds of its original value. The slides and full paper preprint are available.
 
Just to point out that the DxOMark list is by no means inclusive.
I believe their test is not strong enough so they do miss some signal processing.
For example, here's the Nikon D3200 at all ISO settings:

2D FTs from black frames. Top: 100, 200, 400, 800 Bottom: 1600, 3200, 6400, 12800
2D FTs from black frames. Top: 100, 200, 400, 800 Bottom: 1600, 3200, 6400, 12800

Although ISO 6400 and 12800 are documented as "extended" one might not expect that ISO 1600 and ISO 3200 would have noise reduction (this also shows in the read noise measurements).

I hope to automate this process so that my measurements will indicate better than those at DxOMark where signal processing is evident.
(Admitting that some stealth signal processing is technically possible :-) )

--
Bill ( Your trusted source for independent sensor data at http://www.photonstophotos.net )
 
(Admitting that some stealth signal processing is technically possible :-) )
I saw the smiley face... but data filtering always reduces information content. Maybe it would be better to write – always affects information content. Anyway, the effect can be small. The benefit may greatly out-weigh the detriment. But filtered data is always a compromise.

E.T Jaynes wrote

"Old data, when seen in the light of new ideas, can give us an entirely new insight into a phenomenon...When a data set is mutilated (or, to use the common euphemism, ‘filtered’) by processing according to false assumptions, important information in it may be destroyed irreversibly. ... However, old data sets, if preserved unmutilated by old assumptions, may have a new lease on life when our prior information advances."

"Electrical engineers would think instead in terms of Fourier analysis and resort to ‘high- pass filters’ and ‘band-rejection filters’ to deal with trend and seasonality. Again, the philosophy is to produce a new time series (the output of the filter) which represents in some sense an estimate of what the real series would be if the contaminating effect were absent. Then choice of the ‘best’ physically realizable filter is a difficult and basically indeterminate problem; fortunately, intuition has been able to invent filters good enough to be usable if one knows in advance what kind of contamination will occur."

E.T. Jaynes - Probability Theory: The Logic of Science, Preface, p xxvii and p. 536

As far as I know, a properly sufficient statistic will always detect mathematical filtering. (assuming an appropriate SNR). Solving this problem involves model selection. The DFT is a useful model. But that doesn't mean it's the only model. I don't have specific suggestions as to what models may be superior, but that doesn't mean there aren't any.
 
Last edited:
dxo tests is not strong enough.

the proper test would be to take multiple (pre-generated) noise-like frames that should give =const if camera has no NR.
 
dxo tests is not strong enough.

the proper test would be to take multiple (pre-generated) noise-like frames that should give =const if camera has no NR.
That sounds sort-of reasonable. However, in this context, what is noise and what is noise reduction? All you'd be distinguishing is ability of the camera to distinguish your noise-like scene from actual noise sources in the camera.

I believe nearly all high-end cameras do analog noise reduction. My research, referenced above, apparently reduces noise, but what it really does is to enhance values within their error bounds by texture synthesis. I don't think either of those has significant undesirable properties.

I believe that you're really just looking for frequency-domain anomalies. It is quite possible to make a noise reduction algorithm that actually repairs such anomalies. In fact, that is fairly common in the form of deliberately adding noise at the ADC. In sum, I'm not sure I get the point of this....
 
Depends which NR. All do correlated double sampling (which is "legal"). However some also do spatial or temporal filtering.
 
What is wrong with the removal of errors?

Best regards

Holger
 
Last edited:
What is wrong with the removal of errors?
I said nothing about any issue with "removal of errors"; this is widespread noise reduction.
My idea about noise is that it is an error ofthe signal. And it isagood idea to remove errors as soon as possible.

I know that the list is without judgeing - but may people regard changes at the level of RAW as cheating.

I would be interested, if there are different ways to remove the noise.

I remember timeslong ago when we had the great Panasonic FZ30 which was a great bridge camera. I had the Olympus C-5050 that days and thought: let's wait for the follow up modell of the FZ30 and it will be your camera.

Next Panasonic model was the FZ50 - and there were big advertisements of how effective they managed to get rid of the noise - and I saw first photos on the internet and found that they removed noise simply by smearing the singal. At 100% view the photos looked like paintings. I was disappointed. Later I found something similar a photos of Canona 5DMarkII, the well known 24MP FF camera. For a special technical task I needed a camera with maximum resolution and found that the 16 MP Pentax APS-C - the Pentax K5 - did a better job for me than the camera with much more noise but destructive way of noise reduction.

Pentax K5 was know for its clear signal of the RAW files - and we know they did noise reduction at the RAW level - thus, it would not just be of interest, if there is some nosie reduction but also how destructive this works in regard of the information of the signal.

Best regards

Holger
Regards,

--
Bill ( Your trusted source for independent sensor data at http://www.photonstophotos.net )
 
Holger Bargen wrote: Pentax K5 was know for its clear signal of the RAW files - and we know they did noise reduction at the RAW level ...
Hi Holger,

I don't own a K5 but this is the first time that I hear that it routinely performs noise reduction before writing data to the raw file at base/low ISO with all relative in-camera settings off.

Jack
 
Holger Bargen wrote: Pentax K5 was know for its clear signal of the RAW files - and we know they did noise reduction at the RAW level ...
Hi Holger,

I don't own a K5 but this is the first time that I hear that it routinely performs noise reduction before writing data to the raw file at base/low ISO with all relative in-camera settings off.
It is only for high ISO - and I liked it as it made post-processing very easy.
 
What is wrong with the removal of errors?
I said nothing about any issue with "removal of errors"; this is widespread noise reduction.
My idea about noise is that it is an error ofthe signal. And it isagood idea to remove errors as soon as possible.

I know that the list is without judgeing - but may people regard changes at the level of RAW as cheating.
You are right, statistically. :-) Roughly speaking, noise kills information above certain frequencies (this can be made precise) and removing that component of the noise removes no information. Then there is nothing wrong in doing it at RAW level, as long as the demosacing algorithm "knows" that the RAW is altered and processes it accordingly. This is not a precise statement but it is good enough for all practical purposes like ... taking photos.
 
Last edited:
With a proper statistic, removal of errors is completely unnecessary .

If there really are errors, than one must know a great deal about the errors. This prior information can be used to compute a marginal likelihood function. This is in no way related to discarding outliers. The unexpected data included but their contributions are marginalized in a statistically rigorous ad reproducible manner. Known error sources do not increase parameter estimate uncertainties.

If there is any chance they are not from error sources, than those data contain information. By information I mean authentic empirical results. I prefer not to remove information... especially if the information is unexpected. Unexpected information is not necessarily erroneous.

A proper statistic minimizes the risk actual information is mis-attributed as error.
 
...My idea about noise is that it is an error ofthe signal. And it isagood idea to remove errors as soon as possible.
This is certainly an accepted definition of noise.

Another view is the signal's purpose is to compute parameter estimates from the model for the data. Assuming the model maps onto the data in a one-to-one mathematical relationship, the parameter estimates' uncertainty is determined by the noise.

Often the model for the data contains one or more models for the noise. Including noise in the model decreases the parameter estimates' uncertainties.

The noise is not necessarily erroneous. It is empirical.
 
Last edited:
...My idea about noise is that it is an error ofthe signal. And it isagood idea to remove errors as soon as possible.
This is certainly an accepted definition of noise.

Another view is the signal's purpose is to compute parameter estimates from the model for the data. Assuming the model maps onto the data in a one-to-one mathematical relationship, the parameter estimates' uncertainty is determined by the noise.
I know there is something like a "never remove outliers" rule. But sometimes outliers may disturb the model in a way that the signal can be hardly detected and romoval of outliers enables you to get a much more clearly view on the data.

Another aspect could be the quality of the noise. If the noise exists of single white pixels they can be easily detected - and they can be easily removed. All other processes of post-processing will work much bettter without outliers. Just think of routines improving sharpness which can result in horrible patterns if they try to sharpen the noise (I have some experience with a deconvolution programm that needs a smooth signal).

If I have coloured noise and noisereuction itself is an error-prone process it would be better not to tocu the nosie in the RAW and wait for more powerful post-processing tools to get rid of it.

Best regards

Holger
Often the model for the data contains one or more models for the noise. Including noise in the model decreases the parameter estimates' uncertainties.

The noise is not necessarily erroneous. It is empirical.
 
Holger Bargen wrote: Pentax K5 was know for its clear signal of the RAW files - and we know they did noise reduction at the RAW level ...
Hi Holger,

I don't own a K5 but this is the first time that I hear that it routinely performs noise reduction before writing data to the raw file at base/low ISO with all relative in-camera settings off.
It is only for high ISO - and I liked it as it made post-processing very easy.
Ah, ok, in that case virtually every digital camera does that past a certain ISO, some lower some higher.

Jack
 
Last edited:
Holger Bargen wrote: Pentax K5 was know for its clear signal of the RAW files - and we know they did noise reduction at the RAW level ...
Hi Holger,

I don't own a K5 but this is the first time that I hear that it routinely performs noise reduction before writing data to the raw file at base/low ISO with all relative in-camera settings off.
It is only for high ISO - and I liked it as it made post-processing very easy.
Ah, ok, in that case virtually every digital camera does that past a certain ISO, some lower some higher.
I don't have numbers right in front of me; it's common but I suspect that the majority of cameras do not do it even at the highest ISO settings.

Regards,
 

Keyboard shortcuts

Back
Top