Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
As far as i can determine he is not interested in any such approach. i think most of Phils bread must be buttered by jpeg people. i have suggested to him (replying directly to his posts) these same basic ideas, and never gotten a reply. There are a few magazines that do do more serious tests, testing basic sensing capabilities of cameras. But i think if you want to be really popular, as DPReview is succeeding at, you have to appeal to the jpg masses. Personally i do not see that such obviates a more serious thoughtful look at actual performance. He just does not seem interested. Not enough butter? Hard to get an industry to swing if you cannot even get the basic tests published in highly visible places. i just cannot understand what Phil has to gain by ignoring such basic, fundamental, comparisons. Technically involved, yes, but fundamental and not that much more so than his other tests.I'm sure that Phil Askey on this site would be interested in using such a tool in his reviews, as it appears that NR is becoming more an more of a factor on the overall IQ rating and the "standard" standard deviation noise measurements no longer really apply after the test patches are "smudged" by such.
Then his tailor would thrice clothe him fakelyPerhaps the Emperor should consider wearing three-layers of clothes instead ;-)
If one of the cameras was a 5D, the test is totally invalid. Like mixing apples and oranges to put a large sensor camera in the mix. The whole idea would be to keep the sensor size constant and vary the "sensel" size. Using a 5D would vary the heck out of it and i am sure Petteri would not make so gross a mistake. A Arif after all is a very sophisticated photographer who is using a Canon 5D, as well as a Minolta 7D and Sony A100 and the other 10MP cameras at least by reputation, and knows full well that its pixels are quite large and of a totally different class than the yucky 10MP sensors this thread is all about. He probably knows Arifs reputation and would not insult him with such a ringer.Well I count 4 people who guessed at your test so the sample size is a bit small. However everyone agreed that the last image was from the 8MP sensor. The votes were split 50% on the 12MP sensor. If the 12MP camera was the 5D, the fact that it has a huge sensor means that the pixel density is about the same as the density on the 6MP, which would explain the confusion, or no?
I'm sure that Phil Askey on this site would be interested in using such a tool in his reviews, as it appears that NR is becoming more an more of a factor on the overall IQ rating and the "standard" standard deviation noise measurements no longer really apply after the test patches are "smudged" by such.
Since one generally has control over how much NR is applied when given raw image files, the tool generally applies to JPEG's, but can also be applied to the output from raw convertors to determine how much detail is lost vs. how much the standard deviation of noise in flat textured patches is reduced.As far as i can determine he is not interested in any such
approach. i think most of Phils bread must be buttered by jpeg
people. i have suggested to him (replying directly to his posts)
these same basic ideas, and never gotten a reply. There are a few
magazines that do do more serious tests, testing basic sensing
capabilities of cameras. But i think if you want to be really
popular, as DPReview is succeeding at, you have to appeal to the
jpg masses. Personally i do not see that such obviates a more
serious thoughtful look at actual performance. He just does not
seem interested. Not enough butter? Hard to get an industry to
swing if you cannot even get the basic tests published in highly
visible places. i just cannot understand what Phil has to gain by
ignoring such basic, fundamental, comparisons. Technically
involved, yes, but fundamental and not that much more so than his
other tests.
I did look at Dalibor's work, and although impressive as far as user interface, etc., as well as no doubt being tuned for speed of conversion, he only uses various algorithms as commonly available in the public domain and doesn't really break any new ground.If you want to see a programs that does conversions using a lot of
different Bayer methods then look at Dalibors converter for Minolta
raws:
http://dalibor.cz/minolta/index.htm
Possibly my thinking on this is too fuzzy to argue, but it strikes me that the method needs to be able to distinguish signal and noise of the overlapping spatial frequencies. Some of the more visually annoying image noise, especially after some NR processes, can be quite low contrast and more than a few pixels in extent. The same spatial frequencies that convey image sharpness. Perhaps I'm missing something in your method, in which case, sorry.Ken and chuxter, I've been working at a mathematical tool that
would determine how much NR/loss of detail there is in a given
image. I think that in a similar way that noise measurements are
made on a flat textured (hopefully zero natural variation) test
patch, a determination could be made on a finely detailed low
contrast test patch. I have run images of such patches through the
Discreet Cosine Transformation algorithm (DCT, as is used in JPEG
lossy compression to apply different quanitizations to different
"frequencies" of detail) . Given that the components that come out
of the DCT algorithm have magnitudes that are proportional to
frequency, it seems that comparing the magnitudes of the higher
components to the middle "frequency" components does co-relate to
the amount of NR/loss of detail.
I take it that you normalise the results to the lowest ISO performance and work from there. I can imagine that is good with SLRs where the lowest ISO performance is already very good. I have a smaller sensor (2/3") camera that has enough noise at the lowest ISO to be a poor choice for normalisation, so I would not know how to compare those cameras.I have found this is an adequate tool for determining how much
NR/loss of detail increases with increasing ISO sensitivity in the
same camera, but unfortunately the numbers obtained aren't
absolute
But how do you account for anti-aliasing and the low pass function of the lens which will vary from model to model? That is why I wondered about using a known pattern, containing many spatial frequencies at a range of contrast to give the best chance of finding a method that can separate out MTF (not exposure dependent) and frequency-dependent noise (exposure dependent) at the same time - I assume that noise after NR is no longer white, so it is hard to separate them by any simple method.in comparing different cameras due to the variation in Bayer
interpolation algorithms in how much averaging they do across wider
ranges of pixels, although I suppose that could be considered to be
a form of NR with potential loss of fine low contrast detail (ie.
Fuji processing on "angled" sensors does quite a lot of this). I
suppose that I should determine the ratio of frequencies for a very
simple, maximum detail, Bayer interpolation algorithm and use that
as a "standard" for rating all other camera models.
Why are they different crops? If you're trying to make the point that these things are indistinguishable at the pixel level, then showing us such varried scenes, well, hides all the differences in how things are rendered, at the pixel level.The following are all actual-pixels crops converted with default
settings from RAW files. One is shot with a 6MP dSLR, one with an
8MP one, and a third with a 12 MP one. Care to take a guess as to
which is which?
--I don’t know if I’m alone but from what I’ve seen in all the new
dslrs the images lack life and are flat.
With around a 1.5 crop factor the 6mp sensors to me deliver the
best images (and I’m not talking about noise ) full of life but
when they pack more and more pixels in the same size sensors it
just deteriorates.
I remember a while back looking at images from the canon 10D and
thought to my self these look better then the 20D.
I have both a 6mp and a 10mp dslr and the 6mp wins easily for me
does any one else feel the same.
Regards
--
You're welcome to visit my favorite Gallery
http://www.pbase.com/aarif/favorites
I agree. The so-called "lifelessness" of an image has a lot more to do with the shooter than the sensor.http://www.slack.co.uk
His photographs have been taken with many different cameras
including E10, D100, D1x, D2x, D200, D50, E1, E330, Kodak 14n and
nx and various compact digicams.
The remarkable thing about his shots is that they all look very
similar in presentation. His particular 'look' and workflow
survives the transition from camera to camera. His 10MP D200 shots
are no different.
This tells me that a great deal of supposed characteristics unique
to a camera or sensor are NOT the fixed things some people suggest
they are.
Well, the OP probably saw the so-called lifeless 10mp images on the web, which were no doubt post-processed and downsampled as well. I'm betting that Jon Slack's work looks great in print too, regardless of the camera and sensor used.Post-processing and downsampling images to 800*600 hides most of
the differences between cameras.
Gary,Well, the OP probably saw the so-called lifeless 10mp images on
the web, which were no doubt post-processed and downsampled as
well.
My technique currently works best when the image has noise, which provides a high frequency pattern which is not limited by AA filters or lens glass. I use as my standard a theoretical camera that has only gaussian noise, and treat the Bayer pattern with a minimally smoothing Bayer interpolation to obtain RGB values. I then pass that through the DCT conversion to get "standard" ratios between the highest "frequencies" and lower "frequencies".Possibly my thinking on this is too fuzzy to argue, but it strikesKen and chuxter, I've been working at a mathematical tool that
would determine how much NR/loss of detail there is in a given
image. I think that in a similar way that noise measurements are
made on a flat textured (hopefully zero natural variation) test
patch, a determination could be made on a finely detailed low
contrast test patch. I have run images of such patches through the
Discreet Cosine Transformation algorithm (DCT, as is used in JPEG
lossy compression to apply different quanitizations to different
"frequencies" of detail) . Given that the components that come out
of the DCT algorithm have magnitudes that are proportional to
frequency, it seems that comparing the magnitudes of the higher
components to the middle "frequency" components does co-relate to
the amount of NR/loss of detail.
me that the method needs to be able to distinguish signal and noise
of the overlapping spatial frequencies. Some of the more visually
annoying image noise, especially after some NR processes, can be
quite low contrast and more than a few pixels in extent. The same
spatial frequencies that convey image sharpness. Perhaps I'm
missing something in your method, in which case, sorry.
As you say, many new high MP compact cameras have a lot of noise, even at low ISO sensitivities, but that helps plot trends of reduced higher frequencies as sensitivity and NR goes up. I think that ultimately, I will need a very high frequency low contrast test target in order to try to determine how much low contrast detail is lost when noise isn't very detectable. For instance, in all of the 10 MP compact cameras for which I have seen reviews to date, I see a considerable loss of low contrast detail, even in the lowest ISO shots. Note that Simon now includes comparisons of hair shots at various ISO sensitivities for new small sensor compact cameras; I would like to see Phil do this or something similar for DSLR's.I take it that you normalise the results to the lowest ISOI have found this is an adequate tool for determining how much
NR/loss of detail increases with increasing ISO sensitivity in the
same camera, but unfortunately the numbers obtained aren't
absolute
performance and work from there. I can imagine that is good with
SLRs where the lowest ISO performance is already very good. I have
a smaller sensor (2/3") camera that has enough noise at the lowest
ISO to be a poor choice for normalisation, so I would not know how
to compare those cameras.
But AA filters and the low pass function of lenses will affect high contrast detail, so be comparing high contrast detail to low contrast detail, one can see how much low contrast detail is being lost to NR. As to separating frequencies, the DCT responds to any loss of high frequency detail, so is a valid tool.But how do you account for anti-aliasing and the low pass function
of the lens which will vary from model to model? That is why I
wondered about using a known pattern, containing many spatial
frequencies at a range of contrast to give the best chance of
finding a method that can separate out MTF (not exposure dependent)
and frequency-dependent noise (exposure dependent) at the same
time - I assume that noise after NR is no longer white, so it is
hard to separate them by any simple method.
Yes, with a standard test chart, one would have to have both high contrast and low contrast patches in order to be able to plot the loss of low contrast detail as compared to the high contrast detail.Another approach could be to calibrate the transfer functions of
lens and anti-aliasing filter in a very low-noise situation (strong
exposure of a low contrast target at lowest ISO) followed by noise
measurement in a weaker exposure. Is that really your method?
Actually perhaps that is not so hard - two different exposures of a
low contrast test chart could suffice - unless the NR were found to
be adaptive! Perhaps you could try that with your DCT method, if
you did not do that already![]()
Well, there shouldn't be licensing problems with the DCT, which is pretty old math. Any I don't plan to offer a commercial product to analyse the amount of NR; I just want to make a tool available that can help us access whether we are really making any gains in low contrast resolution with these new higher MP cameras.I'm not very convinced that someone would use any of these methods
seriously, there may even be licensing problems for "commercial"
use if (as I suspect is likely) none of this is original!
I have no plan to do any real work on this, but good luck to you!
Ken