Canon G1X gets DxoMarked. Total score of 60.

As I said, one would not notice the difference in image quality between any of these cameras when shooting at lower ISO's and processing the images normally with normal image tone curves.
Regards, GordonBGood
I believe DXOMark low ISO results are reliable.

But their high ISO results can be discarded 'cos their tests fail to uncover any in-camera RAW processing (most recent example is the V1/J1 sensor... luckily DPReview picked up on this).
Sometimes they find noise reduction in the files, sometimes they miss it. However, one can analyze the numbers oneself also. For example, if the DR curve doesn't go down about (or actually typically a bit less than) one stop for each doubling of ISO, there has to be a reason for that. For example, with Canon's low ISO scores it's likely to be their analog to digital conversion. At higher ISOs significant deviations imply noise reduction.

But discarding all their measurements from certain ISO range because they've missed noise reduction in some cases is not in my opinion the best way of handling the information they give. Better to just think about the data, instead of simply trusting (or ignoring) it blindly.
The question is, what is noise reduction and why is it bad. DxO looks for noise reduction by looking for inter pixel correlation. If there is no inter pixel correlation, there is no smearing of detail between pixels, and it is not something to be worried about. If some clever manufacturers have found a way of suppressing noise without introducing smearing between pixels, then why is that anything but a good thing?
--
Bob
 
The question is, what is noise reduction and why is it bad. DxO looks for noise reduction by looking for inter pixel correlation. If there is no inter pixel correlation, there is no smearing of detail between pixels, and it is not something to be worried about. If some clever manufacturers have found a way of suppressing noise without introducing smearing between pixels, then why is that anything but a good thing?
True, but the V1/J1 still got caught by DPReview:
http://www.dpreview.com/reviews/nikonv1j1/page13.asp
 
The silliness is in the presentation of a single score to appeal to the simpletons who can only assess things on the basis of a single number. The DxO data provides quite enough information for those who know what their own priorities are to pass their own judgement.
If their results are complex, then why reduce the information into a single score? There are many critics (not just about cameras, but lenses, audio gear etc etc) out there who refuse to reduce their findings into a single number; they instead choose to lay bare all their findings. To simplify something that cannot be simplified is tantamount to deliberate distortion of information.
 
That's not true. They note it in almost every Pentax review and they DID notice it in the V1/J1 results aswell. Just check the SNR graph and notice the open instead of solid dots at higher ISO's, which indicates smoothing.
Indeed. But they paid NO heed to it. :)

They continued to score and rank the V1/J1 sensor when they should have instead discarded all the INVALID low light ISO results and not rank the sensor at all. They jolly well know those are PROCESSED results which do not in any way reflect the sensors true performance.

If that is not poor (I must say, highly biased) reporting, I do not know what is.
The silliness is in the presentation of a single score to appeal to the simpletons who can only assess things on the basis of a single number. The DxO data provides quite enough information for those who know what their own priorities are to pass their own judgement.
--
Bob
Actually Bob...it's DxO that ranks the cameras...not us. It's DxO that ranks the Nikon D3100 not only above the Canon 5D2 and Canon 7D....but ranks it higher than some Medium Format Digital Backs. Not if you're claiming that "those who know better" can go through the numbers to pass judgement, I'm all ears as to why the Nikon D3100 is better than the 5D2, 7D or MFDBs.

In fact, I'd love to see this in the real world in a photograph. Why don't we take an image from both the D3100 and 5D2, or the D3100 and the Hassey MFDB and produce a 20x30 print at base iso and at max iso and have a group of people pick which print is better.

Are you suggesting that the everyone will choose the D3100?

That is why DxO is a joke.
 
The question is, what is noise reduction and why is it bad. DxO looks for noise reduction by looking for inter pixel correlation. If there is no inter pixel correlation, there is no smearing of detail between pixels, and it is not something to be worried about. If some clever manufacturers have found a way of suppressing noise without introducing smearing between pixels, then why is that anything but a good thing?
True, but the V1/J1 still got caught by DPReview:
http://www.dpreview.com/reviews/nikonv1j1/page13.asp
They got 'caught' by DxO too. Your point is?
--
Bob
 
The silliness is in the presentation of a single score to appeal to the simpletons who can only assess things on the basis of a single number. The DxO data provides quite enough information for those who know what their own priorities are to pass their own judgement.
If their results are complex, then why reduce the information into a single score?
To appeal to simpletons, as I said.
There are many critics (not just about cameras, but lenses, audio gear etc etc) out there who refuse to reduce their findings into a single number; they instead choose to lay bare all their findings. To simplify something that cannot be simplified is tantamount to deliberate distortion of information.
It's an unwise choice. The simpletons still argue and it discredits the rest amongst the others.
--
Bob
 
True, but the V1/J1 still got caught by DPReview:
http://www.dpreview.com/reviews/nikonv1j1/page13.asp
They got 'caught' by DxO too. Your point is?
Then, DXOMark which supposedly tries to evaluate bare SENSOR performance (NOT post-processing abilities) should have discarded all these high ISO results. At the very least, no low light ISO score should be awarded in such instances. To award a score is to deliberately mislead their readers especially since DXOMark is aware they are no longer evaluating bare sensor performance but manufacturers' post-processing abilities.
 
That's not true. They note it in almost every Pentax review and they DID notice it in the V1/J1 results aswell. Just check the SNR graph and notice the open instead of solid dots at higher ISO's, which indicates smoothing.
Indeed. But they paid NO heed to it. :)

They continued to score and rank the V1/J1 sensor when they should have instead discarded all the INVALID low light ISO results and not rank the sensor at all. They jolly well know those are PROCESSED results which do not in any way reflect the sensors true performance.

If that is not poor (I must say, highly biased) reporting, I do not know what is.
The silliness is in the presentation of a single score to appeal to the simpletons who can only assess things on the basis of a single number. The DxO data provides quite enough information for those who know what their own priorities are to pass their own judgement.
--
Bob
Actually Bob...it's DxO that ranks the cameras...not us. It's DxO that ranks the Nikon D3100 not only above the Canon 5D2 and Canon 7D....but ranks it higher than some Medium Format Digital Backs. Not if you're claiming that "those who know better" can go through the numbers to pass judgement, I'm all ears as to why the Nikon D3100 is better than the 5D2, 7D or MFDBs.
I never said anything else. i think DxO is very silly to try to appeal to those incapable of understanding their measurements by condensing them to a single score. However, how that score is arrived at is made plain, so it is easy to see. As to why the D3100 scores higher (which isn't the same thing as 'is better', that's straight forward:

Vs the 7D, the D3100 (67) 'beats' the 7D (66) by one mark. That one mark essentially comes from colour depth, and Canon's use of a yellow filter rather than red in it CFA to gain a few percent efficiency is well known. That is the cost. Whether you agree that DxO should be placing so much weight on colour depth is a different issue, but for many it's an important issue.

vs the 5DII (79) DxO rates the D3100 'better' in the sense it gives it a lower score. What were you expecting.

vs the MFDB, it depends which MFDB you are talking about, but I couldn't find one that DxO scores lower than the D3100. perhaps you could say which one you were thinking about.
In fact, I'd love to see this in the real world in a photograph. Why don't we take an image from both the D3100 and 5D2, or the D3100 and the Hassey MFDB and produce a 20x30 print at base iso and at max iso and have a group of people pick which print is better.
Probably the ones from the 5DII and the Hassy, as DxO scores suggest that they would.
Are you suggesting that the everyone will choose the D3100?

That is why DxO is a joke.
Except that they don't say what you say they say, so who's the joker, the site that gives the 5DII and MFDB a higher score than the D3100, or the person who says that they don't?
--
Bob
 
True, but the V1/J1 still got caught by DPReview:
http://www.dpreview.com/reviews/nikonv1j1/page13.asp
They got 'caught' by DxO too. Your point is?
Then, DXOMark which supposedly tries to evaluate bare SENSOR performance (NOT post-processing abilities) should have discarded all these high ISO results. At the very least, no low light ISO score should be awarded in such instances. To award a score is to deliberately mislead their readers especially since DXOMark is aware they are no longer evaluating bare sensor performance but manufacturers' post-processing abilities.
How would you evaluate 'bare sensor performance'?
--
Bob
 
The G1 X is Canon's latest sensor, and improvement over the last sensor that was last the T3i yet they give the sensor worse numbers all around? The image sensor is 95% of the height of Canon's aps-c sensor. It is essentially their aps-c sensor with the sides cropped off to make it 4:3 instead of 3:2 so it's not really a smaller sensor just smaller form factor. It might be that DxO only test with JPG and is using a pre-production unit and with settings not fully fine tuned.
its e/v from FF is -1.722 stops, and Canons APSC is -1.37 stops, so in making high ISO comparisons if theyre on the same technology there should be 0.344 stops between them.

--
Riley

any similarity to persons living or dead is coincidental and unintended
support 1022 Sunday Scapes'
 
Anyone else seeing a disconnect with reality here? I know some people quote DxO like the bible around here....and if they do that, then I can claim that the Nikon D3100 is better than the 5D2 and MF Digital Backs.
well Ive had my own issues with DxO, and found it lacking. One of my own cameras, an E5 is reported there to have a DR of 10.5 stops, yet in review it will pull 10.5 in jpeg. Under IMATEST similar 4/3 sensors bring 11.5 stops of DR. Thats a stop over the so called engineering limit

Now Ive been told that the differences are where one draws the line with noise, but even in this there are conflicts, an IMATEST review may have had NR applied in RAW conversion (even though presumably the same applies in other reviews under the same flag), but at a minimum a stop seems like a mighty unhealthy gap to me. This in contrast to cameras that have NR in RAW.

Im not saying either or which is good or bad, but if you were looking for a level playing field it seems to me DxO isnt it. Wait for some hands on reviews to cover major aspects and IQ in a variety of conditions and evaluate from there. After all its the end product of the image that counts, and you need images you can adequately suffice that

(I have no affiliation to any review site, evaluation material, evaluation products, or camera manufacturers)

--
Riley

any similarity to persons living or dead is coincidental and unintended
support 1022 Sunday Scapes'
 
I never said anything else. i think DxO is very silly to try to appeal to those incapable of understanding their measurements by condensing them to a single score.
Well, that website is there to make money. It is not silly to cash in on stupidity. Smarter people would understand the reasons and would excuse DXO. Examples like this can be found everywhere around us.
 
Figured out that DOF doesn't decrease with diffraction yet?
 
DxoMark scores are always a bit of skewed reality that doesn't translate well to real world shooting.
The average is not relevant for all photographers as you have to know what is most important for you- colour depth, dynamic range or low light performance
 
Under IMATEST similar 4/3 sensors bring 11.5 stops of DR.
One needs to know how to use tools.
ok lets suppose they do not

yet they have a series of tests on various cameras of the same format with very close results. One position we could take, serial offenders?

Ok lets go a step further

these same 'serial offenders' would apply the same method to all their reviews, that in itself is of some use b/se it is at least consistent by method. Its the same at DPR, whether or not you see their review practice as right fair good or bad, at least it will be consistent within some assessable criteria b/se the same method is applied. And lets face it we all do exactly that, we look at the review and make judgement calls suitable for ourselves, and perhaps we share this material around dpr forums. Or are you going to tell me you never did that? ;)

Across the scope of available material, at the end of the day some things about DxO are not adding up. Yet there seems to be this acceptance that DxO are 'perfect' and others some how less.

Bluntly, considering the mix of RAWs with NR, the uniqueness of DxO's criteria for ISO measurement that are apart from the manufacturers, and the application of their own 'mysterious' curves, not to mention black-level offset serving Nikon cameras, I think that notion of 'perfect' is a highly challengeable proposition not adequately presented by DxO advocates.

Any one feel like stepping up?

Do not get me wrong, they present a lot of material that is of interest, and they have brought to light things to consider as points of difference, and I have used them for just that purpose. But Im getting more and more careful about what I use their material for, as I see parts of their data regime begin to melt under the blowtorch of dissatisfaction.

--
Riley

any similarity to persons living or dead is coincidental and unintended
support 1022 Sunday Scapes'
 
Under IMATEST similar 4/3 sensors bring 11.5 stops of DR.
One needs to know how to use tools.
ok lets suppose they do not
That says it all. In Imatest there is an option to use dcraw or LibRaw to get it work right on raw data without any noise reduction or other processing applied in conversion. That, of course, does not take care of lighting, focusing, and exposure skills.

--
http://www.libraw.org/
 
Under IMATEST similar 4/3 sensors bring 11.5 stops of DR.
One needs to know how to use tools.
ok lets suppose they do not
That says it all. In Imatest there is an option to use dcraw or LibRaw to get it work right on raw data without any noise reduction or other processing applied in conversion. That, of course, does not take care of lighting, focusing, and exposure skills.
yet I have to say again
theyve tested a range of 4/3rds sensors with quite similar results
still that 1 stop discrepancy exists (and thats a minimum)

so it goes to other reviews theyve done, similar disciplines are obviously in force

--
Riley

any similarity to persons living or dead is coincidental and unintended
support 1022 Sunday Scapes'
 

Keyboard shortcuts

Back
Top