Why the religious belief in DxO ratings?

Gidday Moti

It's not such a bad analogy, in some ways. Many (most?) people steer cars from point to point, and that's fine. Others really enjoy driving and get a real buzz out of it. Regardless of how much we love driving, we continue to use cars to go to the shops and similar pedestrian purposes. Pun intended ... :-D
Sure, but my point was that I don't think anyone would by a car he doesn't know, no matter what the purpose is, without drive testing it. Was I wrong?
Actually the more I have personally investigated the camera performance equation in controlled conditions, and seen it correlate with what is achievable out in the real world, the more meaningful the test results produced by outfits like DxO have become to me. This is essentially the end result of a process of demystification. Once you get a handle on all the variables, and understand how they all feed into the final image, considering individual performance characteristics in isolation becomes far more useful.
Gidday Canin

While I appreciate scientific investigation, my own main objection to DxOMark is that they keep their methodology secret, and their resullts have never jelled with my own experience of my own cameras, or my experiences with other brands owned by friends and acquaintances.

Basically, that translates into my reliance on heuristics and the practical experience of myself and others.

Review sites of all kinds are only of some utility. The more practical their methods, the more weight I am inclined to give to their opinions. However, they remain their opinions, not some set-in-stone 'measure' of the kind that DxOMark purports to publish.

--
Regards, john from Melbourne, Australia.
.
Please do not embed images from my web site without prior permission
I consider this to be a breach of my copyright.
-- -- --
.
The Camera doth not make the Man (nor Woman) ...
Perhaps being kind to cats, dogs & children does ...
.
Galleries: http://canopuscomputing.com.au/gallery2/v/main-page/



C120644_small.jpg





Bird Control Officers on active service.
 
Just to complain that a camera sensor testing site does not take into account human element is pretty stupid.
Actually, a camera testing site (one that tests lenses and cameras) that doesn't take into account the human element is pretty stupid.
So smarty pants do you have any idea how to do it?

Or you are just stupid words and no essence.
 
There appears to be an insatiable urge to use DxOMark ratings as some irrefutable bible of camera/lens capability/quality, when in reality the results are little more than the equivalent of fuel economy ratings for cars worked out from relevant country EPA test requirements. As soon as the rubber hits the road, the results become meaningless.

Much, much, better are actual results from users with lenses/cameras in real world use, all the better if they are actual photography enthusiasts and not journos that are writing for some photography mag or the like. Again with the car analogy, motoring journos are not the ideal source for information on cars that might suit your needs.

Do people really consider their lens/camera selections on DxO results?

--
Thoughts, Musings, Ideas and Images from South Gippsland
http://australianimage.com.au/wordpress/
The irony here is that your subjective take has all the religious fervor in support of your erroneous view as you claim the 'believers' to have with actual technical data that DxO offers.

I don't think M43 users like the OP here would author a Book of DxO Denial if they had a camera ranking in the top fifty of DxO sensor rankings. My advice for the Bible thumping crowd who share your belief; stop fretting over tech data that upsets your emotions and enjoy the pictures you get from your camera.

--
Joe
 
Last edited:
Gidday Moti

It's not such a bad analogy, in some ways. Many (most?) people steer cars from point to point, and that's fine. Others really enjoy driving and get a real buzz out of it. Regardless of how much we love driving, we continue to use cars to go to the shops and similar pedestrian purposes. Pun intended ... :-D
Sure, but my point was that I don't think anyone would by a car he doesn't know, no matter what the purpose is, without drive testing it. Was I wrong?
Actually the more I have personally investigated the camera performance equation in controlled conditions, and seen it correlate with what is achievable out in the real world, the more meaningful the test results produced by outfits like DxO have become to me. This is essentially the end result of a process of demystification. Once you get a handle on all the variables, and understand how they all feed into the final image, considering individual performance characteristics in isolation becomes far more useful.
Gidday Canin

While I appreciate scientific investigation, my own main objection to DxOMark is that they keep their methodology secret, and their resullts have never jelled with my own experience of my own cameras, or my experiences with other brands owned by friends and acquaintances.
I'd be curious to know what DxO measurement you referenced and decided was not an accurate representation of that particular performance characteristic of your own sample. The signal-to-noise ratio of your cameras sensor when measuring RAW output at various exposures? Did they get the dynamic range measurements wrong? And how did you verify all this?

DxO is far from infallible when it comes to accurately rating performance characteristics but they do seem to get things mostly right the vast majority of the time. In almost every case I've seen where someone is dismissing their results as completely disconnected from the real world it is because of some sort of misunderstanding about what some particular score or measurement actually means or relates to.
 
Last edited:
Gidday Moti

It's not such a bad analogy, in some ways. Many (most?) people steer cars from point to point, and that's fine. Others really enjoy driving and get a real buzz out of it. Regardless of how much we love driving, we continue to use cars to go to the shops and similar pedestrian purposes. Pun intended ... :-D
Sure, but my point was that I don't think anyone would by a car he doesn't know, no matter what the purpose is, without drive testing it. Was I wrong?
Actually the more I have personally investigated the camera performance equation in controlled conditions, and seen it correlate with what is achievable out in the real world, the more meaningful the test results produced by outfits like DxO have become to me. This is essentially the end result of a process of demystification. Once you get a handle on all the variables, and understand how they all feed into the final image, considering individual performance characteristics in isolation becomes far more useful.
Gidday Canin

While I appreciate scientific investigation, my own main objection to DxOMark is that they keep their methodology secret, and their resullts have never jelled with my own experience of my own cameras, or my experiences with other brands owned by friends and acquaintances.
I'd be curious to know what DxO measurement you referenced and decided was not an accurate representation of that particular performance characteristic of your own sample. The signal-to-noise ratio of your cameras sensor when measuring RAW output at various exposures? Did they get the dynamic range measurements wrong? And how did you verify all this?

DxO is far from infallible when it comes to accurately rating performance characteristics but they do seem to get things mostly right the vast majority of the time. In almost every case I've seen where someone is dismissing their results as completely disconnected from the real world it is because of some sort of misunderstanding about what some particular score or measurement actually means or relates to.
I stopped even bothering to look at the DxOMark site several years ago. They lost all credibility with me then.

If they suddenly make their methodology public - the greatest stumbling block to any credibility, IMNSHO - I reckon somone will let us all know.

If one tried to publish scientific results in any peer reviewed journal without publishing one's data, one would be laughed out of their office in disbelief! Results without supporting data and methodology employed is no better than snake oil IMO ...

--
Regards, john from Melbourne, Australia.
.
Please do not embed images from my web site without prior permission
I consider this to be a breach of my copyright.
-- -- --
.
The Camera doth not make the Man (nor Woman) ...
Perhaps being kind to cats, dogs & children does ...
.
Galleries: http://canopuscomputing.com.au/gallery2/v/main-page/



C120644_small.jpg





Bird Control Officers on active service.
 
I could go on a very long rant. We've learned to put our faith in numerical comparisons and we often over rely on numerical comparisons. We're constantly bombarded by claims in the media like the homicide rate doubled or under my watch the crime rate was reduced by 50%. What that means really depends on the base rate. It might mean going from 1 to 2 homicides, or going from 2 to 1 homicides. And year to year fluctuations don't represent a trend but may simply be year to year fluctuations. I digress. I'm a statistician and have done medical research for nearly 30 years. I think Western science and culture is very reductionist. By that I mean we want to break things into smaller and smaller components to find THE cause. In an experiment we try to control everything except 1 (sometimes it might be more than 1, but a small number) thing which we systematically vary. As kind of an absurd example: Instead of being content with knowing that broccoli is good for us, we want to break broccoli down into smaller and smaller components so that we can identify what thing in broccoli is good for us so we can bottle it and sell it. When in reality it might be that it's a combination of a whole bunch of things in broccoli that work together and reinforce each other. I don't mean to imply that this is necessarily bad, I just think it's often incomplete and we get so focused on seeing the trees we never see the forest. So I think we're kind of trained to believe in quantitative comparisons. You know, 11 is better than 10. So I don't dismiss DxO tests as invalid. I think they measure somethings related to IQ. I just think they don't offer a complete picture. And even they say their numbers are only accurate to within a certain range. And users put too much weight on the quantitative information. For example, differences in iso ratings between 1000 and 2000 look huge, but it's 1 stop. Not trivial but in practice it might mean that shooting camera A at iso 3200 only gives results that are about as good as camera B at iso 6400. And if we're comparing cameras with iso DxO ratings for 900 and 1100, we're probably well within the range of measurement error.
 
Gidday Moti

It's not such a bad analogy, in some ways. Many (most?) people steer cars from point to point, and that's fine. Others really enjoy driving and get a real buzz out of it. Regardless of how much we love driving, we continue to use cars to go to the shops and similar pedestrian purposes. Pun intended ... :-D
Sure, but my point was that I don't think anyone would by a car he doesn't know, no matter what the purpose is, without drive testing it. Was I wrong?
Actually the more I have personally investigated the camera performance equation in controlled conditions, and seen it correlate with what is achievable out in the real world, the more meaningful the test results produced by outfits like DxO have become to me. This is essentially the end result of a process of demystification. Once you get a handle on all the variables, and understand how they all feed into the final image, considering individual performance characteristics in isolation becomes far more useful.
Gidday Canin

While I appreciate scientific investigation, my own main objection to DxOMark is that they keep their methodology secret, and their resullts have never jelled with my own experience of my own cameras, or my experiences with other brands owned by friends and acquaintances.
I'd be curious to know what DxO measurement you referenced and decided was not an accurate representation of that particular performance characteristic of your own sample. The signal-to-noise ratio of your cameras sensor when measuring RAW output at various exposures? Did they get the dynamic range measurements wrong? And how did you verify all this?

DxO is far from infallible when it comes to accurately rating performance characteristics but they do seem to get things mostly right the vast majority of the time. In almost every case I've seen where someone is dismissing their results as completely disconnected from the real world it is because of some sort of misunderstanding about what some particular score or measurement actually means or relates to.
I stopped even bothering to look at the DxOMark site several years ago. They lost all credibility with me then.

If they suddenly make their methodology public - the greatest stumbling block to any credibility, IMNSHO - I reckon somone will let us all know.

If one tried to publish scientific results in any peer reviewed journal without publishing one's data, one would be laughed out of their office in disbelief! Results without supporting data and methodology employed is no better than snake oil IMO ...
While DxO Labs might not be 100% transparent they do in fact publish quite a lot of data pertaining to how they arrive at their results (most of the criticism relates to how the overall score is calculated but anyone who is interested in some particular performance characteristic should be looking at other data anyway). But even if they didn't the measurements they publish would remain useful for as long as there was a strong correlation with what other people were able to determine by running the same sorts of tests. And there is.

In order to make your case that they are pushing nothing but "snake oil" (which is by definition something of no real value) you would still have to show that the bulk of the data they publish is erroneous.
 
Last edited:
There appears to be an insatiable urge to use DxOMark ratings as some irrefutable bible of camera/lens capability/quality, when in reality the results are little more than the equivalent of fuel economy ratings for cars worked out from relevant country EPA test requirements. As soon as the rubber hits the road, the results become meaningless.

Much, much, better are actual results from users with lenses/cameras in real world use, all the better if they are actual photography enthusiasts and not journos that are writing for some photography mag or the like. Again with the car analogy, motoring journos are not the ideal source for information on cars that might suit your needs.

Do people really consider their lens/camera selections on DxO results?
DxO lens tests: no, not at all. Not because I consider tests meaningless, but because they have presented enough questionable results that I look for lens tests elsewhere.

Sensor tests: yes, because in my experience they have been more consistent with practical results.
 
I much prefer a real life assessment of equipment lenses and cameras ,image examples etc ,figures and % generally i find dull ,but not meaning less they have there place just do not take

it as gospel .

I would hold for example a dpreview review of camera or lens over the dxo results for judgement

and as some one pointed out there is not clear transparency over there testing so they could be biased ,

the dxo camera iphone got very good marks ?
 
To clarify:

Are you specifically referring to the DxO final overall Scores only, or are you also stating that All the data from DxO is pretty much meaningless in the real world ?
I'm simply talking about how people use DxO scores as if they were something writ in stone and reflect what is good and what is not.
Can you post a few links where people did that? I don't recall seeing it. Thanks.
Well, all the time fro mtime to time, I suppose.

I would say the same as Ray does here -- but I certainly don't have links for such stuff and I wouldn't be bothered searching them out. I hope Ray isn't bothered either.

If you haven't noticed it then obviously it is not something that irritates you.
 
Gidday Moti

It's not such a bad analogy, in some ways. Many (most?) people steer cars from point to point, and that's fine. Others really enjoy driving and get a real buzz out of it. Regardless of how much we love driving, we continue to use cars to go to the shops and similar pedestrian purposes. Pun intended ... :-D
Sure, but my point was that I don't think anyone would by a car he doesn't know, no matter what the purpose is, without drive testing it. Was I wrong?
Actually the more I have personally investigated the camera performance equation in controlled conditions, and seen it correlate with what is achievable out in the real world, the more meaningful the test results produced by outfits like DxO have become to me. This is essentially the end result of a process of demystification. Once you get a handle on all the variables, and understand how they all feed into the final image, considering individual performance characteristics in isolation becomes far more useful.
Gidday Canin

While I appreciate scientific investigation, my own main objection to DxOMark is that they keep their methodology secret, and their resullts have never jelled with my own experience of my own cameras, or my experiences with other brands owned by friends and acquaintances.
I'd be curious to know what DxO measurement you referenced and decided was not an accurate representation of that particular performance characteristic of your own sample. The signal-to-noise ratio of your cameras sensor when measuring RAW output at various exposures? Did they get the dynamic range measurements wrong? And how did you verify all this?

DxO is far from infallible when it comes to accurately rating performance characteristics but they do seem to get things mostly right the vast majority of the time. In almost every case I've seen where someone is dismissing their results as completely disconnected from the real world it is because of some sort of misunderstanding about what some particular score or measurement actually means or relates to.
I stopped even bothering to look at the DxOMark site several years ago. They lost all credibility with me then.

If they suddenly make their methodology public - the greatest stumbling block to any credibility, IMNSHO - I reckon somone will let us all know.

If one tried to publish scientific results in any peer reviewed journal without publishing one's data, one would be laughed out of their office in disbelief! Results without supporting data and methodology employed is no better than snake oil IMO ...
While DxO Labs might not be 100% transparent they do in fact publish quite a lot of data pertaining to how they arrive at their results. But even if they didn't the measurements they publish would remain useful for as long as there was a strong correlation with what other people were able to determine by running the same sorts of tests. And there is.
Not so last time I looked at comparative information from a wide variety of review sites, which included DxOMark at the time. Perhaps this correlation you speak of exists for CanNikon and fails to exist as far as other brands are concerned. I place enormously greater faith in what (say) Roger Cicala at Lensrentals says than I do in anything I have ever read at DxOMark ...
In order to make your case that they are pushing nothing but "snake oil" (which is by definition something of no real value) you would still have to show that the bulk of the data they publish is erroneous.
That's not exactly what I said ... However your interpreting it that way indicates that you patently understand that most of the information put forward by DxOMark is not substantiated by the release of either methodology or raw data ...

It is not up to me to disprove what they say.
It is up to them to substantiate their unsubstantiated claims and thereby establish their own credibility ...



--
Regards, john from Melbourne, Australia.
.
Please do not embed images from my web site without prior permission
I consider this to be a breach of my copyright.
-- -- --
.
The Camera doth not make the Man (nor Woman) ...
Perhaps being kind to cats, dogs & children does ...
.
Galleries: http://canopuscomputing.com.au/gallery2/v/main-page/



C120644_small.jpg





Bird Control Officers on active service.
 
I assume they have a lens testing website, but I've never been there, LOL. I'd much rather see some photos posted by users and those user's comments than test scores. It isn't that i don't believe in measurable 'facts', so much as it is that in our modern digital world, we are bombarded so heavily by 'irrefutable facts' that are instantly refuted by other 'irrefutable facts', that keeping score that way has become impossible. Give me a user's impression of some gear and some supporting photos any day!

I did try out their raw development software. I liked it well enough, I was particularly impressed with their CA controls, but overall, I didn't think it offered enough reason to switch from what I'm currently using.
 
Gidday Moti

It's not such a bad analogy, in some ways. Many (most?) people steer cars from point to point, and that's fine. Others really enjoy driving and get a real buzz out of it. Regardless of how much we love driving, we continue to use cars to go to the shops and similar pedestrian purposes. Pun intended ... :-D
Sure, but my point was that I don't think anyone would by a car he doesn't know, no matter what the purpose is, without drive testing it. Was I wrong?
Actually the more I have personally investigated the camera performance equation in controlled conditions, and seen it correlate with what is achievable out in the real world, the more meaningful the test results produced by outfits like DxO have become to me. This is essentially the end result of a process of demystification. Once you get a handle on all the variables, and understand how they all feed into the final image, considering individual performance characteristics in isolation becomes far more useful.
Gidday Canin

While I appreciate scientific investigation, my own main objection to DxOMark is that they keep their methodology secret, and their resullts have never jelled with my own experience of my own cameras, or my experiences with other brands owned by friends and acquaintances.
I'd be curious to know what DxO measurement you referenced and decided was not an accurate representation of that particular performance characteristic of your own sample. The signal-to-noise ratio of your cameras sensor when measuring RAW output at various exposures? Did they get the dynamic range measurements wrong? And how did you verify all this?

DxO is far from infallible when it comes to accurately rating performance characteristics but they do seem to get things mostly right the vast majority of the time. In almost every case I've seen where someone is dismissing their results as completely disconnected from the real world it is because of some sort of misunderstanding about what some particular score or measurement actually means or relates to.
I stopped even bothering to look at the DxOMark site several years ago. They lost all credibility with me then.

If they suddenly make their methodology public - the greatest stumbling block to any credibility, IMNSHO - I reckon somone will let us all know.

If one tried to publish scientific results in any peer reviewed journal without publishing one's data, one would be laughed out of their office in disbelief! Results without supporting data and methodology employed is no better than snake oil IMO ...
While DxO Labs might not be 100% transparent they do in fact publish quite a lot of data pertaining to how they arrive at their results. But even if they didn't the measurements they publish would remain useful for as long as there was a strong correlation with what other people were able to determine by running the same sorts of tests. And there is.
Not so last time I looked at comparative information from a wide variety of review sites, which included DxOMark at the time. Perhaps this correlation you speak of exists for CanNikon and fails to exist as far as other brands are concerned. I place enormously greater faith in what (say) Roger Cicala at Lensrentals says than I do in anything I have ever read at DxOMark ...
Once again I ask you: please provide evidence of DxO's systematic failure to produce accurate data.

For the record I don't own any Canon or Nikon cameras; only Panasonic and Olympus cameras. And DxO Labs published data is consistent with what I have observed myself when conducting my own tests (or even when comparing multiple image samples for different cameras such as those provided by DPR, Imaging Resource and other camera review sites).
In order to make your case that they are pushing nothing but "snake oil" (which is by definition something of no real value) you would still have to show that the bulk of the data they publish is erroneous.
That's not exactly what I said ... However your interpreting it that way indicates that you patently understand that most of the information put forward by DxOMark is not substantiated by the release of either methodology or raw data ...
Not really true at all: http://www.dxomark.com/About/In-depth-measurements/DxOMark-testing-protocols

Not only do they publish their testing methodology for all to see, they've even packaged it up so everyone else can use the exact same tools. So I ask you: what are you talking about?

As for the "raw data", what data are you speaking of?
It is not up to me to disprove what they say.
It is up to them to substantiate their unsubstantiated claims and thereby establish their own credibility ...
I see a hell of a lot more "substantiation" coming from DxO Labs than I do from you...
 
Last edited:
The irony here is that your subjective take has all the religious fervor in support of your erroneous view as you claim the 'believers' to have with actual technical data that DxO offers.

I don't think M43 users like the OP here would author a Book of DxO Denial if they had a camera ranking in the top fifty of DxO sensor rankings.
Well, I think a LOT of people simply don't know how their gear stacks up in DXO's hierarchy of 'good' gear. I consider it irrelevant to my life and photography.

I understand Ray's concern, however. There's a lot of camera jingoism going on in these forums, and DXO scores are tossed around like parables at a church camp for teenagers. Through sheer use, these scores then take on a life and validity that is separate from their value as a test score. Like the parables I mentioned, they are confused with the actual message originally offered.
 
You can feel free to believe whatever you like, with no impediment whatsoever from me.

Perhaps you could pay myself and others the same courtesy?

I will have a look at your link, when and if I feel so disposed. However, life is too short to endlessly revisit those who have already demonstrated themselves to be untrustworthy or unreliable ...
 
There appears to be an insatiable urge to use DxOMark ratings as some irrefutable bible of camera/lens capability/quality, when in reality the results are little more than the equivalent of fuel economy ratings for cars worked out from relevant country EPA test requirements. As soon as the rubber hits the road, the results become meaningless.

Much, much, better are actual results from users with lenses/cameras in real world use, all the better if they are actual photography enthusiasts and not journos that are writing for some photography mag or the like. Again with the car analogy, motoring journos are not the ideal source for information on cars that might suit your needs.

Do people really consider their lens/camera selections on DxO results?
 
You can feel free to believe whatever you like, with no impediment whatsoever from me.

Perhaps you could pay myself and others the same courtesy?
All I see "yourself and others" (if by "others" you mean those who have adopted a similar stance to your own with respect to the issue at hand) doing is denigrating what has proven to be a valuable resource to people who understand how to use it properly. I can't say that I see much "courtesy" in that.

Not that I am saying that DxO advocates are always courteous because they are probably not. I'm just saying that at this point I don't see you extending the courtesy that you would be required to extend yourself in order to legitimately demand that I do "the same".
I will have a look at your link, when and if I feel so disposed.
The link is not just for you but also for those who are reading. Anyone who follows it and has a bit of a look around should quickly realize that DxO Labs are not really obfuscating their testing methodology at all. In fact they have gone to great lengths to elucidate it.
However, life is too short to endlessly revisit those who have already demonstrated themselves to be untrustworthy or unreliable ...
You keep saying that. But merely saying it doesn't make it true.
 
Last edited:
Besides, with guys like Robin Wong who can make a brownie take ridiculously good shots, any problems with my photos aren't with the equipment but mostly with the photographer, i.e. me.
And that's where DxOMark fails miserably.
If they are not playing i do not know how they fail and do it miserably.
It leads many people to believe that products that are more than capable of high quality results are incapable of such because of DxOMark scores.
What user do is upto him. I am pretty sure you can buy the best camera according to them and shoot c ra p photos out of it and claim how ridiculously wrong dxo got it.
Sure, if your purpose is to show off, there is nothing like DXO results to give you a great ego trip.

But if you are a serious photographer and you care about your photographs, nothing comes close to real world photo rests.
Dxomark does not claim that you can not make good or great photos from lower ranked camera sensors.

Just to complain that a camera sensor testing site does not take into account human element is pretty stupid.

I still use R1 and that camera was released in 2005, you would not get anyone who does what he does regardless of how dxomark ranks sensors.
It is a realy pity that Sony didn't develop the R1 . I still have mine, imagine the possibilities with a modern sensor and processing power

But it does not change the fact that they do not test what humans will do with that camera.

--
::> I make spelling mistakes. May Dog forgive me for this.
 
Isn't the modern version basically the RX10?

Or does the quirky design achieve something different/better?
 
It appears to me that you have extended not the slightest respect towards others with different views from yours in either this thread, or any other. I hope that I have not been personally vindictive towards anyone in this thread. Perhaps you could ask yourself the same question?

Generally, you get what you give in life, but I, for one try not to treat these things as a zero sum game, infinitely preferring the outcomes attendant in non-zero sum games. However, I see that DPR is still a combat zone.

BTW, I did read the entirety of that link. DxOMark state lots of 'Mom and apple pie' stuff, but there are a lot of important and contestable items that are either missing or glossed over. Just like any good snake oil salesman of times gone by ... The training I have had in statistics and other areas helps me to look for what's not said as well as to evaluate critically what is said.

If you can't see that, then I will never convince you otherwise.

Let's leave it at that, shall we? Vituperation and personal attacks merely make this place more unpleasant than it normally is.
 

Keyboard shortcuts

Back
Top