Can we trust DxOMark (after they rated their own DxO One, 1" sensor at 1657 ISO)?

Started Aug 10, 2015 | Discussions
walkaround Senior Member • Posts: 2,551
Re: They don't test raw "data"
2

cainn24 wrote:

walkaround wrote:

cainn24 wrote:

walkaround wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common, and in most cases it has been detected pretty quickly and identified as such.

The underlying argument you seem to be making here is that the sensor measurements that DxO Labs publish can't possibly correlate with what people will see in the real world when comparing actual images. But that's simply not true. If you run a bunch of RAW files through a standardized development process you will discover a strong tendency for the relative differences identified by DxO Labs to manifest in your final images.

There is no such thing as a "standardized development process".

Sure there is. I take two RAW files, open them up in the same RAW converter, and develop them with the exact same settings. My standardized development process might be different from someone else's but as long as it is the same for all the output in question there there will be a strong tendency for the relative differences that DxO Labs measure and report on to manifest through that development process to a similar degree. That's the point. There is in fact correlation, and that makes the data useful.

You snipped off what I said after that:

This or that website can choose to convert every camera using ACR, but then your results are dependent on the ability of ACR to render every raw file with equal ability and quality.

If I don't use the converter you use, the results your "standardized process" is useless to me.

My point has never been that someone else's development process will be useful to you if you use a different one. Rather my point has always been that if you use your own standardized development process you will, to some significant extent, be able to observe the relative differences that DxO Labs report on. And the reason I am making this point is, again, to demonstrate that DxO Labs are telling you something real about comparative sensor performance.

Ok, well we have to agree to disagree on this. DXO does not, to give just one example, even take pattern noise into account. You might not have been happy about that if you had purchased a D7100, based on the dxo scores.

So we'll have to disagree about how useful their results are to the consumer.

(unknown member) Forum Pro • Posts: 11,521
Re: They don't test raw "data"
1

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common,

actually what i discribed it most certainly is common.

and in most cases it has been detected pretty quickly and identified as such.

actually that's not the case at all. for instance, dxo never publishes / identifies that over around ISO 3200 sony bakes in NR

Actually I've seen DxO report on "smoothing" of RAW data for several Sony cameras at high ISO.

they dont' catch nearly them all, and certainly not any of the ILC's.

what DxO is stating is that:

A+B+Delta C = Z and A+B = Z

where Delta C <> 0

Since that is mathematically impossible - then the numerical data is in some factor, flawed based upon flawed scientific assumptions.

Now if they decompress the RAW data (which they must) - and then apply consistent DxO algorithms to demosiac and take all RAW's up to a status quo as defined by DxO - then you have the case where C = 0.

No matter how much you wish to dismiss it, brand to brand comparison of the data is suspect (even at times historical comparisons are if the raw format subtly changed).

cainn24 Veteran Member • Posts: 4,892
Re: They don't test raw "data"
6

walkaround wrote:

cainn24 wrote:

walkaround wrote:

cainn24 wrote:

walkaround wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common, and in most cases it has been detected pretty quickly and identified as such.

The underlying argument you seem to be making here is that the sensor measurements that DxO Labs publish can't possibly correlate with what people will see in the real world when comparing actual images. But that's simply not true. If you run a bunch of RAW files through a standardized development process you will discover a strong tendency for the relative differences identified by DxO Labs to manifest in your final images.

There is no such thing as a "standardized development process".

Sure there is. I take two RAW files, open them up in the same RAW converter, and develop them with the exact same settings. My standardized development process might be different from someone else's but as long as it is the same for all the output in question there there will be a strong tendency for the relative differences that DxO Labs measure and report on to manifest through that development process to a similar degree. That's the point. There is in fact correlation, and that makes the data useful.

You snipped off what I said after that:

This or that website can choose to convert every camera using ACR, but then your results are dependent on the ability of ACR to render every raw file with equal ability and quality.

If I don't use the converter you use, the results your "standardized process" is useless to me.

My point has never been that someone else's development process will be useful to you if you use a different one. Rather my point has always been that if you use your own standardized development process you will, to some significant extent, be able to observe the relative differences that DxO Labs report on. And the reason I am making this point is, again, to demonstrate that DxO Labs are telling you something real about comparative sensor performance.

Ok, well we have to agree to disagree on this. DXO does not, to give just one example

Why not give 5 examples, or 10?  If you want to establish that the problem is so systemic that DxO Labs provide nothing of real value then that's the sort of direction you'll need to head in.

, even take pattern noise into account. You might not have been happy about that if you had purchased a D7100, based on the dxo scores.

Again this is just a recommendation to throw the baby out with the bathwater.  One other example I can think of is how the purple haze that afflicts certain high resolution Nikon sensors at very high ISOs makes it more difficult to produce a pleasing image than DxO's numbers would suggest.  There are some other similar examples of this same phenomenon to be found here and there as well.  But I've never claimed that DxO's data is flawless, only that it can be useful.  And these exceptions are specific-use cases anyway and the people who prioritize them will very likely use DxO Labs as one data point only, as they should.  As everyone should, always, no matter what resource we are discussing.

cainn24 Veteran Member • Posts: 4,892
Re: They don't test raw "data"
2

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common,

actually what i discribed it most certainly is common.

and in most cases it has been detected pretty quickly and identified as such.

actually that's not the case at all. for instance, dxo never publishes / identifies that over around ISO 3200 sony bakes in NR

Actually I've seen DxO report on "smoothing" of RAW data for several Sony cameras at high ISO.

they dont' catch nearly them all, and certainly not any of the ILC's.

In order to demonstrate that DxO's data is useless you need to demonstrate that the problem is systemic rather than merely existing in isolated pockets here and there.

what DxO is stating is that:

A+B+Delta C = Z and A+B = Z

where Delta C <> 0

Since that is mathematically impossible - then the numerical data is in some factor, flawed based upon flawed scientific assumptions.

What numerical data are you referring to? Specifically?

Now if they decompress the RAW data (which they must) - and then apply consistent DxO algorithms to demosiac and take all RAW's up to a status quo as defined by DxO - then you have the case where C = 0.

No matter how much you wish to dismiss it, brand to brand comparison of the data is suspect (even at times historical comparisons are if the raw format subtly changed).

Yet as I have said countless times now the data that DxO Labs publish correlates strongly with what I see myself when running RAW output from various cameras through any number of different workflows, and plenty of other people find the same to be true.

All you can possibly prove here is that DxO Labs is not perfect.  But I never claimed they were, and no-one ever is.  The bottom line is that unless you can demonstrate a widespread disconnect between their published results and what can be observed by any number of individuals who take (and have taken) it upon themselves to investigate the comparative differences that are observable when working with RAW output then the only thing that is really going on here is a bit of DxO bashing.

(unknown member) Forum Pro • Posts: 11,521
Re: They don't test raw "data"
1

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common,

actually what i discribed it most certainly is common.

and in most cases it has been detected pretty quickly and identified as such.

actually that's not the case at all. for instance, dxo never publishes / identifies that over around ISO 3200 sony bakes in NR

Actually I've seen DxO report on "smoothing" of RAW data for several Sony cameras at high ISO.

they dont' catch nearly them all, and certainly not any of the ILC's.

In order to demonstrate that DxO's data is useless you need to demonstrate that the problem is systemic rather than merely existing in isolated pockets here and there.

what DxO is stating is that:

A+B+Delta C = Z and A+B = Z

where Delta C <> 0

Since that is mathematically impossible - then the numerical data is in some factor, flawed based upon flawed scientific assumptions.

What numerical data are you referring to? Specifically?

The raw sensor pixel value itself.

cainn24 Veteran Member • Posts: 4,892
Re: They don't test raw "data"
2

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common,

actually what i discribed it most certainly is common.

and in most cases it has been detected pretty quickly and identified as such.

actually that's not the case at all. for instance, dxo never publishes / identifies that over around ISO 3200 sony bakes in NR

Actually I've seen DxO report on "smoothing" of RAW data for several Sony cameras at high ISO.

they dont' catch nearly them all, and certainly not any of the ILC's.

In order to demonstrate that DxO's data is useless you need to demonstrate that the problem is systemic rather than merely existing in isolated pockets here and there.

what DxO is stating is that:

A+B+Delta C = Z and A+B = Z

where Delta C <> 0

Since that is mathematically impossible - then the numerical data is in some factor, flawed based upon flawed scientific assumptions.

What numerical data are you referring to? Specifically?

The raw sensor pixel value itself.

Please explain how this problem you are sort of vaguely pointing to prevents the data that DxO Labs provide from strongly correlating with the results that are actually achievable by anyone who shoots RAW.  Keeping in mind, of course, that it actually doesn't

bgbs Veteran Member • Posts: 3,195
Re: Can we trust DxOMark (after they rated their own DxO One, 1" sensor at 1657 ISO)?
2

DXO has been in hot waters since the day it scored a Nikon sensor higher than Canon.

(unknown member) Forum Pro • Posts: 11,521
Re: Can we trust DxOMark (after they rated their own DxO One, 1" sensor at 1657 ISO)?
1

bgbs wrote:

DXO has been in hot waters since the day it scored a Nikon sensor higher than Canon.

actually their credibiltiy took a hike when they released their DxO-One with a sensor score of 85.

(unknown member) Forum Pro • Posts: 11,521
Re: They don't test raw "data"
3

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common,

actually what i discribed it most certainly is common.

and in most cases it has been detected pretty quickly and identified as such.

actually that's not the case at all. for instance, dxo never publishes / identifies that over around ISO 3200 sony bakes in NR

Actually I've seen DxO report on "smoothing" of RAW data for several Sony cameras at high ISO.

they dont' catch nearly them all, and certainly not any of the ILC's.

In order to demonstrate that DxO's data is useless you need to demonstrate that the problem is systemic rather than merely existing in isolated pockets here and there.

what DxO is stating is that:

A+B+Delta C = Z and A+B = Z

where Delta C <> 0

Since that is mathematically impossible - then the numerical data is in some factor, flawed based upon flawed scientific assumptions.

What numerical data are you referring to? Specifically?

The raw sensor pixel value itself.

Please explain how this problem you are sort of vaguely pointing to prevents the data that DxO Labs provide from strongly correlating with the results that are actually achievable by anyone who shoots RAW. Keeping in mind, of course, that it actually doesn't

actually I already stated.

if DxO simply reads the numerical data post decompression, then for instance that will not take into account floating black levels. floating black level can be interpreted from maker notes or from row column pairs of masked sensel values.  not taking into account the black level being floating will introduce errors in calculation including DR and SNR and also introduce random variance in color accuracy calculations.

therefore their scientific method introduces some random variance not defined and masked by DxO in their comparisons.

how much is probably small, however; you cannot empirically state that you gave a sensor a factual "generic" score with random variation not taken into account

The Davinator
The Davinator Forum Pro • Posts: 24,504
Re: Can we trust DxOMark (after they rated their own DxO One, 1" sensor at 1657 ISO)?
4

bgbs wrote:

DXO has been in hot waters since the day it scored a Nikon sensor higher than Canon.

If they rated a Canon camera higher than a Nikon or Sony....all of a sudden the Canon users would chime in that DXO is now completely accurate.

 The Davinator's gear list:The Davinator's gear list
Canon EOS D30 Canon EOS 10D Nikon D2X Fujifilm X-Pro1 Nikon Z7 +16 more
(unknown member) Forum Pro • Posts: 11,521
Re: Can we trust DxOMark (after they rated their own DxO One, 1" sensor at 1657 ISO)?
2

The Davinator wrote:

bgbs wrote:

DXO has been in hot waters since the day it scored a Nikon sensor higher than Canon.

If they rated a Canon camera higher than a Nikon or Sony....all of a sudden the Canon users would chime in that DXO is now completely accurate.

why? it's not even historically accurate for canon.

you should get off your canon trolling though, you do a really bad job of it.

I doubt canon users could be as bad as nikon users before the D3/D700 with DX versus full frame.

cainn24 Veteran Member • Posts: 4,892
Re: They don't test raw "data"
2

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common,

actually what i discribed it most certainly is common.

and in most cases it has been detected pretty quickly and identified as such.

actually that's not the case at all. for instance, dxo never publishes / identifies that over around ISO 3200 sony bakes in NR

Actually I've seen DxO report on "smoothing" of RAW data for several Sony cameras at high ISO.

they dont' catch nearly them all, and certainly not any of the ILC's.

In order to demonstrate that DxO's data is useless you need to demonstrate that the problem is systemic rather than merely existing in isolated pockets here and there.

what DxO is stating is that:

A+B+Delta C = Z and A+B = Z

where Delta C <> 0

Since that is mathematically impossible - then the numerical data is in some factor, flawed based upon flawed scientific assumptions.

What numerical data are you referring to? Specifically?

The raw sensor pixel value itself.

Please explain how this problem you are sort of vaguely pointing to prevents the data that DxO Labs provide from strongly correlating with the results that are actually achievable by anyone who shoots RAW. Keeping in mind, of course, that it actually doesn't

actually I already stated.

if DxO simply reads the numerical data post decompression, then for instance that will not take into account floating black levels. floating black level can be interpreted from maker notes or from row column pairs of masked sensel values. not taking into account the black level being floating will introduce errors in calculation including DR and SNR and also introduce random variance in color accuracy calculations.

therefore their scientific method introduces some random variance not defined and masked by DxO in their comparisons.

how much is probably small, however; you cannot empirically state that you gave a sensor a factual "generic" score with random variation not taken into account

In other words this problem you are pointing to is generally not significant enough to prevent the strong correlation that I have been speaking of. You know, because it's actually there.

Again no-one claimed perfection, only value.

(by the way would you consider broaching this topic in the PST forum?  I wouldn't mind hearing from the resident experts on the matter)

J A C S
J A C S Forum Pro • Posts: 17,846
Re: Can we trust DxOMark (after they rated their own DxO One, 1" sensor at 1657 ISO)?
2

The Davinator wrote:

bgbs wrote:

DXO has been in hot waters since the day it scored a Nikon sensor higher than Canon.

If they rated a Canon camera higher than a Nikon or Sony....all of a sudden the Canon users would chime in that DXO is now completely accurate.

DXO cannot be "accurate" in their scores because those scores are meaningless and the formula is secret (not that I really want to see it). On the other hand, most Canon users do not dispute their actual measurements, which kinda makes them less of DXO fanboys.

(unknown member) Forum Pro • Posts: 11,521
Re: They don't test raw "data"
2

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

rrccad wrote:

cainn24 wrote:

Well, of course. But RAW files straight from the camera aren't demosaiced. That's something that most RAW converters automatically do when you open a RAW file. Most of the popular ones also automatically apply a colour profile, a tone curve, some lens corrections and sometimes a bunch of other tweaks as well. But it's perfectly possible to analyze the RAW data before all of these things take place, and that's what DxO Labs do.

which is why it's incorrect doing company to company comparisons.

I don't understand your point. Could you elaborate?

Various companies will leave a raw file from the camera at a particular state of "polish" or finish.

"Cooking" RAW data to any significant degree isn't really that common,

actually what i discribed it most certainly is common.

and in most cases it has been detected pretty quickly and identified as such.

actually that's not the case at all. for instance, dxo never publishes / identifies that over around ISO 3200 sony bakes in NR

Actually I've seen DxO report on "smoothing" of RAW data for several Sony cameras at high ISO.

they dont' catch nearly them all, and certainly not any of the ILC's.

In order to demonstrate that DxO's data is useless you need to demonstrate that the problem is systemic rather than merely existing in isolated pockets here and there.

what DxO is stating is that:

A+B+Delta C = Z and A+B = Z

where Delta C <> 0

Since that is mathematically impossible - then the numerical data is in some factor, flawed based upon flawed scientific assumptions.

What numerical data are you referring to? Specifically?

The raw sensor pixel value itself.

Please explain how this problem you are sort of vaguely pointing to prevents the data that DxO Labs provide from strongly correlating with the results that are actually achievable by anyone who shoots RAW. Keeping in mind, of course, that it actually doesn't

actually I already stated.

if DxO simply reads the numerical data post decompression, then for instance that will not take into account floating black levels. floating black level can be interpreted from maker notes or from row column pairs of masked sensel values. not taking into account the black level being floating will introduce errors in calculation including DR and SNR and also introduce random variance in color accuracy calculations.

therefore their scientific method introduces some random variance not defined and masked by DxO in their comparisons.

how much is probably small, however; you cannot empirically state that you gave a sensor a factual "generic" score with random variation not taken into account

In other words this problem you are pointing to is generally not significant enough

who knows? do you? if you are assuming that a drift in values is noise when in fact it's caused by a floating black level constant - then you are mis-interpreting data.

to prevent the strong correlation that I have been speaking of. You know, because it's actually there.

any random variance not taken into account is statistically relevant. it will most certainly affect SNR/DR and color values that DxO assigns values to, thus their metric of "sensor score" how much? who knows.

this is fact: garbage in = garbage out.

combine this and their inability to consistency discover "cooking" or remark on it - tosses more raised eyebrows into the equation.

nuke12 Veteran Member • Posts: 3,432
Re: Can we trust DxOMark (after they rated their own DxO One, 1" sensor at 1657 ISO)?
2

Gee guys, can't you just quote the sections of the person's message your replying to? It's not that hard to cut out unneeded text.

I know it seems trivial but some of the posts contain 15 comments from other people that the poster is not even replying to.

Sorry and I'll butt out now but I have been trying to following this thread.

-- hide signature --

I'm a photo hacker. I use my expensive equipment to destroy anything in front of my camera. This is a special skill that can never be realized by low life photographers. A nurtured skill since the 1970's.

ZJ24 Contributing Member • Posts: 600
Re: No need to throw the baby out with the bathwater
1

cainn24 wrote:

DxO Labs have an extremely useful database of sensor performance characteristics. That hasn't changed. They may be leveraging their position to help make their own product successful, and they may be going about it in an unforgivably inconsistent and borderline shameless manner, but for those who know what they are looking for, how to look for it, and how to interpret it, the site is just as useful as it has always been.

Agree, the site is extremely useful, certainly not definitive in terms of purchasing decisions. They consider some important variables and completely ignore other ones.

Unforgivable and shameless: using their SuperRaw 4-image stacked ISO score for a SPORT high ISO rating - they are getting trashed for that all over their own site for that, no-one's fooled.

antoineb Veteran Member • Posts: 6,653
DxO are not bad, but they're (1) commercial, and (2) black box
1

Hi Tian,

I don't think that DxO are "bad".

But of course they are a business not a charity, so when the time comes, they obviously try to take care of their business.  Fair enough.

And then, their whole evaluation process may be systematic, but it is very, very black box.  Back in the day, at one point they had started evaluating middle format cameras - and they came far from the top!  So all serious photographers said, "come on, this is ridiculous, we know the kind of great quality they deliver in the real world!".  So DxO changed their black box and bingo!, the medium format cameras were back on top!

So that's that:  possibly not very meaningful and black box, but systematic - if that's of any value.  And commercial.

Tian6869 wrote:

So, I have always used "Sports (Low-Light ISO)" on DxOMark to get a basic idea on noise level in ISO in cameras.

I think they are very good.

However, recently DxOMark release DxO One and heavily advertisement on their sites (e.g pop up, side Ads and Headline news), so I took a look and found found out that

DxOMark rated their own camera sensor Sports (Low-Light ISO): 1657 ISO

1 inch sensor, 20.2 MP has Sports (Low-Light ISO): 1657 ISO!!!!

So, I was shocked.

I always thought the bigger the sensor, more light, the smaller the sensor, less light.

DxOMark One only has 1 inch sensor.

Looking at other 1 inch sensor"

Canon G7X Sports (Low-Light ISO): 556 ISO,

Panasonic LX100 Sports (Low-Light ISO): 553 ISO,

Panasonic FZ1000 Sports (Low-Light ISO): 517 ISO,

Sony RX100M3 Sports (Low-Light ISO): 495 ISO,

Nikon 1 J5 Sports (Low-Light ISO): 479 ISO,

Nikon 1 J4 Sports (Low-Light ISO): 426 ISO.

Note sure who made DxOMark One's sensor, but DxOMark rated all other major manufacturers way less than their own.

I mean, not by little, but almost 3 TIMES (e.g. 556 vs 1657)

Now, lets looks at M43

Olympus OM-D E-M5 Mark II Sports (Low-Light ISO): 896 ISO,

Olympus E-PL7 Sports (Low-Light ISO): 873 ISO,

Panasonic GX7 Sports (Low-Light ISO): 718 ISO,

Panasonic GM5 Sports (Low-Light ISO): 712 ISO,

Now, lets looks at APS-C

Nikon D5500 Sports (Low-Light ISO): 1438 ISO

Nikon D7200 Sports (Low-Light ISO): 1333 ISO

Sony A5100 Sports (Low-Light ISO): 1347 ISO

Canon EOS 7D Mark II Sports (Low-Light ISO): 1082 ISO

As you can see,

IF DxOMark DxO One with 1 inch sensor, 20.2 MP and has Sports (Low-Light ISO): 1657 ISO is correct.

Then, such sensor will beat Nikon D5500 APS-C sensor.

But, can it be true, that there is a 1 inch sensor that can beat the APC-C sensor which collect more light?

 antoineb's gear list:antoineb's gear list
Panasonic Lumix DMC-FZ8 Panasonic Lumix DMC-FZ18 Panasonic Lumix DMC-ZS7 Olympus TG-610 Nikon D7000 +5 more
IchiroCameraGuy Contributing Member • Posts: 888
toss a coincoin trustworthy
2

DXO Sports/ISO for Canon 5DS is 2381 and 5DS R at 2308 vs the Sony A7R at 2746...put them in the studio comparison here for RAW 3200, 6400, etc. and see if it makes any sense. Just one obvious example. It isn't just recently that they results and numbers became "strange" - to say it very nicely. Their lens results are not much better in my opinion. Being a little wrong now and then is OK but not consistently and in specific directions and cases if that makes sense.

 IchiroCameraGuy's gear list:IchiroCameraGuy's gear list
Canon EOS 500D Canon EOS M Samsung NX500 Canon EF 50mm F1.8 II Canon EF-S 55-250mm f/4-5.6 IS II +2 more
Keyboard shortcuts:
FForum MMy threads