Sigma SD15 and Consumer Reports Rating

I have to wonder what their testing methods were. Did they shoot jpg only? The SD14 and SD15 didn't, IMO, always produce the best ooc jpgs. If they shot RAW, did they use ACR? For a time after the SD15 was introduced, ACR did not properly support SD15 X3F files even though they could be opened and processed in ACR. That would lead to less than optimal raw conversions. The other thing is that I have always been of the opinion that Consumer's Union view products as appliances. Sigma cameras are not mere appliances, it takes some effort to get the right output out of them.
 
"The SD15 got the worst image-quality scores"

The above speaks volumes. The issue is trying to define a subjective issue with objective tools. Image quality is something which isn't directly amenable to numbers. Unfortunately there exists a sub-set of the human population who persist in believing that virtually everything can be described, sorted and ranked with numbers. To do this, numbers are arbitrarily assigned to various qualities then these numbers are used to make the determination about ranking. After all, it's very easy to rank numbers. So in the end, the "numbers" are correctly ranked but sadly may have absolutely no relationship to the quality under discussion.

Lin
"Sigma SD15 is an SLR you should avoid
"Consumer Reports News: March 29, 2013

"...The SD15 got the worst image-quality scores of any of the advanced cameras we've tested...."

I am very surprised as I personally have found the SD15 and its Foveon sensor to deliver very fine photos.
Consumer Reports developed a testing procedure and a set of test metrics. That is true for all items it test. Their goal is to provide the customer with a consistent set of information from which they can make an informed decision. They also normally give an indication of how the camera faired against each of the metrics.

What this says is given the test metrics that are applied to all cameras tested, the SD15 did not fair well - no more no less.

In reality the same is true for DXOMark. It's test procedures are based on a CFA sensor. That is why they do not test a Foveon nor even a Leica monochrome camera.

--
Truman
www.pbase.com/tprevatt
 
"Sigma SD15 is an SLR you should avoid
"Consumer Reports News: March 29, 2013

"...The SD15 got the worst image-quality scores of any of the advanced cameras we've tested...."

I am very surprised as I personally have found the SD15 and its Foveon sensor to deliver very fine photos.
Consumer Reports developed a testing procedure and a set of test metrics. That is true for all items it test. Their goal is to provide the customer with a consistent set of information from which they can make an informed decision. They also normally give an indication of how the camera faired against each of the metrics.

What this says is given the test metrics that are applied to all cameras tested, the SD15 did not fair well - no more no less.

In reality the same is true for DXOMark. It's test procedures are based on a CFA sensor. That is why they do not test a Foveon nor even a Leica monochrome camera.
There's a very simple way to answer that: If you don't understand how to use a particular device, don't test it. "All cameras are the same, so test them all the same way." What a load of malarkey.

DPR are guilty of the same attitude in regard to Foveon-based based cameras. It's a bit like saying, 'all mammals are the same' and then proceeding to rate all dogs against each other and when they are asked to judge a cat - they compare it to another dog.
 
...Tom has it right; ... Consumer Reports is fine for choosing lawnmowers, ...
Are you sure, Chuck?

at :-D m14.
I like it for cars - they do a pretty good job on tracking reality.

Often times their metrics don't match was a person wants to know. I liken it more to coming into a small town and asking a local about where is a good place to eat. They might be able to tell you if the dishes and silverware is clear and the type of beer they serve but how do you know that their taste in food matches yours.
 
I read pretty much just e-books these days. I always check out the 1-star reviews first. There will just about always be some one-stars - just like it is with the Sigma cameras. I look at why someone gives them a one-star to see if their taste matches mine or not.
 
Show me some proper double blind tests where ANYONE - even those blessed with so-called "golden ears" - can tell any difference between a £300 or a £3000 amplifier, CD, turntable etc

Nick

disclaimer - there is a difference between loudspeakers but not the other components, cables being a particularly nonsense item to spend hundreds on. James Randi put out a challenge - prize $1,000,000 - to a cable making company to prove their claims. The prize is still unclaimed ...
 
Show me some proper double blind tests where ANYONE - even those blessed with so-called "golden ears" - can tell any difference between a £300 or a £3000 amplifier, CD, turntable etc

Nick

disclaimer - there is a difference between loudspeakers but not the other components, cables being a particularly nonsense item to spend hundreds on. James Randi put out a challenge - prize $1,000,000 - to a cable making company to prove their claims. The prize is still unclaimed ...
I remember back in the 1960s AR loudspeakers had a blind test, where they put a chamber music quartet behind one curtain and a set of AR loudspeakers behind the other curtain. Listeners couldn't tell the difference. I think it must have been a very successful ad campaign for them. I aspired to build speakers as good, but was always a mere amateur.

I agree with you about speakers. The distortion in even the best ones swamps any gain that might be made by using a high dollar speaker cable. Any 14AWG pair (or larger) should be just fine under almost all circumstances.

I read somewhere that there is actually a very small segment of the population that really does have golden ears. I also read somewhere else that some people have 4 different cones in their eyes and can see more colors than the rest of us. I used to think I saw pretty well but these days my detail vision is diminishing and I'm only in my early '60s. At work, I need magnifiers and microscopes to see what I used to be able to see unaided. And I used to be able to hear the HV flybacks in TV sets (15.736 KHz). Not anymore (and TVs don't have them anymore either).

Another aspect of aging that I'm having trouble with is that craft lore is not going to be retained once I retire. I'll take it with me and the younger workforce won't be interested to learn it. Turns out this happens to every technical generation. Oh, well, this is off-topic anyway so I better stop.
 
Show me some proper double blind tests where ANYONE - even those blessed with so-called "golden ears" - can tell any difference between a £300 or a £3000 amplifier, CD, turntable etc

Nick

disclaimer - there is a difference between loudspeakers but not the other components, cables being a particularly nonsense item to spend hundreds on. James Randi put out a challenge - prize $1,000,000 - to a cable making company to prove their claims. The prize is still unclaimed ...
Do not have double blind tests but... would not be surprised that some 300 ones might be better than some 3000 ones. Most ridiculous amp i read about was 50,000.00 10W monoblock with hand made internal silver wiring and hand made capacitors!

On another hand some of Martin Logan speakers drop down to 1 om of impedance on high frequencies which is extremely hard load for most amps 300 or 3000 does not matter.

P.S. Friend of mine subscribed me to "Streofphile" while back i never laugh so hard when i read it.

$15,000 speaker wires made with teflon insulation which needed burnin. Sounded great first hour but next 100 hours were horrible before sounding great again.

--
http://www.flickr.com/photos/victor_gvirtsman/
 
Last edited:
Show me some proper double blind tests where ANYONE - even those blessed with so-called "golden ears" - can tell any difference between a £300 or a £3000 amplifier, CD, turntable etc

Nick

disclaimer - there is a difference between loudspeakers but not the other components, cables being a particularly nonsense item to spend hundreds on. James Randi put out a challenge - prize $1,000,000 - to a cable making company to prove their claims. The prize is still unclaimed ...
Do not have double blind tests but... would not be surprised that some 300 ones might be better than some 3000 ones. Most ridiculous amp i read about was 50,000.00 10W monoblock with hand made internal silver wiring and hand made capacitors!

On another hand some of Martin Logan speakers drop down to 1 om of impedance on high frequencies which is extremely hard load for most amps 300 or 3000 does not matter.

P.S. Friend of mine subscribed me to "Streofphile" while back i never laugh so hard when i read it.

$15,000 speaker wires made with teflon insulation which needed burnin. Sounded great first hour but next 100 hours were horrible before sounding great again.
Victor, I used to be into audio in the 70's when large Goodman cabinets with 18" speakers and reflex openings gave perfectly adequate bass for the lows in Dark Side of the Moon in my 30ft living room. A couple of tweeters up top with a suitable cross-over thingy completed my happy perception of one of the finest albums ever made.

So, I must say that everything you've written above sounds totally insane :-D
 
Ooo..Good old times "Dark side of the moon" this one helped me to survive military education in my college i was listening to it almost every day during my 5 years in there. Thou my favorite was "Wish you were here" :-)
 
"The SD15 got the worst image-quality scores"

The above speaks volumes. The issue is trying to define a subjective issue with objective tools. Image quality is something which isn't directly amenable to numbers.
I agree with the sentiment, Lin, in that 'image quality' is mentioned here so often but I've yet to read a satisfactory definition of it. Anybody got a link?
Unfortunately there exists a sub-set of the human population who persist in believing that virtually everything can be described, sorted and ranked with numbers.
I resemble that remark. Would prefer the term "super-set" though ;-)
To do this, numbers are arbitrarily assigned to various qualities then these numbers are used to make the determination about ranking. After all, it's very easy to rank numbers. So in the end, the "numbers" are correctly ranked but sadly may have absolutely no relationship to the quality under discussion.
I would suggest that is impossible to rank numbers for "various qualities" because "subjective" implies an individual's preferences. In other words, the ranking or weighting itself depends on the individual. For example, I rank sharpness high on the list and I am far less bothered by poor color. Others might prefer good color and not give a toss about sharpness.

And then there is the big gap between what the sensor saw and someone's final image. In other words individuals' skill comes into play, or even editor performance i.e. ACR's treatment of SD15 X3Fs, as already mentioned. or DP2s or DP2x in FastStone Viewer.

Like flower shots: people shoot flowers which often have chromaticities which are outside of human vision but are captured by our fine Foveon sensors. Then people process in ProPhoto because they always do. Even if it looks OK, the temptation to reach out and crank the saturation a bit to make the colors "nicer" can be irresistable. Then they publish to web and, guess what, a horribly over-saturated image appears for our consideration and which many people seem to like, BTW. "Wow, look at those reds!". So, such images are deemed to be of high "color quality" and pleasing to the eye (but best not run a color-picker over the petals to see the numbers, eh?).

[my standard flower shot rant, sorry].
 
"The SD15 got the worst image-quality scores"

The above speaks volumes. The issue is trying to define a subjective issue with objective tools. Image quality is something which isn't directly amenable to numbers.
I agree with the sentiment, Lin, in that 'image quality' is mentioned here so often but I've yet to read a satisfactory definition of it. Anybody got a link?
I doubt that a satisfactory definition will ever be possible because of subjectivity. It's that very issue that goes to the heart of the problem. What makes one oil painting superior to another? Recently a Paul Gauguin painting, "When Will You Marry?" sold for three hundred million dollars!

When Will You Marry? Sold for $300,000,000  This Year!....

When Will You Marry? Sold for $300,000,000 This Year!....

In a careful examination of the above painting, what makes it worth $300,000,000 is beyond my ability to comprehend. It it ultra realistic? No. Are the colors different that those in other paintings? No. Is the subject matter something unique? No. So what we are dealing with here IMHO is the fact that there is only one and it was painted by someone who's work someone else who had three hundred million dollars wanted badly enough to part with it. I have a friend who is an artist and in my subjective opinion does far better work than this and feels lucky to get $1500 for one of her superb oils. But then I'm not an art critic although I made my living photographing art for many years.
Unfortunately there exists a sub-set of the human population who persist in believing that virtually everything can be described, sorted and ranked with numbers.
I resemble that remark. Would prefer the term "super-set" though ;-)
Yes, I suppose which side of the gaussian distribution on perhaps that second standard deviation one lies is relevant here - LOL. An interesting example of subjectivism ;-)
To do this, numbers are arbitrarily assigned to various qualities then these numbers are used to make the determination about ranking. After all, it's very easy to rank numbers. So in the end, the "numbers" are correctly ranked but sadly may have absolutely no relationship to the quality under discussion.
I would suggest that is impossible to rank numbers for "various qualities" because "subjective" implies an individual's preferences. In other words, the ranking or weighting itself depends on the individual. For example, I rank sharpness high on the list and I am far less bothered by poor color. Others might prefer good color and not give a toss about sharpness.
You're making my point precisely... Subjectivity simply isn't amenable to assigning numbers because the ranking importance of various qualities themselves are subjective. This means, I suppose, that the qualitative rank of the Sigma was correct for the reviewer but relatively meaningless for many others.
And then there is the big gap between what the sensor saw and someone's final image. In other words individuals' skill comes into play, or even editor performance i.e. ACR's treatment of SD15 X3Fs, as already mentioned. or DP2s or DP2x in FastStone Viewer.
Yes...
Like flower shots: people shoot flowers which often have chromaticities which are outside of human vision but are captured by our fine Foveon sensors. Then people process in ProPhoto because they always do. Even if it looks OK, the temptation to reach out and crank the saturation a bit to make the colors "nicer" can be irresistable. Then they publish to web and, guess what, a horribly over-saturated image appears for our consideration and which many people seem to like, BTW. "Wow, look at those reds!". So, such images are deemed to be of high "color quality" and pleasing to the eye (but best not run a color-picker over the petals to see the numbers, eh?).

[my standard flower shot rant, sorry].
Not a problem! It's always fun to discuss how the human mind and brain react to the visible world and how individuals differ in their perceptions and beliefs about what constitutes "quality." Attempting to rank subjective feelings such as love, hate, desire, etc., becomes rather than a symphony; a cacophony. Are these emotions linear or logarithmic? Which are more important? It is certainly possible to take one hundred people drawn randomly and ask them which of four images they prefer. Then fairly say that perhaps sixty of the one hundred chose image three. In some way then we have assigned a proper number to a subjective term "preference." However, we still haven't a clue which of a multitude of subjective feelings went into the determination by these sixty individuals. Probably even those who chose image three couldn't accurately assess why they liked one more than another. Oh, they could say it was brighter, had more "pop," was more realistic, etc., but the bottom line is there was something which appealed to them and pinning that down to an objective criterion and assigning a corresponding number is where the real issue lies...

Lin
 

Keyboard shortcuts

Back
Top