Sigma SD14 & Foveon X3..too late to the party?

  • Thread starter Thread starter Barry Fitzgerald
  • Start date Start date
Remember you double the pixel count to get the equivalent Foven vs
Bayer count so 4mp Foven = 8 Mp Bayer. Bayer sensors seem to have
better high iso performance however.
That's a common subjective evaluation, but it isn't fact.
Your right, its often more than that...In fact the SD10 has been
shown to be able to match the 10mp D200 for resolution.
All kinds of things have been shown. Most of them are slight of hand tricks.
I count resolution as resolution that is accurate, and Sigmas
without AA filters start showing false detail right about the same
resolution, as measured in pixels, as the AA Bayer cameras start to
bercome inconsistent.
Then you need to count the actual pixels recorded by Bayer sensors,
not the photosite count!
They're the same thing. A pixel is a spatial presence in the 2D plane. In the case of a Bayer camera, they don't all measure exactly the same thing.
For the correct pixel count you must divide the number of
photosites by at least 2.
In that case, the 10D would outresolve the SD10 by 2x in my measurements below.
And actually Bayer sensors fail to resolve detail long before
Nyquist, recording nothing but mush, but Foveon sensors resolve
detail right down to Nyquist and only after Nyquist do they produce
"false detail" (a continuous parallel 5 lines)....However lets not
forget that the false detail recorded by the Foveon sensor after
Nyquist appears to be real and sharp to the eye, but to the eye the
Bayer sensor simply records mush.
Those are very charged words, but as I look at those charts, I see less real detail and more trash in the SD10 test. Too bad that there wasn't a way to get a hires, scanned copy of the test chart that mapped, with geometric corrections, to the test results. You could subtract it from the results, and see how the sensor sandwich erred. Judging from the test results, what you would see in the SD10 image is very high contrast detail, in the wrong places to have any relationship to the test chart, and with the 10D image, what you would have is mainly a failure to achieve subject contrast at the higher frequencies, creating halos, but everything pretty much in THE RIGHT PLACE . As soft as Bayer/AA images are at the nyquist, the AA filtering allows sub-pixel positioning of uncrowded detail; something impossible without an AA filter.
In the converging lines just to the right of the center of the
resolution tests at this forum, the SD10 "falls apart" at about
10.8, while the 10D falls apart at about 14.5, almost exactly the
pixel ratio of the cameras. There is no resolution benefit to the
SD10, compared to bayer, for luminance, if we're talking about
accurate resolution.
Of course there is...But you wont see it using B&W targets.
You won't see it in very many real-world situations at all. Most of the subjects in which the Sigma pixel would excel are things that are hard to look at, because they flash the retina. The cover of Uriah Heep's "The Magicians Birthday" comes to mind. I might choose to photograph with an SD10 over a 10D.

I've copied and pasted into PS many images that supposedly showed the great color resolution of the Sigmas, and when I converted to LAB mode, viewing at 100%, viewing all channels but selecting only a and b for editing, and pixellate a and b to 2x2 tiles, I can't see much of anything change. I have to go to 3x3 before I can see much of a difference. Ditto for gaussian blur. I have to get up around 1- 1.2 pixels before I really start seing any loss of definition at color edges.
There's no free lunch when you omit an AA
filter, just a bunch of broken bones to give the impression of
extra detail.
You need to go back to school!
I would, but I don't want to give a valedictory speech again. Once is enough, and besides, I get really annoyed when A+ grades don't raise my GPA above 4.0.

--
John

 
Interesting response between you two. Despite the fact that I'm in the engineering field (Civil) I have little knowledge of the engineering aspects of digital photography. I prefer to let the "artistic" side of me dominate. I don't look at charts and graphs. I just look at the image and as a whole and see how it "hits" me. I could care less how this is achieved. All images are compromises and far less than accurate representations of the original. That said, when all is said and done, which produces better images, Bayer or Foven?
--
Tom

http://www.flickr.com/photos/25301400@N00/
 
Interesting response between you two. Despite the fact that I'm in
the engineering field (Civil) I have little knowledge of the
engineering aspects of digital photography. I prefer to let the
"artistic" side of me dominate. I don't look at charts and graphs.
I just look at the image and as a whole and see how it "hits" me. I
could care less how this is achieved. All images are compromises
and far less than accurate representations of the original. That
said, when all is said and done, which produces better images,
Bayer or Foven?
--
Tom

http://www.flickr.com/photos/25301400@N00/
I'm afraid I have even fewer credentials then you - But I agree with your post.

Part of the problem is this measuring of pixels. Sigma caught such heat because they claimed 10 million - But personally I don't "blame" them.

You simply can't compare the two systems on a pixel by pixel comparison. If ever the expression, "apples and oranges" holds value - this is it.

I LIKE the Sigma shots better. BUT I need more resolution than the old Sigmas provided. The rough approximation of doubling pixels seems fair when it comes to resolving detail.

So, if the new Sigma is 4, that would still result in too little detail compared to the high MP Bayers. On the other hand, 6 or better, 8, would have me running down to B&H to get one.

Dave
 
That said, when all is said and done, which produces better images,
Bayer or Foven?
Sensor technologies don't produce images, people do.

Even then we cannot compare generic technologies like CFA vs. X3 because:
  • these are implemented in physical devices with practical cost and other engineering compromises
  • which are put into camera with yet other compromises
  • and the whole chain from scene to image also includes lenses, expousure, camera support, and most importantly: processing (conversion, scaling, upsizing, etc.)
  • then you have to choose the output media: screen viewing vs. print making (and at what size). Of course the processing chain for print making adds a whole host of other variables.
  • and there are different scene types and different styles of photography.
  • and finally, you have to define "better". The argument over the behavior beyond the normal Nyquist limit shows how subjective this is.
--
Erik
 
John Sheehy wrote:
snip
The Sigmas, by their very design, capture data like this.
Softening it (or failing to sharpen it) in software does not remove
the "snap edge to grid" effect, it only softens those edges.

I am inclined to believe that people who like this effect are
suffering from some kind of visual disorder. I can even see it in
downsampled Sigma images. The scream, PIXELS! PIXELS! PIXELS! ...
not "SUBJECT DETAIL!". I see the same thing in the Kodak 14x
series, but they are even worse, with color moire. Fortunately,
the Kodaks have so many pixels that the artifacts are small
compared to the image size.

Of course, Bayer/AA images downsized with the nearest neighbor
algorithm can show similar effect, but that is a PP option; not
part of the capture.

The Sigmas have approximately the same resolution for greyscale or
low-saturation subjects as a Bayer with and AA filter, with the
same number of pixels. The 2x equivalence stuff is just pure
nonsense. The extra "detail" of the Sigmas (other than satured
primary color resolution) is FALSE detail that is not a spatially
correct part of the subject, but an artifact of the aliased capture.
I think you don't really understand the definition of "resolution" as applied to optical physics. Maybe you should have another look at the resolution charts. Look at both Phil's and the charts done by Outback Photo for color.
Pay particular attention to the lines per image height numbers.

Just to save you some time, let me tell you that the Sigma SD9/SD10 measure 1550 horizontal and 1550 lines vertical perr image height from Phil's own resolution charts. Now spend some time looking at figures for 3.4 megapixel bayer cameras - take your choice, it doesn't matter. Then come back and revisit your comments above and justify them from an objective perspective.

Lin
 
We are merely speculating. Err, using logic of course...
Analyzing available data and forming informed opinions ...
When I wrote that "I was covering myself," I included such reliable scientific data as the conjunctions of the planets - and taking this another step further, mentioned that Pluto was no longer a planet.

Man I took care of every possibility. There is no excuse uncovered. I have an out for every possible contingincy - With the sole exception of Koni, who has anticipated all of us, with a full featured review, based on hands on experieince, not only with the SD14, but the SD15 through SD20.

Dave
 
Are you referring to "stair stepping" along the edges of objects
where those edges are not purely vertical or horizontal?...If so
then I would'nt worry, it is curable as its simply a result of over
sharpening.
Not necessarily. The "stairstepping" is typically one of two things. Aliasing of high contrast lines/edges or the innappropriate viewing of images at on-screen magnifications greater than 100% pixels using a nearest neighbor uprezzing algorithm.

The first is a real image quality issue and is most visible with certain man-made objects with edges near vertical or horizontal. It is not a sharpening artifact, though it may be enhanced by additional sharpening. The second is not a real image quality issue at all. It is a viewing method problem.

--
Jay Turberville
http://www.jayandwanda.com
 
I think you don't really understand the definition of "resolution"
as applied to optical physics. Maybe you should have another look
at the resolution charts. Look at both Phil's and the charts done
by Outback Photo for color.
Pay particular attention to the lines per image height numbers.
I never argued that the Sigmas don't resolve saturated primary colors better per pixel than bayer cameras.
Just to save you some time, let me tell you that the Sigma SD9/SD10
measure 1550 horizontal and 1550 lines vertical perr image height
from Phil's own resolution charts.
And you don't notice something strange about that? The fact that a sensor with 1512 lines can resolve 1550? How about if there were 2/3 times as many lines in the chart as there are lines in the sensor? How would that resolve? What if there were 3 times as many lines as there are lines in the sensor? Light-grey dark-grey light-grey - dark grey at 1/3 the actual frequency. What would it be resolving?

Your antiquated definition of resolution assumes randomly-sized and -placed film grain which doesn't alias, where any discernable line patterns are real. If discernable lines in the result are resolution in the digital world, then the powers that be are asleep at the wheel as the definition of resolution goes crashing down a flight of big, jagged stairs.
Now spend some time looking at
figures for 3.4 megapixel bayer cameras - take your choice, it
doesn't matter. Then come back and revisit your comments above and
justify them from an objective perspective.
I was looking at the very charts which I believe he derived those fiogures from, and I come to a completely different conclusion.

--
John

 
Interesting response between you two. Despite the fact that I'm in
the engineering field (Civil) I have little knowledge of the
engineering aspects of digital photography. I prefer to let the
"artistic" side of me dominate. I don't look at charts and graphs.
I just look at the image and as a whole and see how it "hits" me. I
could care less how this is achieved. All images are compromises
and far less than accurate representations of the original. That
said, when all is said and done, which produces better images,
Bayer or Foven?
Foveon with an anti-alias filter, and the color discrimination problem solved, with the same number of pixels, undoubtedly. We don't have that, though, so we have Sigma with no color moire but image aliasing, and bayer with reduced high-saturation primary color resolution.

--
John

 
I think you don't really understand the definition of "resolution"
as applied to optical physics. Maybe you should have another look
at the resolution charts. Look at both Phil's and the charts done
by Outback Photo for color.
Pay particular attention to the lines per image height numbers.
I never argued that the Sigmas don't resolve saturated primary
colors better per pixel than bayer cameras.
Just to save you some time, let me tell you that the Sigma SD9/SD10
measure 1550 horizontal and 1550 lines vertical per image height
from Phil's own resolution charts.
And you don't notice something strange about that? The fact that a
sensor with 1512 lines can resolve 1550? How about if there were
2/3 times as many lines in the chart as there are lines in the
sensor? How would that resolve? What if there were 3 times as
many lines as there are lines in the sensor? Light-grey dark-grey
light-grey - dark grey at 1/3 the actual frequency. What would it
be resolving?
"sensors with 1512 lines"??? The last time I looked at Foveon and Bayer sensors, they had photosites, not lines. What in the name of heaven are you tackling about?

You seriously need to take a refresher course on both optical physics and sensor technology.
Your antiquated definition of resolution assumes randomly-sized and
-placed film grain which doesn't alias, where any discernible line
patterns are real. If discernible lines in the result are
resolution in the digital world, then the powers that be are asleep
at the wheel as the definition of resolution goes crashing down a
flight of big, jagged stairs.
Ugh, it's not my definition of "resolution" and it assumes nothing but in the relevant sense simply reveals how many black and how many white distinctly visible lines can be visually detected at a numerically reference point on a photographed resolution chart by a human with ordinary eyesight. The test is the same for all the cameras tested and furnishes a reliable means of ranking them for greyscale resolution. The undeniable facts are that the Sigma SD9/SD10 distinctly reveal the 9 visible line pairs next to the referenced point which corresponds to 1550 lines per image height. The undeniable facts are that Bayer filtered sensors with virtually identical optics attached and having 3.4 million photosites (which are not lines but rather discrete luminance sensors) fall far short of being able to reveal distinct lines at anywhere near 1550 on the referenced scale.

You can obfuscate this with nonsensical terminology and spurious questions till the cows come home but the facts remain for any objective observer to witness.

Lin
Now spend some time looking at
figures for 3.4 megapixel bayer cameras - take your choice, it
doesn't matter. Then come back and revisit your comments above and
justify them from an objective perspective.
I was looking at the very charts which I believe he derived those
fiogures from, and I come to a completely different conclusion.

--
John

 
And you don't notice something strange about that? The fact that a
sensor with 1512 lines can resolve 1550?
No. It simply means that reading the resolution chart involves a small degree of error and subjectivity. I don't think a 2.5% error is much to complain about.
How about if there were
2/3 times as many lines in the chart as there are lines in the
sensor? How would that resolve? What if there were 3 times as
many lines as there are lines in the sensor? Light-grey dark-grey
light-grey - dark grey at 1/3 the actual frequency. What would it
be resolving?
You can run Imatest MTF tests on the slant edges of the charts if you like. Yes, there are lots of ways to consider resloution. The nice thing about the DPReview tests is that the methodology is fairly consistent and you have access to the full resolution JPEG originals.
Your antiquated definition of resolution assumes randomly-sized and
-placed film grain which doesn't alias, where any discernable line
patterns are real. If discernable lines in the result are
resolution in the digital world, then the powers that be are asleep
at the wheel as the definition of resolution goes crashing down a
flight of big, jagged stairs.
Not really. In the real world we seldom ask cameras to resolve fine, nearly parallel lines that are rendered near parallel to our sensor's photodiode rows at spatial frequencies approaching the senor's Nyquist limit. High contrast detail such as this is close to a worst case scenario for the X3 sensor from the standpoint of aliasing. The bigger problem with the resolution tests is that they don't communicate MTF response and other factors that influence image quality and perception.

--
Jay Turberville
http://www.jayandwanda.com
 
"sensors with 1512 lines"??? The last time I looked at Foveon and
Bayer sensors, they had photosites, not lines. What in the name of
heaven are you tackling about?
You didn't know that the photosites in the Foveon are in a grid; that they fall into rows and columns?
You seriously need to take a refresher course on both optical
physics and sensor technology.
Academia is usually several years behing real world technology. There is no such course, and you need it more than I do.
Your antiquated definition of resolution assumes randomly-sized and
-placed film grain which doesn't alias, where any discernible line
patterns are real. If discernible lines in the result are
resolution in the digital world, then the powers that be are asleep
at the wheel as the definition of resolution goes crashing down a
flight of big, jagged stairs.
Ugh, it's not my definition of "resolution"
You subscribe to it.
and it assumes nothing
It assumes that the same interpretation that applies to film can apply to digital without AA filters.
but in the relevant sense simply reveals how many black and how
many white distinctly visible lines can be visually detected at a
numerically reference point on a photographed resolution chart by a
human with ordinary eyesight. The test is the same for all the
cameras tested and furnishes a reliable means of ranking them for
greyscale resolution. The undeniable facts are that the Sigma
SD9/SD10 distinctly reveal the 9 visible line pairs next to the
referenced point which corresponds to 1550 lines per image height.
And which point would that be? I see the SD10 starting to fall apart at about 10.8. After that, each line starts morphing into 2 ridges that separate further yet as it pops into 7 lines, and goes through the same process to pop into 5 lines at the end. The 10D goes from consistently-contrasted 9 lines to inconsistent at about 14.5. The 10D fades from 9 lines to 9 inconsistent lines to no lines at all, as it should be.
The undeniable facts are that Bayer filtered sensors with virtually
identical optics attached and having 3.4 million photosites (which
are not lines but rather discrete luminance sensors) fall far short
of being able to reveal distinct lines at anywhere near 1550 on the
referenced scale.
What good are distinct lines if they are not the lines form the original chart? They are GARBAGE ; not resolution.
You can obfuscate this with nonsensical terminology and spurious
questions till the cows come home but the facts remain for any
objective observer to witness.
I don't care how you or Phil interpret the chart. The chart says to me that both cameras start to deteriorate at about the same resolution, at the pixel level. You can talk about "resolution" and your religious-cult mentality worshipping glitches in the terminology machine, and I will talk about "accurate resolution".

--
John

 
"sensors with 1512 lines"??? The last time I looked at Foveon and
Bayer sensors, they had photosites, not lines. What in the name of
heaven are you tackling about?
You didn't know that the photosites in the Foveon are in a grid;
that they fall into rows and columns?
You seriously need to take a refresher course on both optical
physics and sensor technology.
Academia is usually several years behing real world technology.
There is no such course, and you need it more than I do.
Well it took me all of about 10 seconds to find one at MIT:

http://ocw.mit.edu/OcwWeb/Physics/8-422Spring-2005/CourseHome/index.htm

So much for being wrong about that....

snip
And which point would that be? I see the SD10 starting to fall
apart at about 10.8. After that, each line starts morphing into 2
ridges that separate further yet as it pops into 7 lines, and goes
through the same process to pop into 5 lines at the end. The 10D
goes from consistently-contrasted 9 lines to inconsistent at about
14.5. The 10D fades from 9 lines to 9 inconsistent lines to no
lines at all, as it should be.
When your opinion is out of touch with the rest of the testing world, maybe it's time for some introspection? Everyone else disagrees with your assessment, maybe you should apply to MIT to "teach" the course - LOL
The undeniable facts are that Bayer filtered sensors with virtually
identical optics attached and having 3.4 million photosites (which
are not lines but rather discrete luminance sensors) fall far short
of being able to reveal distinct lines at anywhere near 1550 on the
referenced scale.
What good are distinct lines if they are not the lines form the
original chart? They are GARBAGE ; not resolution.
In your considered opinion or one.....
You can obfuscate this with nonsensical terminology and spurious
questions till the cows come home but the facts remain for any
objective observer to witness.
I don't care how you or Phil interpret the chart. The chart says
to me that both cameras start to deteriorate at about the same
resolution, at the pixel level. You can talk about "resolution"
and your religious-cult mentality worshipping glitches in the
terminology machine, and I will talk about "accurate resolution".
Yes, it's quite obvious that Phil and I and the rest of the reviewers are wrong and that you are right! Now it just so happens that I own the SD10, owned the SD9, own a 10D, a 1D, a 1DS, a 1D Mark II, a DCS-760 Kodak with removable AA filter and a Nikon D2X. What's your personal experience with the technology to warrant your assertions?

LOL

Lin
 
And you don't notice something strange about that? The fact that a
sensor with 1512 lines can resolve 1550?
No. It simply means that reading the resolution chart involves a
small degree of error and subjectivity. I don't think a 2.5% error
is much to complain about.
Nobody should go around saying a sensor with 1512 lines resolves 1550. It's very simple. If 1550 is the result, then the math needs to be redone.

Also, anyone who saw the chart with their own eyes, and saw it go from 9 lines down to 5 should know that anything that isn't 9 lines isn't RESOLUTION OF THE SUBJECT, but an artifact. It is absurd to call this resolution.
How about if there were
2/3 times as many lines in the chart as there are lines in the
sensor? How would that resolve? What if there were 3 times as
many lines as there are lines in the sensor? Light-grey dark-grey
light-grey - dark grey at 1/3 the actual frequency. What would it
be resolving?
You can run Imatest MTF tests on the slant edges of the charts if
you like. Yes, there are lots of ways to consider resloution. The
nice thing about the DPReview tests is that the methodology is
fairly consistent and you have access to the full resolution JPEG
originals.
I do, and that is why I can see that most people are interpretting this data in such a way as to support their optical naivety.
Your antiquated definition of resolution assumes randomly-sized and
-placed film grain which doesn't alias, where any discernable line
patterns are real. If discernable lines in the result are
resolution in the digital world, then the powers that be are asleep
at the wheel as the definition of resolution goes crashing down a
flight of big, jagged stairs.
Not really.
Yes, really. If 5 lines where 9 should be are considered successful resolution, all reason has gone to hell in a bucket. This is insane.
In the real world we seldom ask cameras to resolve
fine, nearly parallel lines that are rendered near parallel to our
sensor's photodiode rows at spatial frequencies approaching the
senor's Nyquist limit. High contrast detail such as this is close
to a worst case scenario for the X3 sensor from the standpoint of
aliasing.
No. Every Sigma (this has absolutely NOTHING to do with Foveon, except that they make the sensors that Sigma refuses to put an AA filter on) has the same aliasing with very sharp optics, even in pictures of grains of sand, grass, rock walls, etc. It is simply more organized and obvious in line resolution tests. Make up your mind; is the resolution (1550/1512 lines) justified, or not? If it is, then this is not a worst case scenario for the Sigma; it is the Sigma showing off its power. If it is not, then why are you disagreeing with me?
The bigger problem with the resolution tests is that
they don't communicate MTF response and other factors that
influence image quality and perception.
If you're getting aliasing, then obviously MTF issues are not a major factor. A soft enough lens would have run the lines grey on the SD10 just like the 10D.

--
John

 
"sensors with 1512 lines"??? The last time I looked at Foveon and
Bayer sensors, they had photosites, not lines. What in the name of
heaven are you tackling about?
You didn't know that the photosites in the Foveon are in a grid;
that they fall into rows and columns?
You seriously need to take a refresher course on both optical
physics and sensor technology.
Academia is usually several years behing real world technology.
There is no such course, and you need it more than I do.
Well it took me all of about 10 seconds to find one at MIT:

http://ocw.mit.edu/OcwWeb/Physics/8-422Spring-2005/CourseHome/index.htm

So much for being wrong about that....
He's wrong in multiple dimensions. As you pointed out, optical physics courses are easy to find. I had a lovely one at Oakland.

But sensor technology course are also pretty easy to find. There's one of those at MIT, too. But if you chance one letter, it's RIT that's the king. ;)

--
The Pistons led the NBA, and lost in the playoffs.
The Red Wings led the NHL, and lost in the playoffs.

It's up to the Tigers now...
Leading the league, and going all the way!

Ciao!

Joe

http://www.swissarmyfork.com
 
Hi Joe,

Yep, Rochester seems to be the "cat's meow" for photography related technology these days. God knows they need something in that area since Kodak has been on the downsize....

Best regards,

Lin
 
Nobody should go around saying a sensor with 1512 lines resolves
1550. It's very simple. If 1550 is the result, then the math
needs to be redone.
What math? It is a direct reading from a chart. A reading that has a limit to its precision. I wouldn't be surprised to find that there is some standard that goes along with the chart that defines how to read it and the expected degree of error. Like I said, the read value has an error of 2.5%. Big deal.
Also, anyone who saw the chart with their own eyes, and saw it go
from 9 lines down to 5 should know that anything that isn't 9 lines
isn't RESOLUTION OF THE SUBJECT, but an artifact. It is absurd to
call this resolution.
Did I miss something? Who is calling it resolution?
You can run Imatest MTF tests on the slant edges of the charts if
you like. Yes, there are lots of ways to consider resloution. The
nice thing about the DPReview tests is that the methodology is
fairly consistent and you have access to the full resolution JPEG
originals.
I do, and that is why I can see that most people are interpretting
this data in such a way as to support their optical naivety.
Well, the fact is that most people tend to interpret information so that it fits their existing view of the world. That's just human nature and we all need to be careful of it.
Your antiquated definition of resolution assumes randomly-sized and
-placed film grain which doesn't alias, where any discernable line
patterns are real. If discernable lines in the result are
resolution in the digital world, then the powers that be are asleep
at the wheel as the definition of resolution goes crashing down a
flight of big, jagged stairs.
Not really.
Yes, really. If 5 lines where 9 should be are considered
successful resolution, all reason has gone to hell in a bucket.
This is insane.
You keep arguing about the 5 lines. I've not mentioned the five lines. I don't think Lin has. I'm pretty sure we agree that the resolution limit is right around 1550 LPH. The thing that appears insane to me is your harping about five lines when nobody is arguing the point.

I agree that the five lines is obviously an incorrect rendering of the detail. So too is the gray mush of the CFA sensors with AA filters. The only real debate here is which form of artifact is more pleasing for a particular subject.
In the real world we seldom ask cameras to resolve
fine, nearly parallel lines that are rendered near parallel to our
sensor's photodiode rows at spatial frequencies approaching the
senor's Nyquist limit. High contrast detail such as this is close
to a worst case scenario for the X3 sensor from the standpoint of
aliasing.
No. Every Sigma (this has absolutely NOTHING to do with Foveon,
except that they make the sensors that Sigma refuses to put an AA
filter on) has the same aliasing with very sharp optics, even in
pictures of grains of sand, grass, rock walls, etc.
And who said that it didn't. Not me. I said it was worse case scenario. I thought it would be understood that this was from the standpoint of the visibility of the artifacts. Errors that aren't visible are generally relatively unimportant to photographers.
It is simply
more organized and obvious in line resolution tests.
Exactly. And that is one reason why the SD10 would not be my first choice for subjects with lots of straight lines. Especially if they were nearly vertical or horizontal. In those cases, the errors would be more likely to be noticed. In other subjects, they would tend to go unnoticed and the better MTF response by virtue of not using an AA filter can be enjoyed.
Make up your
mind; is the resolution (1550/1512 lines) justified, or not?
Yes it is. Have I ever said otherwise?
If it
is, then this is not a worst case scenario for the Sigma; it is the
Sigma showing off its power. If it is not, then why are you
disagreeing with me?
Because I don't see how the presence of aliasing causes the definition of resolution to come "crashing down a flight of big, jagged stairs. "

I see two issues (three actually). Aliasing, resolution and MTF response. They can be considered separately.
The bigger problem with the resolution tests is that
they don't communicate MTF response and other factors that
influence image quality and perception.
If you're getting aliasing, then obviously MTF issues are not a
major factor. A soft enough lens would have run the lines grey on
the SD10 just like the 10D.
The point is that the MTF response above the Nyquist resolution is almost certainly better for the SD10 because it has no AA filter. This contributes to sharpness - which is often confused with the abolute limits of resolution. The SD10 trades aliasing for MTF response. This is neither right or wrong. It is a judgement call as to which is a better tradeoff. Which is better is subjective to some degree and also subject dependent.

These Imatest SFR plots show the SD10s much better MTF response up to its Nyquist limit. This plot is of the nearly vertical slant edge. The horizontal slant edge is much more evenly matched between the two.



--
Jay Turberville
http://www.jayandwanda.com
 
Academia is usually several years behing real world technology.
There is no such course, and you need it more than I do.
Well it took me all of about 10 seconds to find one at MIT:
So much for being wrong about that....
I missed the part about issues of resolution of targets without AA filters.

What page is that on?
snip
And which point would that be? I see the SD10 starting to fall
apart at about 10.8. After that, each line starts morphing into 2
ridges that separate further yet as it pops into 7 lines, and goes
through the same process to pop into 5 lines at the end. The 10D
goes from consistently-contrasted 9 lines to inconsistent at about
14.5. The 10D fades from 9 lines to 9 inconsistent lines to no
lines at all, as it should be.
When your opinion is out of touch with the rest of the testing
world, maybe it's time for some introspection?
There are lots of people who agree with me. For whatever reasons, they are not in this particular thread and responding right now.

"The world", right now, is a thread dominated by Sigma users and fanatics, or people who happen to like the Sigmas' high-frequency distortion. Like it or not, it is distortion, not accuracy.

The problem with your position is that you can't even honestly say to yourself and to me that yes, there is a difference between accurate resolution and having a number of lines in the result; you want the resolution to be full, and genuine, so there is no room for any admission otherwise. You have no vision into a bigger picture that engulfs all aspects of the situation; just a mantra that keeps you and your dreams of AA-less photography safe.
Everyone else
disagrees with your assessment, maybe you should apply to MIT to
"teach" the course - LOL
Oh yeah, everything taught in school is correct - since what year? You know, the year when all the facts came out about everything, and all the professors had no choice but to adopt the same viewpoints? How lucky we are that this has happened in our lifetimes!

And frankly, you don't even know what they teach in that course - and most likely, they teach that all good sampling requires rejection of frequencies ablove the nyquist, and a roll-off below it. Every academic or scientific reference to sampling I've seen says this.
What good are distinct lines if they are not the lines form the
original chart? They are GARBAGE ; not resolution.
In your considered opinion or one.....
They are not the subject. They are not in the right space, spatial-weight-wise. They are not consistent. They are a distortion of the subject, regardless of whether or not you like the look.
I don't care how you or Phil interpret the chart. The chart says
to me that both cameras start to deteriorate at about the same
resolution, at the pixel level. You can talk about "resolution"
and your religious-cult mentality worshipping glitches in the
terminology machine, and I will talk about "accurate resolution".
Yes, it's quite obvious that Phil and I and the rest of the
reviewers are wrong and that you are right! Now it just so happens
that I own the SD10, owned the SD9, own a 10D, a 1D, a 1DS, a 1D
Mark II, a DCS-760 Kodak with removable AA filter and a Nikon D2X.
What's your personal experience with the technology to warrant your
assertions?
I've seen the Sigma images. They have false high-res detail. I see lines and grids where lines and grids shouldn't be. I see an "E" character that has a vertical element in one instance, and is missing it in another.

Imagine a digital audio system that took the area under the curve from an analog input for 90% of discreet segments of time, and created a single level for it in the output, which had no filtering. Distortion is what you would get. It would sound brighter and louder, though, wouldn't it? Any analogies come to mind?

I'm not the only person that looks at Sigma images taken with sharp optics and sees distortion.

--
John

 

Keyboard shortcuts

Back
Top