235653929134718976 pixels

Roland Karlsson

Forum Pro
Messages
30,296
Solutions
1
Reaction score
6,582
Location
Stockholm, SE
The SD10 has 3x2^12 color values per site and 3.4 MSites,
therefore it really MUST be a ...
2268 x 1512 x 4096 * 4096 * 4096 = 235,653,929,134,718,976 pixel
... camera.

Sorry about that. I am getting tired of people trying to upgrade
this 3.4 Mpixel camera, counting each pixel 3 times. Why
not take the color depth into account? If more color samples
increases the resolution, so must also color depth.

Roland
 
The SD10 has 3x2^12 color values per site and 3.4 MSites,
therefore it really MUST be a ...
2268 x 1512 x 4096 * 4096 * 4096 = 235,653,929,134,718,976 pixel
... camera.

Sorry about that. I am getting tired of people trying to upgrade
this 3.4 Mpixel camera, counting each pixel 3 times. Why
not take the color depth into account? If more color samples
increases the resolution, so must also color depth.

Roland
Good points. The problem obviously arose when the method for classifying digital image quality was first formulated. No one considered that the Foveon system would toss the one-dimensional "pixel formula" out the window, nor can anyone be blamed for it. It's like classifying an automoblie's power by referring to the number of cylinders in the engine.

Anyone out there wanna develop a new set of parameters? How about one that factors total surface pixels against the percentage of interpolation used to correctly resolve an image?

--
'If they're not screaming at you to get out of the way, you're not close enough'
 
Good points. The problem obviously arose when the method for
classifying digital image quality was first formulated. No one
considered that the Foveon system would toss the one-dimensional
"pixel formula" out the window, nor can anyone be blamed for it.
It's like classifying an automoblie's power by referring to the
number of cylinders in the engine.
This is not entirely correct. The first digital color still pictures
of any quality were scanned. And scanned pictures size
follows the simple formula - Mpixels is the number of 3 color
sites. The same goes for color picture files.

Then came Bayer with its problems to count pixels.

Then came Foveon - nothing new regard counting pixels -
just as a scanners and files there are 3 colors per pixel.
Anyone out there wanna develop a new set of parameters? How about
one that factors total surface pixels against the percentage of
interpolation used to correctly resolve an image?
How about just sticking with the old way of doing it?

Inventing new ways of counting to try to compete
with Bayer is just -- ehem -- silly.

Roland
 
The ONLY criteria I use - and used when selecting the camera for ME - is/was whether it could produce an A3+ print that I could happily hang next to prints via my 5X4 and 6X9 film cameras.

The SD9 was the only one that fulfilled that criteria and I have seen nothing yet to change my mind.

Zone8

PS: If this message appears thrice - it's because the first appeared to disappear! (perhaps the server refused my O4AP subject heading?) However, it will not hurt to repeat the message :-)))
 
Chuck's right ... the "terminology" has been out paced by the "technology."

The "Megapixel" measurement simply doesn't work when describing the Foveon setup.
The SD10 has 3x2^12 color values per site and 3.4 MSites,
therefore it really MUST be a ...
2268 x 1512 x 4096 * 4096 * 4096 = 235,653,929,134,718,976 pixel
... camera.

Sorry about that. I am getting tired of people trying to upgrade
this 3.4 Mpixel camera, counting each pixel 3 times. Why
not take the color depth into account? If more color samples
increases the resolution, so must also color depth.

Roland
Good points. The problem obviously arose when the method for
classifying digital image quality was first formulated. No one
considered that the Foveon system would toss the one-dimensional
"pixel formula" out the window, nor can anyone be blamed for it.
It's like classifying an automoblie's power by referring to the
number of cylinders in the engine.

Anyone out there wanna develop a new set of parameters? How about
one that factors total surface pixels against the percentage of
interpolation used to correctly resolve an image?

--
'If they're not screaming at you to get out of the way, you're not
close enough'
--
eric goeres ~ goeres.com
 
Oh good, another thread about what pixels are and the resolution of the Foveon chip! Just what we needed now that the old one has been closed.

Arrrrghhhhhh..............
The SD10 has 3x2^12 color values per site and 3.4 MSites,
therefore it really MUST be a ...
2268 x 1512 x 4096 * 4096 * 4096 = 235,653,929,134,718,976 pixel
... camera.

Sorry about that. I am getting tired of people trying to upgrade
this 3.4 Mpixel camera, counting each pixel 3 times. Why
not take the color depth into account? If more color samples
increases the resolution, so must also color depth.

Roland
 
Chuck's right ... the "terminology" has been out paced by the
"technology."

The "Megapixel" measurement simply doesn't work when describing the
Foveon setup.
But, that is exectly what it does. It is Bayer that has problems.
Foveon just follows the old definition, used e.g. in color picture
files. Foveon colors is nothing new, it is Bayer that is the invention
with regards to color representation. Foveon is just a clever way
of going back to how it used to be, e.g. when scanning.

Roland
 
Oh good, another thread about what pixels are and the resolution of
the Foveon chip! Just what we needed now that the old one has been
closed.
Sorry for that. Unfortunately there is more to say.
At least as long as the peculiar company Foveon
succeeds in fooling some Foveon zealots.
Arrrrghhhhhh..............
Just calm down. Think positive thoughts - about fluffy clouds
or something. Then - in the future - avoid threads that has this
effect on you :)

See ya :)
Roland
 
Sorry about that. I am getting tired of people trying to upgrade
this 3.4 Mpixel camera, counting each pixel 3 times. Why
not take the color depth into account? If more color samples
increases the resolution, so must also color depth.
There are some Bayer sensor users that just cannot think three dimensionally. For some reason these users just couldnt understand the difference between true RGB pixel vs R, G, or B that must rely on its Bayer magic and clams a 6mp Bayer sensor truy has 6mp worth of RGB color pixels.

--
jc
 
Anyone out there wanna develop a new set of parameters?
That's already going on in a differnt thread. You're welcome to join...

http://forums.dpreview.com/forums/read.asp?forum=1027&message=6503328
How about
one that factors total surface pixels against the percentage of
interpolation used to correctly resolve an image?
Would that include the colors (metamer pairs) that cannot be correctly resolved by a layered sensor?

--
Ciao!

Joe

http://www.swissarmyfork.com
 
There are some Bayer sensor users that just cannot think three
dimensionally. For some reason these users just couldnt understand
the difference between true RGB pixel vs R, G, or B that must rely
on its Bayer magic and clams a 6mp Bayer sensor truy has 6mp worth
of RGB color pixels.
Some might be so naïve, but that is usually not how
it is presented. I think most understand that 6 Million
RGB sensors is better than 6 Million R, G or B sensors.

What is said is that a 6 Msensor Bayer has the potential
for the same resolution as 6 Million RGB sensor. The same
resolution, but not the same color resolution.

This is not entirely true either. Because the interpolation
algorithms tries to get good color rendition, it must sacrifice
some resolution.

But ... if you have the RAW file and you know that the
picture is of very low color saturation, then you could
use the sensor as a "nearly" B&W and get the same
resolution as a sensor without color filter.

Roland
 
The "Megapixel" measurement simply doesn't work when describing the
Foveon setup.
But, that is exectly what it does. It is Bayer that has problems.
Foveon just follows the old definition, used e.g. in color picture
files. Foveon colors is nothing new, it is Bayer that is the invention
with regards to color representation. Foveon is just a clever way
of going back to how it used to be, e.g. when scanning.
Actually, Bayer sensors are normally described quite correctly by the "old definition" (which is actually the same as the "new definition"). They are spatial locations, pure and simple.

A Foveon sensor can also be described quite correctly by the old/new definition, it has 3.4 million pixels (spatial locations). Foveon marketing (and certain Foveon advocates) keep attempting to somehow redefine a pixel to be something non-spatial, so that they can claim 10.2 million of these "somethings" instead of 3.4 million "pixels".

The pixels are simply information "containers", a Foveon sensor puts more information into each container than a Bayer sensor, but the number of containers is not altered.

And yes, it is meaningful to count the pieces of information going into the "pixel containers", but there is currently no name for this quantity.

And I'm really getting tired of people saying things like "It is Bayer that has problems". You are insulting a person that you don't even know. Dr. Bryce Bayer, like many scientists, was quite capable of using technical terminology correctly. He made valuable contributions to several areas of image processing, in addition to creating a particularly useful pattern for color filter arrays.

--
Ciao!

Joe

http://www.swissarmyfork.com
 
And I'm really getting tired of people saying things like "It is
Bayer that has problems". You are insulting a person that you don't
even know. Dr. Bryce Bayer, like many scientists, was quite capable
of using technical terminology correctly. He made valuable
contributions to several areas of image processing, in addition to
creating a particularly useful pattern for color filter arrays.
Sorry about that. The Bayer pattern sensor is
a good design, better than other color filter designs.
The main advantage is that, no matter what pixel
you look at, the other pixels are symmetrically colored.
Thus you can avoid really bad color artefacts by filtering.
I don't think that either me or someone else is
meaning that Dr. Bayer has any problems.

I am only trying get the Foveon people here to
understand that their SD9 and SD10 are 3.4 Mpixels.

For the Bayer sensor it is slightly different. Because you
must apply some filtering to avoid color artefacts. Therefore
you lose some resolution. Generally not much, but some.

Roland
 
But ... if you have the RAW file and you know that the
picture is of very low color saturation, then you could
use the sensor as a "nearly" B&W and get the same
resolution as a sensor without color filter.
Roland,

That's an interesting conjecture: taking it to the limit: that a Bayer sensor should be able to make a "full-res" black and white image.

If the output is zero color saturation, is it possible to get good luminance at every pixel? Could someone with a Bayer camera do some tests? Maybe they'll come out as sharp as the X3 B&Ws, but with more pixels.

j
 
What matters is the quality of the pixels. If 3.4 million Foveon spatial locations can capture (more or less) as much detail as 6 million Bayer spatial locations then what we're really talking about is the 'quality' of the pixels. The number of pixels is really only important if you must upsample a great deal to get to your prefered print size, in which case the less upsampling you must do the better.

Two things matter most to photographers. The number of pixels (or dye clouds) and their quality. Today you can use 2 1/4 film to make enlragements that would have required a 4x5 ten or fifteen years ago (with emulsions available then). That's a clear case of the number of dye clouds being less in quantity than those available in a 4x5 chrome but the increased quality of those dye clouds (smaller granularity factors, etc) allow for better enlargements today from smaller film sizes. It's a bit like that with the current Foveon sensor. You have 3.4MP that produce images that aren't lacking in detail compared to 6MP Bayer cameras.

What's really needed is some type of qualitative measure that lays out what the quality and numerical standards would be for, for example, producing a highly detailed print at 20x30 at 300 dpi. I've found that the Kodak 14N (from what far field landscape samples I could find) quite easily can have its native file size doubled at 300dpi to produce a 20x30 inch print. Of course the 'quality' of those pixels can vary a great deal depending upon the light available to the camera. The SD9/10 and 6MP cameras, for me, simply don't have enough in terms of the number of pixels they output and/or the quality of those pixels for me to use them when any type of significant enlargement is required.

At some point the ability to capture detail is more important than the sheer number of pixels. If you go to the Luminous Landscape website you can see that the Kodak DCS Proback doesn't capture anymore detail than the 1Ds and that the 22MP digital back that was compared to the Kodak back doesn't capture anymore detail than the 16MP Kodak back. All the extra MPs give you is the ability to crop more and to make a certain size enlargement with less interpolation needed.
 
That's an interesting conjecture: taking it to the limit: that a
Bayer sensor should be able to make a "full-res" black and white
image.
They can, under the right conditions.
If the output is zero color saturation, is it possible to get good
luminance at every pixel?
Yes. I've done it on many occasions, and two different conditions.

The first is with a relatively monochrome scene. As long as all sensors are being stimulated, you can simply compute a local "color correction", averag the intensity from several nearby red cells, several blue cells, and several green sells, to get a set of "compensating gains", apply those to the sensor outputs, and you've got monchrome.

The second is for infrared, when the color filter array actually fails, and what you get from the sensor really is monochrome (although it does require a global gain correction).
Could someone with a Bayer camera do
some tests? Maybe they'll come out as sharp as the X3 B&Ws, but
with more pixels.
Unfortunatly, they don't. The anti-aliasing filter on most Bayer cameras removes some high frequency content. But they do come out better than a "conventionally" processed image.

--
Ciao!

Joe

http://www.swissarmyfork.com
 
Why don't we forget about pixel and photosite counts altogether and measure resolution the old fashioned way?

Er, Have I just re-invented the resolution chart?

ps

According to Phil's res charts the SD9 is in the ball park of the D60/D10/300D/D100 and slightly outclassed by the S2pro.

Am I missing something here, or is the empirical evidence as clear as it looks i.e. 3.4MP of Foveon is roughly approx to 6MP of bayer. If this is true, all the endless Foveon v bayer pixel count claims can be resolved by appealing to the humble res chart...

...or am I mising something obvious?

pps

Actually the topic isn't even that interesting, as what I want is an affordable DSLR that has enough resolution to produce 18 x 12 inch prints with the same very fine detail my D100 provides in a 7 x 5 inch print or I can get from my Fuji 6 x 9.

How many pixels (of any type!) are required for that??
The SD10 has 3x2^12 color values per site and 3.4 MSites,
therefore it really MUST be a ...
2268 x 1512 x 4096 * 4096 * 4096 = 235,653,929,134,718,976 pixel
... camera.

Sorry about that. I am getting tired of people trying to upgrade
this 3.4 Mpixel camera, counting each pixel 3 times. Why
not take the color depth into account? If more color samples
increases the resolution, so must also color depth.

Roland
 
Why don't we forget about pixel and photosite counts altogether and
measure resolution the old fashioned way?

Er, Have I just re-invented the resolution chart?

ps

According to Phil's res charts the SD9 is in the ball park of the
D60/D10/300D/D100 and slightly outclassed by the S2pro.

Am I missing something here, or is the empirical evidence as clear
as it looks i.e. 3.4MP of Foveon is roughly approx to 6MP of bayer.
If this is true, all the endless Foveon v bayer pixel count claims
can be resolved by appealing to the humble res chart...

...or am I mising something obvious?
No, not really, the most interesting meassure is the actual picture,
not how you get there.

There are some problems though.

When you meassure film, the film usually have no preferred
direction. It is a random pattern. Digital cameras have a
regular grid pattern, which makes the resolution in 2D
look rather weird. Adding upon that the Bayer pattern,
you get even weirder resolution. So --- a number might
not describe the resolution.

Roland
 
Could someone with a Bayer camera do
some tests? Maybe they'll come out as sharp as the X3 B&Ws, but
with more pixels.
Unfortunatly, they don't. The anti-aliasing filter on most Bayer
cameras removes some high frequency content. But they do come out
better than a "conventionally" processed image.
Then the Kodak 14N ought to make an awesome B&W, having no AA blur.

Any samples?

j
 
Naturally, I have no problem with either Bayer personally or his scientific contributions.
The "Megapixel" measurement simply doesn't work when describing the
Foveon setup.
But, that is exectly what it does. It is Bayer that has problems.
Foveon just follows the old definition, used e.g. in color picture
files. Foveon colors is nothing new, it is Bayer that is the invention
with regards to color representation. Foveon is just a clever way
of going back to how it used to be, e.g. when scanning.
Actually, Bayer sensors are normally described quite correctly by
the "old definition" (which is actually the same as the "new
definition"). They are spatial locations, pure and simple.

A Foveon sensor can also be described quite correctly by the
old/new definition, it has 3.4 million pixels (spatial locations).
Foveon marketing (and certain Foveon advocates) keep attempting to
somehow redefine a pixel to be something non-spatial, so that they
can claim 10.2 million of these "somethings" instead of 3.4 million
"pixels".

The pixels are simply information "containers", a Foveon sensor
puts more information into each container than a Bayer sensor, but
the number of containers is not altered.

And yes, it is meaningful to count the pieces of information going
into the "pixel containers", but there is currently no name for
this quantity.

And I'm really getting tired of people saying things like "It is
Bayer that has problems". You are insulting a person that you don't
even know. Dr. Bryce Bayer, like many scientists, was quite capable
of using technical terminology correctly. He made valuable
contributions to several areas of image processing, in addition to
creating a particularly useful pattern for color filter arrays.

--
Ciao!

Joe

http://www.swissarmyfork.com
--
eric goeres ~ goeres.com
 

Keyboard shortcuts

Back
Top