Sensor size, complexity, pixel size and cost?

Erik Kaffehr

Veteran Member
Messages
8,195
Solutions
7
Reaction score
5,118
Location
Nyköping, SE
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
 
Another consideration is that the low-light sensitivity of on-sensor phase-detect AF used in mirrorless cameras depends on the size of the sensels.

Conceptually, Canon's dual-pixel OSPDAF could combine a number of sensels together vertically, thus increasing the PDAF capture area. However, that would require reading out all of the additional lines during focusing. I imagine we'll get there someday.
 
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Pushing camera resolution higher requires pushing optical resolution higher for those "crisp" images consumers want. 24MP APS-C and 20MP M4/3 are at such a resolution that it takes relatively excellent optics to make use of the pixels, which drives lens costs into a range that most will not pay.

FF above 50 MP is the same, but the typical quality of a FF lens is higher than on APS-C. Given the quality of the GFX and X systems, I think it is also fair to say that the typical 44x33 lens is higher quality than the typical FF lens, and the typical 54x44 lens would be higher quality than the typical 44x33 lens if they were very modern. My understanding, though, is that the 54x44 lenses are not very modern and provide middling performance at best at 100MP.
 
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Pushing camera resolution higher requires pushing optical resolution higher for those "crisp" images consumers want. 24MP APS-C and 20MP M4/3 are at such a resolution that it takes relatively excellent optics to make use of the pixels, which drives lens costs into a range that most will not pay.
At Canon Expo 2015, they displayed "crisp" room size prints from the 120Mp APS-H and 24-70 2.8 L II. Pixel density gives 70Mp APS-C.
FF above 50 MP is the same, but the typical quality of a FF lens is higher than on APS-C. Given the quality of the GFX and X systems, I think it is also fair to say that the typical 44x33 lens is higher quality than the typical FF lens, and the typical 54x44 lens would be higher quality than the typical 44x33 lens if they were very modern. My understanding, though, is that the 54x44 lenses are not very modern and provide middling performance at best at 100MP.
 
Last edited:
Any comments from sensor experts?
That would not be me, but I dare comment anyways =)

* The cost of image sensor are has been beaten to death in here, I have no comments on that.

* The sensor cost of increased sensel density could perhaps be expressed as either:

a) a reduction in (some aspects of) image quality at a fixed manufacture pitch,

b) the cost of investing in basic research (or buying IP) for smarter sensor circuitry

c) the cost of having a larger number of sensors rejected due to errors (or, as you say, a larger number of sensels per sensor that have to be interpolated)

d) the cost of moving sensor manufacture from an (outdated) silicon process to a (slightly less outdated) silicon process?

* Increasing the number of pixels means that for a given frame rate, you need to increase internal bandwidth, processing power accordingly. For a given burst performance, you also need to increase temporary memory/buffers, and for a given number of images you need to increase memory card size (and hard-drive). For a given performance in a raw developer, you need to bump pc/storage performance accordingly. A large number of users in this forum have expressed strong (negative) opinions about that.

The momentum of the industry seems to be driven by mobile phones and (lately) the car industry. That is where the big money is, and accordingly where manufacturers will spend the big R&D sums. Thus we _could_ (?) end up in a situation where you cellphone has 36 MP because whatever (minimal) IQ improvement is basically free, while your APS-C DSLR is stuck at 24MP because sales volume cannot support the kind of features needed to bump pixel density.

We see something similar in the laughable UX/touch features of even expensive cameras as compared to current smartphones. $500 smartphones can probably be sold in numbers where the cost of hiring a team of 500 SW devs is easily be absorbed by volume. A $5000 camera may not. Thus the software/UX of the expensive camera would either be crude in comparison to the cellphone, or it would be a (possibly poor fit) copy & paste job from a highly unrelated product (i.e. cameras running Android).

-h
 
Last edited:
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Pushing camera resolution higher requires pushing optical resolution higher for those "crisp" images consumers want. 24MP APS-C and 20MP M4/3 are at such a resolution that it takes relatively excellent optics to make use of the pixels, which drives lens costs into a range that most will not pay.
Are you saying that kit lenses are free from aliasing with those pixel counts?

FF above 50 MP is the same, but the typical quality of a FF lens is higher than on APS-C. Given the quality of the GFX and X systems, I think it is also fair to say that the typical 44x33 lens is higher quality than the typical FF lens, and the typical 54x44 lens would be higher quality than the typical 44x33 lens if they were very modern. My understanding, though, is that the 54x44 lenses are not very modern and provide middling performance at best at 100MP.
A lens does obviously not perform worse as sensor resolution increases. That, of course, does not mean that better lenses cannot be designed
 
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Pushing camera resolution higher requires pushing optical resolution higher for those "crisp" images consumers want. 24MP APS-C and 20MP M4/3 are at such a resolution that it takes relatively excellent optics to make use of the pixels, which drives lens costs into a range that most will not pay.
Are you saying that kit lenses are free from aliasing with those pixel counts?
They are relatively free compared to the standard of excellence.
FF above 50 MP is the same, but the typical quality of a FF lens is higher than on APS-C. Given the quality of the GFX and X systems, I think it is also fair to say that the typical 44x33 lens is higher quality than the typical FF lens, and the typical 54x44 lens would be higher quality than the typical 44x33 lens if they were very modern. My understanding, though, is that the 54x44 lenses are not very modern and provide middling performance at best at 100MP.
A lens does obviously not perform worse as sensor resolution increases. That, of course, does not mean that better lenses cannot be designed
It obviously does perform worse when you highly magnify the image.

At the scene level, the performance can only increase, but people do not judge 10,000x6,500 px images by looking at a 1000x1000 preview on a monitor. They zoom all the way in to the gory details.
 
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Pushing camera resolution higher requires pushing optical resolution higher for those "crisp" images consumers want. 24MP APS-C and 20MP M4/3 are at such a resolution that it takes relatively excellent optics to make use of the pixels, which drives lens costs into a range that most will not pay.
Are you saying that kit lenses are free from aliasing with those pixel counts?
They are relatively free compared to the standard of excellence.
So, I take your relatively cryptic statement to mean that we agree that relatively poor optics will make good use of smaller pixels 😉

FF above 50 MP is the same, but the typical quality of a FF lens is higher than on APS-C. Given the quality of the GFX and X systems, I think it is also fair to say that the typical 44x33 lens is higher quality than the typical FF lens, and the typical 54x44 lens would be higher quality than the typical 44x33 lens if they were very modern. My understanding, though, is that the 54x44 lenses are not very modern and provide middling performance at best at 100MP.
A lens does obviously not perform worse as sensor resolution increases. That, of course, does not mean that better lenses cannot be designed
It obviously does perform worse when you highly magnify the image.
No, not if you magnify similarly from a sensor with less resolution

At the scene level, the performance can only increase, but people do not judge 10,000x6,500 px images by looking at a 1000x1000 preview on a monitor. They zoom all the way in to the gory details.
 
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Pushing camera resolution higher requires pushing optical resolution higher for those "crisp" images consumers want. 24MP APS-C and 20MP M4/3 are at such a resolution that it takes relatively excellent optics to make use of the pixels, which drives lens costs into a range that most will not pay.
Are you saying that kit lenses are free from aliasing with those pixel counts?
They are relatively free compared to the standard of excellence.
So, I take your relatively cryptic statement to mean that we agree that relatively poor optics will make good use of smaller pixels 😉
Good use from a scientific perspective. From a consumer perspective, no.
FF above 50 MP is the same, but the typical quality of a FF lens is higher than on APS-C. Given the quality of the GFX and X systems, I think it is also fair to say that the typical 44x33 lens is higher quality than the typical FF lens, and the typical 54x44 lens would be higher quality than the typical 44x33 lens if they were very modern. My understanding, though, is that the 54x44 lenses are not very modern and provide middling performance at best at 100MP.
A lens does obviously not perform worse as sensor resolution increases. That, of course, does not mean that better lenses cannot be designed
It obviously does perform worse when you highly magnify the image.
No, not if you magnify similarly from a sensor with less resolution
People do not do that.
At the scene level, the performance can only increase, but people do not judge 10,000x6,500 px images by looking at a 1000x1000 preview on a monitor. They zoom all the way in to the gory details.
 
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Pushing camera resolution higher requires pushing optical resolution higher for those "crisp" images consumers want. 24MP APS-C and 20MP M4/3 are at such a resolution that it takes relatively excellent optics to make use of the pixels, which drives lens costs into a range that most will not pay.
Are you saying that kit lenses are free from aliasing with those pixel counts?
They are relatively free compared to the standard of excellence.
So, I take your relatively cryptic statement to mean that we agree that relatively poor optics will make good use of smaller pixels 😉
Good use from a scientific perspective. From a consumer perspective, no.
Because consumers love Moiré and other aliasing artefacts.
FF above 50 MP is the same, but the typical quality of a FF lens is higher than on APS-C. Given the quality of the GFX and X systems, I think it is also fair to say that the typical 44x33 lens is higher quality than the typical FF lens, and the typical 54x44 lens would be higher quality than the typical 44x33 lens if they were very modern. My understanding, though, is that the 54x44 lenses are not very modern and provide middling performance at best at 100MP.
A lens does obviously not perform worse as sensor resolution increases. That, of course, does not mean that better lenses cannot be designed
It obviously does perform worse when you highly magnify the image.
No, not if you magnify similarly from a sensor with less resolution
People do not do that.
I do :)

5.93µm pixels

i-ZDGxHW5.jpg


1.53µm pixels

i-xKspH4f-X3.jpg


Of course, a lot of people don't and will arrive at silly conclusions about lens performance, noise performance etc

The natural consequence of what you say is that teleconverters are obsolete. I yet have to see the evidence for this
At the scene level, the performance can only increase, but people do not judge 10,000x6,500 px images by looking at a 1000x1000 preview on a monitor. They zoom all the way in to the gory details.
 
The natural consequence of what you say is that teleconverters are obsolete. I yet have to see the evidence for this
No, not at all...
of course it does. If nothing substantial can be had by reducing the pixel size further then it follows that a tc would offer nothing more than empty magnification at those pixel pitches.
 
The natural consequence of what you say is that teleconverters are obsolete. I yet have to see the evidence for this
No, not at all...
of course it does. If nothing substantial can be had by reducing the pixel size further then it follows that a tc would offer nothing more than empty magnification at those pixel pitches.
I do not understand how you reach that conclusion. Let me oversimplify my argument.
  1. The average lens today is not ready for pixels smaller than about 3.5 microns if you wish to maintain the current level of aliasing and subjective sharpness
  2. These lenses are in the < ~$1,000 range.
  3. Lenses that take TCs are generally considerably better than average and correspondingly more expensive. No one would call a 400/2.8 or 200/2 an average lens. Not a 100-400 either.
Further, a TC will always give you more pixels on target, even if not "sharp" pixels. This helps with SNR and "smoothness" of the image.
 
The natural consequence of what you say is that teleconverters are obsolete. I yet have to see the evidence for this
No, not at all...
of course it does. If nothing substantial can be had by reducing the pixel size further then it follows that a tc would offer nothing more than empty magnification at those pixel pitches.
I do not understand how you reach that conclusion. Let me oversimplify my argument.
  1. The average lens today is not ready for pixels smaller than about 3.5 microns if you wish to maintain the current level of aliasing and subjective sharpness
  2. These lenses are in the < ~$1,000 range.
  3. Lenses that take TCs are generally considerably better than average and correspondingly more expensive. No one would call a 400/2.8 or 200/2 an average lens. Not a 100-400 either.
Further, a TC will always give you more pixels on target, even if not "sharp" pixels. This helps with SNR and "smoothness" of the image.
I don't wish to maintain the current level of aliasing and subjective sharpness. I want no aliasing and all of what all of my lenses can give. All lenses from the cheapest kit zooms to expensive proffesional lenses can easely generate Moiré on 24MP APS-C and 50MP FF. I don't want it if I can be free from it.

The aperture will be constant regardless of the TC and you will thus not have any improvement in SNR and you will also not have improved smoothness if the sensor is already sampling the lens sufficiently.
 
I can only provide some second hand answers.

The main real-estate limitation right now does not appear to the upstream photosite architecture - phone cameras already have very small pixels, but the addition of downstream components and phase detection points to multiple rows and columns on BSI sensors, and future global shutter implementations with downstream components on all photosite locations.

The problem is partly the components - CDS and ADC are more complex than the FETs mounted on the PS.

Perhaps the increased throughput at each photosite also requires more connections and wider tracks as well, but I have not seen a schematic of a global shutter design so I don't know.

In terms of PDAF, I assume it depends on implementation. DPAF would require more components and wiring than masked photodiodes for instance.

In theory, more pixels is always better, but all these 'bolt-on accessories' seem to be more of a problem.

Having said all that, sensor resolution seems to align pretty well with noise performance. By which I mean that any increase in pixel resolution would be offset by increased noise to the point where the improvement would be largely irrelevant, due to the effect of noise on high frequency MTF, and its impact on sharpening.

If noise is more of a limitation than resolution, there is no point in chasing higher resolution for a given format. We are kind of back to where film was.
 
The natural consequence of what you say is that teleconverters are obsolete. I yet have to see the evidence for this
No, not at all...
of course it does. If nothing substantial can be had by reducing the pixel size further then it follows that a tc would offer nothing more than empty magnification at those pixel pitches.
I do not understand how you reach that conclusion. Let me oversimplify my argument.
  1. The average lens today is not ready for pixels smaller than about 3.5 microns if you wish to maintain the current level of aliasing and subjective sharpness
  2. These lenses are in the < ~$1,000 range.
  3. Lenses that take TCs are generally considerably better than average and correspondingly more expensive. No one would call a 400/2.8 or 200/2 an average lens. Not a 100-400 either.
Further, a TC will always give you more pixels on target, even if not "sharp" pixels. This helps with SNR and "smoothness" of the image.
I don't wish to maintain the current level of aliasing and subjective sharpness. I want no aliasing and all of what all of my lenses can give. All lenses from the cheapest kit zooms to expensive proffesional lenses can easely generate Moiré on 24MP APS-C and 50MP FF. I don't want it if I can be free from it.
Cool. The masses don't think the same way.
The aperture will be constant regardless of the TC and you will thus not have any improvement in SNR and you will also not have improved smoothness if the sensor is already sampling the lens sufficiently.
Untrue with post processing on SNR, and what I mean by smooth has nothing to do with the sharpness of the lens.
 
The natural consequence of what you say is that teleconverters are obsolete. I yet have to see the evidence for this
No, not at all...
of course it does. If nothing substantial can be had by reducing the pixel size further then it follows that a tc would offer nothing more than empty magnification at those pixel pitches.
I do not understand how you reach that conclusion. Let me oversimplify my argument.
  1. The average lens today is not ready for pixels smaller than about 3.5 microns if you wish to maintain the current level of aliasing and subjective sharpness
  2. These lenses are in the < ~$1,000 range.
  3. Lenses that take TCs are generally considerably better than average and correspondingly more expensive. No one would call a 400/2.8 or 200/2 an average lens. Not a 100-400 either.
Further, a TC will always give you more pixels on target, even if not "sharp" pixels. This helps with SNR and "smoothness" of the image.
I don't wish to maintain the current level of aliasing and subjective sharpness. I want no aliasing and all of what all of my lenses can give. All lenses from the cheapest kit zooms to expensive proffesional lenses can easely generate Moiré on 24MP APS-C and 50MP FF. I don't want it if I can be free from it.
Cool. The masses don't think the same way.
The masses have been whining about the megapixel race since 6MP APS-C chips were the hottest thing around. Led by Phil Askey's completely misguided campaign against smaller pixels.

And I don't buy your premise: the ILC market is clearly moving towards the higher end catering for the massess who want sharp, fast lenses which can separate their photography from the ubiquitous cell phone cam
The aperture will be constant regardless of the TC and you will thus not have any improvement in SNR and you will also not have improved smoothness if the sensor is already sampling the lens sufficiently.
Untrue with post processing on SNR, and what I mean by smooth has nothing to do with the sharpness of the lens.
Perhaps you mean creaminess? ;-)
 
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Image sensors are not leading edge semiconductor products at a fundamental level (which is not to say that design teams are not doing innovative work). Most of the constraints under which they operate are commercial and depend very much on the market. So, here is the first thing to take into account:

The market for large-sensor still cameras (which includes all ILC cameras) is a very small part of the overall sensor market. The sensor companies will tend to priorities their R&D expenditure where they see the biggest return. Right now, that's the automotive market. Recently cars have become plastered with image senosors, and the image sensor companies are falling over themselves to bag a part of that new market. New markets are few and far between. Whilst they do that, they aren't putting so much into large-sensor still cameras, which is a moribund market.

So, the question a sensor company is making is whether it can make money on each new sensor product, and this in turn depends on its customers. We could take as a case study the Nikon D850. This uses a Sony IMX309 sensor with 46MP. The D800 and D810 both used an IMX094 sensor with 36MP, and that sensor has been a successful product for Sony, going into two of Sony imaging's cameras, one of Pentax's and becoming popular in the scientific and industrial markets. If you want to build a camera around it, you can buy one for around $200. Sony imaging no longer uses that sensor, having commissioned a BSI alternative, the IMX193, but the sensor was still selling in other markets, so left to itself, Sony has no reason to replace it. However, Nikon clearly needed an upgrade for the D850, anything less would be seen as not enough in market terms. Now Sony has a customer for an upgrade, and the customer will set the broad parameters of the new sensor. Panasonic also expresses interest in a FF sensor in the 46MP range. Now Sony has two customers, and replacing the IMX094 becomes commercially viable, and Sony goes ahead and produces the part. There is nothing new in this sensor, it is an assembly of technology Sony already has. In fact, it's not close to the limits of what they can do.

So, in brief, the progress of the sensors that we see in our cameras has slowed because our cameras represent a shrinking market, and the there are no longer string drivers for investment in it.

As far as the relative expense of larger sensors versus more pixels, that is a complex question. The reason for the price drop of large sensors is reasonably simple, they can make use of fab plants which are no longer suitable for the small sensors. Thus they extract value from what would otherwise be an idle asset. Given that the major cost of semiconductors is plant, not raw materials, this means that the relative cost of the large sensors for our small market has declined rapidly. In fact, that is precisely why all the action is at FF and larger, for the still camera market, they represent opportunities for new sales where there is little elsewhere, and the opportunity to extract additional value from old lines is also attractive.
 
The natural consequence of what you say is that teleconverters are obsolete. I yet have to see the evidence for this
No, not at all...
of course it does. If nothing substantial can be had by reducing the pixel size further then it follows that a tc would offer nothing more than empty magnification at those pixel pitches.
I do not understand how you reach that conclusion. Let me oversimplify my argument.
  1. The average lens today is not ready for pixels smaller than about 3.5 microns if you wish to maintain the current level of aliasing and subjective sharpness
  2. These lenses are in the < ~$1,000 range.
  3. Lenses that take TCs are generally considerably better than average and correspondingly more expensive. No one would call a 400/2.8 or 200/2 an average lens. Not a 100-400 either.
Further, a TC will always give you more pixels on target, even if not "sharp" pixels. This helps with SNR and "smoothness" of the image.
I don't wish to maintain the current level of aliasing and subjective sharpness. I want no aliasing and all of what all of my lenses can give. All lenses from the cheapest kit zooms to expensive proffesional lenses can easely generate Moiré on 24MP APS-C and 50MP FF. I don't want it if I can be free from it.
Cool. The masses don't think the same way.
I'm not sure that it has very much to do with what 'the masses' think. How did you get your information, was there some market research done?

Rather, I think it as to do with what would be a commercially sensible path, as a camera company would see it.

Take, for example the 46MP sensor in the D850, 7D (and soon, no doubt, the Panasonic S1R). That has 4.35 micron pixels, which really aren't pushing any boundaries with respect to sensor technology. Sony's 1" sensors, which work very well, have 2.4 micron pixels. Had the sensor been made with those, it would have boasted 150MP. No problem whatsoever with the sensor, Sony can do it easily without breaking sweat. However, so far as the camera spec goes, it produces some problems. Frame rates will go down. Instead of producing a 7FPS camera, Nikon would have had a 2.3 FPS camera, which they might have felt to be less marketable. Next, in terms of marketing, give the consumer something more, they will grab it with open arms. Give them something so much more that they no longer understand it, you get sales resistance. So, whatever the masses think, they are likely to be offered what makes sense to the manufacturer, and that isn't a 150MP FF camera (yet).
 
Hi,

It would be interesting to hear from folks knowledgeable about the cost aspects of modern sensor designs.

My understanding is that increasing the sensor size over proportionally expensive.

But, I would assume that increasing the number of pixels would not be very expensive.

Clearly, increasing the complexity will increase the probability of component failure, but I also think that bad pixels can be mapped out. The complexity of the non pixel components should probably increase with the square root of pixel components. Doubling the number of pixels increases number of columns with sqrt(2).

It seems that APS-C is pretty much stuck at 24 MP and 24x36 is moving slowly upwards from 36 MP.
  • There can be quite a few reasons for the slow development:
  • Optimum balance between noise and resolution may have been reached.
  • It may be that we get into diminishing returns with regard to resolution.
  • It may also be that camera electronics can have difficulty to handle larger amount of data within limited power budget.
  • Cost obviously is a factor that plays an important role.
On the other hand, we get into a situation when medium format uses smaller pixel size than 24x36 mm, something like 3.8 microns on the 100 MP 44x33 mm sensor.

Any comments from sensor experts?

Best regards

Erik
Image sensors are not leading edge semiconductor products at a fundamental level (which is not to say that design teams are not doing innovative work). Most of the constraints under which they operate are commercial and depend very much on the market. So, here is the first thing to take into account:

The market for large-sensor still cameras (which includes all ILC cameras) is a very small part of the overall sensor market. The sensor companies will tend to priorities their R&D expenditure where they see the biggest return. Right now, that's the automotive market. Recently cars have become plastered with image senosors, and the image sensor companies are falling over themselves to bag a part of that new market. New markets are few and far between. Whilst they do that, they aren't putting so much into large-sensor still cameras, which is a moribund market.

So, the question a sensor company is making is whether it can make money on each new sensor product, and this in turn depends on its customers. We could take as a case study the Nikon D850. This uses a Sony IMX309 sensor with 46MP. The D800 and D810 both used an IMX094 sensor with 36MP, and that sensor has been a successful product for Sony, going into two of Sony imaging's cameras, one of Pentax's and becoming popular in the scientific and industrial markets. If you want to build a camera around it, you can buy one for around $200. Sony imaging no longer uses that sensor, having commissioned a BSI alternative, the IMX193, but the sensor was still selling in other markets, so left to itself, Sony has no reason to replace it. However, Nikon clearly needed an upgrade for the D850, anything less would be seen as not enough in market terms. Now Sony has a customer for an upgrade, and the customer will set the broad parameters of the new sensor. Panasonic also expresses interest in a FF sensor in the 46MP range. Now Sony has two customers, and replacing the IMX094 becomes commercially viable, and Sony goes ahead and produces the part. There is nothing new in this sensor, it is an assembly of technology Sony already has. In fact, it's not close to the limits of what they can do.

So, in brief, the progress of the sensors that we see in our cameras has slowed because our cameras represent a shrinking market, and the there are no longer string drivers for investment in it.

As far as the relative expense of larger sensors versus more pixels, that is a complex question. The reason for the price drop of large sensors is reasonably simple, they can make use of fab plants which are no longer suitable for the small sensors. Thus they extract value from what would otherwise be an idle asset. Given that the major cost of semiconductors is plant, not raw materials, this means that the relative cost of the large sensors for our small market has declined rapidly. In fact, that is precisely why all the action is at FF and larger, for the still camera market, they represent opportunities for new sales where there is little elsewhere, and the opportunity to extract additional value from old lines is also attractive.
Most of the investment in image sensors seems to be directed at readout speed and on-sensor PDAF (both for smart phones and regular cameras). IQ improvements at the hardware level are likely to be minimal.

The rest of the R&D money is being spent on image processing software - mainly by the big phone companies who are acquiring boutique software companies at a fair old rate.

And as you say, the real market for large sensors is small, but if they can be built on existing lines, is that a real issue cost-wise? AFAIK, they are built using legacy processes that are nowhere near leading edge in CMOS terms, so capital costs are not huge.
 

Keyboard shortcuts

Back
Top