Fast lenses, and High ISO

Started 5 months ago | Discussions
unknown member
(unknown member)
Like?
Exactly !
In reply to Great Bustard, 5 months ago

Think about signal-to-noise ratio, not jut the "ever increasing ISO".

 wchutt's gear list:wchutt's gear list
Fujifilm X-Pro1 Fujifilm X-T1 Fujifilm XF 18mm F2 R Fujifilm XF 35mm F1.4 R Fujifilm XF 14mm F2.8 R +20 more
Reply   Reply with quote   Complain
Albert Silver
Senior MemberPosts: 1,541Gear list
Like?
Re: "fast" is relative
In reply to tko, 5 months ago

tko wrote:

Remember that F4.0 is considered kind of slow on FF, but is equal to F2.0 on M43rds, which is considered "fast."

That's not entirely accurate. You are describing the depth of field equivalence, from one sensor to the next, not the light. f/2 on a m43rds may have the depth of field of f/4 on a full-frame, but the light will still be f/2.

A F0.95 lens on M43rds is equal in performance to a F2.0 lens on FF, which (for a prime on FF) is pretty slow.

We won't even talk about how slow the superzooms are in FF terms.

Chris R-UK wrote:

Fast lenses are clearly less important on MF and FF cameras because the larger sensor allow for higher ISO settings and better DoF control.

However, as sensors get smaller, fast lenses as measured by the f-number became much more important because high ISO performance is worse and depth of field control is more difficult.

That is why Voigtlander has a range of f/0.95 lenses for M4/3 and there are superzooms like the Panasonic FZ200 with f/2.8 over the entire 24x zoom focal length range.

-- hide signature --

Chris R

-- hide signature --

professional cynic and contrarian: don't take it personally

 Albert Silver's gear list:Albert Silver's gear list
Canon EF 85mm f/1.8 USM
Reply   Reply with quote   Complain
bobn2
Forum ProPosts: 32,301
Like?
Re: "fast" is relative
In reply to Albert Silver, 5 months ago

Albert Silver wrote:

tko wrote:

Remember that F4.0 is considered kind of slow on FF, but is equal to F2.0 on M43rds, which is considered "fast."

That's not entirely accurate. You are describing the depth of field equivalence, from one sensor to the next, not the light. f/2 on a m43rds may have the depth of field of f/4 on a full-frame, but the light will still be f/2.

The 'light' of a FF f/2 and a FT f/4 will be the same, which is the point he is making. In the end, given equally efficient sensors, you can achieve the same result at the same shutter speed using an f/4 on FF as you can on FT. The density of the light of the f/4 is one quarter but there is a sensor four times the area to collect it, so it ends up the same.

-- hide signature --

Bob

Reply   Reply with quote   Complain
EinsteinsGhost
Forum ProPosts: 11,977Gear list
Like?
Re: "fast" is relative
In reply to bobn2, 5 months ago

bobn2 wrote:

Albert Silver wrote:

tko wrote:

Remember that F4.0 is considered kind of slow on FF, but is equal to F2.0 on M43rds, which is considered "fast."

That's not entirely accurate. You are describing the depth of field equivalence, from one sensor to the next, not the light. f/2 on a m43rds may have the depth of field of f/4 on a full-frame, but the light will still be f/2.

The 'light' of a FF f/2 and a FT f/4 will be the same, which is the point he is making. In the end, given equally efficient sensors, you can achieve the same result at the same shutter speed using an f/4 on FF as you can on FT. The density of the light of the f/4 is one quarter but there is a sensor four times the area to collect it, so it ends up the same.

-- hide signature --

Bob

The point tko is making is wrong. DOF equivalence applies, exposure equivalence does not (for the reason you state above).

 EinsteinsGhost's gear list:EinsteinsGhost's gear list
Sony Cyber-shot DSC-F828 Sony SLT-A55 Sony Alpha NEX-6 Sigma 18-250mm F3.5-6.3 DC OS HSM Sony 135mm F2.8 (T4.5) STF +12 more
Reply   Reply with quote   Complain
mosswings
Veteran MemberPosts: 6,075Gear list
Like?
Re: Understanding ISO
In reply to Austinian, 5 months ago

Austinian wrote:

Great Bustard wrote:

The most important factors about the sensor that manufacturers do not tell us are:

  • QE (Quantum Efficiency -- the proportion of light falling on the sensor that is recorded)
  • Read Noise (the additional electronic noise added by the sensor and supporting hardware)
  • CFA (Color Filter Array)
  • Microlens Efficiency

Which of these are likely to see significant improvements in the fairly near term (next few years)?

If by significant you mean something that might garner an extra stop or more of performance, it's hard to say. There has been recent work by Panasonic on a different type of CFA that eliminates the losses inherent in current implementations and that theoretically could yield a stop, perhaps more, in sensitivity:

http://www.imaging-resource.com/news/2013/02/05/bye-bye-bayer-panasonic-claims-new-sensor-tech-ends-color-filter-light-loss

It's been over a year since this announcement, but no updates.  One of the problems of this technique may lie in tailoring the color splitting response to produce acceptable color rendition. Given all the complaints about current CFAs, it will need to be no worse.

BSI (backside illumination) is the current go-to for increasing sensor QE.  Sony calls it EXMOR-R and it is found on no larger than 1" sensors as it involves thinning the sensor wafer down to a few 10s of microns in order to expose the pixel wells from the backside.  Doing this for a larger format sensor is still impractical as it dramatically weakens the chip.

Sony's trying to curve their sensor to eliminate corner vignetting and the need for tricky microlens tailoring.  It also has the side effect of improving sensor response by straining the sensor lattice, but again it's an expensive technique.

Read noise can be attacked in several ways, but another way of dealing with the problem may be in redefining the entire imaging process.  This is what Eric Fossum has been working on with his Quanta Imaging Sensor.  It basic trades the charge-integrating approach of today for a photon-counting approach using a combination of extremely dense binary-response pixel arrays, high frame capture rates, and heavy postprocessing - 100+MP arrays, 1000 frames/sec capture rates using fairly conventional CMOS technology. In doing so one can trade off resolution for DR, tailor tonal response directly, possibly directly compensate for camera motion, etc. etc.. The downside is that it requires pixel read noise to be about 4 stops better, but since it's not trying to do a linear amplification, more options for doing so are open to the designer. Research chips are in development now, but we have a long road to go.

 mosswings's gear list:mosswings's gear list
Olympus XZ-1 Nikon D90 Nikon D7100 Nikon AF-S DX Nikkor 18-105mm f/3.5-5.6G ED VR Nikon AF-S Nikkor 70-300mm f/4.5-5.6G VR +1 more
Reply   Reply with quote   Complain
Albert Silver
Senior MemberPosts: 1,541Gear list
Like?
Re: "fast" is relative
In reply to EinsteinsGhost, 5 months ago

EinsteinsGhost wrote:

bobn2 wrote:

Albert Silver wrote:

tko wrote:

Remember that F4.0 is considered kind of slow on FF, but is equal to F2.0 on M43rds, which is considered "fast."

That's not entirely accurate. You are describing the depth of field equivalence, from one sensor to the next, not the light. f/2 on a m43rds may have the depth of field of f/4 on a full-frame, but the light will still be f/2.

The 'light' of a FF f/2 and a FT f/4 will be the same, which is the point he is making. In the end, given equally efficient sensors, you can achieve the same result at the same shutter speed using an f/4 on FF as you can on FT. The density of the light of the f/4 is one quarter but there is a sensor four times the area to collect it, so it ends up the same.

-- hide signature --

Bob

The point tko is making is wrong. DOF equivalence applies, exposure equivalence does not (for the reason you state above).

That was my point, though I think Bob missed it.

 Albert Silver's gear list:Albert Silver's gear list
Canon EF 85mm f/1.8 USM
Reply   Reply with quote   Complain
Austinian
Senior MemberPosts: 1,748Gear list
Like?
Re: Understanding ISO
In reply to mosswings, 5 months ago

mosswings wrote:

Austinian wrote:

Great Bustard wrote:

The most important factors about the sensor that manufacturers do not tell us are:

  • QE (Quantum Efficiency -- the proportion of light falling on the sensor that is recorded)
  • Read Noise (the additional electronic noise added by the sensor and supporting hardware)
  • CFA (Color Filter Array)
  • Microlens Efficiency

Which of these are likely to see significant improvements in the fairly near term (next few years)?

If by significant you mean something that might garner an extra stop or more of performance, it's hard to say. There has been recent work by Panasonic on a different type of CFA that eliminates the losses inherent in current implementations and that theoretically could yield a stop, perhaps more, in sensitivity:

http://www.imaging-resource.com/news/2013/02/05/bye-bye-bayer-panasonic-claims-new-sensor-tech-ends-color-filter-light-loss

It's been over a year since this announcement, but no updates. One of the problems of this technique may lie in tailoring the color splitting response to produce acceptable color rendition. Given all the complaints about current CFAs, it will need to be no worse.

From what little I've read about various difficulties in formulating appropriate dyes for CFAs, adding more constraints to the CFA dyes' requirements sounds like it might be difficult.

BSI (backside illumination) is the current go-to for increasing sensor QE. Sony calls it EXMOR-R and it is found on no larger than 1" sensors as it involves thinning the sensor wafer down to a few 10s of microns in order to expose the pixel wells from the backside. Doing this for a larger format sensor is still impractical as it dramatically weakens the chip.

Sony's trying to curve their sensor to eliminate corner vignetting and the need for tricky microlens tailoring. It also has the side effect of improving sensor response by straining the sensor lattice, but again it's an expensive technique.

Will this work for a wide variety of DSLR lenses, or will optical issues limit it to fixed-lens cameras, since existing lenses 'expect' a flat sensor?

Read noise can be attacked in several ways, but another way of dealing with the problem may be in redefining the entire imaging process. This is what Eric Fossum has been working on with his Quanta Imaging Sensor. It basic trades the charge-integrating approach of today for a photon-counting approach using a combination of extremely dense binary-response pixel arrays, high frame capture rates, and heavy postprocessing - 100+MP arrays, 1000 frames/sec capture rates using fairly conventional CMOS technology. In doing so one can trade off resolution for DR, tailor tonal response directly, possibly directly compensate for camera motion, etc. etc.. The downside is that it requires pixel read noise to be about 4 stops better, but since it's not trying to do a linear amplification, more options for doing so are open to the designer. Research chips are in development now, but we have a long road to go.

I've read some of Eric's posts in PS&T about this. My uninformed reaction is that such image processing sounds like it may need a pretty healthy PC/GPU setup (at least), but maybe dedicated silicon could do it in-camera.

Thank you, good info.

 Austinian's gear list:Austinian's gear list
Sony a77 II Sigma 10-20mm F4-5.6 EX DC HSM Sony DT 55-300mm F4.5-5.6 SAM Sony DT 35mm F1.8 SAM Sony DT 16-50mm F2.8 SSM +3 more
Reply   Reply with quote   Complain
mosswings
Veteran MemberPosts: 6,075Gear list
Like?
Re: Understanding ISO
In reply to Austinian, 5 months ago

Austinian wrote:

mosswings wrote:

Austinian wrote:

Great Bustard wrote:

The most important factors about the sensor that manufacturers do not tell us are:

  • QE (Quantum Efficiency -- the proportion of light falling on the sensor that is recorded)
  • Read Noise (the additional electronic noise added by the sensor and supporting hardware)
  • CFA (Color Filter Array)
  • Microlens Efficiency

Which of these are likely to see significant improvements in the fairly near term (next few years)?

If by significant you mean something that might garner an extra stop or more of performance, it's hard to say. There has been recent work by Panasonic on a different type of CFA that eliminates the losses inherent in current implementations and that theoretically could yield a stop, perhaps more, in sensitivity:

http://www.imaging-resource.com/news/2013/02/05/bye-bye-bayer-panasonic-claims-new-sensor-tech-ends-color-filter-light-loss

It's been over a year since this announcement, but no updates. One of the problems of this technique may lie in tailoring the color splitting response to produce acceptable color rendition. Given all the complaints about current CFAs, it will need to be no worse.

From what little I've read about various difficulties in formulating appropriate dyes for CFAs, adding more constraints to the CFA dyes' requirements sounds like it might be difficult.

Panasonic's technique uses beam splitters, not dye filtration.  No absorption losses, but a very different color response than dye filters, and a completely different sort of color matrix than Bayer.

BSI (backside illumination) is the current go-to for increasing sensor QE. Sony calls it EXMOR-R and it is found on no larger than 1" sensors as it involves thinning the sensor wafer down to a few 10s of microns in order to expose the pixel wells from the backside. Doing this for a larger format sensor is still impractical as it dramatically weakens the chip.

Sony's trying to curve their sensor to eliminate corner vignetting and the need for tricky microlens tailoring. It also has the side effect of improving sensor response by straining the sensor lattice, but again it's an expensive technique.

Will this work for a wide variety of DSLR lenses, or will optical issues limit it to fixed-lens cameras, since existing lenses 'expect' a flat sensor?

That's one concern. As good as the best lenses are, they still have some field curvature. If the curvature is initially limited to this, then a curved sensor might be useable with existing lenses. But the easiest implementation of it is with fixed lens cameras.

Read noise can be attacked in several ways, but another way of dealing with the problem may be in redefining the entire imaging process. This is what Eric Fossum has been working on with his Quanta Imaging Sensor. It basic trades the charge-integrating approach of today for a photon-counting approach using a combination of extremely dense binary-response pixel arrays, high frame capture rates, and heavy postprocessing - 100+MP arrays, 1000 frames/sec capture rates using fairly conventional CMOS technology. In doing so one can trade off resolution for DR, tailor tonal response directly, possibly directly compensate for camera motion, etc. etc.. The downside is that it requires pixel read noise to be about 4 stops better, but since it's not trying to do a linear amplification, more options for doing so are open to the designer. Research chips are in development now, but we have a long road to go.

I've read some of Eric's posts in PS&T about this. My uninformed reaction is that such image processing sounds like it may need a pretty healthy PC/GPU setup (at least), but maybe dedicated silicon could do it in-camera.

Thank you, good info.

GPU power and data rates are astronomical, but on-sensor precomputation and 20nm CMOS integration densities will help, and there is some tradeoff between resolution, data rate, and power required. A big issue is getting that pixel read amp with 0.1e- noise.

 mosswings's gear list:mosswings's gear list
Olympus XZ-1 Nikon D90 Nikon D7100 Nikon AF-S DX Nikkor 18-105mm f/3.5-5.6G ED VR Nikon AF-S Nikkor 70-300mm f/4.5-5.6G VR +1 more
Reply   Reply with quote   Complain
MoreorLess
Senior MemberPosts: 3,012
Like?
Re: Fast lenses, and High ISO
In reply to Chikoo, 5 months ago

Chikoo wrote:

Fast lenses, as they are called allow for more light to hit the sensor and in turn allow for fast(er) shutter speeds. The F number provides a relative measure of how much this ability is.

In this age of ever increasing ISO, are fast lenses needed anymore? The only ability I see the fast lenses provide was actually a disadvantage that happened to become a feature, and that is shallow DoF, allowing for separation of subject from the background.

That said, should they be called Fast Lenses or Shallow Lenses?

You could of course say that improved noise performance isn't merely at high ISO levels but all ISO levels, indeed in more recent years its actually base ISO that has improved most with greatly increased dynamic range.

So using a fast lens not only to control DOF but to have access to this improved performance is a reason for many using them.

Reply   Reply with quote   Complain
Chikoo
Senior MemberPosts: 1,630Gear list
Like?
Re: Understanding ISO
In reply to mosswings, 5 months ago

Austinian wrote:

Great Bustard wrote:

The most important factors about the sensor that manufacturers do not tell us are:

  • QE (Quantum Efficiency -- the proportion of light falling on the sensor that is recorded)
  • Read Noise (the additional electronic noise added by the sensor and supporting hardware)
  • CFA (Color Filter Array)
  • Microlens Efficiency

Which of these are likely to see significant improvements in the fairly near term (next few years)?

If by significant you mean something that might garner an extra stop or more of performance, it's hard to say. There has been recent work by Panasonic on a different type of CFA that eliminates the losses inherent in current implementations and that theoretically could yield a stop, perhaps more, in sensitivity:

http://www.imaging-resource.com/news/2013/02/05/bye-bye-bayer-panasonic-claims-new-sensor-tech-ends-color-filter-light-loss

It's been over a year since this announcement, but no updates.  One of the problems of this technique may lie in tailoring the color splitting response to produce acceptable color rendition. Given all the complaints about current CFAs, it will need to be no worse.

BSI (backside illumination) is the current go-to for increasing sensor QE.  Sony calls it EXMOR-R and it is found on no larger than 1" sensors as it involves thinning the sensor wafer down to a few 10s of microns in order to expose the pixel wells from the backside.  Doing this for a larger format sensor is still impractical as it dramatically weakens the chip.

Sony's trying to curve their sensor to eliminate corner vignetting and the need for tricky microlens tailoring.  It also has the side effect of improving sensor response by straining the sensor lattice, but again it's an expensive technique.

Read noise can be attacked in several ways, but another way of dealing with the problem may be in redefining the entire imaging process.  This is what Eric Fossum has been working on with his Quanta Imaging Sensor.  It basic trades the charge-integrating approach of today for a photon-counting approach using a combination of extremely dense binary-response pixel arrays, high frame capture rates, and heavy postprocessing - 100+MP arrays, 1000 frames/sec capture rates using fairly conventional CMOS technology. In doing so one can trade off resolution for DR, tailor tonal response directly, possibly directly compensate for camera motion, etc. etc.. The downside is that it requires pixel read noise to be about 4 stops better, but since it's not trying to do a linear amplification, more options for doing so are open to the designer. Research chips are in development now, but we have a long road to go.

Use a transparent wafer for BSI?

Reply   Reply with quote   Complain
Chikoo
Senior MemberPosts: 1,630Gear list
Like?
Re: Fast lenses, and High ISO
In reply to ultimitsu, 5 months ago

Chikoo wrote:

Chikoo wrote:

ultimitsu wrote:

Chikoo wrote:

The only ability I see the fast lenses provide was actually a disadvantage that happened to become a feature, and that is shallow DoF, allowing for separation of subject from the background.

how is it a disadvantage when you can stop the lens down?

The purpose of designing larger apertures was to get more light in. The shallow DoF is a side effect or a by product of doing so.

You did not answer the question.

A F/1.4 lens can stop down to at the least F/16, often F/32. That ability is not different to that of an F/5.6 lens. So I am asking you again, how is it a disadvantage when you can stop the lens down?

At best, you could argue one cannot always make use of the large aperture advantage, specifically, when deeper DOF is required. But that is not what you said.

I apologize for nor answering your question directly. If you look at photo history, a picture contains everything you see through the viewfinder esp rangefinders. With larger apertures, that is not the case. The foreground and background blurs away. Stepping down takes care of that, but then it is no longer a fast lens.

You still did not answer the question. It has already been established that large aperture isnt always an advantage, but where stopping down is needed, large aperture lens can do so. So the question is still left unanswered:

How is it a disadvantage?

With more light, you lose background detail.

Reply   Reply with quote   Complain
Chikoo
Senior MemberPosts: 1,630Gear list
Like?
Re: Fast lenses, and High ISO
In reply to Chikoo, 5 months ago

Chikoo wrote:

Chikoo wrote:

ultimitsu wrote:

Chikoo wrote:

The only ability I see the fast lenses provide was actually a disadvantage that happened to become a feature, and that is shallow DoF, allowing for separation of subject from the background.

how is it a disadvantage when you can stop the lens down?

The purpose of designing larger apertures was to get more light in. The shallow DoF is a side effect or a by product of doing so.

You did not answer the question.

A F/1.4 lens can stop down to at the least F/16, often F/32. That ability is not different to that of an F/5.6 lens. So I am asking you again, how is it a disadvantage when you can stop the lens down?

At best, you could argue one cannot always make use of the large aperture advantage, specifically, when deeper DOF is required. But that is not what you said.

I apologize for nor answering your question directly. If you look at photo history, a picture contains everything you see through the viewfinder esp rangefinders. With larger apertures, that is not the case. The foreground and background blurs away. Stepping down takes care of that, but then it is no longer a fast lens.

You still did not answer the question. It has already been established that large aperture isnt always an advantage, but where stopping down is needed, large aperture lens can do so. So the question is still left unanswered:

How is it a disadvantage?

With more light, you lose background detail.

And you have large(r) lenses, heavier lenses, and in turn more expensive lenses.

Reply   Reply with quote   Complain
Austinian
Senior MemberPosts: 1,748Gear list
Like?
Re: Understanding ISO
In reply to mosswings, 5 months ago

mosswings wrote:

Panasonic's technique uses beam splitters, not dye filtration. No absorption losses, but a very different color response than dye filters, and a completely different sort of color matrix than Bayer.

That is indeed an interesting idea. I just looked at the link you provided, but apparently Nature wants money to let me dig deeper. So I'll await further developments, since this isn't ready for prime time yet.

Their mention of angle-of-incidence problems sounds like it has implications for lens design. Oh, well, nothing is simple.

If you hear more, please post it.

 Austinian's gear list:Austinian's gear list
Sony a77 II Sigma 10-20mm F4-5.6 EX DC HSM Sony DT 55-300mm F4.5-5.6 SAM Sony DT 35mm F1.8 SAM Sony DT 16-50mm F2.8 SSM +3 more
Reply   Reply with quote   Complain
bobn2
Forum ProPosts: 32,301
Like?
Re: "fast" is relative
In reply to EinsteinsGhost, 5 months ago

EinsteinsGhost wrote:

bobn2 wrote:

Albert Silver wrote:

tko wrote:

Remember that F4.0 is considered kind of slow on FF, but is equal to F2.0 on M43rds, which is considered "fast."

That's not entirely accurate. You are describing the depth of field equivalence, from one sensor to the next, not the light. f/2 on a m43rds may have the depth of field of f/4 on a full-frame, but the light will still be f/2.

The 'light' of a FF f/2 and a FT f/4 will be the same, which is the point he is making. In the end, given equally efficient sensors, you can achieve the same result at the same shutter speed using an f/4 on FF as you can on FT. The density of the light of the f/4 is one quarter but there is a sensor four times the area to collect it, so it ends up the same.

-- hide signature --

Bob

The point tko is making is wrong. DOF equivalence applies, exposure equivalence does not (for the reason you state above).

Whether the point tko is making is wrong or not depends entirely on what you think is the definition of 'fast'. If you think that 'fast' is to do with exposure when comparing between formats, then he is wrong. However, that's not a very sensible point of view, so if you assume that he thinks that 'fast' means 'puts more light on the sensor' then he is right.

-- hide signature --

Bob

Reply   Reply with quote   Complain
bobn2
Forum ProPosts: 32,301
Like?
Re: "fast" is relative
In reply to Albert Silver, 5 months ago

Albert Silver wrote:

EinsteinsGhost wrote:

bobn2 wrote:

Albert Silver wrote:

tko wrote:

Remember that F4.0 is considered kind of slow on FF, but is equal to F2.0 on M43rds, which is considered "fast."

That's not entirely accurate. You are describing the depth of field equivalence, from one sensor to the next, not the light. f/2 on a m43rds may have the depth of field of f/4 on a full-frame, but the light will still be f/2.

The 'light' of a FF f/2 and a FT f/4 will be the same, which is the point he is making. In the end, given equally efficient sensors, you can achieve the same result at the same shutter speed using an f/4 on FF as you can on FT. The density of the light of the f/4 is one quarter but there is a sensor four times the area to collect it, so it ends up the same.

-- hide signature --

Bob

The point tko is making is wrong. DOF equivalence applies, exposure equivalence does not (for the reason you state above).

That was my point, though I think Bob missed it.

I didn't miss anything. 'Exposure' doesn't mean the same as 'the light'. It doesn't make much sense to base your assessment of what is 'fast' on exposure in cross format comparisons.

-- hide signature --

Bob

Reply   Reply with quote   Complain
mosswings
Veteran MemberPosts: 6,075Gear list
Like?
Re: Understanding ISO
In reply to Chikoo, 5 months ago

Chikoo wrote:

Austinian wrote:

Great Bustard wrote:

The most important factors about the sensor that manufacturers do not tell us are:

  • QE (Quantum Efficiency -- the proportion of light falling on the sensor that is recorded)
  • Read Noise (the additional electronic noise added by the sensor and supporting hardware)
  • CFA (Color Filter Array)
  • Microlens Efficiency

Which of these are likely to see significant improvements in the fairly near term (next few years)?

If by significant you mean something that might garner an extra stop or more of performance, it's hard to say. There has been recent work by Panasonic on a different type of CFA that eliminates the losses inherent in current implementations and that theoretically could yield a stop, perhaps more, in sensitivity:

http://www.imaging-resource.com/news/2013/02/05/bye-bye-bayer-panasonic-claims-new-sensor-tech-ends-color-filter-light-loss

It's been over a year since this announcement, but no updates. One of the problems of this technique may lie in tailoring the color splitting response to produce acceptable color rendition. Given all the complaints about current CFAs, it will need to be no worse.

BSI (backside illumination) is the current go-to for increasing sensor QE. Sony calls it EXMOR-R and it is found on no larger than 1" sensors as it involves thinning the sensor wafer down to a few 10s of microns in order to expose the pixel wells from the backside. Doing this for a larger format sensor is still impractical as it dramatically weakens the chip.

Sony's trying to curve their sensor to eliminate corner vignetting and the need for tricky microlens tailoring. It also has the side effect of improving sensor response by straining the sensor lattice, but again it's an expensive technique.

Read noise can be attacked in several ways, but another way of dealing with the problem may be in redefining the entire imaging process. This is what Eric Fossum has been working on with his Quanta Imaging Sensor. It basic trades the charge-integrating approach of today for a photon-counting approach using a combination of extremely dense binary-response pixel arrays, high frame capture rates, and heavy postprocessing - 100+MP arrays, 1000 frames/sec capture rates using fairly conventional CMOS technology. In doing so one can trade off resolution for DR, tailor tonal response directly, possibly directly compensate for camera motion, etc. etc.. The downside is that it requires pixel read noise to be about 4 stops better, but since it's not trying to do a linear amplification, more options for doing so are open to the designer. Research chips are in development now, but we have a long road to go.

Use a transparent wafer for BSI?

For electronic and manufacturing reasons you have to start with silicon, which isn't. But beyond that, you have to clear away all the wiring and other structures to gain QE, which is what BSI does - it flips the chip over where there isn't any wiring. The flipped wafer can then be cemented to a carrier substrate - a ceramic carrier, or something like that - and the back of the wafer ground down to the required thickness. A lot of today's ICs include what amounts to a pass with a wet sander.

 mosswings's gear list:mosswings's gear list
Olympus XZ-1 Nikon D90 Nikon D7100 Nikon AF-S DX Nikkor 18-105mm f/3.5-5.6G ED VR Nikon AF-S Nikkor 70-300mm f/4.5-5.6G VR +1 more
Reply   Reply with quote   Complain
Lee Jay
Forum ProPosts: 45,309Gear list
Like?
Re: Fast lenses, and High ISO
In reply to Chikoo, 5 months ago

Chikoo wrote:

Chikoo wrote:

Chikoo wrote:

ultimitsu wrote:

Chikoo wrote:

The only ability I see the fast lenses provide was actually a disadvantage that happened to become a feature, and that is shallow DoF, allowing for separation of subject from the background.

how is it a disadvantage when you can stop the lens down?

The purpose of designing larger apertures was to get more light in. The shallow DoF is a side effect or a by product of doing so.

You did not answer the question.

A F/1.4 lens can stop down to at the least F/16, often F/32. That ability is not different to that of an F/5.6 lens. So I am asking you again, how is it a disadvantage when you can stop the lens down?

At best, you could argue one cannot always make use of the large aperture advantage, specifically, when deeper DOF is required. But that is not what you said.

I apologize for nor answering your question directly. If you look at photo history, a picture contains everything you see through the viewfinder esp rangefinders. With larger apertures, that is not the case. The foreground and background blurs away. Stepping down takes care of that, but then it is no longer a fast lens.

You still did not answer the question. It has already been established that large aperture isnt always an advantage, but where stopping down is needed, large aperture lens can do so. So the question is still left unanswered:

How is it a disadvantage?

With more light, you lose background detail.

And you have large(r) lenses, heavier lenses, and in turn more expensive lenses.

Size and cost are indeed disadvantages, but there are generally no photographic disadvantages of having the ability to go faster. I've shot deep DOF shots at f/1.4 on full frame. In many cases, especially with larger subjects, the DOF change from f/1.4 to f/5.6 is not material, and thus the advantages of having a brighter lens are all about better image quality with no photographic disadvantages.

-- hide signature --

Lee Jay

 Lee Jay's gear list:Lee Jay's gear list
Canon IXUS 310 HS Canon PowerShot SX260 HS Canon EOS 5D Canon EOS 20D Canon EOS 550D +23 more
Reply   Reply with quote   Complain
bobn2
Forum ProPosts: 32,301
Like?
Re: Understanding ISO
In reply to mosswings, 5 months ago

mosswings wrote:

Chikoo wrote:

Austinian wrote:

Great Bustard wrote:

The most important factors about the sensor that manufacturers do not tell us are:

  • QE (Quantum Efficiency -- the proportion of light falling on the sensor that is recorded)
  • Read Noise (the additional electronic noise added by the sensor and supporting hardware)
  • CFA (Color Filter Array)
  • Microlens Efficiency

Which of these are likely to see significant improvements in the fairly near term (next few years)?

If by significant you mean something that might garner an extra stop or more of performance, it's hard to say. There has been recent work by Panasonic on a different type of CFA that eliminates the losses inherent in current implementations and that theoretically could yield a stop, perhaps more, in sensitivity:

http://www.imaging-resource.com/news/2013/02/05/bye-bye-bayer-panasonic-claims-new-sensor-tech-ends-color-filter-light-loss

It's been over a year since this announcement, but no updates. One of the problems of this technique may lie in tailoring the color splitting response to produce acceptable color rendition. Given all the complaints about current CFAs, it will need to be no worse.

BSI (backside illumination) is the current go-to for increasing sensor QE. Sony calls it EXMOR-R and it is found on no larger than 1" sensors as it involves thinning the sensor wafer down to a few 10s of microns in order to expose the pixel wells from the backside. Doing this for a larger format sensor is still impractical as it dramatically weakens the chip.

Sony's trying to curve their sensor to eliminate corner vignetting and the need for tricky microlens tailoring. It also has the side effect of improving sensor response by straining the sensor lattice, but again it's an expensive technique.

Read noise can be attacked in several ways, but another way of dealing with the problem may be in redefining the entire imaging process. This is what Eric Fossum has been working on with his Quanta Imaging Sensor. It basic trades the charge-integrating approach of today for a photon-counting approach using a combination of extremely dense binary-response pixel arrays, high frame capture rates, and heavy postprocessing - 100+MP arrays, 1000 frames/sec capture rates using fairly conventional CMOS technology. In doing so one can trade off resolution for DR, tailor tonal response directly, possibly directly compensate for camera motion, etc. etc.. The downside is that it requires pixel read noise to be about 4 stops better, but since it's not trying to do a linear amplification, more options for doing so are open to the designer. Research chips are in development now, but we have a long road to go.

Use a transparent wafer for BSI?

For electronic and manufacturing reasons you have to start with silicon, which isn't. But beyond that, you have to clear away all the wiring and other structures to gain QE, which is what BSI does - it flips the chip over where there isn't any wiring. The flipped wafer can then be cemented to a carrier substrate - a ceramic carrier, or something like that - and the back of the wafer ground down to the required thickness. A lot of today's ICs include what amounts to a pass with a wet sander.

There are two processes for BSI. One grows silicon epitaxy over a transparent (sapphire) wafer and then builds the circuits in the silicon. The other is the process you describe. I suspect that wafer thinning by grinding, as you describe, is going to cause too much silicon damage for it to be used, I suspect it is done by etching, dry or wet.

-- hide signature --

Bob

Reply   Reply with quote   Complain
EinsteinsGhost
Forum ProPosts: 11,977Gear list
Like?
Re: "fast" is relative
In reply to bobn2, 5 months ago

bobn2 wrote:

EinsteinsGhost wrote:

bobn2 wrote:

Albert Silver wrote:

tko wrote:

Remember that F4.0 is considered kind of slow on FF, but is equal to F2.0 on M43rds, which is considered "fast."

That's not entirely accurate. You are describing the depth of field equivalence, from one sensor to the next, not the light. f/2 on a m43rds may have the depth of field of f/4 on a full-frame, but the light will still be f/2.

The 'light' of a FF f/2 and a FT f/4 will be the same, which is the point he is making. In the end, given equally efficient sensors, you can achieve the same result at the same shutter speed using an f/4 on FF as you can on FT. The density of the light of the f/4 is one quarter but there is a sensor four times the area to collect it, so it ends up the same.

-- hide signature --

Bob

The point tko is making is wrong. DOF equivalence applies, exposure equivalence does not (for the reason you state above).

Whether the point tko is making is wrong or not depends entirely on what you think is the definition of 'fast'. If you think that 'fast' is to do with exposure when comparing between formats, then he is wrong. However, that's not a very sensible point of view, so if you assume that he thinks that 'fast' means 'puts more light on the sensor' then he is right.

Fast has only one meaning: Shutter Speed, as in exposure time, the time value part of exposure. And that makes you just as much wrong as tko.

Photographic exposure is independent of media size.

 EinsteinsGhost's gear list:EinsteinsGhost's gear list
Sony Cyber-shot DSC-F828 Sony SLT-A55 Sony Alpha NEX-6 Sigma 18-250mm F3.5-6.3 DC OS HSM Sony 135mm F2.8 (T4.5) STF +12 more
Reply   Reply with quote   Complain
bobn2
Forum ProPosts: 32,301
Like?
Re: "fast" is relative
In reply to EinsteinsGhost, 5 months ago

EinsteinsGhost wrote:

bobn2 wrote:

EinsteinsGhost wrote:

bobn2 wrote:

Albert Silver wrote:

tko wrote:

Remember that F4.0 is considered kind of slow on FF, but is equal to F2.0 on M43rds, which is considered "fast."

That's not entirely accurate. You are describing the depth of field equivalence, from one sensor to the next, not the light. f/2 on a m43rds may have the depth of field of f/4 on a full-frame, but the light will still be f/2.

The 'light' of a FF f/2 and a FT f/4 will be the same, which is the point he is making. In the end, given equally efficient sensors, you can achieve the same result at the same shutter speed using an f/4 on FF as you can on FT. The density of the light of the f/4 is one quarter but there is a sensor four times the area to collect it, so it ends up the same.

-- hide signature --

Bob

The point tko is making is wrong. DOF equivalence applies, exposure equivalence does not (for the reason you state above).

Whether the point tko is making is wrong or not depends entirely on what you think is the definition of 'fast'. If you think that 'fast' is to do with exposure when comparing between formats, then he is wrong. However, that's not a very sensible point of view, so if you assume that he thinks that 'fast' means 'puts more light on the sensor' then he is right.

Fast has only one meaning:

Well that is obviously not true.

Shutter Speed, as in exposure time, the time value part of exposure.

Obviously it doesn't mean 'shutter speed', otherwise it would be 'shutter speed'. And 'fast' as used by people relates to aperture, not 'shutter speed'. So you are very obviously wrong. What it looks like is you have an agenda but haven't thought things through enough to successfully defend yourself, so you end up saying silly things like that. Let me help you out. A 'fast' lens is called a 'fast' lens because it allows you to set a 'fast' shutter speed. That begs the question, what would stop you setting any shutter speed that you want? My answer would be that you might not want to set too fast a shutter speed because it would result in a lower quality image than you would be satisfied with.

And that makes you just as much wrong as tko.

Or just as right, more likely in fact.

Photographic exposure is independent of media size.

Yes, I know that. The real question is how relevant is exposure when making cross format comparisons?

-- hide signature --

Bob

Reply   Reply with quote   Complain
Keyboard shortcuts:
FForum MMy threads