Deep sub-electron read noise is coming

Eric Fossum

Senior Member
Messages
1,701
Solutions
7
Reaction score
2,689
Location
Hanover, NH, US
I guess this is more of a tease since these papers are not easily available for a little bit but here are two Gigajot abstracts reporting on two separate sensors. Gigajot is the company spun out of from Dartmouth with my 2 PhD student co-founders.
  • Abstract—This paper reports a 16.7 Mpixel, 3D-stacked
    backside illuminated Quanta Image Sensor (QIS) with
    1.1 µm-pitch pixels which achieves 0.19 e- rms array read
    noise and 0.12 e- rms best single-pixel read noise under
    room temperature operation. The accurate photon-counting
    capability enables superior imaging performance under
    ultra-low-light conditions. The sensor supports
    programmable analog-to-digital convertor (ADC) resolution
    from 1-14 bits and video frame rates up to
    40 fps with 4096 x 4096 resolution and 600 mW power
    consumption.
This paper will be open-access from IEEE Electron Device Letters soon, but if you are an IEEE member you can access the preview version now here.

The second paper will be presented in a few months at the VLSI Symposia in Japan.
  • A Photon-Counting 4Mpixel Stacked BSI Quanta Image Sensor with 0.3e- Read Noise and 100dB Single-Exposure Dynamic Range,
    J. Ma, D. Zhang, O. Elgendy and S. Masoodian, Gigajot Technology Inc., USA
    This paper reports a 4Mpixel, 3D-stacked backside illuminated Quanta Image Sensor (QIS) with 2.2um pixels that can operate simultaneously in photon-counting mode with deep sub-electron read noise (0.3e- rms) and linear integration mode with large full-well capacity (30k e-). A single-exposure dynamic range of 100dB is realized with this dual-mode readout under room temperature. This QIS device uses a cluster-parallel readout architecture to achieve up to 120fps frame rate at 550mW power consumption.
Since I am not a co-author on either paper, I can say I am very proud of the Gigajot team. BTW, both sensors are color (and/or monochrome).

Will update this when the actual papers are published, including lab camera images.
 
Thank you Eric. This will be the first time in a while that I've taken advantage of my IEE membership.
 
I guess this is more of a tease since these papers are not easily available for a little bit but here are two Gigajot abstracts reporting on two separate sensors. Gigajot is the company spun out of from Dartmouth with my 2 PhD student co-founders.
  • Abstract—This paper reports a 16.7 Mpixel, 3D-stacked
    backside illuminated Quanta Image Sensor (QIS) with
    1.1 µm-pitch pixels which achieves 0.19 e- rms array read
    noise and 0.12 e- rms best single-pixel read noise under
    room temperature operation. The accurate photon-counting
    capability enables superior imaging performance under
    ultra-low-light conditions. The sensor supports
    programmable analog-to-digital convertor (ADC) resolution
    from 1-14 bits and video frame rates up to
    40 fps with 4096 x 4096 resolution and 600 mW power
    consumption.
This paper will be open-access from IEEE Electron Device Letters soon, but if you are an IEEE member you can access the preview version now here.

The second paper will be presented in a few months at the VLSI Symposia in Japan.
  • A Photon-Counting 4Mpixel Stacked BSI Quanta Image Sensor with 0.3e- Read Noise and 100dB Single-Exposure Dynamic Range,
    J. Ma, D. Zhang, O. Elgendy and S. Masoodian, Gigajot Technology Inc., USA
    This paper reports a 4Mpixel, 3D-stacked backside illuminated Quanta Image Sensor (QIS) with 2.2um pixels that can operate simultaneously in photon-counting mode with deep sub-electron read noise (0.3e- rms) and linear integration mode with large full-well capacity (30k e-). A single-exposure dynamic range of 100dB is realized with this dual-mode readout under room temperature. This QIS device uses a cluster-parallel readout architecture to achieve up to 120fps frame rate at 550mW power consumption.
Since I am not a co-author on either paper, I can say I am very proud of the Gigajot team. BTW, both sensors are color (and/or monochrome).

Will update this when the actual papers are published, including lab camera images.
Seems that the pdf is in open access https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9402860
 
This is well outside my comfort zone (antenna engineer) but the data in Figure 1 seems very impressive - no half photons!

S.
 
This paper will be open-access from IEEE Electron Device Letters soon, but if you are an IEEE member you can access the preview version now here.
Nice work; looks a lot further along now. Pretty interesting use of two substrates... I guess that's where all the high-end imaging sensors are headed, not just QIS. Seems like here all the noise-generating stuff is on the ISP substrate, rather than having a purely analog/digital split....
 
Thanks for the link. Very interesting.

Just for general interest of the forum members, here is a photo of a fun little promotional item being given out at Photonics West a couple of years ago. I hung this one on our Christmas tree last year.

Bonus points for those who make the connection between the logo and one of the figures in the paper.





61429c9cfbd04bb0b1fc04e35cf1539b.jpg
 
Thanks for the link. Very interesting.

Just for general interest of the forum members, here is a photo of a fun little promotional item being given out at Photonics West a couple of years ago. I hung this one on our Christmas tree last year.

Bonus points for those who make the connection between the logo and one of the figures in the paper.

61429c9cfbd04bb0b1fc04e35cf1539b.jpg
Ha! I didn't get one of those!
 
This paper will be open-access from IEEE Electron Device Letters soon, but if you are an IEEE member you can access the preview version now here.
Nice work; looks a lot further along now. Pretty interesting use of two substrates... I guess that's where all the high-end imaging sensors are headed, not just QIS. Seems like here all the noise-generating stuff is on the ISP substrate, rather than having a purely analog/digital split....
The split is in the process/technology node to make the two layers. The pixel layer is an abbreviated process at the tighter node and the mixed signal layer is in the other larger-dimensions node.

While pickup noise (clocks,etc.) could be a problem, generally design techniques and timing ensure that it is not.

Read noise is dominated by noise -mostly 1/f- in the first transistor. Seems that a mobility fluctuation model is the best fit to the data.
 
Really cool stuff, Eric, and great progress.

I see that writing conference paper summaries is still as refined an art as always - concinnitizing until you can't any further. Reading summaries is a very slow process, because every word is significant and the technical shorthand is thick. It took me back to when our group presented on a new circuit design...somehow we compressed 10s of pages into just 2. It was fun to "unzip" your summary and realize how many terms of art have come into use since I retired.
 
Gigajot did a press release, picked up Image Sensor World blog, which includes the Gigajot camera or "camera design kit"

Blog post on Gigajot camera and QIS chips
Eric why is the quantum efficiency increasing at about 310nm and then continuing to increase below 250nm?

Is this still simply due to ionization? Or is something else happening in the QIS detection electronics?

I typically see a bump in quantum efficiency at and around 250nm due to ionization like this QE plot for the 2020bsi sensor:


But the curves in the Gigajot press release don'tlook like what I typically see with drop off into UVC.
 
Gigajot did a press release, picked up Image Sensor World blog, which includes the Gigajot camera or "camera design kit"

Blog post on Gigajot camera and QIS chips
Very nice!

I just sent them a note about my group potentially working with them to implement our TDCI (Time Domain Continuous Imaging) processing on the QIS data as open source software. As a reminder, this is the scheme my group has been using that models each pixel's value as a waveform over time and combines that with an error model to synthesize virtual exposures after capture. The best single-point reference is TIK: a time domain continuous imaging testbed using conventional still images and video , and there are also slides from the 2017 EI presentation here .

Of course, what I'd really like would be to entirely lose the concept of frames in the data stream from the chip... just emit temporally-compressed TDCI waveforms... but one step at a time. ;-)
 
Gigajot did a press release, picked up Image Sensor World blog, which includes the Gigajot camera or "camera design kit"

Blog post on Gigajot camera and QIS chips
Eric why is the quantum efficiency increasing at about 310nm and then continuing to increase below 250nm?

Is this still simply due to ionization? Or is something else happening in the QIS detection electronics?

I typically see a bump in quantum efficiency at and around 250nm due to ionization like this QE plot for the 2020bsi sensor:

https://www.gpixel.com/products/area-scan-en/gsense-series/gsense2020bsi/

But the curves in the Gigajot press release don'tlook like what I typically see with drop off into UVC.
I don't have any first hand knowledge of the measurements in this UV range at Gigajot and aside from quantum yield arguments (handwaving at this point) I cannot explain the trend or bumps etc. Calibration in this range is also tricky. I suppose right now, I would take the data in the UV range with a grain of salt.

(fyi Dartmouth is about 3000 miles away from Gigajot - almost the full diagonal of the USA).
 
Gigajot did a press release, picked up Image Sensor World blog, which includes the Gigajot camera or "camera design kit"

Blog post on Gigajot camera and QIS chips
Very nice!

I just sent them a note about my group potentially working with them to implement our TDCI (Time Domain Continuous Imaging) processing on the QIS data as open source software. As a reminder, this is the scheme my group has been using that models each pixel's value as a waveform over time and combines that with an error model to synthesize virtual exposures after capture. The best single-point reference is TIK: a time domain continuous imaging testbed using conventional still images and video , and there are also slides from the 2017 EI presentation here .

Of course, what I'd really like would be to entirely lose the concept of frames in the data stream from the chip... just emit temporally-compressed TDCI waveforms... but one step at a time. ;-)
Thanks Hank. I remember your approach from earlier conversations. (Well, mostly remember). I don't know if Gigajot will be able to respond. They need all hands on deck focused on creating products-for-sale and moving from startup towards profitability. They barely have time to respond to small requests "from home" that require any real resources. Thanks for your understanding.
 
Gigajot did a press release, picked up Image Sensor World blog, which includes the Gigajot camera or "camera design kit"

Blog post on Gigajot camera and QIS chips
Eric why is the quantum efficiency increasing at about 310nm and then continuing to increase below 250nm?

Is this still simply due to ionization? Or is something else happening in the QIS detection electronics?

I typically see a bump in quantum efficiency at and around 250nm due to ionization like this QE plot for the 2020bsi sensor:

https://www.gpixel.com/products/area-scan-en/gsense-series/gsense2020bsi/

But the curves in the Gigajot press release don'tlook like what I typically see with drop off into UVC.
I don't have any first hand knowledge of the measurements in this UV range at Gigajot and aside from quantum yield arguments (handwaving at this point) I cannot explain the trend or bumps etc. Calibration in this range is also tricky. I suppose right now, I would take the data in the UV range with a grain of salt.

(fyi Dartmouth is about 3000 miles away from Gigajot - almost the full diagonal of the USA).
Thank you Eric for the response. I figured this might be more complex and thus opened a dedicate thread for its discussion. I see this being a future research paper for someone.

There aren't many of us, but we UV photographers want every drop of UV from our sensors and this looks promising.
 
Gigajot did a press release, picked up Image Sensor World blog, which includes the Gigajot camera or "camera design kit"

Blog post on Gigajot camera and QIS chips
Very nice!

I just sent them a note about my group potentially working with them to implement our TDCI (Time Domain Continuous Imaging) processing on the QIS data as open source software. As a reminder, this is the scheme my group has been using that models each pixel's value as a waveform over time and combines that with an error model to synthesize virtual exposures after capture. The best single-point reference is TIK: a time domain continuous imaging testbed using conventional still images and video , and there are also slides from the 2017 EI presentation here .

Of course, what I'd really like would be to entirely lose the concept of frames in the data stream from the chip... just emit temporally-compressed TDCI waveforms... but one step at a time. ;-)
Thanks Hank. I remember your approach from earlier conversations. (Well, mostly remember). I don't know if Gigajot will be able to respond. They need all hands on deck focused on creating products-for-sale and moving from startup towards profitability. They barely have time to respond to small requests "from home" that require any real resources. Thanks for your understanding.
I'm familiar with start-up pressures... but if I didn't ask, the probability is zero. ;-)

I've also been working with a group of Physicists creating a high resolution IFU spectrograph instrument, and this could be a better sensor for that than the QHY600M currently planned. The instrument could be a very high-profile application, but obviously it's never going to be a high-sales-volume thing (perhaps tens of copies of it).

On the other hand, the TDCI software could give Gigajot a marketing edge very quickly. We'll just have to see how their priorities fall. I assume at this stage they'll be looking more to partner with companies like Basler or FLIR rather than academic open-source software researchers/developers. Then again, Basler has always been very willing to work with us academics.... ;-)
 

Keyboard shortcuts

Back
Top