a7RII self-heating for long exposures

Jim sent me one of the A7II raws. I created 150 400x400 samples across the entire image and found only 23 of the 150 samples had std. dev higher than 90 for the red channel. Here's a RawDigger screen grab with those 23 squares highlighted:

A7II Infrared 400x400 Noise Plots
It's the hot pixels affecting the deviation measurement. The deviations go away if you set "Data Processing -> Selection/Sample stats: discard abnormal pixel values". That doesn't answer the question though as to why the red channel uniquely has so hot pixels. Btw the std dev goes down for all channels with the discard abnormal pixel values set - so there are plenty of "warm" pixels from the long exposure on the other channels.
Nice work. Is it possible that the sensor was damaged during the IR conversion? I seem to vaguely remember someone complaining about that happening to them.

Jim
Look at the samples again and there are hot pixels on the other channels too. I only focused on the red ones at first since that's what we were chasing. Just random chance that the center 400x400 had only red ones :) For example, here are the samples containing hot pixels on the G1 channel:

A7II Infrared 400x400 Green Hot Pixels
Yes, I noticed that, too. But there are so many more hot pixels than with the unmodified a7II that I'm wonder why. It doesn't look like IR, per se, anymore. I never did this kind of testing to the now-modified a7II before I sent it off to the IR modding shop. It could have been that way since day one, but I dunno.

Jim
 
Jim sent me one of the A7II raws. I created 150 400x400 samples across the entire image and found only 23 of the 150 samples had std. dev higher than 90 for the red channel. Here's a RawDigger screen grab with those 23 squares highlighted:

A7II Infrared 400x400 Noise Plots
It's the hot pixels affecting the deviation measurement. The deviations go away if you set "Data Processing -> Selection/Sample stats: discard abnormal pixel values". That doesn't answer the question though as to why the red channel uniquely has so hot pixels. Btw the std dev goes down for all channels with the discard abnormal pixel values set - so there are plenty of "warm" pixels from the long exposure on the other channels.
Nice work. Is it possible that the sensor was damaged during the IR conversion? I seem to vaguely remember someone complaining about that happening to them.
IR filter removed from a BSI sensor? Ummm...
Iliah, the camera that was IR modified was an a7II, not an a7RII.

Jim
 
Jim sent me one of the A7II raws. I created 150 400x400 samples across the entire image and found only 23 of the 150 samples had std. dev higher than 90 for the red channel. Here's a RawDigger screen grab with those 23 squares highlighted:

A7II Infrared 400x400 Noise Plots
It's the hot pixels affecting the deviation measurement. The deviations go away if you set "Data Processing -> Selection/Sample stats: discard abnormal pixel values". That doesn't answer the question though as to why the red channel uniquely has so hot pixels. Btw the std dev goes down for all channels with the discard abnormal pixel values set - so there are plenty of "warm" pixels from the long exposure on the other channels.
Nice work. Is it possible that the sensor was damaged during the IR conversion? I seem to vaguely remember someone complaining about that happening to them.
IR filter removed from a BSI sensor? Ummm...
Iliah, the camera that was IR modified was an a7II, not an a7RII.
Yes, thank you, I realized that too late to withdraw the post.
 
Jim sent me one of the A7II raws. I created 150 400x400 samples across the entire image and found only 23 of the 150 samples had std. dev higher than 90 for the red channel. Here's a RawDigger screen grab with those 23 squares highlighted:

A7II Infrared 400x400 Noise Plots
It's the hot pixels affecting the deviation measurement. The deviations go away if you set "Data Processing -> Selection/Sample stats: discard abnormal pixel values". That doesn't answer the question though as to why the red channel uniquely has so hot pixels. Btw the std dev goes down for all channels with the discard abnormal pixel values set - so there are plenty of "warm" pixels from the long exposure on the other channels.
Nice work. Is it possible that the sensor was damaged during the IR conversion? I seem to vaguely remember someone complaining about that happening to them.

Jim
Look at the samples again and there are hot pixels on the other channels too. I only focused on the red ones at first since that's what we were chasing. Just random chance that the center 400x400 had only red ones :) For example, here are the samples containing hot pixels on the G1 channel:

A7II Infrared 400x400 Green Hot Pixels
Yes, I noticed that, too. But there are so many more hot pixels than with the unmodified a7II that I'm wonder why. It doesn't look like IR, per se, anymore. I never did this kind of testing to the now-modified a7II before I sent it off to the IR modding shop. It could have been that way since day one, but I dunno.

Jim
 
Can you send me one of the non-IR modified A7II raws, that was taken at about the same time in the sequence as the IR raw you sent me? (so that the temperatures will be comparable)
Sure. It may be a few hours.

Jim
 
Jim, hey thanks for all your efforts and sharing your findings. Maybe Sony should enlist your help at some point and compensate you! On the flip side you haven't noticed any Sony vans or black sedans hanging out in your neighborhood have you?!?! ;-)

I have a question about your A7II and A7II IR tests... In theory shouldn't the ramps due to heating be somewhat proportional in each camera with the Red channel being more pronounced? From your test(s) it looks like the A7II with IR conversion shows much less overall heating than the non converted A7II. I'm curious if you have a thought on what's causing the difference, or if I'm missing something.

I haven't used my A7rII for anything at night yet, but for what I have use it for, I'm still loving everything about it. The images it produces are spectacular. Every time I pick it up I forget more and more about my years as a Canon shooter and wish I could go back and re shoot some of my images with the Sony.
 
wait, does this mean I shouldn't keep my spare battery in my back jeans pocket? :-O

I am not sure how one tells the difference between white-colored noise and stars, exactly. Any tips?
 
wait, does this mean I shouldn't keep my spare battery in my back jeans pocket? :-O

I am not sure how one tells the difference between white-colored noise and stars, exactly. Any tips?
I don't know but no noise reduction was applied to the prior example I was a little heavy on the exposure slider. Here is another example with a little noise reduction applied. With the naked eye, it's really this populated with stars and considered an area with sub arcsecond vision. I'm not that experienced with nightscapes. I haven't seen many with the A7rii so here is another effort that was a little sharper.



ff11e7c7dcf844c8a95c98bd0aa275e2.jpg
 
So is there the problem too with A7-II ???? What I understand from this topic on fire :

Hot pixels problem for astrophoto :
  • A7R-II
Unknow :
  • A7-II ????
No hot pixel problem for astrophoto :
  • A7
  • A7S
  • A7R
  • A6000
 
This looks pretty good, I think. At least for my purposes at the moment, it would be. Most of this looks like stars rather than noise - can tell from the slight movement in the stars. It does look sharp.

I am thinking that when it is colder at night, I may get some better results myself. Although I don't have a wide lens with aperture wider than f2.8, and it is the Rokinon 14MM which is fine but not the sharpest in the arsenal wide open. I may have to go for a softer look...

Now if we would only get a good clear night! I am in a good place for starscapes with little light pollution, but we have had smoke and clouds for several days now.
 
But in this case all Sony Alpha cameras with IBIS should have the problem ?

Samething for Pentax and Olympus cameras : they have IBIS too.
A7rII sensor is BSI vs A7II, is denser (more pixels per area than A7II), runs at a faster pixel clock/readout rate than A7II, and is larger than Pentax/Olympus sensors.
All true. But I can't figure out what consumes so much power when live view is off and the chip is only being read once every thirty seconds. It doesn't seem like it would take much power to back bias 42 million photodiodes. The comparitor clock doesn't even need to be running.
It comes in my mind as olympus e-m5's "live time" feature. In a nutshell, it is an exposure preview duringa long exposure "bulb mode".

I wonder how olympus achieve that. My initial thought is can olympus find a magic trick that could sample the frame during exposure without interrupting the exposure? If this is the case, the works behind the sensor during the exposure is much complicated than I thought.

On second thought, maybe olympus was using electronic shutter trick - exposure for 5 seconds, interrupt it by electronic shutter, stack the exposure, display the preview, exposure for 5 seconds again. In this case, it would interrupt the exposure for at least 1/20s for every 5 seconds, we might find some gaps on trails in final photo. However, e-m5 doesn't have an electronic shutter feature at all, how would olympus hide silence shutter function when it is physically capable?

Third thought, maybe olympus don't need to reset whole frame during bulb, it could just reset and sample enough pixels that good for live view only. Let's said, 1.44M pixels from 16M total.
All the photodiodes in a CMOS array are buffered by source followers. It's not like a CCD; there's no reason you can't read the charge as it builds up. That said, the a7x cameras don't work that way. The liveview screen is black during the exposure.

Jim

--
http://blog.kasson.com
I just wonder, if cmos has ability to sample pixels without disrupting electron buildup, why no manufacture comes out with a super single-shot HDR mode?

You can stop sampling highlight portion of the scene at early moment, and continue to sample shadow area until desired exposure achieved, then merge them into an HDR photo in single exposure.
 
But in this case all Sony Alpha cameras with IBIS should have the problem ?

Samething for Pentax and Olympus cameras : they have IBIS too.
A7rII sensor is BSI vs A7II, is denser (more pixels per area than A7II), runs at a faster pixel clock/readout rate than A7II, and is larger than Pentax/Olympus sensors.
All true. But I can't figure out what consumes so much power when live view is off and the chip is only being read once every thirty seconds. It doesn't seem like it would take much power to back bias 42 million photodiodes. The comparitor clock doesn't even need to be running.
It comes in my mind as olympus e-m5's "live time" feature. In a nutshell, it is an exposure preview duringa long exposure "bulb mode".

I wonder how olympus achieve that. My initial thought is can olympus find a magic trick that could sample the frame during exposure without interrupting the exposure? If this is the case, the works behind the sensor during the exposure is much complicated than I thought.

On second thought, maybe olympus was using electronic shutter trick - exposure for 5 seconds, interrupt it by electronic shutter, stack the exposure, display the preview, exposure for 5 seconds again. In this case, it would interrupt the exposure for at least 1/20s for every 5 seconds, we might find some gaps on trails in final photo. However, e-m5 doesn't have an electronic shutter feature at all, how would olympus hide silence shutter function when it is physically capable?

Third thought, maybe olympus don't need to reset whole frame during bulb, it could just reset and sample enough pixels that good for live view only. Let's said, 1.44M pixels from 16M total.
All the photodiodes in a CMOS array are buffered by source followers. It's not like a CCD; there's no reason you can't read the charge as it builds up. That said, the a7x cameras don't work that way. The liveview screen is black during the exposure.

Jim
 
Last edited:
What will happen if you enable long-exposure-noise-reduction?

Will it bring the DR back to 'normal'?
 
But in this case all Sony Alpha cameras with IBIS should have the problem ?

Samething for Pentax and Olympus cameras : they have IBIS too.
A7rII sensor is BSI vs A7II, is denser (more pixels per area than A7II), runs at a faster pixel clock/readout rate than A7II, and is larger than Pentax/Olympus sensors.
All true. But I can't figure out what consumes so much power when live view is off and the chip is only being read once every thirty seconds. It doesn't seem like it would take much power to back bias 42 million photodiodes. The comparitor clock doesn't even need to be running.
It comes in my mind as olympus e-m5's "live time" feature. In a nutshell, it is an exposure preview duringa long exposure "bulb mode".

I wonder how olympus achieve that. My initial thought is can olympus find a magic trick that could sample the frame during exposure without interrupting the exposure? If this is the case, the works behind the sensor during the exposure is much complicated than I thought.

On second thought, maybe olympus was using electronic shutter trick - exposure for 5 seconds, interrupt it by electronic shutter, stack the exposure, display the preview, exposure for 5 seconds again. In this case, it would interrupt the exposure for at least 1/20s for every 5 seconds, we might find some gaps on trails in final photo. However, e-m5 doesn't have an electronic shutter feature at all, how would olympus hide silence shutter function when it is physically capable?

Third thought, maybe olympus don't need to reset whole frame during bulb, it could just reset and sample enough pixels that good for live view only. Let's said, 1.44M pixels from 16M total.
All the photodiodes in a CMOS array are buffered by source followers. It's not like a CCD; there's no reason you can't read the charge as it builds up. That said, the a7x cameras don't work that way. The liveview screen is black during the exposure.

Jim

--
http://blog.kasson.com
I just wonder, if cmos has ability to sample pixels without disrupting electron buildup, why no manufacture comes out with a super single-shot HDR mode?

You can stop sampling highlight portion of the scene at early moment, and continue to sample shadow area until desired exposure achieved, then merge them into an HDR photo in single exposure.
Many CMOS design house include aptina, sony, omnivision have this kind of CMOS indeed, one short exposure and a long one, then combine them and tone mapping to a single HDR output. Artifacts may appear on the combine step due to the lag between short exposure and long one.
Even artifacts may appear, it should be much less prominent than traditional multi-shot merge HDR.

Also, sony has HDR movie mode on their latest models, however, according to marketing text, it is not working as I described. It is like half of scene(presumably half of lines) exposure at a fixed value, and the other half exposure at another fixed value, then combined two per-defined exposure into one. Not dynamically sampling pixels across frame.

That's why it is called SME-HDR, (Spatially Multiplexed Exposure)

http://www.sony.net/Products/SC-HP/new_pro/april_2014/imx214_e.html

"

The HDR imaging function is one of the effective methods to improve picture quality. The conventional software method to generate one HDR image combining several pictures is not appropriate to capture moving subjects since this method cannot fill the time gap between the respective pictures. Handling several frames is also difficult for high speed video recording which contains big signal processing data.
This time Sony released the IMX214 featuring SME-HDR technology, which is capable of outputting 13M-pixel HDR images. The technology sets two different exposure conditions during shooting and seamlessly performs appropriate image processing to generate optimal images with a wide dynamic range. Therefore, the IMX214 outputs 13M-pixel HDR at 30 frame/s, and brilliant colors are captured even when pictures are taken against bright light for both video imaging and still imaging. (See figure 1.)
As well, this image sensor is capable to handle 4K or 13M-pixel HDR shooting which was not possible in the existing IMX135 (featuring BME-HDR function). (See table 1.)"
 
Last edited:
Jim,

I'm not sure if this helps your analysis but I shot several pics of the Milky Way last night. I used the 3 continuous shot option with a 5 second delay as my drive mode with the new 35/1.4 Zeiss FE (SEL35F14ZA) on the A7Rii.

Observations from the set using Exiftool 9.99:

Ambient temperature 27.0 (All temps in C) - Same for all three raw images : Outdoors temp was approx 15.5

Battery Temp 39.4 - Same for all three raw images

I then compared temperatures to a few other images that I have taken while outdoors over the past week with the SEL5518Z since I got the camera (Single Drive mode).

Outdoors temp approx: 21.1 - Amb Temp: 25 Batt temp: 38.3

Outdoors temp approx: 29.4 - Amb Temp: 28 Batt temp: 38.3

Outdoors temp approx: 40.5 - Amb Temp: 31 Batt temp: 55.6

____________

Someone mentioned that the ambien temperature may be coming from the lens vs the Sensor. That sounds about right based for the EXIF temperatures compared to the approximate outdoors temps I observed for each shot. I have included the last of the three shot set as a 4000 px long edge, 300 px/in resolution export from Capture One (SRGB JPEG). The RAW was reset after import to Capture One so no adjustments should be in play. The exposure for the three astro shot sequence was 20 seconds / 1,250 ISO image @ f/1.7 with no post processing. Color space used for image capture was Adobe RGB.

RAW import to Capture One. No adjustments, export SGRB JPEG. Manual focus was slightly off.
RAW import to Capture One. No adjustments, export SGRB JPEG. Manual focus was slightly off.

Here is the shot with some post processing. Note that there is a prescribed forest burn in our vicinity and smoke was evident last night viewing the sky.

After some post processing. Note the UA/Discovery Channel new telescope is about 20 miles west of here. It's a very dark place.
After some post processing. Note the UA/Discovery Channel new telescope is about 20 miles west of here. It's a very dark place.
I use both an A7R and A7S for astro/nightscape photography. I could sure live nicely with your images off the A7R II!! Nice captures!

bwa
 
I immensely appreciate all the work going into diagnosing this problem.

I'm seriously considering buying this camera and wondering how this issue might affect my use of the camera.

The tests that were done by Jim were 50 to 60 minutes of almost continuous shooting (with 1 sec between 30s exposures) at 3200 iso.

I've shot plays at up to 3200 iso, say 400 shots over 90 minutes, but thats a very brief shot with 14 sec between shots.

I can't imagine for the typical still photog that this is any kind of problem.

For most video work, i can't see that its a problem either. For one thing you're not likely to use 3200 iso in combination with 30 minutes continuous shooting.

So as i think Jim wrote earlier, unless one is a serious astrophotographer, than it shouldn't be a problem. In that case, auxiliary cooling equipment should be considered.

Or am I mistaken?
 
I immensely appreciate all the work going into diagnosing this problem.

I'm seriously considering buying this camera and wondering how this issue might affect my use of the camera.

The tests that were done by Jim were 50 to 60 minutes of almost continuous shooting (with 1 sec between 30s exposures) at 3200 iso.

I've shot plays at up to 3200 iso, say 400 shots over 90 minutes, but thats a very brief shot with 14 sec between shots.

I can't imagine for the typical still photog that this is any kind of problem.

For most video work, i can't see that its a problem either. For one thing you're not likely to use 3200 iso in combination with 30 minutes continuous shooting.

So as i think Jim wrote earlier, unless one is a serious astrophotographer, than it shouldn't be a problem. In that case, auxiliary cooling equipment should be considered.

Or am I mistaken?
 
I immensely appreciate all the work going into diagnosing this problem.

I'm seriously considering buying this camera and wondering how this issue might affect my use of the camera.

The tests that were done by Jim were 50 to 60 minutes of almost continuous shooting (with 1 sec between 30s exposures) at 3200 iso.

I've shot plays at up to 3200 iso, say 400 shots over 90 minutes, but thats a very brief shot with 14 sec between shots.

I can't imagine for the typical still photog that this is any kind of problem.

For most video work, i can't see that its a problem either. For one thing you're not likely to use 3200 iso in combination with 30 minutes continuous shooting.

So as i think Jim wrote earlier, unless one is a serious astrophotographer, than it shouldn't be a problem. In that case, auxiliary cooling equipment should be considered.

Or am I mistaken?
 
I just wonder, if cmos has ability to sample pixels without disrupting electron buildup, why no manufacture comes out with a super single-shot HDR mode?

You can stop sampling highlight portion of the scene at early moment, and continue to sample shadow area until desired exposure achieved, then merge them into an HDR photo in single exposure.
It takes a long time to read out a sensor. 1/14 second for the a7RII, for instance, and 1/30 second for the a7S.

Jim
 

Keyboard shortcuts

Back
Top