I wasn't relying on anything I did in ACR and Photoshop for any "data analysis" I considered. Rather, I was using those tools to visualize the differences caused by different variables in the dark frame captures. Since all examples shown were processed with identical settings in ACR+PS, the visual differences between the displayed crops from my EM1iii are both informative for analytical purposes but, more importantly, from a photographic imaging perspective.
Have in mind, that even the correct visualization of dark frames is problematic with ACR and Photoshop. Nobody knows what happens inside this "black-box".
We don't know the exact demosaicing algorithm, but otherwise we know a lot about the internals of ACR and PS. We can linearize output from ACR. We can also substitute our own profiles if preferred; but regardless of that option, the "standard" Adobe profiles (Adobe Standard and Adobe Color) are sufficiently consistent between cameras to make them adequate for the kind of visualization-based comparisons we're doing here. Remember, I'm not basing my observations on these visualizations alone. However, to humor you, I'm switching to Rawdigger for the visualizations below. That's in addition to continuing to rely on Rawdigger for the unadulterated raw data analysis, which I contend is less "black box" than the interpolated and white balanced data displayed and used for DR calculations in IWE.
I'm well aware of what Adobe does to manipulate raws, so when I do camera comparisons of raws processed in ACR, I usually ensure that I've reset ACR to eliminate the hidden baseline exposure compensation and hidden non-linear tone curve. I also often utilize linear profiles I've generated for the specific cameras being compared. That's what I did in this instance, and the discrepancy between your G9 dark frame and an EM1iii dark frame I produced with identical ISO, shutter speed and lens-on settings was baffling. The processing results pretty clearly favored the EM1iii.
If you give me the black frames from you EM1III then I can quickly compare DR with the G9.
No need for that. See below.
I need blackframes at SS=1/500s and maximal F-number + black cap (lens type is not important; you can close viewfinder, but I think it is important only for DSLRs).
That observation about OVFs vs. EVFs is a fair point and it got me thinking that maybe I was missing something in my initial black frame testing with my EM1iii. I wondered whether the source of light leakage could be the EVF itself when turned on. I repeated the black frame tests and eventually stumbled on the real reason I was seeing radically different color casts with my EM1iii. The bottom line is that it was just a coincidence that the blue-tinted and magenta-tinted black frame shots from my initial test happened to occur when I blocked the viewfinder. In my subsequent testing done in a windowless pitch-black room with the EVF off, shutter speed set to 1/8000 (mechanical) and body cap on, about half the test black frames I shot were tinted green, about a quarter were tinted magenta and the other quarter were tinted blue when visualized in either Rawdigger or ACR! There was no correlation to any possible light leakage.
At first, I was really baffled and worried that there was something seriously wrong with my camera. Then I dug into the shots more deeply using Rawdigger to see what was really going on with these varying black frames. I think it will be easier for readers to conceptualize the issue by including Rawdigger screen grabs from one of the "magenta" EM1iii black frames. [Sergeui, please note that I'm sure you fully understand the math and related details to follow. The explanation is aimed at other readers and feel free to correct me if I stumble.] In all of the Rawdigger screen grabs below, I've set brightness to +3 to make things easier to visualize. Just bear in mind that the brightness setting does not affect at all any of the data reported in the header part of the screen shots. Also, no raw profile is selected. Black level setting and white balance setting for each screen shot is specified just below the image. Let's start with this one:

EM1iii; black level applied as per image exif (253,254,253,254) ; white balance set to "As Shot"
Looks pretty magenta doesn't it? This is confirmed by the mean ("Avg") values shown at the top for the four channels. Clearly, the red and blue mean values are higher than the two green values. That alone would explain a magenta cast, but the story is more complicated than that. In fact, when no black level is applied to the file, the unadulterated mean values for all four channels are extremely close:

EM1iii; black level NOT applied; WB is still set to As Shot
As you can see, there is no longer a significant imbalance between the R and B channels vs the G channels in either mean (AVG) or standard deviation (SD) values. Clearly, then, the problem must have been introduced by the application of the black levels seen in the preceding screen grab. And if we look a bit closer at the specific black levels that are assigned, we can see that they are set to 254 for the G channels and 253 for the R and B channels. Those values are determined by the camera itself and written to the EXIF header. (Note: if you're wondering why the color has shifted to pink rather than gray despite all channel AVGs being virtually identical, please pin that question until I show and discuss the G9ii screen grabs below.)
Back to the EM1iii varying color cast issue, look what happens when I manually set the black level for all four channels to 254:

Same EM1iii black frame as above; black level applied uniformly to 254 for all 4 channels; WB set to As Shot
It turns out that
all of the
green-tinted EM1iii black frames that I shot had black levels set to 254 for all four channels. The
blue-tinted ones had the B channel black level set to 253 and the
magenta-tinted ones had both the R and B channel black level set to 253. The green channels in all of my test black frames were always auto-set to 254.
Exactly why the camera switches sometimes away from the all-254 setting is unclear to me. I know that it's derived somehow from the readouts of the optical black pixels, but what camera-specific or environmental conditions that cause the fluctuation in very controlled settings as were present during my second round of testing is a mystery to me. Perhaps it's as simple as random variation in the readout of the optical black pixels combined with the bit depth limitation of the camera causing it to toggle between 254 and 253 when more precision (something "between" 253 and 254) is what's actually needed. If so, this is just a consequence of the EM1iii being a 12-bit camera. Probably 14-bit and certainly 16-bit cameras like the G9ii wouldn't run into this issue (assuming quantization error is really a factor here). Bear in mind, though, that the impact of this problem only becomes visible when dealing with very dark images (or "black" frames like this one) that require significant shadow pushes.
[snip]
My study shows that magenta indeed comes from higher mean values in R and B channels. These higher non-zero mean values come from higher read noise. Of course, one can make additional off-set for the R and B channels and subtract these mean levels removing the magenta. You will get the visual effect, but this will not improve the true DR.
This is where we part ways once again. The magenta color cast is, of course, nominally due to the higher mean values in the R and B channels. That's not in dispute. The question is: WHY ARE THE R AND B DN VALUES HIGHER TO BEGIN WITH? I showed above one scenario that causes an imbalance among the four channels - namely, less than ideally set black levels. Let's return to the hypothetical question I "pinned" earlier in this post by looking at your G9ii black frame. First the rendering with black levels applied:

G9ii black frame; black level applied as per image exif (2048) ; white balance set to "As Shot"
The channel AVGs are better balanced than the EM1iii with black levels applied, but they are tilted toward the blue and the rendering shows a pretty obvious purplish/magenta cast. Your own screen shot from IWE shows the same purplish/magenta cast. So what gives? Let's check what happens when we look at the unadulterated version with no black level adjustment applied:

Same G9ii black frame; black level OFF; white balance set to "As Shot"
What's interesting here is - just like we saw with the EM1iii with no black level adjust - all four channel AVG values are very closely balanced (as are the SDs, of course). Yet, despite nearly identical AVGs for all channels, the rendering appears pinkish (just like the EM1iii). Now, let's look at one more rendering of the same G9ii black frame:

Same G9ii; black level applied as per image exif (2048); white balance set to AUTO
The ONLY difference between this more neutral rendering and the purplish/magenta one shown above is that the white balance has been switched from "As Shot" to Rawdigger's "Auto" WB. So, obviously, WB has a critical impact on the presence of an apparent color cast, but what exactly is going on?
To begin with, unless some kind of masking is utilized, the R-specific and B-specific WB coefficients are applied to all R and B pixel values in the image, regardless of how light or dark the individual R and B pixels are. It makes sense to apply the co-efficients to correct for the differences in color responsivity caused by the different wavelengths of photoelectrons passing through the color filters on top of pixels. Since the R and B pixels are less responsive to light than G pixels, they end up generating lower DNs in the raw files, so the WB operation multiplies these DNs by the amount of the WB co-efficients in order to prevent the RGB image output by the raw converter from having an overly strong green color cast. However, what does NOT make sense is to apply this WB multiplying effect to pixels that received no light. Ideally, there would be a tapering off of the WB multiplier with respect to to any very dark pixels in which read noise plays a significant role in setting the DN value of the pixel.
The failure to taper the WB effect on read noise-dominated pixels will cause the overall average of these very dark pixels to have an inappropriate magenta color cast. The actual saturation and hue of the cast will depend on several factors:
- How much of a role read noise plays in establishing the average lightness of these dark pixels.
- How strong the WB effect is.
- How well-optimized the black level setting is that gets applied to these darkest pixels (e.g., the black levels for my EM1iii aren't particularly well optimized since the starting mean values, after black levels are applied are already unbalanced and either tilted toward the green, blue or magenta.)*
The bottom line here is that the pinkish color cast seen in the non-black leveled renderings is caused by application of WB to the black frame raws, NOT because the R and B raw channels are inherently more noisy than the G channels. Similarly, huge exposure pushes to very deep shadows/very underexposed shots will have the perverse effect of
adding an inappropriate magenta cast to the image.
The corrective action to take is to back out the effect by a tone curve adjustment targeted at just these inappropriately affected deep shadows. ACR and LR include a special adjustment slider (in the Calibration tab) that may suffice to ameliorate the problem, but my experience is that it's often necessary to use that adjustment very modestly and to supplement it with curve adjustments in the red and blue channels to re-equalize them with the green channel (plus also address any other errors introduced by the black level subtraction step).
The foregoing is what I've been hammering at now for months whenever the magenta cast issue and its real proximate cause is brought up. THERE REALLY IS NO MEANINGFUL DIFFERENCE IN READ NOISE LEVELS AT THE RAW LEVEL (with the possible exception of some PDAF pixel implementations) because there is no relevant difference in how CMOS pixels are designed and fabbed for each of the four Bayer raw color channels. The circuitry and silicon for one pixel should be identical to every other active pixel on the sensor, regardless of which color channel it is associated with. Remember: we're talking about read noise, which is noise added by the electronics, not any noise associated with light hitting the sensor. Since there is no light involved here (we're talking now just about black frames generated in pitch black conditions), there are no complications and channel-specific variability generated by differing responsivity to specific wavelengths of photoelectrons absorbed by the pixels based on the color filters that sit atop them. I'm not aware of any reason to expect read noise behavior to be correlated by color channel at this level.
The correlation is introduced later in the processing chain as has been demonstrated with visualizations and corresponding raw data above.
Furthermore, by waiting until later in the processing chain to extract the standard deviation data needed to calculate DR, you're adding your own version of what you've called a "black box". IWE appears to perform some kind of white balance operation in addition to the interpolation of the four raw channel data into three channels. This is bound to be more confounding than performing the measurements at the front end (as can be done with Rawdigger). For instance, any reasonable type of interpolation of two green channels into one is bound to reduce the standard deviation for the single green channel relative to the red and blue channels. Of course, the red and blue channels will seem to be at least slightly more noisy as a result. And that's before we get to the undesirable WB effect on the red and blue channels of a black frame, which also increases their apparent noisiness relative to the green channel.
________________________
*Beyond the specific black level problem noted for the EM1iii, there's another way in which application of black levels can adversely affect the post-black level subtraction DN averages. Black level subtraction is performed on the raw digital numbers (DNs) using simple arithmetic. A whole number is subtracted from every DN in the raw file. Since some DNs will have starting values less than the black level value (e.g., less than 254 or 253 for the EM1iii, 142 for the G9, and 2048 for the G9ii), fully subtracting the black level value from these smaller DNs would result in negative values (DNs of less than 0), which isn't allowed. Instead, these smaller DNs will all be set to 0, which means they are now lighter relative to other values after the black level subtraction than they were prior to the subtraction operation, which isn't ideal.
Higher DR gives more room for light, when the "light-line" level is above the "noise-line" level. In the given case you are simply modify black-line level and make specific procedure of the color noise reduction known as discrimination method. But in presence of light you will also shift down the "light-line" level, which can be even lower than this mean level you push down .
The black-line level is measured by special masked pixels, and I have found that the black-line level measurements are reliable and should not be touched in case you are measuring the DR.
Based on my findings - at least, with respect to my personal EM1iii as described above - in-camera black level settings aren't always reliable. I have read posts by others complaining about inconsistent black level settings in their cameras, so I rather doubt that the problem is unique to me.
BTW, the better DR can be visualized on real-life images taken at low light for sensors with different pixel size if the conditions of same exposure per pixel is satisfied.
Admittedly, based on my DR measurements derived from the raw data reported by Rawdigger, I really don't have that much of a dispute with your DR measurements. The difference in our respective calculations is only about 1/3 Ev. I do hope, however, we can get past the continued promotion of a confused and simplistic correlation of color cast and read noise and a dismissiveness of methods and tools that don't exactly match your preferred DR metric and tool. Unfortunately, this post is already way too long and detailed, so I'll stop here, catch my breath and post separately a comparison of very low light G9 and G9ii test shots helpfully provided by jrsforums and hopefully enlightening about which DR metric will be of more practical use to photographers interested in comparing these cameras.