l_d_allan: Do I need my eyes checked? I'm not "seeing a real difference" between the magnified "glowing 30" numbers at about 1:40 into the video ... comparing uRAW to cRAW.
If anything, the slightly different magnification of the uRAW and cRAW may impact the visual impression created.
@Rishi,Emil got those values from me and his photon noise explanation explanation is not unlike the CIE Lightness explanation I give in my [NEF Compression](http://www.photonstophotos.net/NikonInfo/NEF_Compression.htm) article.Also, that excellent Open Forums post is only a portion of his larger article [Noise, Dynamic Range and Bit Depth in Digital SLRs](http://www.photonstophotos.net/Emil%20Martinec/noise.html) (see page 3 The Consequences of Noise for An aside on "lossy" NEF compression)RegardsP.S. - What is the correct URL syntax in these posts?
bclaff: @Rishi,As far as I know, every place the article says 14-bit should read 13-bit.I'm not aware that Sony uses anything higher than a 13-bit ADC in any of their cameras.The raw data is stored at 14-bit for encode/decode convenience.Regards,
@Rishi,FWIW, a 13-bit ADC doesn't limit EDR to 13EV.
cr2shooter: Raw should be just that - raw and unmolested. I wouldn't consider buying any Sony camera when they implement goofy stuff like this. You can rely on companies like Canon to provide a thorough, reliable implementation.
@VirtualMirage,I'm obviously not entirely conversant regarding raw software for Sony files :-)But it's still, as I see it, that the solution is not just a firmware change.
@Rishi,My main point is that it is not simple a firmware fix from Sony's point of view.I think they could do the firmware quite easily but unless they modify their software (Image Data Converter? Sony RAW Driver, etc.) it's only a partial solution.To do one without the other might be viewed as a bad business decision; driving Sony camera owners to other software (even though Sony's software is free).
@Rishi,Aren't there multiple moving parts here?1) 13-bit ADC might not be enough bit depth for the newer better sensors. But probably not a big deal (and this is hardware)2) Tone curve might not be optimal. Nikon's lossy curve with more and smoother steps is better. But again, maybe not a big deal.3) 11+7 compression. This has bad worst case behavior and is probably the biggest issue.Bottom line: Sony doesn't appear to have the will to make the firmware and software changes needed to address these concerns.It's simple, but not enough, to adjust the camera firmware.The hard part is the Sony software that would then have to read the new and improved raw files.
bernardf12: The A99 produces 14 bit RAW images from the Bionz X so Sony has no excuse. If the focus points were spread evenly across the frame the A99 would have been an interesting option to me.
Sony is rumored to release the a7000 and the a99ii soon and it will be very interesting to see if they have the same limited RAW capability built in.
I've never seen an A99 file with true 14-bit data; only 14-bit raw with 13-bit data.If you think you have one then contact me by email or Private Message (PM); I'd like to get that file.
True, but why do you think that slope of 2 is there. Surely the Sony guys aren't complete idiots. Note that even Figure 12 of the article you cite (http://www.rawdigger.com/howtouse/sony-craw-arw2-posterization-detection) has a y-axis labeled "Linear 13-bit data in 14-bit space"
It's quite obvious from the gaps in the histograms.This is also how we (Jim Kasson, myself, and others) detect when the camera drop into even lower bit depths.Try a tool like RawDigger on some files.
MustyMarie: I have still not seen why no one mentions the split ADC approach used by Sony on A7RII and other sensors.
I guess it is about video ease/speed vs a full 14/16 bit ADC, but does it somehow effect the ability to do full 14bit loss-less Raw files ??
The dual conversion gain happens inside the Active Pixel Sensor (APS) so I doubt this has any effect of the bit depth of any downstream electronics.
@Rishi,As far as I know, every place the article says 14-bit should read 13-bit.I'm not aware that Sony uses anything higher than a 13-bit ADC in any of their cameras.The raw data is stored at 14-bit for encode/decode convenience.Regards,
Pandimonium: Wonders why the ISO 800 shot looks cleaner than the ISO 100. Also, why does Sony persist in getting crucial stuff wrong (like raw compression).
@Azimuth46There is a different conversion gain inside the pixel at ISO 100 and ISO 640; but at ISO 32000 there is signal processing (noise reduction) and that falsely looks like improved noise.
Yes, there are essentially two ISO-invariant ranges; one starts at ISO 100 and the other at ISO 640 (some signal processing kicks in at ISO 32000). See:http://photonstophotos.net/Charts/RN_e.htm#Sony%20ILCE-7RM2_14