Of course all this would be a moot point if camera manufacturers
standardised their RAW files...
in a way, I support the companies doing their own raw format.
the idea is that raw is a THIN pass thru of the essential info
stored in that sensor. how that vendor pulls data out and writes
to a file is HIS business. if he (the company) finds that its
faster or takes less code to encode this way or that way, that's
probably a well-made tech decision.
should everyone have used EBCDIC (old ibm encoding) just because
IBM said so? ebcdic was done based on hardware reasons (as I
understand that historical character format) and there were gains
in efficiency to encode and use that.
on other machines, ascii was faster and more 'native' to the hardware.
in the same way, I do NOT expect many to support DNG on the camera.
that's probably a very awkward format for hardware vendors to have
to encode to. I bet there is no silicon support, today, for that
format. and its not cost-effective to add 2 encode formats when 1
does all the vendor needs.
as long as you can, on your host (pc) convert to dng, adding dng to
cameras makes less than no sense to me.
NOW, if someone comes up with a very small chip that will encode it
AND make that chip free for all to use, THEN there might be dng
support in cameras. I am not waiting for THAT even to happen, of
course..
--
Bryan (pics only:
http://www.flickr.com/photos/linux-works )
(pics and more: http://www.netstuff.org ) ~