Sorry, but the amount of "information" in a digital file is directly measured in how many bits it contains. This has been the official technical definition of information since Claude Shannon (of Bell Labs) published his paper "A Mathematical Theory of Communication" in 1948. Since a 14-bit file contains only 16% more bits than a 12-bit file, it contains at most 16% more information (well, OK, 16.6666...%). The additional information may be less (or none), depending on the efficiency of the coding.
You can choose to make up your own theory of information and your own definitions if you wish, but that's what's taught in engineering courses.
"My" theory is that 1 bit contains two values. 1 and 0. That's the way I learned and that's the "theory" paying for my daily bread since about 35 years, when I started to work with computers. I don't know what Shannon meant by that but perhaps that the actual improvement would only be 16% between 12 and 14 bit. I am pretty sure that he perfectly understood that one single bit has two values, thus it represents two pieces of information, two bits double the data, representing four different values. Shannon knew that very well since he is one of the computing pioneers and he demonstrated as a 21 year old student at MIT that an electrical application of Boolean algebra could construct and resolve any logical, numerical relationship. In other words, he knew very well the power of each bit.
Shannon understood Boolean algebra and binary arithmetic perfectly well, as do I. But when he constructed his formalization of Information Theory, he decided he didn't wish to say that a symbol which could take on, say, 4 values contained twice the information of one that could take on two values. So he defined "information" as being related to the logarithm of the number of potential values, not the linear arithmetic count of potential values. In the case of binary arithmetic, this logarithm is simply the number of bits. Since I spent about 10 years in 3 different schools getting various engineering degrees and 32 years as a working engineer, manager, and consultant (including 14 years at Bell Labs) in digital communications, I will claim I probably understand a little of this stuff.
If you think about it for a minute or two, I think you'd have to admit that it's not desirable to say that a 14-bit image file contains 4 times the information of a 12-bit file. After all, as others have pointed out here, most people see no visible difference between 12- and 14-bit images. Surely if there were 4 times as much information there, it would be obvious to anyone who looked at the image, wouldn't it? So intuitively, as well as technically, it's more sensible to say that the 14-bit file contains 16% more information (at most), rather than 4 times.
Perhaps your definition of information isn't the same as mine, for me in computing "information" is equal data value. 14 bits can have a total of 16384 values while 12 bits can have 4096 only, which is one fourth of 14 bits. However, I am pretty sure that the actual improvement we can see is only about 16% and not four times, but that depends on other things, never the less the data is indeed four times, and a 14 bits file is 4 times as large as a 12 bits file since it contains 4 times more uncompressed data.
It's not my definition, it's Shannon's. Your intuition of what you call "actual improvement" is exactly the reason Shannon defined "information" as he did, i. e., making the measure of information logarithmic. However, a 14-bit file is NOT 4 times as large as a 12-bit file. In the case of uncompressed D300 NEF files, the 14-bit file size is about 30% larger (25.3 MB vs. 19.4 MB), which is of course larger than 16.7%, mostly because the efficiency of packing the 14-bit words into bytes is less.
I don't have any axe to grind here, so I won't continue to argue the point. It just seems to me that since this is a technical forum, we'd be better served if we try to stay with language which is consistent with that of the engineers who design these systems we're talking about. You're free to think of it as you like, and I'm free to disagree, which I already have.
Peace,
Ray Ritchie