Mega = one million. This has been true for the last four centuries.
Except when it refers to bytes of memory or, more recently, pixels
in an image, which case it means 2^20. This has been true for at
least five decades (i.e. since people "invented" bytes).
That is a good example of a "definition" vs. a "convention".
A definition often established by a standards body, such as the ISO, NIST, or the CIE. It usually has legal status, it will hold up in a court of law.
A "convention" is often specific to one field, or something in recent use, or something just a bit confusing.
For example, the metric prefixes have precise legal definitions, set down by standards organizations, and held up in countless legal tests all over the world.
Hard drive manufacturers adverise capacity in legal, official "megabytes" and gigabytes, where one megabyte = 1,000,000 bytes, and one gigabyte = 1,000,000,000 bytes.
By mathematical coincidence, 2^10 approximates 10^3, so there is a computer industry kilo, mega, and giga that are sueful conventions, but not legal definitions. The strict and precise terms are "kibi", "mebi", and "gibi".
http://physics.nist.gov/cuu/Units/binary.html
Despite these terms being more percise, I have never been able to say "kibibyte" out loud, and continue to use (or misuse) the conventions "kilo", "mega", and "giga".
BTW, I still claim that the Foveon sensor has 3.4 mega pixels; the
Bayer ones have (e.g.) 6.0 "one-third" pixels ;-)
You should join the "MegaSomethings" and "MegaPops" thread.
http://forums.dpreview.com/forums/read.asp?forum=1027&message=6503328
--
Ciao!
Joe
http://www.swissarmyfork.com