I'm old-school. Variable analog gain has much lower noise than any sensor or A/D and the delay is zero. Not bad.
I agree. It's in most cases a better engineering solution than a wider ADC. I'm not a fan of the term 'analog gain'. You're not
gaining analogs, and 'gain' itself is an analog concept, applied to
whatever quantity is being 'gained' (in the case of a camera VGA it's voltage). In the digital domain, if you make a number larger it's called multiplication.
Analog gain and digital gain were very commonly used terms when I did electronic design, but that was decades ago. However I just did a Google search and they are still commonly used by circuit designers.
I've never heard the terms "gaining analogs" or "gaining a quantity" though.
As I said, you're not gaining analogs, which was the point. On the quantity, electronic engineers will talk about voltage gain or current gain and occasionally charge gain. That's 'gaining a quantity', the quantity being voltage, current or charge.
And gain can be greater or less than unity in an electronic circuit.
Yes, they are. Within a discipline people tend to have their own framework of jargon. Electronic engineers are well accustomed to talking about 'gain', but the will say in general which kind of gain it is (unless it's obvious by contexts) so they'll talk about 'current gain' or 'voltage gain'. When digital circuits began to be incorporated into analog systems, they they found that digital multiplication operators could provide the same function as could 'gain', so they called it 'digital gain'.
In the world of computer science things were different. Circuits that performed multiplication were called 'multipliers'. For analog computing the variable gain amplifiers that did this function were still called 'multipliers' (and still are).
Photography is neither electronic engineering or computer science, so if we adopt those communities' terminology without understanding then we confuse ourselves. I say this with a fair amount of confidence, because I am an electronic engineer, a computer scientist and a photographer.
The reason I'm concerned about terminology is that poor use can lead thought patterns down a garden path. That's the case here. If we consider a camera as a black box, it takes light in at one end and puts out perceptual specifications at the other. It doesn't emit light. Inside the black box a translation is made from the input to the output, and gain is no part of that translation. There is no reason why any arbitrary amount of light might not be translated to any arbitrary lightness (lightness being the component of that perceptual specification which says how light or dark something should look). Internal 'gains' are as irrelevant to the essence that conversion as are the details of the computer code used to do it. So, you do not need to invoke 'gain' to explain how a smaller amount of light translates to a lighter image. And doing so sometimes leads to erroneous thought patterns. People logically assume that 'gain' means that something is being 'gained', and then the thing that is gained is either light or some unspecified analog to light they they often call 'signal'.