bill1
Senior Member
This kind of thing has precedent. Look no further than computer CPUs. The rivalry between Intel and AMD precipitated a "Megahertz war" (or "megahurts" as some called it).Could be. OTOH, 99% of camera users (a) never print larger than 4 xI think the evolution of digital cameras will have more to do with
what people want than what is possible. Consumer digicams won't go
that much beyond 10mp simply because you don't need much more than
that (you don't even need that). Very little of it will have much
to do with the physical limits of possibility.
6, and (b) even if they did, don't have the camera technique to
(consistently) get quality good enough for enlargements much bigger
than that. IOW, 1.5 to 2 megapixels would be plenty. Yet 3MP has
become the entry-level baseline, 5 MP the standard in low-end to
midrange ones, and 8 MP the high end.
I see no (market-related) reason the megapixel race should stop. I
predict we'll continue to see higher resolutions, until we hit some
limit where megapixels will become irrelevant even in controlled
testing -- possibly somewhere around 40-60 MP.
The original Athlon scaled up MHz far better than the Intel equivalent (P3). Intel's response was the P4. The P4 lowered IPC (instructions per clock) in favour of an architecture that could be scaled up much higher. AMD responded with a model rating although this wasnn't the first time such a rating had been used. Cyrix had used it for their 586 chips. Intel's response was predictabl: model numbers were some kind of con.
What drove this was entirely consumer demand and marketing. Its easy for a consumer to realise a P4 3GHz is faster than a P4 2GHz. At one point MHz was a fairly decent comparison between different chips too. The P4 changed that.
Intel has flip-flopped. Future P4s beyond about 4GHz have been canned. The mobile chip (Pentium-M, which is basically a P3 with some funky power managemnet and other stuff thrown in) will replace it. Unfortunately (for Intel) the P-M at 1.7GHz does about as much as a P4 at 2.8GHz so Intel has to eat its earlier words and marketing. Moreso, they've decided to drop MHz ratings in favour of model numbers (more eating of words).
Anyway, the moral of the story is this: at about 4GHz Intel decided that frequency one-upmanship is (with current technology) either not feasible or not cost-effective.
I predict digicams will reach a similar point and sooner rather than later. Firstly, people will start to realise on a non-poster print you simply can't see the difference between 12 and 20mp but you can see the difference of another 2 f-stops of DR. It won't change completely. "20 megapixels" on a bullet list of marketing material will always have some advantage over "12 megapixels" but megapixels aren't the only determining factor when someone buys a camera.
If anything, a crunch in camera technology is more likely than it was in computers because cameras rely on something far more tangible and easy-to-understand: what you can see. Computers vary in performance for many reasons. The CPU is only one factor of many (eg memory speed, bus speed, graphics card, etc) but if a camera shop puts up a display or the same composition take with 5, 10, 15 and 20mp cameras the differences (or lack thereof) are instantly recognisable.
--
'A colour-sense is more important, in the development of
the individual, then a sense of right or wrong.'
-- Oscar Wilde