When will Canon focus on IQ instead of MP!?

Over in Nikonland the boys are wondering when Nikon will catch up with Canon as far as MP.
Says who?

D3X is 24 MP
But their "other" cameras have been "stuck" at 12MP for a few years now. They either have to contract sony to design a higher MP aps-c sensor or contract to buy a higher MP aps-c sensor that sony has designed since they are getting mega-bulk discounts for having bought so many of the same 12MP aps-c sensor from sony.
 
Ask ANY pro photographer what they want, a FF camera with large pixels or a crop camera with more but smaller pixels. Any sane photog will give you the same answer.
You know, it seems the people on here that complain that the 1D line is aps-h are usually those that don't own a 1-series and would not buy one. They are arm-chair sports photographers and sensor designers.

I remember many sports photographers on here that were happy the 1D4 is aps-h. It really is the better design for Canon's system. Their cross type AF points are with f2.8 or faster lenses. So you're able to use a 300 or 400 f2.8 and have 39 cross type AF sensors. You can use a shorter lens and have the FOV of a longer lens.

Now consider FF. You'd have to buy a 500 or 600 f4 to get the same FOV as aps-h and oops!! you lose your cross type AF sensors because your lens is now f4 instead of f2.8 Or you crop and use a shorter lens but then you really are giving away your resolution and if you're going to do that, you might as well just use a 1D2 because you'd end up with 8MP by cropping 12MP FF to aps-h sensor size. Canon knows what they are doing.
 
Because the management at Canon sees Sony as more of a threat, at this point, than Nikon. And Sony is relentlessly pushing pixel counts.
Sure nikon has a 24MP FF sensor, but their entry level models (all but the a700) haven't gone anywhere in 18 months. They refreshed their 3 cameras this year and kept the sensor the same and have not made a replacement for the a700.

So how can you say they relentlessly push pixel counts?!
In their advertising. That, and SteadyShot, is all they talk about. The constant plugging of pixel counts has an effect on the marketplace. But they're the ones who have pushed pixel counts to the levels at which they are. And that was pretty relentless. That they haven't introduced a new entry level camera isn't really relevant, and don't forget about the a850.
--
Skip M
http://www.shadowcatcherimagery.com
http://www.pbase.com/skipm
http://skipm.smugmug.com/
'Living in the heart of a dream, in the Promised Land!'
John Stewart
 
Ask ANY pro photographer what they want, a FF camera with large pixels or a crop camera with more but smaller pixels. Any sane photog will give you the same answer.
You know, it seems the people on here that complain that the 1D line is aps-h are usually those that don't own a 1-series and would not buy one. They are arm-chair sports photographers and sensor designers.

I remember many sports photographers on here that were happy the 1D4 is aps-h. It really is the better design for Canon's system. Their cross type AF points are with f2.8 or faster lenses. So you're able to use a 300 or 400 f2.8 and have 39 cross type AF sensors. You can use a shorter lens and have the FOV of a longer lens.

Now consider FF. You'd have to buy a 500 or 600 f4 to get the same FOV as aps-h and oops!! you lose your cross type AF sensors because your lens is now f4 instead of f2.8 Or you crop and use a shorter lens but then you really are giving away your resolution and if you're going to do that, you might as well just use a 1D2 because you'd end up with 8MP by cropping 12MP FF to aps-h sensor size. Canon knows what they are doing.
I thought the cross type sensors workd to f5.6 on the 1 sereis. I know that AF is read up to f8, rather than the f5.6 of the lesser cameras.
--
Skip M
http://www.shadowcatcherimagery.com
http://www.pbase.com/skipm
http://skipm.smugmug.com/
'Living in the heart of a dream, in the Promised Land!'
John Stewart
 
You know all them DXO mark is just one of many ways of looking of building a measurement and does not take into account of IQ, AA filter, processor, lens, etc.

For instant Nikon D90 has hiogher DXO score than even the new D300s, D3x and 5DMk2 has better scores than most MF cameras. You are not going t tell me DXO is consistent with IQ?

In numerous tests the IQ on the 5DMk2 is better than D3x so who need to catch up?
 
Yeah...I read parts of the Dx0 website, and found that their results are essentially per-pixel. So, like you said, even if you see that a pixel reads more noise, if there are more (like going from APS-C to full-frame), you still end up getting a S/N improvement overall. So Canon probably feels this is fine, since they can also market a higher MP.

When I complain about high MP, it's more about the practicality of handling large files, which isn't necessary for 99.9% of the shots that I take. It would just be nicer if the default file sizes were smaller (due to lower MP), and have even better IQ due to that (since more tech/materials of the sensor could go into each pixel rather than the supporting electronics).
Shoot sRAW mode, you probably gain a stop less noise too.
 
Most of those people are going to be kicking themselves in the rear in ten years, when their images look pixelated and jagged on their 100MP monitors, with mild-but-large noise grain.
10 years is too long in the digital industry to keep up. Even the guys with 15MP will be kicking themselves unless... they take it as a fact that one cannot keep the pace with some kind of current technology without upgrading the rest.

It is just how it is with the digital - I know that my 7 years old digitalized pictures aren't going to be larger than the consumer monitors of the future. But still they are a valuable part of my memories, even though they are 2MP. And last but not least - only after those 7 years I own a monitor which can display them fully 1:1 (1600x1200), so I think it'll be still some more years till they become really pixellated at 8+ MP monitors when those hit the consumer sector.

So there. And I was only talking about 2MP. Moore (that one from intel ;-)) is there, but not as fast as expected.
--
Cheers,
Martin

 
What I contest is IQ is going up at the pixel level. Lower noise. Better SNR (read DR). Better sharpness. Better color. At the same time, MP are also going up.
The sensors of the 450D, 500D & 50D are not better than that of 40D, even resolution, which was supposed to be the main beneficiary, increased only in nominal pixel size of images but very slightly if any in resolving power. In order to produce a higher MP sensor that can resolve significantly more with no negative impact on DR, colours ...etc, the production cost of the sensor and consequently the price tag of the camera have to go higher, which is not always the case, because manufacturers have to keep prices at certain points defined by the class of each camera. It is particularly difficult to maintain the price and the quality both in the same time, when the MP increases are big and quick one after the other like Canon has been doing lately.
 
What you don't seem to understand is that if a 6400 ISO 12MP image taken by the D3s is better in absolute terms than a 6400 ISO 16 MP image taken by the 1dIV (which appears to be the case), then the same crop will still be better with the D3s.
No. The more you crop an image, the noisier it gets at its final size. Cropping an image is exactly the same thing as using a smaller sensor; less resolution, and more noise.

--
John

 
It's been shown that Canon, among others, uses a noise reduction program apart from the JPEG processing.
Noise avoidance. There is no sign of signal-robbing filter-based "noise reduction", after the fact, in the RAW data.
I'm not sure why you think that it's an "absurd theory" that increasing pixel density increases noise, it's physics.
NO!!! It is not "physics"; it is BS. It's only physics to confused people. Noise from breaking down into more pixels is just an illusion of invalid perspective.

Imagine you had a square tray, with an array of 10x10 smaller trays inside it. Imagine that you had a cardboard grid that sat over it and had a 20x20 array of compartments in it. Now, you take a bunch of marbles and sprinkle them in, trying to form a shape. Now, you pull up the finer cardboard grid, and groups of marbles from 4 original bins combine into one. Did you just lessen the noise of this image? Of course not; you have only lost resolution.

--
John

 
the problem is compared side by side for higher iso iq of canon at 400 and below is stellar but the 5dmk2 at above 1000 shows noise the in blacks way to much noise, adding more processors does not fix that software was never a solution for an inherent hardware issue. it is a known fact that a ff sensor has so much real estate, the more mps you put the smaller they are , the smaller they are the less pixel pitch, pixel pit determines light gathering capability at a front line issue. So at what point do we ssay enough is enough and concentrate on dynamic range and usuable isos and stop giving us ridiculous pixel counts that only fullfil part of the iq equation, resoultion (which does nit increase static in relationship to number of pixels) dynamic range and light gathering capabilities. I say my 1d mk3s at 10mp and aps-h could put a stunning 40x50 on the wall, anyone tell me here how many people blow their images up past 24x36, so where the heck are we going with the pixel race, I want more usuable iso and more head room and dynamic range. I love canon but I think nikons model of ff sensors, limit the mps and keep pixel pitch to balance iso is a great way to go, anyone agree.......... I hope canon is listening
 
Shoot sRAW mode, you probably gain a stop less noise too.
Neither sRAW, nor downsampling, nor software binning really reduce image noise.

--
Yes they do -- they eliminate high frequency image noise, albeit in a crude and inefficient way, by throwing away image detail at the same rate that they eliminate noise at fine scales. You are correct though that noise at coarser scales (spatial frequencies well below Nyquist of the downsampled image) is unaffected by any of these approaches.

--
emil
--



http://theory.uchicago.edu/~ejm/pix/20d/
 
It's been shown that Canon, among others, uses a noise reduction program apart from the JPEG processing.
Noise avoidance. There is no sign of signal-robbing filter-based "noise reduction", after the fact, in the RAW data.
Noise avoidance, as you call it, often results in loss of resolution, negating gains made by increasing pixel count.
I'm not sure why you think that it's an "absurd theory" that increasing pixel density increases noise, it's physics.
NO!!! It is not "physics"; it is BS. It's only physics to confused people. Noise from breaking down into more pixels is just an illusion of invalid perspective.

Imagine you had a square tray, with an array of 10x10 smaller trays inside it. Imagine that you had a cardboard grid that sat over it and had a 20x20 array of compartments in it. Now, you take a bunch of marbles and sprinkle them in, trying to form a shape. Now, you pull up the finer cardboard grid, and groups of marbles from 4 original bins combine into one. Did you just lessen the noise of this image? Of course not; you have only lost resolution.
John, it isn't an illusion. How do you account of the increased noise in sensors with high numbers of pixels and small sensor sizes? Continually, when pixel count is increased and the size of the sensor isn't, noise goes up. The bucket analogy is a valid one. You know that, you've discussed it in the past from just that standpoint.

--
Skip M
http://www.shadowcatcherimagery.com
http://www.pbase.com/skipm
http://skipm.smugmug.com/
'Living in the heart of a dream, in the Promised Land!'
John Stewart
 
It's been shown that Canon, among others, uses a noise reduction program apart from the JPEG processing.
Noise avoidance. There is no sign of signal-robbing filter-based "noise reduction", after the fact, in the RAW data.
I'm not sure why you think that it's an "absurd theory" that increasing pixel density increases noise, it's physics.
NO!!! It is not "physics"; it is BS. It's only physics to confused people. Noise from breaking down into more pixels is just an illusion of invalid perspective.
No, it's just the "photonic reality", which is that the light/signal itself gets more and more inaccurate (noisy) with increasing sampling rates, and the sensor resolves this increased variation/noise in the light itself perfectly, which unfortunately isn't the case with the 'real' image detail (because lenses aren't perfect, and the AA-filter, etc.).
Imagine you had a square tray, with an array of 10x10 smaller trays inside it. Imagine that you had a cardboard grid that sat over it and had a 20x20 array of compartments in it. Now, you take a bunch of marbles and sprinkle them in, trying to form a shape. Now, you pull up the finer cardboard grid, and groups of marbles from 4 original bins combine into one. Did you just lessen the noise of this image? Of course not; you have only lost resolution.

--
John

 
It's been shown that Canon, among others, uses a noise reduction program apart from the JPEG processing.
Noise avoidance. There is no sign of signal-robbing filter-based "noise reduction", after the fact, in the RAW data.
Noise avoidance, as you call it, often results in loss of resolution, negating gains made by increasing pixel count.
"Noise avoidance" methods, like correlated double sampling, or banding correction, do not lose detail. There is a big difference between filtering away noise, which is gambling, and preventing noise from being ever allowed to contaminate the signal.
I'm not sure why you think that it's an "absurd theory" that increasing pixel density increases noise, it's physics.
NO!!! It is not "physics"; it is BS. It's only physics to confused people. Noise from breaking down into more pixels is just an illusion of invalid perspective.

Imagine you had a square tray, with an array of 10x10 smaller trays inside it. Imagine that you had a cardboard grid that sat over it and had a 20x20 array of compartments in it. Now, you take a bunch of marbles and sprinkle them in, trying to form a shape. Now, you pull up the finer cardboard grid, and groups of marbles from 4 original bins combine into one. Did you just lessen the noise of this image? Of course not; you have only lost resolution.
John, it isn't an illusion. How do you account of the increased noise in sensors with high numbers of pixels and small sensor sizes?
There is nothing to account for, because it does not happen, unless you are stuck in a pixel-centric illusion.
Continually, when pixel count is increased and the size of the sensor isn't, noise goes up. The bucket analogy is a valid one. You know that, you've discussed it in the past from just that standpoint.
No, it does not. I don't know where you get this crazy, mistaken idea that noise increases with pixel density. Every significant case of lower density resulting in lower noise has been a retrograde, with new technology.

Even when you are comparing a higher-res image with more noise to a lower-res one with less noise, I can still often appreciate the higher resolution more, anyway.

--
John

 
NO!!! It is not "physics"; it is BS. It's only physics to confused people. Noise from breaking down into more pixels is just an illusion of invalid perspective.
No, it's just the "photonic reality", which is that the light/signal itself gets more and more inaccurate (noisy) with increasing sampling rates,
Nonsense. It is illusive reality, because the ignorant observer is assuming that the natural, equalizing thing to do is to increase pixel display magnification with increased density.

and the sensor resolves this increased variation/noise in the light itself perfectly, which unfortunately isn't the case with the 'real' image detail (because lenses aren't perfect, and the AA-filter, etc.).

Light itself is "noisy", by our idealistic standards. There is no such thing as even light. Light is like vibrating raindrops of individual photons, with infinite resolution and infinite local noise levels at the virtual analog pixel level.

Let's say you did somehow have a spatially analog sensor, with no binning. What size would you view it at on a 22" analog monitor? Would you zoom in infinitely to match the "pixel density"? The deeper you go, the more you will see highly isolated dots of individual photons, and the more likely the screen will be completely black, because the area represented has no photon strikes at all.

That is the reality which we are attempting to record. Information like that would be optimal, but unwieldy. It would not be noisy, except in literal presentation.

--
John

 
Even when you are comparing a higher-res image with more noise to a lower-res one with less noise, I can still often appreciate the higher resolution more, anyway.
Nope.
To the human eye, noise is more visible/obtrusive than blur from lack of detail.

Check out this Stanford University paper, aptly named ‘Resolution and Light Sensitivity Tradeoff with Pixel Size’:
http://www.imageval.com/public/Papers/ResolutionSensitivityTradeoff_SPIE06

Everything else being equal, smaller-sized pixels do result in more noise.

But since the technology in newer sensors mitigates the noise to some extent, one gets the false impression that megapixels increases do not increase noise.

But make a low-res sensor using the same new technology and the difference will be obvious.
 

Keyboard shortcuts

Back
Top