Would a 500 megapixel camera give more zoom?

JohnHarlin

Forum Enthusiast
Messages
417
Reaction score
46
Location
Denver, CO, US
There is talk of a 500 megapixel camera bring developed.

So I was wondering if you would get more zoom the higher the megapixel camera you have.

Say for example you took a photo of the planet Jupiter at 500 megapixels. How much more zoom would you get than a 10 megapixel camera.
 
Last edited:
There is talk of a 500 megapixel camera bring developed.

So I was wondering if you would get more zoom the higher the megapixel camera you have.

Say for example you took a photo of the planet Jupiter at 500 megapixels. How much more zoom would you get than a 10 megapixel camera.
Well, you could get sqrt(500/10) = 7.07 digital zoom theoretically. In practice, due to diffraction and lens aberrations, it would be maybe a half of that (with excellent optics) or even less.

By the way, for a good photo of Jupiter, you need a lens or telescope with diameter of 100 mm/4 inches or larger. No amount of megapixels will achieve that with small lens due to limited angular resolution of optics with smaller diameter.
 
There is talk of a 500 megapixel camera bring developed.

So I was wondering if you would get more zoom the higher the megapixel camera you have.

Say for example you took a photo of the planet Jupiter at 500 megapixels. How much more zoom would you get than a 10 megapixel camera.
Well, you could get sqrt(500/10) = 7.07 digital zoom theoretically. In practice, due to diffraction and lens aberrations, it would be maybe a half of that (with excellent optics) or even less.

By the way, for a good photo of Jupiter, you need a lens or telescope with diameter of 100 mm/4 inches or larger. No amount of megapixels will achieve that with small lens due to limited angular resolution of optics with smaller diameter.
All true.

What you do get however is an almost complete lack of aliasing, false colour, etc, and much better hue resolution at high frequencies, since the blue and red channels are now both 125 MP.

Effectively you have a full colour resolution of 125 MP, which would be quite awesome for landscape photography. Those red poppies would no longer disappear in green fields.
 
Megapixel don't give you "zoom", aka magnification; lenses do that. What megapixels give you is detail / resolution. Higher megapixel images can be cropped in more. This doesn't scale linearly though. To find out how much more crop room you have you'd need the resolution numbers (for example 4000 x 6000 pixels = 24 megapixel sensor) to figure it out.

Now the reality is that 500 megapixels would requires lenses with extreme resolving power and absurdly fast apertures. I'm no expert but at an FF sensor size I'm not even sure diffraction allows for a lens that can resolve 500mp.
 
There is talk of a 500 megapixel camera bring developed.

So I was wondering if you would get more zoom the higher the megapixel camera you have.

Say for example you took a photo of the planet Jupiter at 500 megapixels. How much more zoom would you get than a 10 megapixel camera.
Just a reminder > 'any' camera can be a 500mp camera with stitching :)
 
You can get one of those 100mp cell phones and check if gives more "zoom" right now.
 
when i get one (a long time after they first arrive) i'll let you know :)
 
One thing that will never change even with a zillion pixels is space expansion nor compression (perspective) - those are purely optical-based unless there is some AI someday that can alter that,

Mike
 
You can get one of those 100mp cell phones and check if gives more "zoom" right now.
You'd need to make sure its an actual 100mp. My Pixel 6 has a 50mp sensor but also has a quad bayer array so effective resolution ends up being just over 12mp.

Also I remember watching a Northrup video where they compared a 100mp Samsung phone to a Canon RP with 24-240 RF lens on it and the Canon resolved far more detail. Diffraction is a killer.
 
So I was wondering if you would get more zoom the higher the megapixel camera you have.
The resolution you get from any lens will be severely limited by that lens's f/stop, even if you get a great lens that is well-corrected for aberrations. As others mentioned, it's the diffraction which will limit the camera's performance.

A series of articles explains it:


Up to a point, the camera will perform better with more megapixels; beyond that, no.
--
 
So I was wondering if you would get more zoom the higher the megapixel camera you have.
The resolution you get from any lens will be severely limited by that lens's f/stop, even if you get a great lens that is well-corrected for aberrations. As others mentioned, it's the diffraction which will limit the camera's performance.

A series of articles explains it:

https://blog.kasson.com/the-last-word/whats-your-q/

Up to a point, the camera will perform better with more megapixels; beyond that, no.
--
http://therefractedlight.blogspot.com
Philosophy presented by Jim Kasson in that series of articles is a little bit different than common approach of choosing suitable lens and sensor combination. Usually, people want to have pixel sharp image which leads to a lens that outresolves sensor. This leads to presence of aliasing and moire artifacts due to Bayer CFA. Jim outlines an opposite approach there. That means that he derives a sensor resolution so that all information provided by the lens is captured by the sensor without artifacts.

By his calculations, diffraction limited f/8 optics requires 1.6 um pixels at Q = 2.8 which corresponds to roughly 340 MP FF sensor.
 
So I was wondering if you would get more zoom the higher the megapixel camera you have.
The resolution you get from any lens will be severely limited by that lens's f/stop, even if you get a great lens that is well-corrected for aberrations. As others mentioned, it's the diffraction which will limit the camera's performance.

A series of articles explains it:

https://blog.kasson.com/the-last-word/whats-your-q/

Up to a point, the camera will perform better with more megapixels; beyond that, no.
--
http://therefractedlight.blogspot.com
Philosophy presented by Jim Kasson in that series of articles is a little bit different than common approach of choosing suitable lens and sensor combination. Usually, people want to have pixel sharp image which leads to a lens that outresolves sensor. This leads to presence of aliasing and moire artifacts due to Bayer CFA. Jim outlines an opposite approach there. That means that he derives a sensor resolution so that all information provided by the lens is captured by the sensor without artifacts.

By his calculations, diffraction limited f/8 optics requires 1.6 um pixels at Q = 2.8 which corresponds to roughly 340 MP FF sensor.
Still not sure that will deal with false colour moire.

You need to double it, given that red and blue are half the sensor frequency, and you might also want to use blue as the defining wavelength, in which case you need more spatial resolution again, and the sharpest aperture.

That requires more like 4,000 MP.

However, if you also account for human acuity, and the MTF of displays and prints, all of which are multiplied to get the perceived sharpness, I would think about 500 MP would be enough for practical purposes.

You could also use it in a quad Bayer arrangement, which would make for much easier processing - no demosaicing required and 125 full-colour megapixels.
 
So I was wondering if you would get more zoom the higher the megapixel camera you have.
The resolution you get from any lens will be severely limited by that lens's f/stop, even if you get a great lens that is well-corrected for aberrations. As others mentioned, it's the diffraction which will limit the camera's performance.

A series of articles explains it:

https://blog.kasson.com/the-last-word/whats-your-q/

Up to a point, the camera will perform better with more megapixels; beyond that, no.
--
http://therefractedlight.blogspot.com
Philosophy presented by Jim Kasson in that series of articles is a little bit different than common approach of choosing suitable lens and sensor combination. Usually, people want to have pixel sharp image which leads to a lens that outresolves sensor. This leads to presence of aliasing and moire artifacts due to Bayer CFA. Jim outlines an opposite approach there. That means that he derives a sensor resolution so that all information provided by the lens is captured by the sensor without artifacts.

By his calculations, diffraction limited f/8 optics requires 1.6 um pixels at Q = 2.8 which corresponds to roughly 340 MP FF sensor.
Still not sure that will deal with false colour moire.

You need to double it, given that red and blue are half the sensor frequency, and you might also want to use blue as the defining wavelength, in which case you need more spatial resolution again, and the sharpest aperture.

That requires more like 4,000 MP.

However, if you also account for human acuity, and the MTF of displays and prints, all of which are multiplied to get the perceived sharpness, I would think about 500 MP would be enough for practical purposes.

You could also use it in a quad Bayer arrangement, which would make for much easier processing - no demosaicing required and 125 full-colour megapixels.
According to his simulations, false color is almost gone at Q = 2 and totally gone at Q = 2.8, see examples here https://blog.kasson.com/the-last-word/the-effect-of-q-on-false-color-and-aliasing/

Airy disk diameter is around 10 um in green light, so 1.6 um pixel are already a lot smaller than that. And don't forget that he used an assumption of perfect optics in his simulation. Any aberrations will decrease MP requirements.
 
So I was wondering if you would get more zoom the higher the megapixel camera you have.
The resolution you get from any lens will be severely limited by that lens's f/stop, even if you get a great lens that is well-corrected for aberrations. As others mentioned, it's the diffraction which will limit the camera's performance.

A series of articles explains it:

https://blog.kasson.com/the-last-word/whats-your-q/

Up to a point, the camera will perform better with more megapixels; beyond that, no.
--
http://therefractedlight.blogspot.com
Philosophy presented by Jim Kasson in that series of articles is a little bit different than common approach of choosing suitable lens and sensor combination. Usually, people want to have pixel sharp image which leads to a lens that outresolves sensor. This leads to presence of aliasing and moire artifacts due to Bayer CFA. Jim outlines an opposite approach there. That means that he derives a sensor resolution so that all information provided by the lens is captured by the sensor without artifacts.

By his calculations, diffraction limited f/8 optics requires 1.6 um pixels at Q = 2.8 which corresponds to roughly 340 MP FF sensor.
Still not sure that will deal with false colour moire.

You need to double it, given that red and blue are half the sensor frequency, and you might also want to use blue as the defining wavelength, in which case you need more spatial resolution again, and the sharpest aperture.

That requires more like 4,000 MP.

However, if you also account for human acuity, and the MTF of displays and prints, all of which are multiplied to get the perceived sharpness, I would think about 500 MP would be enough for practical purposes.

You could also use it in a quad Bayer arrangement, which would make for much easier processing - no demosaicing required and 125 full-colour megapixels.
According to his simulations, false color is almost gone at Q = 2 and totally gone at Q = 2.8, see examples here https://blog.kasson.com/the-last-word/the-effect-of-q-on-false-color-and-aliasing/

Airy disk diameter is around 10 um in green light, so 1.6 um pixel are already a lot smaller than that. And don't forget that he used an assumption of perfect optics in his simulation. Any aberrations will decrease MP requirements.
If you read further, he mentions that even at 2.8 there is some moire in a high contrast chart, and that ideally the Q factor should be 4, but he decided to average the values.

He is also using f/8 not the sharpest aperture which is usually f/4 in the centre.

So for a Q of 4 and f/4, you would still need about 4000 MP to eliminate moire entirely.
 
I think it is fair to note that this analysis is what's used particularly by "ground observing orbiting telescopes" otherwise known as spy satellites. From what I can tell, Q=2 is a balanced optical design, but that's for monochrome images.

A lot of photographers prefer lenses that 'outresolve' the sensor, because it often looks good at 100%. Fake detail is better than no detail when it comes to art photos—and look at the popularity of AI enhancement which is all about fakery—but as you might imagine, that could be disastrous when making military decisions.
 
So I was wondering if you would get more zoom the higher the megapixel camera you have.
The resolution you get from any lens will be severely limited by that lens's f/stop, even if you get a great lens that is well-corrected for aberrations. As others mentioned, it's the diffraction which will limit the camera's performance.

A series of articles explains it:

https://blog.kasson.com/the-last-word/whats-your-q/

Up to a point, the camera will perform better with more megapixels; beyond that, no.
--
http://therefractedlight.blogspot.com
Philosophy presented by Jim Kasson in that series of articles is a little bit different than common approach of choosing suitable lens and sensor combination. Usually, people want to have pixel sharp image which leads to a lens that outresolves sensor. This leads to presence of aliasing and moire artifacts due to Bayer CFA. Jim outlines an opposite approach there. That means that he derives a sensor resolution so that all information provided by the lens is captured by the sensor without artifacts.

By his calculations, diffraction limited f/8 optics requires 1.6 um pixels at Q = 2.8 which corresponds to roughly 340 MP FF sensor.
Still not sure that will deal with false colour moire.

You need to double it, given that red and blue are half the sensor frequency, and you might also want to use blue as the defining wavelength, in which case you need more spatial resolution again, and the sharpest aperture.

That requires more like 4,000 MP.

However, if you also account for human acuity, and the MTF of displays and prints, all of which are multiplied to get the perceived sharpness, I would think about 500 MP would be enough for practical purposes.

You could also use it in a quad Bayer arrangement, which would make for much easier processing - no demosaicing required and 125 full-colour megapixels.
According to his simulations, false color is almost gone at Q = 2 and totally gone at Q = 2.8, see examples here https://blog.kasson.com/the-last-word/the-effect-of-q-on-false-color-and-aliasing/

Airy disk diameter is around 10 um in green light, so 1.6 um pixel are already a lot smaller than that. And don't forget that he used an assumption of perfect optics in his simulation. Any aberrations will decrease MP requirements.
If you read further, he mentions that even at 2.8 there is some moire in a high contrast chart, and that ideally the Q factor should be 4, but he decided to average the values.

He is also using f/8 not the sharpest aperture which is usually f/4 in the centre.

So for a Q of 4 and f/4, you would still need about 4000 MP to eliminate moire entirely.
Ah, ok, I wasn't sure if you were talking about f/8 or not. Yes, it mathematically works out that way at large apertures. It's too bad that Jim didn't provide a simulation at Q = 4 but stopped at Q = 2.8.

Nevertheless, multi-gigapixel sensors face two serious issues:
  1. Depth of field where they could provide an advantage is really small. In this DOF calculator, circle of confusion can be set down to 1 um for a demonstration how thin it is at such setting https://www.dofmaster.com/dofjs.html
  2. Increasing the amount of pixels by two orders of magnitude also decreases the amount of light hitting each pixel by that factor. So the image quality at pixel level at base ISO would resemble quality which we currently get around ISO 10000. This might cause more issues than it solves. Based on what I have seen in local studio test scene, it seems that aliasing artifacts and moiré are getting worse with increasing ISO. I'm not really sure why that happens but it looks like this:
36afd84c1da14888a078c72113eadc63.jpg.png

https://www.dpreview.com/reviews/im...=1&x=-0.599030059785108&y=0.35124828146481696
 
Last edited:
I think it is fair to note that this analysis is what's used particularly by "ground observing orbiting telescopes" otherwise known as spy satellites. From what I can tell, Q=2 is a balanced optical design, but that's for monochrome images.

A lot of photographers prefer lenses that 'outresolve' the sensor, because it often looks good at 100%. Fake detail is better than no detail when it comes to art photos—and look at the popularity of AI enhancement which is all about fakery—but as you might imagine, that could be disastrous when making military decisions.
It would be quite funny if upsampling AI trained on images of ICBM launchers would be hallucinating about them being everywhere :-)
 
Apart from limitations down to lens performance and diffraction, "digital zoom", i.e. cropping, will be blighted by noise issues, down to the fewer photons reaching the illuminated cropped area of the sensor. (Compared to the entire frame, when viewed at the same print/display size; the larger area being aided by a larger degree of pixel-binning).
 
So I was wondering if you would get more zoom the higher the megapixel camera you have.
The resolution you get from any lens will be severely limited by that lens's f/stop, even if you get a great lens that is well-corrected for aberrations. As others mentioned, it's the diffraction which will limit the camera's performance.

A series of articles explains it:

https://blog.kasson.com/the-last-word/whats-your-q/

Up to a point, the camera will perform better with more megapixels; beyond that, no.
--
http://therefractedlight.blogspot.com
Philosophy presented by Jim Kasson in that series of articles is a little bit different than common approach of choosing suitable lens and sensor combination. Usually, people want to have pixel sharp image which leads to a lens that outresolves sensor. This leads to presence of aliasing and moire artifacts due to Bayer CFA. Jim outlines an opposite approach there. That means that he derives a sensor resolution so that all information provided by the lens is captured by the sensor without artifacts.

By his calculations, diffraction limited f/8 optics requires 1.6 um pixels at Q = 2.8 which corresponds to roughly 340 MP FF sensor.
Still not sure that will deal with false colour moire.

You need to double it, given that red and blue are half the sensor frequency, and you might also want to use blue as the defining wavelength, in which case you need more spatial resolution again, and the sharpest aperture.

That requires more like 4,000 MP.

However, if you also account for human acuity, and the MTF of displays and prints, all of which are multiplied to get the perceived sharpness, I would think about 500 MP would be enough for practical purposes.

You could also use it in a quad Bayer arrangement, which would make for much easier processing - no demosaicing required and 125 full-colour megapixels.
According to his simulations, false color is almost gone at Q = 2 and totally gone at Q = 2.8, see examples here https://blog.kasson.com/the-last-word/the-effect-of-q-on-false-color-and-aliasing/

Airy disk diameter is around 10 um in green light, so 1.6 um pixel are already a lot smaller than that. And don't forget that he used an assumption of perfect optics in his simulation. Any aberrations will decrease MP requirements.
If you read further, he mentions that even at 2.8 there is some moire in a high contrast chart, and that ideally the Q factor should be 4, but he decided to average the values.

He is also using f/8 not the sharpest aperture which is usually f/4 in the centre.

So for a Q of 4 and f/4, you would still need about 4000 MP to eliminate moire entirely.
Ah, ok, I wasn't sure if you were talking about f/8 or not. Yes, it mathematically works out that way at large apertures. It's too bad that Jim didn't provide a simulation at Q = 4 but stopped at Q = 2.8.

Nevertheless, multi-gigapixel sensors face two serious issues:
  1. Depth of field where they could provide an advantage is really small. In this DOF calculator, circle of confusion can be set down to 1 um for a demonstration how thin it is at such setting https://www.dofmaster.com/dofjs.html
It depend what advantage you mean. It will largely eliminate aliasing, and the lens will then define the performance. So, it's more about aliasing than detail.

If we create larger images, we always make noise more visible when we look closely. That won't change.

If the performance is limited by the lens, then it's entirely predictable if we know the lens performance.
  1. Increasing the amount of pixels by two orders of magnitude also decreases the amount of light hitting each pixel by that factor. So the image quality at pixel level at base ISO would resemble quality which we currently get around ISO 10000. This might cause more issues than it solves Based on what I have seen in local studio test scene, it seems that aliasing artifacts and moiré are getting worse with increasing ISO. I'm not really sure why that happens but it looks like this:
Shows how useful it would be to eliminate aliasing.
 

Keyboard shortcuts

Back
Top