Sorry if this has been posted before, but I am having trouble finding any information on it. Any help would be greatly appreciated.
There is something that has been bothering me for the longest time. It relates to when a lens is stopped down. We all know that as the aperture is decreased, the depth of field is increased, but the amount of available light is also decreased. One thing I can never get my mind around is, how is it possible for the image on the sensor, and the viewfinder, to be exactly the same regardless of the aperture size?
For example, when looking through the lens directly, one sees a circular shape. When one stops the lens down, the circle is decreased. Clearly, the image that the eye can see is no longer the same; the stopped down shows much less information. Another way of visualizing is imagining looking at a bullseye. When the lens is wide open, the entire bullseye can be seen. When stopped down, just the center point is visible.
However, this is not the case for the view on a sensor/viewfinder. The viewfinder and sensor shows the same image, but just different light levels, regardless of the aperture. How can this be? Please, if anyone can shed some light (pun intended) on this matter to help me, I would greatly appreciate it.
--
http://www.flickr.com/photos/xinogage
There is something that has been bothering me for the longest time. It relates to when a lens is stopped down. We all know that as the aperture is decreased, the depth of field is increased, but the amount of available light is also decreased. One thing I can never get my mind around is, how is it possible for the image on the sensor, and the viewfinder, to be exactly the same regardless of the aperture size?
For example, when looking through the lens directly, one sees a circular shape. When one stops the lens down, the circle is decreased. Clearly, the image that the eye can see is no longer the same; the stopped down shows much less information. Another way of visualizing is imagining looking at a bullseye. When the lens is wide open, the entire bullseye can be seen. When stopped down, just the center point is visible.
However, this is not the case for the view on a sensor/viewfinder. The viewfinder and sensor shows the same image, but just different light levels, regardless of the aperture. How can this be? Please, if anyone can shed some light (pun intended) on this matter to help me, I would greatly appreciate it.
--
http://www.flickr.com/photos/xinogage