Paul Belanger
Active member
Mac now has a 5K monitor, should be really good. Most likely is.
One issue I have noticed though with higher resolution monitors is the less magnification one gets, say, using 100% crop to check out details like noise and sharpness when photoshopping.
Let me explain how I see it. Lets say one has a standard 1920x1080 format monitor. When viewing the entire image there are artifacts because the camera pixels cannot match on a one for one basis. So most if not all photo editing programs offer cropping views like 50% 100% 200% etc. The 100% view crops the image in such a way as to present a pixel for pixel view on the screen. The software calculates the pixels from the camera and the pixels on the screen and takes a portion of the camera image and shows a pixel for pixel view.
Example, My monitor has 1680x1050 resolution, my D90 has about 4100x2800. So when I set 100% cropping to view details I see only 1680x1050 of the total image, about 1/3. Using a D7100 with about 6000x4000 on the same monitor at 100% I see, again, 1680x1050, which is a smaller portion of the image, about 1/4 or more magnification in a sense. I suspect the software uses the long side of the image and lets the short dimension fall where it will.
As a general rule, as I understand it, the larger the camera resolution and the smaller the monitor resolution the greater the magnification of your image at 100% crop.
So when using higher resolution like the newer 5K monitors some cameras like the D90, or any with resolution 5K or less, will show no magnification at 100%. In fact some will not even fill the screen and may appear smaller then the screen available.
Perhaps some enterprising software developer might consider using some form of binning on the monitor to provide a lower monitor resolution for a pseudo 100% crop detail with greater magnification.
One issue I have noticed though with higher resolution monitors is the less magnification one gets, say, using 100% crop to check out details like noise and sharpness when photoshopping.
Let me explain how I see it. Lets say one has a standard 1920x1080 format monitor. When viewing the entire image there are artifacts because the camera pixels cannot match on a one for one basis. So most if not all photo editing programs offer cropping views like 50% 100% 200% etc. The 100% view crops the image in such a way as to present a pixel for pixel view on the screen. The software calculates the pixels from the camera and the pixels on the screen and takes a portion of the camera image and shows a pixel for pixel view.
Example, My monitor has 1680x1050 resolution, my D90 has about 4100x2800. So when I set 100% cropping to view details I see only 1680x1050 of the total image, about 1/3. Using a D7100 with about 6000x4000 on the same monitor at 100% I see, again, 1680x1050, which is a smaller portion of the image, about 1/4 or more magnification in a sense. I suspect the software uses the long side of the image and lets the short dimension fall where it will.
As a general rule, as I understand it, the larger the camera resolution and the smaller the monitor resolution the greater the magnification of your image at 100% crop.
So when using higher resolution like the newer 5K monitors some cameras like the D90, or any with resolution 5K or less, will show no magnification at 100%. In fact some will not even fill the screen and may appear smaller then the screen available.
Perhaps some enterprising software developer might consider using some form of binning on the monitor to provide a lower monitor resolution for a pseudo 100% crop detail with greater magnification.
