Oly, perhaps a new player in this market segment

Started Aug 18, 2012 | Discussions thread
crames
Regular MemberPosts: 192
Like?
Re: Oly, perhaps a new player in this market segment
In reply to bobn2, Aug 24, 2012

bobn2 wrote:

crames wrote:

bobn2 wrote:

(trimmed to fit)

I think your example shoots you in the foot. the enlarged one clearly looks (and is) noisier.

Not on my monitor. Everything is larger but I would not say it looks noisier. If it looks noisier to you despite being smoothly interpolated I would guess that your display has something to do with it.

If as you say the noise increases when enlarging, why would it happen that the noise is affected differently, relative to signal, resulting in a lower SNR? How does enlarging differentiate between signal and noise?

Point being that the enlarged image has the same noise, as measured by its standard deviation.

You are using different scales of measurement/ Compare the SD over 1mm square samples of the original and enlarged, you'll find the SD clearly is not the same.

Did that and measuring noise over multiple patches and different images confirms that noise is not increased. Seems that you don't want spend 30 seconds in Photoshop or ImageJ to see this for yourself.

Just think about what is happening in the spatial-frequency domain.

Obviously we don't all use crops because the signal is smoothed the same as the noise. And a P&S would be less noisy than a D800 only if it's pixels could capture more photons.

Same mistake, counting photons in input (camera) samples, not output (printer, display) samples, the things the eye actually sees.

Same mistake as what?

I explained in one of my replies to Michael. The noise is determined by the number of photons in the image. There are phewer photons in the crop, hence the crop is more noisy.

What matters in this case is the number of photons in the pixels, because that is what establishes the proportion of photon noise in each element of the image.

Again, wrong samples, display samples are what needs to be counted, not camera samples.

So how do the display samples explain it better than the camera samples?

The crop has the same proportion of photon noise, which doesn't change when enlarging .

Wrong, the number of photons represented in each display sample does change when enlarging.

When enlarging it's easy to keep the original samples and interpolate new ones in-between, so it's not necessarily the case that the number of photons changes. The interpolated samples add no signal nor noise if the interpolation is done correctly.

Viewing two digital with different pixel counts the same size inevitably involves differently re-sampling. Re-sampling happens inevitably in your workflow

Right. But you have a choice of enlarging the smaller image, where the noise doesn't change, vs. reducing the larger image, which decreases noise. Completely different results.

Actually, all that matters is whether the camera pixellation is coarse enough that it affects the visible resolution in the viewed image. In that case then the 'upsampled' image is observed under a smaller bandwidth (low pass filtered) and the noise (and detail) will be less. i don't think anyone is discussing comparing the full sensor and crop such that the crop's output sample rate is so low that it affects the visible resolution. If so, no-one would be arguing about the noise, they would be berating teh crop image for being so blurred.

It's not necessary to low-pass filter the crop when up-sampling, it will fit within the wider bandwidth so you do not have to remove any data from the crop.

Well, your scenario says "enlarge", so how does that happen without upsampling

'Enlarge' is a matter of spatial dimension. I means take an image that was physically 24x36mm or 13x17mm and 'enlarge' it so it is 240x360mm. The amount of resampling necessary depends entirely on the actual sample rate of the capture and output device, and might be up or down. In terms of observed bandwidth, all that matters is whether the sampling rate is high enough that it is higher than the effective 'sampling rate' of the eye.

(Cameras don't "pixelate" - that is an artifact of the way one chooses to display an image.)

Expain to me how cameras don't divide a continuous image into a number of square discrete samoles.

Pixellate usually refers to representing a pixel as a square area, on the display side of things. In cameras, it's probably better to say "sample", since the photons falling on the square/rectangular/L-shaped pixel are integrated to a shapeless, dimensionless point.

OK. Test for yourself. Fit a zoom lens. Set to double the minimum FL. Take a photo. Zoom to the minimum FL. Take a crop of the same framing as the original. Compare the the same size on any output medium there is. So long as the viewing conditions are sufficient to show noise, there will be more on the crop.

Depends on how you make them the same size . If you enlarge as you have originally proposed, the photon noise SNR doesn't change.

Wrong, try it.

If you were talking about some perceptual effect of enlarging that in some circumstances causes noise in an image to be more visible, without actually increasing the physically measurable noise, I might buy it.

Reply   Reply with quote   Complain
Post (hide subjects)Posted by
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark post MMy threads
Color scheme? Blue / Yellow