Studio Scene at 4:3 downsampled to 4K, Z8 and GFX 100

Hi Jim

If one downloads the RAW files of each of those from DPreview's tool and view them, without zooming, by flicking from one to the other, then two things become (very?) apparent:

1. The quality of each image is significantly higher than what has been supplied here (which to me doesn't look any better than an iphone pic)
You seem to have missed the part about downsizing to 4K. This is a thread about differences after that downsampling.
People looking at their GFX 100 images on screen may not have downsampled the image to 4K beforehand. They compare images of different resolutions as shown on a 4K monitor.

The question is whether downsampling the images to 4K emulates what people see on the monitor when they are not manually downsampled.
There is an actual screen shot in this thread.
2. The GFX one is significantly sharper; it brings out details that are not just mushy on the Z8, but in certain cases (like if you look at the old black and white photos), they are not even there.

So, sure, nobody can argue that your test doesn't produce any significant difference.
As I said elsewhere, there are people who claim that there are great differences between a 100 MP MF sensor and a high-MP FF sensor when the images are shown in their entirety on a 4K screen.
 
If you view the entire image on a 4K screen, it needs to be downsampled. If you don't downsample, you're looking at a crop. This is not a thread about looking at crops.
LR must do a pretty good job on down sampling. My GFX raw files look spectacular on my 32-inch 6K monitor (so much better than anything I've ever seen) at full screen fit and full res (which is of course partial).

Pretty exciting and amazing monitor tech going on now. I want more. 8K please.

Jim, I'm not arguing with your down sampling math, crop lessons (on other threads), denying science or declaring the Earth is flat.

You were the one who taught me to keep the LR jpeg exports full size so that the jpeg down sampling algorithms on the various platforms could do their work while still retaining the ability for people to download the full file and view it in various ways of their choosing.

But I still see what I see and there is a visual difference between FF and GFX. But both are amazing, so pick and choose between both like we both when we shoot our GFX 100 MP and Leica 60MP FF cameras.
 
You are up early Jim.

I'm jet lagged from 3 months in Europe and have been waking up at 4 AM at my daughter's house in Chapel Hill NC. She had a 9 pounder 2 days ago. Grandkid number 7.
 
Jim, with respect Sir, is this yet another demonstration or editorial you have come up with that is designed to sooth the irritation you feel when I state this irrefutable fact....

I can see the image fidelity difference between APSC, FF, High Res FF and GFX Medium Format on my 32-inch pro 4K and 6K monitors at both 1:1 and full screen.

I will go further. I can see it on my 15-inch Dell 4K touch laptop monitor. And no one can prove that I can't. Not even you, who I believe can accomplish miracles.

Sensor size matters, as do MPs and great glass. That I know you can agree with. Our only difference is you think I can't see it on my monitors, and I know I can.

Is this test an attempt to try to counteract me stating this fact of what I can clearly see, or is it unrelated?

There is no test, demonstration or editorial that you can devise or write that can make me (and so many others) unsee what we clearly see, and that I have seen almost every day of my life the past 5 years in LR on my monitors with GFX, high-res FF and APSC files in my workflow and viewing.

If this demonstration is unrelated to my statement here and your irritation with me here on our Medium Format Forum, then I stand corrected and apologize for bothering you. I will make it up to you in some other way. My affection for you is almost limitless and I have your back. Make a command, and I will obey.

But my statements about GFX image fidelity, CoC and DOF have spawned a large number of Kasson threads, tests, demonstrations, proclamations, protestations, charts, editorials and even actual articles while ginning up a lot of attention out there with the Great Unwashed Masses of DPR (and lurkers from beyond) who mostly know far less about all of this than you and me.

And Fuji itself actually enjoys our friendly arguments!

You are welcome, Jim.... It is my honor to be your inspiration in this matter.

Bonus - I am about to post my final trip shots from Corfu, and they are OK, but not as good as the ones from Napoli, Rome, Genoa, Lake Como, Venice, Bologna, Verona, Padova, Ferrara, Palermo, Siracusa, Ravenna, Modena and various vineyards and coastal areas.

But I am going to post them anyway for the enjoyment of the Board. It gives them something to "coach" me on.
Confirmation bias? Placebo effect?
 
Confirmation bias? Placebo effect?
That along with the denial of science and apparent inability to do math by my two eyeballs.
 
Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
When you downsample a 102 MP or 41 MP image to less than 6 MP, I tend to suspect that the downsampling algorithm may be the most important contributor to the results, at least in terms of resolution, sharpness, acutance, chromatic aberrations, and noise. What are your thoughts on this?
Are there significant differences between downsampling settings and algorithms? What kind of downsampling has Jim used?
Yes, there are significant differences among downsampling settings and algorithms, and the interaction of those with the properties of the source image and the degree of downsampling may well affect the qualities of the results.
Can you point me to tests/analysis of various downsampling algorithms used in current post-processors?
https://blog.kasson.com/?s=downsampling
Do you use the terms downsampling, resize and resampling interchangeably (e.g. in this post: https://blog.kasson.com/the-last-wo...viewing-natural-images-qimage-downsampling/)? If not, what is done in each case? Thx.
 
Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
When you downsample a 102 MP or 41 MP image to less than 6 MP, I tend to suspect that the downsampling algorithm may be the most important contributor to the results, at least in terms of resolution, sharpness, acutance, chromatic aberrations, and noise. What are your thoughts on this?
Are there significant differences between downsampling settings and algorithms? What kind of downsampling has Jim used?
Yes, there are significant differences among downsampling settings and algorithms, and the interaction of those with the properties of the source image and the degree of downsampling may well affect the qualities of the results.
Can you point me to tests/analysis of various downsampling algorithms used in current post-processors?
https://blog.kasson.com/?s=downsampling
Do you use the terms downsampling, resize and resampling interchangeably (e.g. in this post: https://blog.kasson.com/the-last-wo...k-viewing-natural-images-qimage-downsampling/)? If not, what is done in each case? Thx.
In proper context, resize and resampling mean the same thing. Downsampling is a special case of resampling, in which the output has fewer pixels than the input.

Clear?

Jim
 
Confirmation bias? Placebo effect?
That along with the denial of science and apparent inability to do math by my two eyeballs.
I remember designing and building a preamp, swapping it into my system to replace a store-bought preamp, and being amazed by how much better it sounded. After using it for a few months, I started swapping it back and forth with the old preamp, and slowly came to the realization that the differences were fairly small.

Jim
 
Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
When you downsample a 102 MP or 41 MP image to less than 6 MP, I tend to suspect that the downsampling algorithm may be the most important contributor to the results, at least in terms of resolution, sharpness, acutance, chromatic aberrations, and noise. What are your thoughts on this?
Are there significant differences between downsampling settings and algorithms? What kind of downsampling has Jim used?
Yes, there are significant differences among downsampling settings and algorithms, and the interaction of those with the properties of the source image and the degree of downsampling may well affect the qualities of the results.
Can you point me to tests/analysis of various downsampling algorithms used in current post-processors?
https://blog.kasson.com/?s=downsampling
Do you use the terms downsampling, resize and resampling interchangeably (e.g. in this post: https://blog.kasson.com/the-last-wo...k-viewing-natural-images-qimage-downsampling/)? If not, what is done in each case? Thx.
In proper context, resize and resampling mean the same thing. Downsampling is a special case of resampling, in which the output has fewer pixels than the input.

Clear?

Jim
Thx, like this, possibly simplified?

Resize: same number of pixels, but difference size in length and hight @100%

Resampling: different number of pixels, either higher or lower than original pixel count

Downsampling; lower number of pixel count than original.

Resizing used for printing, resampling / downsampling or upsampling used for screen viewing. Downsampling done at export to avoid lower quailty adaption to full screen viewing?[
 
Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
When you downsample a 102 MP or 41 MP image to less than 6 MP, I tend to suspect that the downsampling algorithm may be the most important contributor to the results, at least in terms of resolution, sharpness, acutance, chromatic aberrations, and noise. What are your thoughts on this?
Are there significant differences between downsampling settings and algorithms? What kind of downsampling has Jim used?
Yes, there are significant differences among downsampling settings and algorithms, and the interaction of those with the properties of the source image and the degree of downsampling may well affect the qualities of the results.
Can you point me to tests/analysis of various downsampling algorithms used in current post-processors?
https://blog.kasson.com/?s=downsampling
Do you use the terms downsampling, resize and resampling interchangeably (e.g. in this post: https://blog.kasson.com/the-last-wo...k-viewing-natural-images-qimage-downsampling/)? If not, what is done in each case? Thx.
In proper context, resize and resampling mean the same thing. Downsampling is a special case of resampling, in which the output has fewer pixels than the input.

Clear?

Jim
Thx, like this, possibly simplified?

Resize: same number of pixels, but difference size in length and hight @100%
That is a legitimate definition, but it doesn't correspond to how most people use the term. For anything but PDFs and printing, images of the same aspect ratio and pixel count are virtually the same size.

Resampling: different number of pixels, either higher or lower than original pixel count

Downsampling; lower number of pixel count than original.

Resizing used for printing, resampling / downsampling or upsampling used for screen viewing. Downsampling done at export to avoid lower quailty adaption to full screen viewing?[
 
Last edited:
b351580c76e64d36bc9cfc2ce6844d17.jpg

81a0f2a4bedd4ebeb09969097ca540e6.jpg

Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
Thank you for the comparison, Jim.
Forgive me, but I want to ask: I feel like when we down-sample GFX files to match a full frame camera, we are trying to show how much they are all the same. Why don't we upscale the Z8 photo and compare with GFX to show the difference a larger sensor + higher resolution camera makes at its native resolution comparing to a full frame camera?

I always think this when I see a comparison of GFX with a FF camera. Why down-grading the high-res image? Why not upscale the FF image and show how MF is superior?

--
IG: https://www.instagram.com/manzurfahim/
website: https://www.manzurfahim.com
 
Last edited:
b351580c76e64d36bc9cfc2ce6844d17.jpg

81a0f2a4bedd4ebeb09969097ca540e6.jpg

Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
Thank you for the comparison, Jim.
Forgive me, but I want to ask: I feel like when we down-sample GFX files to match a full frame camera, we are trying to show how much they are all the same. Why don't we upscale the Z8 photo and compare with GFX to show the difference a larger sensor + higher resolution camera makes at its native resolution comparing to a full frame camera?
That would make almost perfect sense if we were talking about printing. What I do is a twist on that. I resample both images to a resolution higher than either.

But in this case, we're talking about viewing the full image on a monitor.
I always think this when I see a comparison of GFX with a FF camera. Why down-grading the high-res image? Why not upscale the FF image and show how MF is superior?
--
 
Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
When you downsample a 102 MP or 41 MP image to less than 6 MP, I tend to suspect that the downsampling algorithm may be the most important contributor to the results, at least in terms of resolution, sharpness, acutance, chromatic aberrations, and noise. What are your thoughts on this?
Are there significant differences between downsampling settings and algorithms? What kind of downsampling has Jim used?
Yes, there are significant differences among downsampling settings and algorithms, and the interaction of those with the properties of the source image and the degree of downsampling may well affect the qualities of the results.
Can you point me to tests/analysis of various downsampling algorithms used in current post-processors?
https://blog.kasson.com/?s=downsampling
Do you use the terms downsampling, resize and resampling interchangeably (e.g. in this post: https://blog.kasson.com/the-last-wo...k-viewing-natural-images-qimage-downsampling/)? If not, what is done in each case? Thx.
In proper context, resize and resampling mean the same thing. Downsampling is a special case of resampling, in which the output has fewer pixels than the input.

Clear?

Jim
Thx, like this, possibly simplified?

Resize: same number of pixels, but difference size in length and hight @100%
That is a legitimate definition, but it doesn't correspond to how most people use the term. For anything but PDFs and printing, images of the same aspect ratio and pixel count are virtually the same size.

https://www.mathworks.com/help/matlab/ref/imresize.html
Resampling: different number of pixels, either higher or lower than original pixel count

Downsampling; lower number of pixel count than original.

Resizing used for printing, resampling / downsampling or upsampling used for screen viewing. Downsampling done at export to avoid lower quailty adaption to full screen viewing?[
Thx Jim, so my definitions of resampling / donwsampling / upsampling are ok? But I am still not clear on the resize definition / what is the goal and result of resizing? The link you provided is not helping me, sorry.
 
b351580c76e64d36bc9cfc2ce6844d17.jpg

81a0f2a4bedd4ebeb09969097ca540e6.jpg

Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
Thank you for the comparison, Jim.
Forgive me, but I want to ask: I feel like when we down-sample GFX files to match a full frame camera, we are trying to show how much they are all the same. Why don't we upscale the Z8 photo and compare with GFX to show the difference a larger sensor + higher resolution camera makes at its native resolution comparing to a full frame camera?

I always think this when I see a comparison of GFX with a FF camera. Why down-grading the high-res image? Why not upscale the FF image and show how MF is superior?

--
IG: https://www.instagram.com/manzurfahim/
website: https://www.manzurfahim.com
It’s really hard to compare two different formats in a way everyone agrees with. I believe the best way is to print both to the same size. Then you can argue at what size the difference becomes apparent.

--
... Mike, formerly known as Rod. :)
... https://www.flickr.com/photos/198581502@N02/
 
Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
When you downsample a 102 MP or 41 MP image to less than 6 MP, I tend to suspect that the downsampling algorithm may be the most important contributor to the results, at least in terms of resolution, sharpness, acutance, chromatic aberrations, and noise. What are your thoughts on this?
Are there significant differences between downsampling settings and algorithms? What kind of downsampling has Jim used?
Yes, there are significant differences among downsampling settings and algorithms, and the interaction of those with the properties of the source image and the degree of downsampling may well affect the qualities of the results.
Can you point me to tests/analysis of various downsampling algorithms used in current post-processors?
https://blog.kasson.com/?s=downsampling
Do you use the terms downsampling, resize and resampling interchangeably (e.g. in this post: https://blog.kasson.com/the-last-wo...k-viewing-natural-images-qimage-downsampling/)? If not, what is done in each case? Thx.
In proper context, resize and resampling mean the same thing. Downsampling is a special case of resampling, in which the output has fewer pixels than the input.

Clear?

Jim
Thx, like this, possibly simplified?

Resize: same number of pixels, but difference size in length and hight @100%
That is a legitimate definition, but it doesn't correspond to how most people use the term. For anything but PDFs and printing, images of the same aspect ratio and pixel count are virtually the same size.

https://www.mathworks.com/help/matlab/ref/imresize.html
Resampling: different number of pixels, either higher or lower than original pixel count

Downsampling; lower number of pixel count than original.

Resizing used for printing, resampling / downsampling or upsampling used for screen viewing. Downsampling done at export to avoid lower quailty adaption to full screen viewing?[
Thx Jim, so my definitions of resampling / donwsampling / upsampling are ok?
Yes.
But I am still not clear on the resize definition / what is the goal and result of resizing?
I use resizing and resampling interchangeably. What you're calling resizing is simply a change to the metadata.
The link you provided is not helping me, sorry.
It describes how Matlab uses the term resize.
 
b351580c76e64d36bc9cfc2ce6844d17.jpg

81a0f2a4bedd4ebeb09969097ca540e6.jpg

Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
Thank you for the comparison, Jim.
Forgive me, but I want to ask: I feel like when we down-sample GFX files to match a full frame camera, we are trying to show how much they are all the same. Why don't we upscale the Z8 photo and compare with GFX to show the difference a larger sensor + higher resolution camera makes at its native resolution comparing to a full frame camera?

I always think this when I see a comparison of GFX with a FF camera. Why down-grading the high-res image? Why not upscale the FF image and show how MF is superior?
It’s really hard to compare two different formats in a way everyone agrees with. I believe the best way is to print both to the same size. Then you can argue at what size the difference becomes apparent.
The assertion that inspired this post wasn't about printing.

--
 
b351580c76e64d36bc9cfc2ce6844d17.jpg

81a0f2a4bedd4ebeb09969097ca540e6.jpg

Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
Thank you for the comparison, Jim.
Forgive me, but I want to ask: I feel like when we down-sample GFX files to match a full frame camera, we are trying to show how much they are all the same. Why don't we upscale the Z8 photo and compare with GFX to show the difference a larger sensor + higher resolution camera makes at its native resolution comparing to a full frame camera?

I always think this when I see a comparison of GFX with a FF camera. Why down-grading the high-res image? Why not upscale the FF image and show how MF is superior?
It’s really hard to compare two different formats in a way everyone agrees with. I believe the best way is to print both to the same size. Then you can argue at what size the difference becomes apparent.
The assertion that inspired this post wasn't about printing.

--
https://blog.kasson.com
The difference between on screen viewing and printing is you can actually print large enough to see a difference. I don’t know of any screen big enough, with enough resolution to do that. Other than zooming in

--
... Mike, formerly known as Rod. :)
... https://www.flickr.com/photos/198581502@N02/
 
That's a JPEG file. Compression on top of compression, in one case.
Sure. But this file looks 99% the same as the raw, on the left, and on my monitor.
Here's what I see in Ps, as a PNG.

e78d69aa4684430a86d793dae879994c.jpg.png

Oh, rats! DPR turned it into a JPEG.

At least both images have the same compression.

There is no question that there is less Bayer color aliasing in the 100 MP image.
Thanks. So I see three things now:

1. The top image is the better one, is that the GFX?
No, the top image is from the Z8. That's why there's more color aliasing.
2. Both images are way better than what I saw (my image to the right)

3. My RAW GFX looks very similar indeed to what you supplied and what you see.
Jim, I looked at the studio pics today on my 14 in laptop and I can't see a difference, unlike when I looked at them yesterday on my 42in monitor. I think that is an important parameter to keep in mind when we're saying there is or there is not a difference, same if someone like Greg is viewing things on a 6K vs 4K monitor. Similarly, I also think it's important to state what dimensions are your supplied images downscaled to; are those dimensions enough to cover the area of a 14in or a 42in monitor?

--
Apollon
http://www.flickr.com/photos/apollonas/
 
Last edited:
b351580c76e64d36bc9cfc2ce6844d17.jpg

81a0f2a4bedd4ebeb09969097ca540e6.jpg

Developed in Lr with default settings except for WB.

Night and day? I think not. There are some differences.
Thank you for the comparison, Jim.
Forgive me, but I want to ask: I feel like when we down-sample GFX files to match a full frame camera, we are trying to show how much they are all the same. Why don't we upscale the Z8 photo and compare with GFX to show the difference a larger sensor + higher resolution camera makes at its native resolution comparing to a full frame camera?

I always think this when I see a comparison of GFX with a FF camera. Why down-grading the high-res image? Why not upscale the FF image and show how MF is superior?
It’s really hard to compare two different formats in a way everyone agrees with. I believe the best way is to print both to the same size. Then you can argue at what size the difference becomes apparent.
The assertion that inspired this post wasn't about printing.
The difference between on screen viewing and printing is you can actually print large enough to see a difference. I don’t know of any screen big enough, with enough resolution to do that. Other than zooming in
Which is the point of this thread. It's not that you can't see a difference on a 4K screen when you're viewing the full image. It's that the difference is subtle, and not the night and day delta that some have claimed.

--
 

Keyboard shortcuts

Back
Top