A7iii vs A7Riii

Started 3 months ago | Discussions
SafariBob
SafariBob Contributing Member • Posts: 843
Re: A7iii vs A7Riii

tqlla wrote:

SafariBob wrote:

tqlla wrote:

SafariBob wrote:

tqlla wrote:

SafariBob wrote:

andrewD2 wrote:

SafariBob wrote:

andrewD2 wrote:

that’s not really true. The pixels in a screen have a sub pixel for each color. The camera does not. Hence a 4K display, which is 8mpx is equivalent to a 32mpx photo. Typically, to generate one high quality pixel you need 4, this is why high quality 1080p is supersampled from 4K. This basically implies that a super high quality 4K image needs up to 144mpx.

A 4k wide image at 3:2 isn't 8MP.

I am talking about a 4K image, not “4K wide”

The calculation you are making is the required camera resolution. The camera resolution is cropped for a 16:9 screen. You need to take that into account.

as you say, that’s bizarre. Let’s do the math.

3:2, 4K wide. = 4096 * 4096 * 2/3 = 11.1mpx full colour = 44 camera mpx.

so in order to fill a 4K display with information at each sub pixel you need 44mpx

16:9, 4K wide = 4096 * 4096 * 9/16 = 9.4mpx full colour= 37.6 mpx camera

Used British spelling if that helps cognition for ya

Your math is just way off here.

4K at 3x2 = 3840x2560 = 9.8304MP
4K at 16x9 = 3840x2160 = 8.2944MP
8K on a 3x2 sensor would require 7680x5120 = 39.3216MP

I dont know that much about color detail on a per pixel level.

read the other post. That is why this is not the correct interpretation. Different kinds of 4K. What you are talking about is ultra hd. 4K is 4096. But it’s the same for practical purposes.

But my assumption is that each pixel on a camera sensor is more than just a Red or Green or Blue.

it ain’t.

Is that something you learned from Ken Rockwell?

lol, touché

Open a picture with your photo editor. Using the editors magnify, zoom into a single pixel in a white portion of the image.

If what you believe is true, that pixel would be red green or blue.

It’s not, because it is interpolated. Just like clear image zoom.

edit: this is just marketing from the camera companies, just like hard drive makers use factors of ten bit while os’es use factors of 2. To make them sound bigger. Camera makers exaggerate 300% though not 2-8%

Okay, I see what you are saying. But still a 42MP sensor has 42MP of resolution regardless of how the color definition of each pixel is created.

Yes and no, it has 21mp of green data, and 10.5 of red and blue... and while it does have 42mpx of spatial resolution, that was more relevant when pixels were expensive. Because the human eye is much more sensitive to luminance than chrominance, it was a reasonable trade off. Jpeg, mpeg etc, uses the same assumption. But the Bayer interpolation still makes people prefer foveon over Bayer or film over digital. So to make that go away, you need some excess resolution.

 SafariBob's gear list:SafariBob's gear list
Sony a7R II Sony 70-400mm F4-5.6 G SSM Sony FE 35mm F2.8 Sony Vario-Tessar T* FE 16-35mm F4 ZA OSS
kolyy Senior Member • Posts: 1,109
Re: A7iii vs A7Riii

SafariBob wrote:

andrewD2 wrote:

in a camera, each “pixel” contains just one color. Hence the pixel in a camera is comparable to a sub pixel in a display. For each full color site, there are 2 green, one red and one blue in a camera.

thus an 8k display contains 32 “camera” pixels, which are actually 24 rgb sub pixels.

in order to make the effect of pixels, interpolation, etc negligible, you need to oversample.

No, I'm not the one confused here.
Yes, the Bayer array is there on the camera sensor meaning you get RGGB per 4 pixels.
So ok, if you want to bin each 4 RGGB pixels to a superpixel you get ONE factor of 4 in your calculation. Bayer interpolation is better than that but OK, lets run with the x4 factor for the SENSOR array.
Your OTHER subpixel screen factor is bogus. However many subpixels the screen uses to be able to show ONE image pixel we group those subpixels together in the same way
and call each, well, we just call it "a pixel".

By your calculations you'd need a 683MP camera for an 8K screen. Your calculation is off by a factor that doesn't exist.
Andrew

You need to oversample. The cameras correct for optical aberrations etc.

4x (which effectively is 1x) is probably fine, but what i am saying is that 16x (which is effectively 4x) probably is sufficient to outresolve any pixel issues in all by the most extreme cases.

Why do you think film was mastered in 4K before being transferred to 1080p blueray? Why does 70mm exist? Why do film studios shoot in 6k or beyond when cinemas are 4K? This is moving image, where resolution is much less discernible.

those are rhetorical questions. No need to answer. And forgive me if I don’t. Read my original post. Nothing there is wrong or controversial.

Edit: when I bought my first dslr, it was 6mp, people were making the same arguments back then. My second was 12mp. It’s blatantly obvious with today’s equipment which is which. And that’s part of it too, you keep your images for life presumably, and it’s a bit sad when the resolution just isn’t there. Not always. Sometimes a less resolved picture has more ambiance. Photographers frequently add grain in post. But storage is so cheap these days, do your efforts justice and capture what you can.

I wonder if you have ever tried what argue for in real life. I crop images down to 8Mpx quite regularly and it is very hard to pick out an 8Mpx image among 24Mpx ones, when viewing on a UHD screen. Bayer interpolation works very well indeed. Granted, if you magnify the image way beyond normal viewing, you can find artifacts, compared to an oversampled image.

As for video, there is an advantage in oversampling, but it is much smaller than you think. Try to compare an oversampled 4K video from the A7III with the one without oversampling from an APS-C crop (choose "4K APS-C" from the drop down list for the A7 III on the right):

https://www.dpreview.com/reviews/image-comparison/fullscreen?attr29_0=sony_a7iii&attr29_1=sony_a7iii&attr29_2=sony_a7riii&attr29_3=sony_a7sii&attr72_0=4k&attr72_1=4k-apsc&attr72_2=4k&attr72_3=4k&normalization=full&widget=602&x=-0.1486276679678514&y=-0.5674316472873174

 kolyy's gear list:kolyy's gear list
Canon G9 X II Panasonic Lumix DMC-GM5 Panasonic Lumix DMC-GX85 Sony a7 III
SafariBob
SafariBob Contributing Member • Posts: 843
Re: A7iii vs A7Riii

kolyy wrote:

SafariBob wrote:

andrewD2 wrote:

in a camera, each “pixel” contains just one color. Hence the pixel in a camera is comparable to a sub pixel in a display. For each full color site, there are 2 green, one red and one blue in a camera.

thus an 8k display contains 32 “camera” pixels, which are actually 24 rgb sub pixels.

in order to make the effect of pixels, interpolation, etc negligible, you need to oversample.

No, I'm not the one confused here.
Yes, the Bayer array is there on the camera sensor meaning you get RGGB per 4 pixels.
So ok, if you want to bin each 4 RGGB pixels to a superpixel you get ONE factor of 4 in your calculation. Bayer interpolation is better than that but OK, lets run with the x4 factor for the SENSOR array.
Your OTHER subpixel screen factor is bogus. However many subpixels the screen uses to be able to show ONE image pixel we group those subpixels together in the same way
and call each, well, we just call it "a pixel".

By your calculations you'd need a 683MP camera for an 8K screen. Your calculation is off by a factor that doesn't exist.
Andrew

You need to oversample. The cameras correct for optical aberrations etc.

4x (which effectively is 1x) is probably fine, but what i am saying is that 16x (which is effectively 4x) probably is sufficient to outresolve any pixel issues in all by the most extreme cases.

Why do you think film was mastered in 4K before being transferred to 1080p blueray? Why does 70mm exist? Why do film studios shoot in 6k or beyond when cinemas are 4K? This is moving image, where resolution is much less discernible.

those are rhetorical questions. No need to answer. And forgive me if I don’t. Read my original post. Nothing there is wrong or controversial.

Edit: when I bought my first dslr, it was 6mp, people were making the same arguments back then. My second was 12mp. It’s blatantly obvious with today’s equipment which is which. And that’s part of it too, you keep your images for life presumably, and it’s a bit sad when the resolution just isn’t there. Not always. Sometimes a less resolved picture has more ambiance. Photographers frequently add grain in post. But storage is so cheap these days, do your efforts justice and capture what you can.

I wonder if you have ever tried what argue for in real life

all the time. As an example, in my version of Lightroom, when you edit, photos are lineskipped, when you bake a jpeg, they are oversample. Huge difference

. I crop images down to 8Mpx quite regularly and it is very hard to pick out an 8Mpx image among 24Mpx ones,

I do that all the time also. I shoot a lot of wildlife, so there is tons of cropping. If you feel that way, why do you have full frame? You could do well with apsc or even 1 inch.

when viewing on a UHD screen

I do have a uhd screen, but mostly use a laptop with quarter the resolution, very visible there.

. Bayer interpolation works very well indeed. Granted, if you magnify the image way beyond normal viewing, you can find artifacts, compared to an oversampled image.

its extremely apparent. How old are you? You may need decent eyesight to see it, but should be clearly visible to anyone 40 or below. Or if you have decent eyesight.

As for video, there is an advantage in oversampling, but it is much smaller than you think. Try to compare an oversampled 4K video from the A7III with the one without oversampling from an APS-C crop (choose "4K APS-C" from the drop down list for the A7 III on the right):

the oversampled a7iii is markedly better than the unoversampled a7sii, if there is significant detail. The a7s 1080p (oversampled) is shockingly better than the a7 stock 1080p

https://www.dpreview.com/reviews/image-comparison/fullscreen?attr29_0=sony_a7iii&attr29_1=sony_a7iii&attr29_2=sony_a7riii&attr29_3=sony_a7sii&attr72_0=4k&attr72_1=4k-apsc&attr72_2=4k&attr72_3=4k&normalization=full&widget=602&x=-0.1486276679678514&y=-0.5674316472873174

this is clearly visible in your own tool. I don’t get why people insist on misinforming others.

 SafariBob's gear list:SafariBob's gear list
Sony a7R II Sony 70-400mm F4-5.6 G SSM Sony FE 35mm F2.8 Sony Vario-Tessar T* FE 16-35mm F4 ZA OSS
kolyy Senior Member • Posts: 1,109
Re: A7iii vs A7Riii

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

andrewD2 wrote:

in a camera, each “pixel” contains just one color. Hence the pixel in a camera is comparable to a sub pixel in a display. For each full color site, there are 2 green, one red and one blue in a camera.

thus an 8k display contains 32 “camera” pixels, which are actually 24 rgb sub pixels.

in order to make the effect of pixels, interpolation, etc negligible, you need to oversample.

No, I'm not the one confused here.
Yes, the Bayer array is there on the camera sensor meaning you get RGGB per 4 pixels.
So ok, if you want to bin each 4 RGGB pixels to a superpixel you get ONE factor of 4 in your calculation. Bayer interpolation is better than that but OK, lets run with the x4 factor for the SENSOR array.
Your OTHER subpixel screen factor is bogus. However many subpixels the screen uses to be able to show ONE image pixel we group those subpixels together in the same way
and call each, well, we just call it "a pixel".

By your calculations you'd need a 683MP camera for an 8K screen. Your calculation is off by a factor that doesn't exist.
Andrew

You need to oversample. The cameras correct for optical aberrations etc.

4x (which effectively is 1x) is probably fine, but what i am saying is that 16x (which is effectively 4x) probably is sufficient to outresolve any pixel issues in all by the most extreme cases.

Why do you think film was mastered in 4K before being transferred to 1080p blueray? Why does 70mm exist? Why do film studios shoot in 6k or beyond when cinemas are 4K? This is moving image, where resolution is much less discernible.

those are rhetorical questions. No need to answer. And forgive me if I don’t. Read my original post. Nothing there is wrong or controversial.

Edit: when I bought my first dslr, it was 6mp, people were making the same arguments back then. My second was 12mp. It’s blatantly obvious with today’s equipment which is which. And that’s part of it too, you keep your images for life presumably, and it’s a bit sad when the resolution just isn’t there. Not always. Sometimes a less resolved picture has more ambiance. Photographers frequently add grain in post. But storage is so cheap these days, do your efforts justice and capture what you can.

I wonder if you have ever tried what argue for in real life

all the time. As an example, in my version of Lightroom, when you edit, photos are lineskipped, when you bake a jpeg, they are oversample. Huge difference

This has no relevance for the discussion.

. I crop images down to 8Mpx quite regularly and it is very hard to pick out an 8Mpx image among 24Mpx ones,

I do that all the time also. I shoot a lot of wildlife, so there is tons of cropping. If you feel that way, why do you have full frame? You could do well with apsc or even 1 inch.

Your comment make zero sense to me. Why exactly should I move to a smaller sensor and loose all the flexibility the large one provides me? You seem to think resolution is all that matters. As for me, I only need enough. And a sharp picture on a UHD screen is enough resolution to me.

when viewing on a UHD screen

I do have a uhd screen, but mostly use a laptop with quarter the resolution, very visible there.

Are you seriously claiming that you can see a difference between an 8MPx and a 24Mpx image on a 2Mpx FHD screen? I would suggest you to make a blind experiment to get back to reality. I hope you understand we are talking about displaying the whole image, not magnifying it.

. Bayer interpolation works very well indeed. Granted, if you magnify the image way beyond normal viewing, you can find artifacts, compared to an oversampled image.

its extremely apparent. How old are you? You may need decent eyesight to see it, but should be clearly visible to anyone 40 or below. Or if you have decent eyesight.

As for video, there is an advantage in oversampling, but it is much smaller than you think. Try to compare an oversampled 4K video from the A7III with the one without oversampling from an APS-C crop (choose "4K APS-C" from the drop down list for the A7 III on the right):

the oversampled a7iii is markedly better than the unoversampled a7sii, if there is significant detail. The a7s 1080p (oversampled) is shockingly better than the a7 stock 1080p

I wanted you to compare the oversampled vs "unoversampled" 4K on the A7 III, not 1080p, which is line-skipped.

https://www.dpreview.com/reviews/image-comparison/fullscreen?attr29_0=sony_a7iii&attr29_1=sony_a7iii&attr29_2=sony_a7riii&attr29_3=sony_a7sii&attr72_0=4k&attr72_1=4k-apsc&attr72_2=4k&attr72_3=4k&normalization=full&widget=602&x=-0.1486276679678514&y=-0.5674316472873174

this is clearly visible in your own tool. I don’t get why people insist on misinforming others.

The difference I am talking about is indeed visible in a magnified view, but completely negligible at normal viewing distances.

 kolyy's gear list:kolyy's gear list
Canon G9 X II Panasonic Lumix DMC-GM5 Panasonic Lumix DMC-GX85 Sony a7 III
SafariBob
SafariBob Contributing Member • Posts: 843
Re: A7iii vs A7Riii

kolyy wrote:

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

andrewD2 wrote:

in a camera, each “pixel” contains just one color. Hence the pixel in a camera is comparable to a sub pixel in a display. For each full color site, there are 2 green, one red and one blue in a camera.

thus an 8k display contains 32 “camera” pixels, which are actually 24 rgb sub pixels.

in order to make the effect of pixels, interpolation, etc negligible, you need to oversample.

No, I'm not the one confused here.
Yes, the Bayer array is there on the camera sensor meaning you get RGGB per 4 pixels.
So ok, if you want to bin each 4 RGGB pixels to a superpixel you get ONE factor of 4 in your calculation. Bayer interpolation is better than that but OK, lets run with the x4 factor for the SENSOR array.
Your OTHER subpixel screen factor is bogus. However many subpixels the screen uses to be able to show ONE image pixel we group those subpixels together in the same way
and call each, well, we just call it "a pixel".

By your calculations you'd need a 683MP camera for an 8K screen. Your calculation is off by a factor that doesn't exist.
Andrew

You need to oversample. The cameras correct for optical aberrations etc.

4x (which effectively is 1x) is probably fine, but what i am saying is that 16x (which is effectively 4x) probably is sufficient to outresolve any pixel issues in all by the most extreme cases.

Why do you think film was mastered in 4K before being transferred to 1080p blueray? Why does 70mm exist? Why do film studios shoot in 6k or beyond when cinemas are 4K? This is moving image, where resolution is much less discernible.

those are rhetorical questions. No need to answer. And forgive me if I don’t. Read my original post. Nothing there is wrong or controversial.

Edit: when I bought my first dslr, it was 6mp, people were making the same arguments back then. My second was 12mp. It’s blatantly obvious with today’s equipment which is which. And that’s part of it too, you keep your images for life presumably, and it’s a bit sad when the resolution just isn’t there. Not always. Sometimes a less resolved picture has more ambiance. Photographers frequently add grain in post. But storage is so cheap these days, do your efforts justice and capture what you can.

I wonder if you have ever tried what argue for in real life

all the time. As an example, in my version of Lightroom, when you edit, photos are lineskipped, when you bake a jpeg, they are oversample. Huge difference

This has no relevance for the discussion.

then I am unsure whether you are discussing the same topic, that’s essentially oversampling vs not.

. I crop images down to 8Mpx quite regularly and it is very hard to pick out an 8Mpx image among 24Mpx ones,

I do that all the time also. I shoot a lot of wildlife, so there is tons of cropping. If you feel that way, why do you have full frame? You could do well with apsc or even 1 inch.

Your comment make zero sense to me. Why exactly should I move to a smaller sensor and loose all the flexibility the large one provides me?

you are claiming an 8mpx crop is indiscernible from a 24mpx full capture, why not get an rx100?

You seem to think resolution is all that matters.

where did I make this statement?

As for me, I only need enough. And a sharp picture on a UHD screen is enough resolution to me.

when viewing on a UHD screen

I do have a uhd screen, but mostly use a laptop with quarter the resolution, very visible there.

Are you seriously claiming that you can see a difference between an 8MPx and a 24Mpx image on a 2Mpx FHD screen?

an 8mpx crop, definitely. An 8mpx image from an 8mp sensor of equivalent technology. Probably not in many cases, depending how much detail there is. No such sensor exists though, but it’s frequently not difficult to see difference between a7s and a7r2 on 2mp screen:

I would suggest you to make a blind experiment to get back to reality. I hope you understand we are talking about displaying the whole image, not magnifying it.

i have done it many time to justify upgrades despites people spreading erroneous tropes about resolution not mattering

. Bayer interpolation works very well indeed. Granted, if you magnify the image way beyond normal viewing, you can find artifacts, compared to an oversampled image.

its extremely apparent. How old are you? You may need decent eyesight to see it, but should be clearly visible to anyone 40 or below. Or if you have decent eyesight.

please answer this question. It really does make a difference whether you are 27 or 70. If you are 70, your experience really isn’t relevant for people under 40.

As for video, there is an advantage in oversampling, but it is much smaller than you think. Try to compare an oversampled 4K video from the A7III with the one without oversampling from an APS-C crop (choose "4K APS-C" from the drop down list for the A7 III on the right):

the oversampled a7iii is markedly better than the unoversampled a7sii, if there is significant detail. The a7s 1080p (oversampled) is shockingly better than the a7 stock 1080p

I wanted you to compare the oversampled vs "unoversampled" 4K on the A7 III, not 1080p, which is line-skipped.

there is no unoversampling without cropping or line skipping

https://www.dpreview.com/reviews/image-comparison/fullscreen?attr29_0=sony_a7iii&attr29_1=sony_a7iii&attr29_2=sony_a7riii&attr29_3=sony_a7sii&attr72_0=4k&attr72_1=4k-apsc&attr72_2=4k&attr72_3=4k&normalization=full&widget=602&x=-0.1486276679678514&y=-0.5674316472873174

this is clearly visible in your own tool. I don’t get why people insist on misinforming others.

The difference I am talking about is indeed visible in a magnified view, but completely negligible at normal viewing distances.

maybe you are happy with dvd. I find a good dvd decent. But prefer bluray. A great 4K blu ray is even better

 SafariBob's gear list:SafariBob's gear list
Sony a7R II Sony 70-400mm F4-5.6 G SSM Sony FE 35mm F2.8 Sony Vario-Tessar T* FE 16-35mm F4 ZA OSS
NatureBX Regular Member • Posts: 236
Re: A7iii vs A7Riii
1

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

andrewD2 wrote:

in a camera, each “pixel” contains just one color. Hence the pixel in a camera is comparable to a sub pixel in a display. For each full color site, there are 2 green, one red and one blue in a camera.

thus an 8k display contains 32 “camera” pixels, which are actually 24 rgb sub pixels.

in order to make the effect of pixels, interpolation, etc negligible, you need to oversample.

No, I'm not the one confused here.
Yes, the Bayer array is there on the camera sensor meaning you get RGGB per 4 pixels.
So ok, if you want to bin each 4 RGGB pixels to a superpixel you get ONE factor of 4 in your calculation. Bayer interpolation is better than that but OK, lets run with the x4 factor for the SENSOR array.
Your OTHER subpixel screen factor is bogus. However many subpixels the screen uses to be able to show ONE image pixel we group those subpixels together in the same way
and call each, well, we just call it "a pixel".

By your calculations you'd need a 683MP camera for an 8K screen. Your calculation is off by a factor that doesn't exist.
Andrew

You need to oversample. The cameras correct for optical aberrations etc.

4x (which effectively is 1x) is probably fine, but what i am saying is that 16x (which is effectively 4x) probably is sufficient to outresolve any pixel issues in all by the most extreme cases.

Why do you think film was mastered in 4K before being transferred to 1080p blueray? Why does 70mm exist? Why do film studios shoot in 6k or beyond when cinemas are 4K? This is moving image, where resolution is much less discernible.

those are rhetorical questions. No need to answer. And forgive me if I don’t. Read my original post. Nothing there is wrong or controversial.

Edit: when I bought my first dslr, it was 6mp, people were making the same arguments back then. My second was 12mp. It’s blatantly obvious with today’s equipment which is which. And that’s part of it too, you keep your images for life presumably, and it’s a bit sad when the resolution just isn’t there. Not always. Sometimes a less resolved picture has more ambiance. Photographers frequently add grain in post. But storage is so cheap these days, do your efforts justice and capture what you can.

I wonder if you have ever tried what argue for in real life

all the time. As an example, in my version of Lightroom, when you edit, photos are lineskipped, when you bake a jpeg, they are oversample. Huge difference

This has no relevance for the discussion.

then I am unsure whether you are discussing the same topic, that’s essentially oversampling vs not.

. I crop images down to 8Mpx quite regularly and it is very hard to pick out an 8Mpx image among 24Mpx ones,

I do that all the time also. I shoot a lot of wildlife, so there is tons of cropping. If you feel that way, why do you have full frame? You could do well with apsc or even 1 inch.

Your comment make zero sense to me. Why exactly should I move to a smaller sensor and loose all the flexibility the large one provides me?

you are claiming an 8mpx crop is indiscernible from a 24mpx full capture, why not get an rx100?

You seem to think resolution is all that matters.

where did I make this statement?

As for me, I only need enough. And a sharp picture on a UHD screen is enough resolution to me.

when viewing on a UHD screen

I do have a uhd screen, but mostly use a laptop with quarter the resolution, very visible there.

Are you seriously claiming that you can see a difference between an 8MPx and a 24Mpx image on a 2Mpx FHD screen?

an 8mpx crop, definitely. An 8mpx image from an 8mp sensor of equivalent technology. Probably not in many cases, depending how much detail there is. No such sensor exists though, but it’s frequently not difficult to see difference between a7s and a7r2 on 2mp screen:

I would suggest you to make a blind experiment to get back to reality. I hope you understand we are talking about displaying the whole image, not magnifying it.

i have done it many time to justify upgrades despites people spreading erroneous tropes about resolution not mattering

. Bayer interpolation works very well indeed. Granted, if you magnify the image way beyond normal viewing, you can find artifacts, compared to an oversampled image.

its extremely apparent. How old are you? You may need decent eyesight to see it, but should be clearly visible to anyone 40 or below. Or if you have decent eyesight.

please answer this question. It really does make a difference whether you are 27 or 70. If you are 70, your experience really isn’t relevant for people under 40.

As for video, there is an advantage in oversampling, but it is much smaller than you think. Try to compare an oversampled 4K video from the A7III with the one without oversampling from an APS-C crop (choose "4K APS-C" from the drop down list for the A7 III on the right):

the oversampled a7iii is markedly better than the unoversampled a7sii, if there is significant detail. The a7s 1080p (oversampled) is shockingly better than the a7 stock 1080p

I wanted you to compare the oversampled vs "unoversampled" 4K on the A7 III, not 1080p, which is line-skipped.

there is no unoversampling without cropping or line skipping

https://www.dpreview.com/reviews/image-comparison/fullscreen?attr29_0=sony_a7iii&attr29_1=sony_a7iii&attr29_2=sony_a7riii&attr29_3=sony_a7sii&attr72_0=4k&attr72_1=4k-apsc&attr72_2=4k&attr72_3=4k&normalization=full&widget=602&x=-0.1486276679678514&y=-0.5674316472873174

this is clearly visible in your own tool. I don’t get why people insist on misinforming others.

The difference I am talking about is indeed visible in a magnified view, but completely negligible at normal viewing distances.

maybe you are happy with dvd. I find a good dvd decent. But prefer bluray. A great 4K blu ray is even better

The  high resolution does matter. I love zoom in the image with 42 MP even on iPhone or iPad and do cropping if needed. If the resolution doesn’t matter then why not just use iPhone to take picture with it 12mp camera? There really no need to argue here. If they love 24mp or feel that is enough then so be it. Other people want 42MP or even 61mp on a7riv

 NatureBX's gear list:NatureBX's gear list
Sony RX100 VII Sony a7R III Sony FE 70-200 F4 Sony FE 85mm F1.4 GM Sony FE 50mm F1.4 ZA +2 more
EarthQuake Senior Member • Posts: 2,846
Re: A7iii vs A7Riii
1

NatureBX wrote:

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

andrewD2 wrote:

in a camera, each “pixel” contains just one color. Hence the pixel in a camera is comparable to a sub pixel in a display. For each full color site, there are 2 green, one red and one blue in a camera.

thus an 8k display contains 32 “camera” pixels, which are actually 24 rgb sub pixels.

in order to make the effect of pixels, interpolation, etc negligible, you need to oversample.

No, I'm not the one confused here.
Yes, the Bayer array is there on the camera sensor meaning you get RGGB per 4 pixels.
So ok, if you want to bin each 4 RGGB pixels to a superpixel you get ONE factor of 4 in your calculation. Bayer interpolation is better than that but OK, lets run with the x4 factor for the SENSOR array.
Your OTHER subpixel screen factor is bogus. However many subpixels the screen uses to be able to show ONE image pixel we group those subpixels together in the same way
and call each, well, we just call it "a pixel".

By your calculations you'd need a 683MP camera for an 8K screen. Your calculation is off by a factor that doesn't exist.
Andrew

You need to oversample. The cameras correct for optical aberrations etc.

4x (which effectively is 1x) is probably fine, but what i am saying is that 16x (which is effectively 4x) probably is sufficient to outresolve any pixel issues in all by the most extreme cases.

Why do you think film was mastered in 4K before being transferred to 1080p blueray? Why does 70mm exist? Why do film studios shoot in 6k or beyond when cinemas are 4K? This is moving image, where resolution is much less discernible.

those are rhetorical questions. No need to answer. And forgive me if I don’t. Read my original post. Nothing there is wrong or controversial.

Edit: when I bought my first dslr, it was 6mp, people were making the same arguments back then. My second was 12mp. It’s blatantly obvious with today’s equipment which is which. And that’s part of it too, you keep your images for life presumably, and it’s a bit sad when the resolution just isn’t there. Not always. Sometimes a less resolved picture has more ambiance. Photographers frequently add grain in post. But storage is so cheap these days, do your efforts justice and capture what you can.

I wonder if you have ever tried what argue for in real life

all the time. As an example, in my version of Lightroom, when you edit, photos are lineskipped, when you bake a jpeg, they are oversample. Huge difference

This has no relevance for the discussion.

then I am unsure whether you are discussing the same topic, that’s essentially oversampling vs not.

. I crop images down to 8Mpx quite regularly and it is very hard to pick out an 8Mpx image among 24Mpx ones,

I do that all the time also. I shoot a lot of wildlife, so there is tons of cropping. If you feel that way, why do you have full frame? You could do well with apsc or even 1 inch.

Your comment make zero sense to me. Why exactly should I move to a smaller sensor and loose all the flexibility the large one provides me?

you are claiming an 8mpx crop is indiscernible from a 24mpx full capture, why not get an rx100?

You seem to think resolution is all that matters.

where did I make this statement?

As for me, I only need enough. And a sharp picture on a UHD screen is enough resolution to me.

when viewing on a UHD screen

I do have a uhd screen, but mostly use a laptop with quarter the resolution, very visible there.

Are you seriously claiming that you can see a difference between an 8MPx and a 24Mpx image on a 2Mpx FHD screen?

an 8mpx crop, definitely. An 8mpx image from an 8mp sensor of equivalent technology. Probably not in many cases, depending how much detail there is. No such sensor exists though, but it’s frequently not difficult to see difference between a7s and a7r2 on 2mp screen:

I would suggest you to make a blind experiment to get back to reality. I hope you understand we are talking about displaying the whole image, not magnifying it.

i have done it many time to justify upgrades despites people spreading erroneous tropes about resolution not mattering

. Bayer interpolation works very well indeed. Granted, if you magnify the image way beyond normal viewing, you can find artifacts, compared to an oversampled image.

its extremely apparent. How old are you? You may need decent eyesight to see it, but should be clearly visible to anyone 40 or below. Or if you have decent eyesight.

please answer this question. It really does make a difference whether you are 27 or 70. If you are 70, your experience really isn’t relevant for people under 40.

As for video, there is an advantage in oversampling, but it is much smaller than you think. Try to compare an oversampled 4K video from the A7III with the one without oversampling from an APS-C crop (choose "4K APS-C" from the drop down list for the A7 III on the right):

the oversampled a7iii is markedly better than the unoversampled a7sii, if there is significant detail. The a7s 1080p (oversampled) is shockingly better than the a7 stock 1080p

I wanted you to compare the oversampled vs "unoversampled" 4K on the A7 III, not 1080p, which is line-skipped.

there is no unoversampling without cropping or line skipping

https://www.dpreview.com/reviews/image-comparison/fullscreen?attr29_0=sony_a7iii&attr29_1=sony_a7iii&attr29_2=sony_a7riii&attr29_3=sony_a7sii&attr72_0=4k&attr72_1=4k-apsc&attr72_2=4k&attr72_3=4k&normalization=full&widget=602&x=-0.1486276679678514&y=-0.5674316472873174

this is clearly visible in your own tool. I don’t get why people insist on misinforming others.

The difference I am talking about is indeed visible in a magnified view, but completely negligible at normal viewing distances.

maybe you are happy with dvd. I find a good dvd decent. But prefer bluray. A great 4K blu ray is even better

The high resolution does matter. I love zoom in the image with 42 MP even on iPhone or iPad and do cropping if needed. If the resolution doesn’t matter then why not just use iPhone to take picture with it 12mp camera? There really no need to argue here. If they love 24mp or feel that is enough then so be it. Other people want 42MP or even 61mp on a7riv

Because an A7 has a full frame sensor. The Iphone doesn't. What is relevant in this comparison is the difference in sensor area, not number of pixels. Otherwise the 41MP sensor in that Nokia phone would be on par with an A7r III.

Yes, some people enjoy pixel peeping, and others love to brag on the internet that they have the camera with the most pixels, so if that's your thing, sure, get an A7r. Or jump up to a Fuji GFX 100 - that has even more pixels, and obviously more pixels are always better, so why settle for second best? Or get that Phase One IQ4 150, go big or go home, right?

kolyy Senior Member • Posts: 1,109
Re: A7iii vs A7Riii

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

andrewD2 wrote:

in a camera, each “pixel” contains just one color. Hence the pixel in a camera is comparable to a sub pixel in a display. For each full color site, there are 2 green, one red and one blue in a camera.

thus an 8k display contains 32 “camera” pixels, which are actually 24 rgb sub pixels.

in order to make the effect of pixels, interpolation, etc negligible, you need to oversample.

No, I'm not the one confused here.
Yes, the Bayer array is there on the camera sensor meaning you get RGGB per 4 pixels.
So ok, if you want to bin each 4 RGGB pixels to a superpixel you get ONE factor of 4 in your calculation. Bayer interpolation is better than that but OK, lets run with the x4 factor for the SENSOR array.
Your OTHER subpixel screen factor is bogus. However many subpixels the screen uses to be able to show ONE image pixel we group those subpixels together in the same way
and call each, well, we just call it "a pixel".

By your calculations you'd need a 683MP camera for an 8K screen. Your calculation is off by a factor that doesn't exist.
Andrew

You need to oversample. The cameras correct for optical aberrations etc.

4x (which effectively is 1x) is probably fine, but what i am saying is that 16x (which is effectively 4x) probably is sufficient to outresolve any pixel issues in all by the most extreme cases.

Why do you think film was mastered in 4K before being transferred to 1080p blueray? Why does 70mm exist? Why do film studios shoot in 6k or beyond when cinemas are 4K? This is moving image, where resolution is much less discernible.

those are rhetorical questions. No need to answer. And forgive me if I don’t. Read my original post. Nothing there is wrong or controversial.

Edit: when I bought my first dslr, it was 6mp, people were making the same arguments back then. My second was 12mp. It’s blatantly obvious with today’s equipment which is which. And that’s part of it too, you keep your images for life presumably, and it’s a bit sad when the resolution just isn’t there. Not always. Sometimes a less resolved picture has more ambiance. Photographers frequently add grain in post. But storage is so cheap these days, do your efforts justice and capture what you can.

I wonder if you have ever tried what argue for in real life

all the time. As an example, in my version of Lightroom, when you edit, photos are lineskipped, when you bake a jpeg, they are oversample. Huge difference

This has no relevance for the discussion.

then I am unsure whether you are discussing the same topic, that’s essentially oversampling vs not.

Line-skipping has a very detrimental effect on IQ. It's thus not relevant for our discussion about the difference between oversampled and native resolution images.

. I crop images down to 8Mpx quite regularly and it is very hard to pick out an 8Mpx image among 24Mpx ones,

I do that all the time also. I shoot a lot of wildlife, so there is tons of cropping. If you feel that way, why do you have full frame? You could do well with apsc or even 1 inch.

Your comment make zero sense to me. Why exactly should I move to a smaller sensor and loose all the flexibility the large one provides me?

you are claiming an 8mpx crop is indiscernible from a 24mpx full capture, why not get an rx100?

Huh? The RX100 has 20Mpx. You make no sense. The difference between a 24Mpx A7 III and a 20MPx RX100 is not in the number of pixels. It's in the amount of light you can send to these pixels.

You seem to think resolution is all that matters.

where did I make this statement?

I am under that impression as you do not seem to consider anything beyond the number of pixels. To me, a 10Mpx image produced by a sharp, large aperture lens on an APS-C sized sensor area with fantastic SNR is much preferable to a 20Mpx image from a poor lens on a tiny sensor, limited by noise or diffraction.

As for me, I only need enough. And a sharp picture on a UHD screen is enough resolution to me.

when viewing on a UHD screen

I do have a uhd screen, but mostly use a laptop with quarter the resolution, very visible there.

Are you seriously claiming that you can see a difference between an 8MPx and a 24Mpx image on a 2Mpx FHD screen?

an 8mpx crop, definitely. An 8mpx image from an 8mp sensor of equivalent technology. Probably not in many cases, depending how much detail there is. No such sensor exists though, but it’s frequently not difficult to see difference between a7s and a7r2 on 2mp screen:

I am sorry, but you are viewing images on a low resolution screen. I am not sure how can I take you seriously. I use my 65" 4K TV, 24" 4K LCD monitor, 14" 4K notebook screen and 10" 4Mpx tablet. All of these have very good pixel densities for the typical viewing distance and images with sufficient detail look fantastic on them. A FHD screen of similar size is very noticeably soft and does not show much of the detail, neither the artifacts.

The difference between 4K and FHD is much much larger than any difference between a 24Mpx and a 8Mpx image on a 4K screen. Let that sink in.

I would suggest you to make a blind experiment to get back to reality. I hope you understand we are talking about displaying the whole image, not magnifying it.

i have done it many time to justify upgrades despites people spreading erroneous tropes about resolution not mattering

If you have done this on a FHD screen then I am not sure what to say, except that you are imagining things. That is a low resolution screen which smears everything.

. Bayer interpolation works very well indeed. Granted, if you magnify the image way beyond normal viewing, you can find artifacts, compared to an oversampled image.

its extremely apparent. How old are you? You may need decent eyesight to see it, but should be clearly visible to anyone 40 or below. Or if you have decent eyesight.

please answer this question. It really does make a difference whether you are 27 or 70. If you are 70, your experience really isn’t relevant for people under 40.

I am not going to discuss my age with you, but I am not anywhere near 70, lol. My eyesight is certainly good enough to inspect images from close distance on my 4K laptop, which I am writing this comment on.

As for video, there is an advantage in oversampling, but it is much smaller than you think. Try to compare an oversampled 4K video from the A7III with the one without oversampling from an APS-C crop (choose "4K APS-C" from the drop down list for the A7 III on the right):

the oversampled a7iii is markedly better than the unoversampled a7sii, if there is significant detail. The a7s 1080p (oversampled) is shockingly better than the a7 stock 1080p

I wanted you to compare the oversampled vs "unoversampled" 4K on the A7 III, not 1080p, which is line-skipped.

there is no unoversampling without cropping or line skipping

Line-skipping is irrelevant. You are supposed to compare a native resolution image to an oversampled image. That means going back and taking a look at the A7 III in FF 4K mode (oversampled) and the FF APS-C mode (close to native resolution). Take a look at the magnified image and then try to download it a view them full screen.

https://www.dpreview.com/reviews/image-comparison/fullscreen?attr29_0=sony_a7iii&attr29_1=sony_a7iii&attr29_2=sony_a7riii&attr29_3=sony_a7sii&attr72_0=4k&attr72_1=4k-apsc&attr72_2=4k&attr72_3=4k&normalization=full&widget=602&x=-0.1486276679678514&y=-0.5674316472873174

this is clearly visible in your own tool. I don’t get why people insist on misinforming others.

The difference I am talking about is indeed visible in a magnified view, but completely negligible at normal viewing distances.

maybe you are happy with dvd. I find a good dvd decent. But prefer bluray. A great 4K blu ray is even better

Bluray is mostly FHD, which is pretty low res. It seems to me that I actually have a higher standard for detail than you.

 kolyy's gear list:kolyy's gear list
Canon G9 X II Panasonic Lumix DMC-GM5 Panasonic Lumix DMC-GX85 Sony a7 III
hungrylau Senior Member • Posts: 1,311
Re: A7iii vs A7Riii

AshleyMateoLeeRoy wrote:

I am leaning towards getting the a7iii but the a7Riii is coming down on price. Which do you recommend and what's the biggest difference in the two?

I had a similar choice to make before purchasing my a7iii. Price was a concern since it meant basically, 3/4 the cost of another new FE lens or likely, 100% of the cost of a used FE lens.

I went with the a7iii in the end because I really wanted to focus on lowlight and have the option to do more video.

One thing to note, you should also consider going into a physical store and handholding your various options, including the a6400 and various lenses, a7iii and lenses and a7riii and lenses.

So effectively, price was the biggest difference to my mind, although others differ.

For example: https://www.youtube.com/watch?v=i4H4iuw6qms

 hungrylau's gear list:hungrylau's gear list
Sony a7 III Sony FE 55mm F1.8 Sony Vario-Tessar T* FE 16-35mm F4 ZA OSS Zeiss Loxia 35 Zeiss Loxia 50
NatureBX Regular Member • Posts: 236
Re: A7iii vs A7Riii

EarthQuake wrote:

NatureBX wrote:

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

kolyy wrote:

SafariBob wrote:

andrewD2 wrote:

in a camera, each “pixel” contains just one color. Hence the pixel in a camera is comparable to a sub pixel in a display. For each full color site, there are 2 green, one red and one blue in a camera.

thus an 8k display contains 32 “camera” pixels, which are actually 24 rgb sub pixels.

in order to make the effect of pixels, interpolation, etc negligible, you need to oversample.

No, I'm not the one confused here.
Yes, the Bayer array is there on the camera sensor meaning you get RGGB per 4 pixels.
So ok, if you want to bin each 4 RGGB pixels to a superpixel you get ONE factor of 4 in your calculation. Bayer interpolation is better than that but OK, lets run with the x4 factor for the SENSOR array.
Your OTHER subpixel screen factor is bogus. However many subpixels the screen uses to be able to show ONE image pixel we group those subpixels together in the same way
and call each, well, we just call it "a pixel".

By your calculations you'd need a 683MP camera for an 8K screen. Your calculation is off by a factor that doesn't exist.
Andrew

You need to oversample. The cameras correct for optical aberrations etc.

4x (which effectively is 1x) is probably fine, but what i am saying is that 16x (which is effectively 4x) probably is sufficient to outresolve any pixel issues in all by the most extreme cases.

Why do you think film was mastered in 4K before being transferred to 1080p blueray? Why does 70mm exist? Why do film studios shoot in 6k or beyond when cinemas are 4K? This is moving image, where resolution is much less discernible.

those are rhetorical questions. No need to answer. And forgive me if I don’t. Read my original post. Nothing there is wrong or controversial.

Edit: when I bought my first dslr, it was 6mp, people were making the same arguments back then. My second was 12mp. It’s blatantly obvious with today’s equipment which is which. And that’s part of it too, you keep your images for life presumably, and it’s a bit sad when the resolution just isn’t there. Not always. Sometimes a less resolved picture has more ambiance. Photographers frequently add grain in post. But storage is so cheap these days, do your efforts justice and capture what you can.

I wonder if you have ever tried what argue for in real life

all the time. As an example, in my version of Lightroom, when you edit, photos are lineskipped, when you bake a jpeg, they are oversample. Huge difference

This has no relevance for the discussion.

then I am unsure whether you are discussing the same topic, that’s essentially oversampling vs not.

. I crop images down to 8Mpx quite regularly and it is very hard to pick out an 8Mpx image among 24Mpx ones,

I do that all the time also. I shoot a lot of wildlife, so there is tons of cropping. If you feel that way, why do you have full frame? You could do well with apsc or even 1 inch.

Your comment make zero sense to me. Why exactly should I move to a smaller sensor and loose all the flexibility the large one provides me?

you are claiming an 8mpx crop is indiscernible from a 24mpx full capture, why not get an rx100?

You seem to think resolution is all that matters.

where did I make this statement?

As for me, I only need enough. And a sharp picture on a UHD screen is enough resolution to me.

when viewing on a UHD screen

I do have a uhd screen, but mostly use a laptop with quarter the resolution, very visible there.

Are you seriously claiming that you can see a difference between an 8MPx and a 24Mpx image on a 2Mpx FHD screen?

an 8mpx crop, definitely. An 8mpx image from an 8mp sensor of equivalent technology. Probably not in many cases, depending how much detail there is. No such sensor exists though, but it’s frequently not difficult to see difference between a7s and a7r2 on 2mp screen:

I would suggest you to make a blind experiment to get back to reality. I hope you understand we are talking about displaying the whole image, not magnifying it.

i have done it many time to justify upgrades despites people spreading erroneous tropes about resolution not mattering

. Bayer interpolation works very well indeed. Granted, if you magnify the image way beyond normal viewing, you can find artifacts, compared to an oversampled image.

its extremely apparent. How old are you? You may need decent eyesight to see it, but should be clearly visible to anyone 40 or below. Or if you have decent eyesight.

please answer this question. It really does make a difference whether you are 27 or 70. If you are 70, your experience really isn’t relevant for people under 40.

As for video, there is an advantage in oversampling, but it is much smaller than you think. Try to compare an oversampled 4K video from the A7III with the one without oversampling from an APS-C crop (choose "4K APS-C" from the drop down list for the A7 III on the right):

the oversampled a7iii is markedly better than the unoversampled a7sii, if there is significant detail. The a7s 1080p (oversampled) is shockingly better than the a7 stock 1080p

I wanted you to compare the oversampled vs "unoversampled" 4K on the A7 III, not 1080p, which is line-skipped.

there is no unoversampling without cropping or line skipping

https://www.dpreview.com/reviews/image-comparison/fullscreen?attr29_0=sony_a7iii&attr29_1=sony_a7iii&attr29_2=sony_a7riii&attr29_3=sony_a7sii&attr72_0=4k&attr72_1=4k-apsc&attr72_2=4k&attr72_3=4k&normalization=full&widget=602&x=-0.1486276679678514&y=-0.5674316472873174

this is clearly visible in your own tool. I don’t get why people insist on misinforming others.

The difference I am talking about is indeed visible in a magnified view, but completely negligible at normal viewing distances.

maybe you are happy with dvd. I find a good dvd decent. But prefer bluray. A great 4K blu ray is even better

The high resolution does matter. I love zoom in the image with 42 MP even on iPhone or iPad and do cropping if needed. If the resolution doesn’t matter then why not just use iPhone to take picture with it 12mp camera? There really no need to argue here. If they love 24mp or feel that is enough then so be it. Other people want 42MP or even 61mp on a7riv

Because an A7 has a full frame sensor. The Iphone doesn't. What is relevant in this comparison is the difference in sensor area, not number of pixels. Otherwise the 41MP sensor in that Nokia phone would be on par with an A7r III.

Yes, some people enjoy pixel peeping, and others love to brag on the internet that they have the camera with the most pixels, so if that's your thing, sure, get an A7r. Or jump up to a Fuji GFX 100 - that has even more pixels, and obviously more pixels are always better, so why settle for second best? Or get that Phase One IQ4 150, go big or go home, right?

If you are a knowledgeable photographer or have used or read review on a7iii and a7riii then you should already know what both camera capable of so needless to compare with Nokia phone or mentioning about sensor since both a7riii and a7iii are full frame. Given that we only need to talk about the resolution, some people want to be able to zoom in or crop more so 42MP would handle this area well while 24mp has no room to do much cropping. If you already happy with your 24mp camera then stick with it but that doesn’t mean it will work for everyone else. Again, this is personal preference so there really no need to argue about. As far as bragging about high mega pixels, those people probably newbie. The main focus shouldn’t be on the camera alone, lenses are more important. Furthermore, even if you have best gear yet if you don’t have it with you when you need then it do no good either. This is why people use both full frame camera and compact camera interchangeably.

Let me guess, a7iii probably your only camera and you’ve never use a7riii.

 NatureBX's gear list:NatureBX's gear list
Sony RX100 VII Sony a7R III Sony FE 70-200 F4 Sony FE 85mm F1.4 GM Sony FE 50mm F1.4 ZA +2 more
Keyboard shortcuts:
FForum MMy threads