RX100 Mark VI Review: Is this camera worth $1200?

Are you saying most people shoot Full Frame, the sensor size you were comparing it too?
I'm saying most people who shoot "RAW" are shooting on a DSLR (either APS-C or full-frame sensor). This market is by FAR the largest for RAW shooters. If you know what RAW is, your experience is probably coming in from using a DSLR (likely the popular entry-level DSLRs like those Canon Rebels or Nikon D7500 or something like that).

Most point-n-shoots/compacts don't have RAW processing, and slapping RAW on those isn't going to be what you'd expect, generally speaking.
 
True blown highlights can never be recovered even with RAW.
It depends on your definition of "blown." If you mean that JPG processing just capped it to (255, 255, 255), but the sensor was able to capture brighter values then... well -- then I'd imagine RAW would be able to restore that. This would vary of course, based on the RAW encoding/decoding algorithm & manufacturer, but theoretically those additional bits could be used to store that additional information.
My definition of a blown highlight, which is the generally accepted one, is when part of the sensor receives so much light that it becomes saturated and the photodiodes are maxed out to the point of clipping. At that point nothing is ever recoverable. If anything is recoverable it's not truly "blown".

 
My definition of a blown highlight, which is the generally accepted one, is when part of the sensor receives so much light that it becomes saturated and the photodiodes are maxed out to the point of clipping.
lol ok, I think that "generally" most people don't even know what a photodiode is so how can this be the "generally accepted" answer?

A "blown highlight" is when the colors get capped out at white -- it doesn't even need to hit (255,255,255) for that matter, if it were to hit (240, 240, 240) even you'd still say "hey those highlights look blown out. Can we see if we can recover some detail there?" Technically it might imply checking the histogram and seeing if they're being clipped out at the far edges.

Most "general people" don't know or care how RAW works, we all look at JPGs in the end. If you cared about photodiodes, then simply looking at a JPG you would not be able to precisely determine whether the image has "blown highlights" since you wouldn't know what the sensor/photodiode was doing at that point in time and whether it was oversaturated or not, lol.

On another note, stop trying to sound smarter than me because you're not.
 
Last edited:
On another note, stop trying to sound smarter than me because you're not.
I don't consider myself smarter than anyone but I do have 2 degrees, one in Electrical Technology and one in Biology, both earned in the 60's.
 
I think it's good to have 4K these days. You get more flexibility in editing the footage and 4K is pretty standard now for TVs.
4K is standard for TV hardware, but not necessarily TV content. I think most shows are still in 1080P. Tech-wise, HDR has been a bigger visual difference than 4K. Many people cannot tell the difference between 4K and 1080, and one primary reason for this is that in video there is often motion blur anyways. (The case is different for stills). I might also add, if you really want to create "UHD" content for high-end TVs, you'll need a HDR + 4K workflow, which balloons hardware requirements. Pros will go this length and sell the content to reap profits, but if you're not intending to sell your content like this, then I'd recommend making life easier on yourself and create in 1080p at least until computer hardware catches up.
Why does it matter if there is any TV content in 4K right now. For people creating memories or any other kind of video they value, they can view that on the hardware available right now, in full quality. They can also make sure their video quality is as good as possible far into the future, a future in which 1080p will probably be a good joke.

Lots of devices can record 4K and also play that content back without issues, hardware is only a problem if editing the content, and there is no real need to do that for most users. They can always edit it later as hardware becomes more powerful and widely available.

Why store memories or other valuable work in an inferior format?
 
On the positive side, it was great to be able to adjust white balance and not lose detail -- it was just the shadows/highlights where I expected a bit more.

RAW implies uncooked, but there are still variations of RAW (12-bit vs 14-bit, for instance). From the article mentioned, I think RX100 is doing only 11-bit.
Possibly. If so, I would love for you to show me the difference in results produced from an 11-bit RAW file and a RAW file with a higher bit depth - but do not compare across sensor sizes. Here's a hint: It will not make the difference in poor highlight/shadow recovery that you referenced in the video.
This is what I meant by saying that RX100 "RAW" may not be the same "RAW" that you are familiar with from standard DSLRs.
What you said is that it's not necessarily real RAW. It is in fact as real as the RAW files from lots of cameras.
Is there any way to identify the bit depth of the RAW file?
 
Why store memories or other valuable work in an inferior format?
I used to think this way too... but in reality, I don't think people ever go back like this. Especially not for video, which is too intensive in terms of editing it into something consumable + adding music and effects... You'd have to re-export the entire video, it'd be so much work for little gain. I can't imagine many people 10 years from now, trying to re-render some vacation video.

RAW photos is a bit different since a photo isn't compiled into some collage...
 
On the positive side, it was great to be able to adjust white balance and not lose detail -- it was just the shadows/highlights where I expected a bit more.

RAW implies uncooked, but there are still variations of RAW (12-bit vs 14-bit, for instance). From the article mentioned, I think RX100 is doing only 11-bit.
Possibly. If so, I would love for you to show me the difference in results produced from an 11-bit RAW file and a RAW file with a higher bit depth - but do not compare across sensor sizes. Here's a hint: It will not make the difference in poor highlight/shadow recovery that you referenced in the video.
This is what I meant by saying that RX100 "RAW" may not be the same "RAW" that you are familiar with from standard DSLRs.
What you said is that it's not necessarily real RAW. It is in fact as real as the RAW files from lots of cameras.
Is there any way to identify the bit depth of the RAW file?
Probably, with tools that I don't possess. RawDigger would be one, I guess.
 
Last edited:
4k is old hat already, we will soon be expect this to be 6k plus and calling 4k an inferior format. Plans to have a format that will have long term immunity against being comparatively poor are unrealistic.

that said there is a growing consensus that most folk are not able to discriminate between 4k and HD content, and for many HD is equal to the task and it is a thinking error to driven by marketing propaganda to think otherwise.

dont get me wrong, 4 k is marvellous as a format but then again if you buy into nothing else will do, then where do you stand on stills? Will only MF do as the rest are inferior ?
 
Last edited:
I think it's good to have 4K these days. You get more flexibility in editing the footage and 4K is pretty standard now for TVs.
4K is standard for TV hardware, but not necessarily TV content. I think most shows are still in 1080P. Tech-wise, HDR has been a bigger visual difference than 4K. Many people cannot tell the difference between 4K and 1080, and one primary reason for this is that in video there is often motion blur anyways. (The case is different for stills). I might also add, if you really want to create "UHD" content for high-end TVs, you'll need a HDR + 4K workflow, which balloons hardware requirements. Pros will go this length and sell the content to reap profits, but if you're not intending to sell your content like this, then I'd recommend making life easier on yourself and create in 1080p at least until computer hardware catches up.
Why does it matter if there is any TV content in 4K right now. For people creating memories or any other kind of video they value, they can view that on the hardware available right now, in full quality. They can also make sure their video quality is as good as possible far into the future, a future in which 1080p will probably be a good joke.

Lots of devices can record 4K and also play that content back without issues, hardware is only a problem if editing the content, and there is no real need to do that for most users. They can always edit it later as hardware becomes more powerful and widely available.

Why store memories or other valuable work in an inferior format?
+!

Why would I want to shoot 1080p when I can shoot 4K? Even if my delivered format was HD I can use 4K as the source to either crop in for a different view or down size for a superior image. HDR is not necessary for UHD & only complicates matters. The difference between the detail in HD & UHD is clearly visible.
 
4k is old hat already, we will soon be expect this to be 6k plus and calling 4k an inferior format. Plans to have a format that will have long term immunity against being comparatively poor are unrealistic.

that said there is a growing consensus that most folk are not able to discriminate between 4k and HD content, and for many HD is equal to the task and it is a thinking error to driven by marketing propaganda to think otherwise.
What consensus? If you can't see the difference between 4K & 1080p either you need glasses or your screen is too small or you are watching from too far away.

The commonest size for HD TVs was 46". Today the commonest size for a 4K TV is 60". The difference is chalk & cheese. 4K is like looking through a window.
 
Last edited:
Why store memories or other valuable work in an inferior format?
Because memories are not dependent of video quality. I have some videos of my oldest grandson taken in 2005 in 480x360 or something similar. The memories are not diminished by the poor quality. That said there's a cutoff beyond which video IQ improvements don't add much unless viewed at huge sizes. With still photography that point is often considered to be 8 mp. I think HD is the point for video when viewing up to a 60" screen +/-. What next 8K? I can imagine 8K on huge movie screens would be nice.

There will always be something better. Hasselblad makes a 100mp camera. If you don't use that camera all your stills are from an inferior format. Get my point?

I mentioned earlier that 4K would be nice for the ability to zoom into HD when post processing. I can see a use for that while editing and making a video more interesting. This could replace using multiple cameras in some cases.

--
Tom
Look at the picture, not the pixels
 
Last edited:
4k is old hat already, we will soon be expect this to be 6k plus and calling 4k an inferior format. Plans to have a format that will have long term immunity against being comparatively poor are unrealistic.

that said there is a growing consensus that most folk are not able to discriminate between 4k and HD content, and for many HD is equal to the task and it is a thinking error to driven by marketing propaganda to think otherwise.
What consensus? If you can't see the difference between 4K & 1080p either you need glasses or your screen is too small or you are watching from too far away.
Maybe you are watching too close! The fact is for most people HD is good enough for video and 8mp for stills. That doesn't mean there's something wrong with them.
The commonest size for HD TVs was 46". Today the commonest size for a 4K TV is 60". The difference is chalk & cheese. 4K is like looking through a window.
One of my cameras is 42mp. Do you have a camera with that much resolution? If you don't and can't see or don't care about the difference between 42mp and 20mp for example you must need glasses or are looking at too small a size.

My point is stop criticizing someone else who disagrees with you on the need for 4K. It's none of your business what they like. Instead you shoot 4K and let them be happy with HD.

Me I can see the difference between 4K and HD but it's not enough to matter for me. I don't demand the same quality in video as I do for stills because the constant movement tends to obscure a lack of resolution. It's when you freeze the image on the screen that the difference becomes obvious.
 
Why store memories or other valuable work in an inferior format?
...The memories are not diminished by the poor quality...

There will always be something better. Hasselblad makes a 100mp camera. If you don't use that camera all your stills are from an inferior format. Get my point?
Yes, good point.

I guess what I was trying to say, that in my personal opinion I would utilise the best quality format available to me (at that time and in that situation) to capture whatever it is I value.

For example I don't have a 100 MP camera, so even though that's what I would like to use, I can't. Although if I had a camera that records both 1080p and 4K, why would I use 1080p?

Obviously situations are different, some people have storage issues, their cameras can't record 4K in a useful manner, they have nobody to pass the recordings onto etc. It remind me a lot of other similar discussions, such as *RAW vs JPEG etc.

So I believe it's useful to think about the situation and if we have a good enough reason to not use the best quality format we have. I don't think not being able to edit the footage right now is a good enough reason, at least not for me.

*I also regret not shooting a lot of photos in RAW (when I easily could have but just didn't know about it) when I started using a RAW capable camera.
 
Why store memories or other valuable work in an inferior format?
...The memories are not diminished by the poor quality...

There will always be something better. Hasselblad makes a 100mp camera. If you don't use that camera all your stills are from an inferior format. Get my point?
Yes, good point.

I guess what I was trying to say, that in my personal opinion I would utilise the best quality format available to me (at that time and in that situation) to capture whatever it is I value.

For example I don't have a 100 MP camera, so even though that's what I would like to use, I can't. Although if I had a camera that records both 1080p and 4K, why would I use 1080p?
You've already made your choice, which is fine. The question in your mind is actually Why would someone out there use 1080p?
Obviously situations are different, some people have storage issues, their cameras can't record 4K in a useful manner, they have nobody to pass the recordings onto etc. It remind me a lot of other similar discussions, such as *RAW vs JPEG etc.
Exactly. So you do know the answer: Not everyone has the same priorities ... so not everyone cares about maximum quality when using a camera. I don't, for example. Good enough is good enough.
So I believe it's useful to think about the situation and if we have a good enough reason to not use the best quality format we have. I don't think not being able to edit the footage right now is a good enough reason, at least not for me.
Again, that's fine.
*I also regret not shooting a lot of photos in RAW (when I easily could have but just didn't know about it) when I started using a RAW capable camera.
I've known how to shoot and process RAW for many years, and I still shoot 95% JPEG. No regrets here at all.

On the other hand, I have no doubt that I'm more meticulous and critical (even obsessive/compulsive) about other aspects of my life than some others out there. We're all different.
 
Last edited:
Although if I had a camera that records both 1080p and 4K, why would I use 1080p?
Simply, it is still a pain to process 4K video. It takes forever, and often you need to set up proxies where you edit on a "proxy" 720P video and then render out the 4K over many hours. It bogs down your system. If you have an 18-core beastly machine at home, then sure go ahead and shoot your 4K -- even then, the visual difference isn't going to be too noticeable. Even watching 4K movies at home, I rarely stop and say "Oh hey look at all that detail", usually there's too much bokeh & motion blur in videos to notice.
 
One thing we didn't discuss so much, that I want to get your opinion on, is that the RX100 VI has an f/7.6 equivalent "bokeh" aperture at 24mm. This is compared to the RX100 V, at f/4.9. This is a 2.4x stop difference in bokeh-blurring ability (the light/exposure is unaffected).

Now... this really got me thinking. I love bokeh at wide angles, I think they're really great for separating people/subject from the background. I hardly ever would choose to shoot at f/8 wide-angle unless it were like a panoramic vista, since I tend to like some bokeh background-blurring between my subject matter.

The additional 8x zoom is still appealing, but I think this really starts to get me thinking whether I really want to go around shooting at 24mm f/8 all the time. The RX100 V's f/4.9 is more acceptable in this regard. It's a difficult trade-off for the zoom.
 
4k is old hat already, we will soon be expect this to be 6k plus and calling 4k an inferior format. Plans to have a format that will have long term immunity against being comparatively poor are unrealistic.

that said there is a growing consensus that most folk are not able to discriminate between 4k and HD content, and for many HD is equal to the task and it is a thinking error to driven by marketing propaganda to think otherwise.
What consensus? If you can't see the difference between 4K & 1080p either you need glasses or your screen is too small or you are watching from too far away.
Maybe you are watching too close! The fact is for most people HD is good enough for video and 8mp for stills. That doesn't mean there's something wrong with them.
4K video on the Sony RX10/RX100 gives you 24/25/30 8mp images per second. Why put up with "good enough"?
 

Keyboard shortcuts

Back
Top