Raw vs jpeg part 2

Halina123

Senior Member
Messages
1,657
Solutions
1
Reaction score
1,209
Digidog wrote. In a full previous thread.

You can't capture a raw +JPEG in one 'capture' and have ideal exposure for both.

The test should be to properly expose for raw: shoot it. Properly expose for JPEG: shoot it. Compare the two.

The reason there's so much misunderstanding of the difference between the two captures is that those who tell us their findings often didn't do the testing correctly. So it's not useful data.

-------------------------------------------------------------------'''''

Does anyone know any camera that has a Raw+Jpeg setting and automatically exposes properly (differently) for raw and jpeg.

All my cameras give identical exposures in Raw+ Jpeg mode.
 
Digidog wrote. In a full previous thread.

You can't capture a raw +JPEG in one 'capture' and have ideal exposure for both.

The test should be to properly expose for raw: shoot it. Properly expose for JPEG: shoot it. Compare the two.

The reason there's so much misunderstanding of the difference between the two captures is that those who tell us their findings often didn't do the testing correctly. So it's not useful data.

-------------------------------------------------------------------'''''

Does anyone know any camera that has a Raw+Jpeg setting and automatically exposes properly (differently) for raw and jpeg.

All my cameras give identical exposures in Raw+ Jpeg mode.
That's probably because what's happening is that the camera shoots a raw, records it, and processes a JPEG from that RAW. With whatever setting are there for JPEG. The camera's computer assumed you wanted that rather washed out bright shot that you set up because you were using ETTR to get the best possible RAW.

I suppose there could be a camera that had the computer settings within to say dial that exposure back to bring down the highlights, etc producing a better JPEG.

What I guess I'm saying is that since the image always starts out as RAW it's really a processing question, not an exposure question. And I think the question of whether there is a camera that can tweak the JPEG production of the proper ETTR to come up with something like one would do on the computer is probably: no. But I hope I'm wrong.

I'm also a bit curious if there's a camera (or camera phone) that perhaps always shoots ETTR, even if the user is unaware of it, and never even sees the washed out image, as in a P&S with no EV controls.
 
I look at it this when, when I'm shooting RAW (which is nearly always), I'm thinking about how I'm going to push and pull things in post processing later. If I were shooting JPEGs, thinking that I likely wound't be post processing them, I'd likely shoot more conservatively, especially as far as letting shadows go dark because I know that I can get some detail in the processing stage.

From my way of looking at it, if I were the type shooter who does a more straight up conversion to my RAW files, rather than the more radical adjustments that I make my exposure would probably be pretty similar if I were shooting JPEGs... Hard to compare though because I think that I end up getting contrasts and tones that I likely wouldn't be able to get with straight out of the camera JPEGs anyway...

--
my flickr:
www.flickr.com/photos/128435329@N08/
 
Last edited:
Digidog wrote. In a full previous thread.

You can't capture a raw +JPEG in one 'capture' and have ideal exposure for both.

The test should be to properly expose for raw: shoot it. Properly expose for JPEG: shoot it. Compare the two.

The reason there's so much misunderstanding of the difference between the two captures is that those who tell us their findings often didn't do the testing correctly. So it's not useful data.

-------------------------------------------------------------------'''''

Does anyone know any camera that has a Raw+Jpeg setting and automatically exposes properly (differently) for raw and jpeg.
No such camera.
All my cameras give identical exposures in Raw+ Jpeg mode.
And Andrew's observation is correct.

Using a digital camera, if you want the best possible image quality then make full use of the sensor. Expose the sensor to full saturation without clipping diffuse highlights. The problem with that is it's tricky and dangerous. My favorite analogy is an electrified fence. Your goal is to get as close to the fence as you possibly can, but DO NOT TOUCH IT! Clipping diffuse highlights in the raw file (touch the fence) and you crash and burn.

So when the camera manufacturers adjust and calibrate and tweak their camera's metering systems and processing software they understandably take a conservative position and stand a little back from the fence -- like 20% or even 40% of the sensor's capacity back from the fence.

Shooting only JPEGs the photographer is unaware of the above and learns to manipulate the camera to get a best possible JPEG. In most cases this is a no harm no foul since the sensor's total dynamic range exceeds what is needed in the final JPEG and the difference that would show up is pretty d*mn minor.

When can this difference matter: High contrast light, and backlight conditions where the full dynamic range of the sensor can make a difference and the photographer knows that and intends to take full advantage.

Here's a backlight example from the garden:

First the camera JPEG SOOC except for re-size.

SOOC JPEG
SOOC JPEG

The diffuse highlights in the JPEG are clipped (crash and burn). When I took the photo I added a +.3 EC and fully expected the clipped highlights in the JPEG. But the raw file is not clipped. To prevent my camera processor from clipping the JPEG I would have had to give up the +.3 EC and very likely go with a -.3 EC. That's 2/3 stop less exposure. In high contrast light like this I switched gears into "electric fence mode" and went for a full sensor exposure.

Imagine the photo with -.3 EC. It's going to be darker. Conventional wisdom teaches that it's often necessary to increase exposure beyond the meter reading in backlight, but that conventional wisdom also accepts blown highlights. I don't accept the blown highlights and I don't have to accept the blown highlights if I get a full saturation raw file (and don't touch the fence).

processed from raw
processed from raw
 
Last edited:
We all shoot RAW, some of us copy the RAW from the buffer, some of us don't but we all shoot RAW. The camera can process RAW files without an issue, it is only if you want to do something special on a PC it can be good to shoot RAW.

If conditions are good the RAW-file in the buffer is enough and can be discarded if JPEG settings in camera are fine.

This discussion is 10 years old and need an upgrade.

--
" Use the shutter button on the headset cord " - Leonardo Da Vinci
 
Last edited:
We all shoot RAW, some of us copy the RAW from the buffer, some of us don't but we all shoot RAW. The camera can process RAW files without an issue, it is only if you want to do something special on a PC it can be good to shoot RAW.

If conditions are good the RAW-file in the buffer is enough and can be discarded if JPEG settings in camera are fine.

This discussion is 10 years old and need an upgrade.
 
I still mainly shot in JPEG as it saves memory space, easier to handle across devices, saves time and usually the results are fine. I still shot some RAW if i feel the subject is particulary important or detailed, though i often don't have the time to process them.

Besides most of the time the PC processed image from RAW to JPEG can look similar to the in-camera processed image, particulary in regards to the colour saturation. The colour from the RAW often looks so much better than the JPEG version out-putted from the camera so i've tried converting from the RAW myself only to find that the PC-processed image colours look similar or even slightly more washed-out than the camera's version so sometimes i don't feel it's worth the bother even though i can get more out of the shadows & highlights. Maybe i need to learn some new post-pro techniques/softwares.

It's a pity we can't do a personal set-up of how we want the camera to process the data captured by the sensor - how much compression, how much colour saturation, how much shadow/highlight correction, etc., of course this would be for advanced users only!
 
I still mainly shot in JPEG as it saves memory space, easier to handle across devices, saves time and usually the results are fine. I still shot some RAW if i feel the subject is particulary important or detailed, though i often don't have the time to process them.

Besides most of the time the PC processed image from RAW to JPEG can look similar to the in-camera processed image, particulary in regards to the colour saturation. The colour from the RAW often looks so much better than the JPEG version out-putted from the camera so i've tried converting from the RAW myself only to find that the PC-processed image colours look similar or even slightly more washed-out than the camera's version so sometimes i don't feel it's worth the bother even though i can get more out of the shadows & highlights. Maybe i need to learn some new post-pro techniques/softwares.
Yes you need to practice or look for online tutorials.

It's a pity we can't do a personal set-up of how we want the camera to process the data captured by the sensor - how much compression, how much colour saturation, how much shadow/highlight correction, etc., of course this would be for advanced users only!
On many Good cameras the user can set up compression, noise reduction, colour saturation, contrast, sharpness, tone characteristics etc and the camera develops the jpeg according to your taste.

But most photographers prefer to shoot raw and do it afterwards because they can do a better job on a large screen computer with a more powerful processor.
 
I still mainly shot in JPEG as it saves memory space, easier to handle across devices, saves time and usually the results are fine. I still shot some RAW if i feel the subject is particulary important or detailed, though i often don't have the time to process them.

Besides most of the time the PC processed image from RAW to JPEG can look similar to the in-camera processed image, particulary in regards to the colour saturation. The colour from the RAW often looks so much better than the JPEG version out-putted from the camera so i've tried converting from the RAW myself only to find that the PC-processed image colours look similar or even slightly more washed-out than the camera's version so sometimes i don't feel it's worth the bother even though i can get more out of the shadows & highlights. Maybe i need to learn some new post-pro techniques/softwares.
Yes you need to practice or look for online tutorials.
It's a pity we can't do a personal set-up of how we want the camera to process the data captured by the sensor - how much compression, how much colour saturation, how much shadow/highlight correction, etc., of course this would be for advanced users only!
On many Good cameras the user can set up compression, noise reduction, colour saturation, contrast, sharpness, tone characteristics etc and the camera develops the jpeg according to your taste.

But most photographers prefer to shoot raw and do it afterwards because they can do a better job on a large screen computer with a more powerful processor.
I don't have a problem with how you choose to process your images, but really think the last paragraph could be in error. Most who post here on dp review open forum would very likely be accurate, but in my association with photographers and hobbiest in the area I live in indicate if describing them this would be in error.

I also see evidence daily on these forums that most cameras are capable of producing straight out of the camera jpegs that blow away the raw processing abilities of many who choose to do it themselves after the fact. There are some folks that do a pretty good job of it, but there seem to be at least as many that need a lot more practice to get to the point that they can equal what the camera can do. And there are some that can make some real messes of what might have been a great shot. And I guarantee you that I would be one of those that can make a real mess if I choose to attempt it. The hotshots at the manufacturer that write the programs are certainly way better and talented than I.
 
I also see evidence daily on these forums that most cameras are capable of producing straight out of the camera jpegs that blow away the raw processing abilities of many who choose to do it themselves after the fact. ... And there are some that can make some real messes of what might have been a great shot. And I guarantee you that I would be one of those that can make a real mess if I choose to attempt it.
The safe way to start into raw is to install the raw converter that the camera manufacturer recommends, load a raw into it, and save a TIFF or a JPEG, changing nothing. The output will match the out of camera JPEG very closely. From there, one can see what changes to make to raw converter settings, brightness correction, white balance, sharpening, cropping,... to improve the output.
 
Maybe you are one of the ones that know how to use it. But you don't have to look far to see some real messes people have made trying to turn out what they thought was going to be the ultimate. It's easy to find examples that prove that not just anybody can make a great conversion from raw - it takes talent that some folks just plain don't have. For them it's best left to the camera.
 
Last edited:
Digidog wrote. In a full previous thread.

You can't capture a raw +JPEG in one 'capture' and have ideal exposure for both.
Of course, I can. In very low light, the SS and the aperture are at their critical values for DOF and motion blur. RAW or JPEG does not matter for the exposure then.
 
Just my opinion, I use a raw file for every photograph I take, the reason and I may be deluded but I always expect the next photograph I take to be a lifetime best, if it is a special image I have the file at its best possible quality giving me all the options for editing depending on intended use.
 
Digidog wrote. In a full previous thread.

You can't capture a raw +JPEG in one 'capture' and have ideal exposure for both.

The test should be to properly expose for raw: shoot it. Properly expose for JPEG: shoot it. Compare the two.

The reason there's so much misunderstanding of the difference between the two captures is that those who tell us their findings often didn't do the testing correctly. So it's not useful data.

-------------------------------------------------------------------'''''

Does anyone know any camera that has a Raw+Jpeg setting and automatically exposes properly (differently) for raw and jpeg.
No such camera.
All my cameras give identical exposures in Raw+ Jpeg mode.
And Andrew's observation is correct.

Using a digital camera, if you want the best possible image quality then make full use of the sensor. Expose the sensor to full saturation without clipping diffuse highlights. The problem with that is it's tricky and dangerous. My favorite analogy is an electrified fence. Your goal is to get as close to the fence as you possibly can, but DO NOT TOUCH IT! Clipping diffuse highlights in the raw file (touch the fence) and you crash and burn.

So when the camera manufacturers adjust and calibrate and tweak their camera's metering systems and processing software they understandably take a conservative position and stand a little back from the fence -- like 20% or even 40% of the sensor's capacity back from the fence.

Shooting only JPEGs the photographer is unaware of the above and learns to manipulate the camera to get a best possible JPEG. In most cases this is a no harm no foul since the sensor's total dynamic range exceeds what is needed in the final JPEG and the difference that would show up is pretty d*mn minor.

When can this difference matter: High contrast light, and backlight conditions where the full dynamic range of the sensor can make a difference and the photographer knows that and intends to take full advantage.

Here's a backlight example from the garden:

First the camera JPEG SOOC except for re-size.

SOOC JPEG
SOOC JPEG

The diffuse highlights in the JPEG are clipped (crash and burn). When I took the photo I added a +.3 EC and fully expected the clipped highlights in the JPEG. But the raw file is not clipped. To prevent my camera processor from clipping the JPEG I would have had to give up the +.3 EC and very likely go with a -.3 EC. That's 2/3 stop less exposure. In high contrast light like this I switched gears into "electric fence mode" and went for a full sensor exposure.

Imagine the photo with -.3 EC. It's going to be darker. Conventional wisdom teaches that it's often necessary to increase exposure beyond the meter reading in backlight, but that conventional wisdom also accepts blown highlights. I don't accept the blown highlights and I don't have to accept the blown highlights if I get a full saturation raw file (and don't touch the fence).

processed from raw
processed from raw
This is a very good example for the difference between JPG and RAW. In direct comparison the JPG almost looks dull. Without directly comparing a JPG to its associated RAW the JPG would look quite acceptable. And this is the very reason that JPG shooters can and will be content with their results. Modern cams have indeed quite capable JPG engines.

The reason for the noticeable difference can be - and has been - decribed simply: no matter how good the in-camera processor is, the computing power of the PC/MAC is dramatically superior. And no matter how carfully the in-camera settings have been chosen, the processing possibilities of a good RAW processor like Adobe ACR are much more versatile. And not to forget: the OOC JPG is a final result which can of course be tweaked quite a bit, but the RAW is just a starting point with all processing options left.

Having been a strong advocate of JPGs until about two years ago I've completely changed my mind. There has been not a single image - out of thousands - that did not look better - slightly or even significantly - when processed with ACR as compared to the OOC image.

Andreas
 
Last edited:
  • Like
Reactions: pgb
Digidog wrote. In a full previous thread.

You can't capture a raw +JPEG in one 'capture' and have ideal exposure for both.
Of course, I can. In very low light, the SS and the aperture are at their critical values for DOF and motion blur. RAW or JPEG does not matter for the exposure then.
Even in that use case there's a good chance that your "exposure" settings will differ. I put the word exposure in quotes because you failed to address ISO in your example. Putting aside the whole exposure triangle/definitional issue, when you factor in ISO the applicability of your exception to Digidog's rule is narrowed further. How much narrower will depend on the particular camera used and its high ISO behavior and the DR of the scene. Bear in mind here that most lowlight scenes have visible light source(s) that create a potential DR tradeoff.
 
This is a very good example for the difference between JPG and RAW. In direct comparison the JPG almost looks dull. Without directly comparing a JPG to its associated RAW the JPG would look quite acceptable. And this is the very reason that JPG shooters can and will be content with their results. Modern cams have indeed quite capable JPG engines.
While I agree that you can almost always do better by shooting raw, exposing properly for raw, and processing yourself, to be fair the example here isn't a particularly good one. The JPEG version shows very little clipping. In fact, there's nothing meaningful that's been lost as a result of the exposure used in the shot. Thus, for purposes of the raw shot, there was certainly no need to reduce exposure by -.3 EC. If anything, exposure should have been increased. I'm quite confident that if Ysarex posted the raw histograms generated by Rawdigger for the raw version, we'd see that there was a significant amount of unused highlight headroom. As for the difference in look, that's mostly due to the different tone curves used, WB differences, camera position differences, etc. and not due to the superiority of the raw capture itself.

To illustrate my points here, I've taken screen shots of the two versions with associated histograms and gamut warning (for sRGB) turned on. Note in particular how few gray gamut warning spots appear in the top (OOC JPEG) rendering. Note that the OOC JPEG histograms indicate very little highlight clipping and no shadow clipping at all. The rendering from the raw original has quite a bit of shadow clipping. Of course, this could be corrected in processing but at the expense of introducing more noise relative to the OOC JPEG since the raw version was less exposed.

OOC JPEG=top; Raw rendering=bottom
OOC JPEG=top; Raw rendering=bottom
 
This discussion is 10 years old and need an upgrade.
I thought the OP was trying that, but per usual people are discussing the quality of RAW vs JPEG and other wheezy topics.

The question was whether a camera body exists that essentially does what a RAW ETTR photographer does. So in an auto or semi-auto mode it selects a "correct" exposure that pushed to the clipping point, and then when producing the JPEG in-camera renders that RAW into an image with the exposure corrected and hence the highlights are brought down and shadows up, as one would do in post processing.
 
Digidog wrote. In a full previous thread.

You can't capture a raw +JPEG in one 'capture' and have ideal exposure for both.

The test should be to properly expose for raw: shoot it. Properly expose for JPEG: shoot it. Compare the two.

The reason there's so much misunderstanding of the difference between the two captures is that those who tell us their findings often didn't do the testing correctly. So it's not useful data.

-------------------------------------------------------------------'''''

Does anyone know any camera that has a Raw+Jpeg setting and automatically exposes properly (differently) for raw and jpeg.

All my cameras give identical exposures in Raw+ Jpeg mode.
JPEGS for noobs that dont want any control over how their pictures look (or dont care about making them look better)
 
This is a very good example for the difference between JPG and RAW. In direct comparison the JPG almost looks dull. Without directly comparing a JPG to its associated RAW the JPG would look quite acceptable. And this is the very reason that JPG shooters can and will be content with their results. Modern cams have indeed quite capable JPG engines.
While I agree that you can almost always do better by shooting raw, exposing properly for raw, and processing yourself, to be fair the example here isn't a particularly good one. The JPEG version shows very little clipping. In fact, there's nothing meaningful that's been lost as a result of the exposure used in the shot. Thus, for purposes of the raw shot, there was certainly no need to reduce exposure by -.3 EC. If anything, exposure should have been increased. I'm quite confident that if Ysarex posted the raw histograms generated by Rawdigger for the raw version, we'd see that there was a significant amount of unused highlight headroom. As for the difference in look, that's mostly due to the different tone curves used, WB differences, camera position differences, etc. and not due to the superiority of the raw capture itself.
You are right about part of the different look of the two images. But still the message is valid - at least from my point of view - that RAW processed images are better.

And one more point should be mentioned: if the OOC JPGs are good enough for the purpose they can easily be accepted as a final result. But if they call for some - even minor - tweaking this procedure should not be applied to the JPGs but to the RAWs. The time and effort is the same as with JPG processing but the result is much more rewarding.

Bottom line: either OOC JPGs untouched or processed RAWs.

Andreas
 

Keyboard shortcuts

Back
Top