Iliah Borg
Forum Pro
Take your camera, shoot, process, print, scan, and post. Don't expect somebody to do your homework.would there be a visible difference in the final print between exposing for JPEG and exposing for RAW?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Take your camera, shoot, process, print, scan, and post. Don't expect somebody to do your homework.would there be a visible difference in the final print between exposing for JPEG and exposing for RAW?
What a sensible idea; actually testing something yourself (with proper steps) to make your own, educated decisions!Take your camera, shoot, process, print, scan, and post. Don't expect somebody to do your homework.would there be a visible difference in the final print between exposing for JPEG and exposing for RAW?
GIGO:Garbage In Garbage Out!A basic assumption in this thread has been that one needs to absolutely maximize certain aspects of image quality.
The in camera JPEG is either over exposed because you wish to ideally expose the raw or the raw is under exposed because you ideally exposed the JPEG. It's simple to test (why doesn’t everyone take the 5 minutes to do so?).Assume your camera is set to RAW+JPEG at ISO 100. You take two images. One is exposed with the intent of producing the best possible RAW file. The other is exposed with the intent of producing the best possible out-of-camera JPEG.
Yes and you can figure this out on your own by spending less time asking and actually doing the tests yourself!Will there be a visible difference between the resulting two images?
Knowing by testing before making statements is an ideal first step in answering your questions. DO try it.Knowing how big a visible difference it makes is an important factor in knowing how important it is to properly expose the RAW, and when you should do so.
If it is not, why the RAW option in the first place?I don't think anyone can tell the difference. I believe photography is more about the subject matter of what's happening in the picture rather than if it's technically perfect. So I think that a batch of 100 random images, each containing 50 jpegs and 50 raws all processed and resized for webview at 4K, would result in no one guessing within 10.
If raw is so completely overwhelmingly better, then why is JPEG on every camera ever made including the flagship models?
They are actually more challenging than that. Blown highlights are far more common than challenging DR scenes. Poor WB is another example.All I hear about are personal examples, which isn't fair at all. I could easily post-process a JPEG to look superior to a raw. I could purposefully take a picture with a wide dynamic range and prove raw is superior. Real pictures aren't usually like that.
Of course what's what happens. All cameras shoot this raw file. You can decided if you want to keep it or toss it.robgendreau wrote: That's probably because what's happening is that the camera shoots a raw, records it, and processes a JPEG from that RAW.
The data is simply vastly different. Treating a JPEG and a raw the same is like assuming an ISO 100 transparency film and a 400 ISO color neg should be treated the same. They should not. Anyone testing these differences (in film or digital) can see this with about 5 minutes of testing.With whatever setting are there for JPEG. The camera's computer assumed you wanted that rather washed out bright shot that you set up because you were using ETTR to get the best possible RAW.
Exactly! And the data is different (one is linear encoded and needs rendering)..What I guess I'm saying is that since the image always starts out as RAW it's really a processing question, not an exposure question.
Exposure Compensation anyone?I'm also a bit curious if there's a camera (or camera phone) that perhaps always shoots ETTR, even if the user is unaware of it, and never even sees the washed out image, as in a P&S with no EV controls.
A world where a lot of people don't have a clue creative idea how to render their raws perhaps, or use inferior raw processing software they have no idea how to control.You are living in a dream world.I also see evidence daily on these forums that most cameras are capable of producing straight out of the camera jpegs that blow away the raw processing abilities of many who choose to do it themselves after the fact.
Pure and simple.
Here we go again. People are happy who actually properly test both options or people who didn't test this, are just somewhat lazy and ignorant and are happy they are happy? Big difference.The short answer is that JPEG is an easier workflow, and many people are happy with the results....
If raw is so completely overwhelmingly better, then why is JPEG on every camera ever made including the flagship models?
...
How would you know if you didn't try the other process? Oh, speculate.There are many situations where the results from shooting JPEG are more than "good enough".
The two should be connected at the hip.The long answer:
I think you are confusing the concept of a "better tool" with "better results".
Could they do better? In many cases, especially with guidelines from those who've done some testing, yes. Much better quality, more flexible image data to produce an image.JPEG shooters are quite capable of producing excellent quality results.
And that doesn't ever constitute better results (then JPEG shooters?).RAW shooters have more flexibility in post processing.
And that doesn't ever constitute better results (then JPEG shooters?).This gives them a wider range of options when processing the file.
It's not a good match unless you can colorimetrically measure the scene and what you produced from the rendering. You can't. So don't go there.If the in-camera JPEG processing is a good match for the scene being photographed, then there may not be an advantage to shooting RAW for that image.
And that doesn't ever constitute better results (then JPEG shooters?).If there is something in the image that the in-camera JPEG processing won't handle well, then there can be a tremendous advantage to shooting RAW.
Who said you can't get excellent results from raw? The only way to KNOW you can do better is properly capture the JEPG then the raw, the later being properly exposed.Thus the assertion that RAW is "better" is not incompatible with the assertion that one can get excellent results from JPEG.
Speak for yourself!I don't think anyone can tell the difference.
Yes. Start with bad assumptions, and you will get bad conclusions.GIGO:Garbage In Garbage Out!A basic assumption in this thread has been that one needs to absolutely maximize certain aspects of image quality.
Maximizing image quality is not the same as making a visible difference.Why not maximize the image data quality? You CAN see a difference!
F for logic.Start with bad assumptions, and you will get bad conclusions.
Although you don't address it, this brings up the question as to why cameras are designed in this fashion?... The in camera JPEG is either over exposed because you wish to ideally expose the raw or the raw is under exposed because you ideally exposed the JPEG.
Do you have evidence to prove this?But most photographers prefer to shoot raw
and do it afterwards because they can do a better job on a large screen computer with a more powerful processor.
You CAN see a difference! I can see a difference.Yes. Start with bad assumptions, and you will get bad conclusions.GIGO:Garbage In Garbage Out!A basic assumption in this thread has been that one needs to absolutely maximize certain aspects of image quality.
Maximizing image quality is not the same as making a visible difference.Why not maximize the image data quality? You CAN see a difference!
It does, for raw! You just have to understand the basics of CFA's, image capture, raw data and it's encoding and more importantly, how to properly exposure for it.Although you don't address it, this brings up the question as to why cameras are designed in this fashion?...
The in camera JPEG is either over exposed because you wish to ideally expose the raw or the raw is under exposed because you ideally exposed the JPEG.
If ETTR is the strategy that provides the best quality, why don't cameras do this by default?
The PHOTOGRAPHER knows or should know. The camera always produces raw data. Always.If the camera knows that the RAW has been exposed in this way, the camera can certainly adjust for that when it creates the in-camera JPEG.
ETTR is an old term for optimal exposure, for raw. Nothing more. If you treat the JPEG and raw identically in terms of exposure, you're being as silly as treating ISO 100 transparency film as ISO 400 color neg film the same way out of ignorance.As it is clearly possible to design a camera that uses ETTR and can correctly produce a JPEG from that RAW, why don't all cameras work this way?
For some who don't understand the basics of photography and exposure? There is, too little education and testing.Clearly there must be a reason.
No one denies that there are images where ETTR can help. In the example you linked ETTR helped reduce shadow noise in a scene that had high dynamic range.
I meant to say most photographers on DPR. And my survey in another thread confirms this.Do you have evidence to prove this?But most photographers prefer to shoot raw
and do it afterwards because they can do a better job on a large screen computer with a more powerful processor.
No one should deny it.No one denies that there are images where ETTR can help. In the example you linked ETTR helped reduce shadow noise in a scene that had high dynamic range.
DO your own testing.However, this tells us nothing about the visibility of the difference for more mainstream images.
I'm sorry, I don't see how your answer addresses the question.It does, for raw! You just have to understand the basics of CFA's, image capture, raw data and it's encoding and more importantly, how to properly exposure for it.Although you don't address it, this brings up the question as to why cameras are designed in this fashion?...
The in camera JPEG is either over exposed because you wish to ideally expose the raw or the raw is under exposed because you ideally exposed the JPEG.
If ETTR is the strategy that provides the best quality, why don't cameras do this by default?
When you buy film, there's no assumption you're not a rube have have zero idea how to ideally expose for that media. You test. At least some of us in the old days did with film, just as we do with digital.
Your position is based on the premise that you know what people want, even when they have other ideas....
Who said you can't get excellent results from raw? The only way to KNOW you can do better is properly capture the JEPG then the raw, the later being properly exposed.
You can get an inferior raw if you also capture JPEG the same way; that's a fact. You can get a boatload more DR from the raw than the camera JPEG! You can get a wider bit depth and color gamut. Those are the facts of the data!
...
Good question !I'm sorry, I don't see how your answer addresses the question.It does, for raw! You just have to understand the basics of CFA's, image capture, raw data and it's encoding and more importantly, how to properly exposure for it.Although you don't address it, this brings up the question as to why cameras are designed in this fashion?...
The in camera JPEG is either over exposed because you wish to ideally expose the raw or the raw is under exposed because you ideally exposed the JPEG.
If ETTR is the strategy that provides the best quality, why don't cameras do this by default?
When you buy film, there's no assumption you're not a rube have have zero idea how to ideally expose for that media. You test. At least some of us in the old days did with film, just as we do with digital.
If I understand your position, it is that one should use a different exposure strategy when one wants to maximize the quality of the information in the RAW file, as opposed to when one wants to maximize the quality of an in-camera JPEG.
The reason is that the in-camera JPEG processing will produce an image that is "too bright" if the RAW file is properly exposed.
My question was why are modern cameras designed in this fashion? If the best quality is always obtained by ETTR, why doesn't the in-camera meter do this by default? The in-camera JPEG processing would know about this, and take it into account.
If this is the best strategy in the general case, why aren't cameras designed this way?