An interesting link for those who only shoot RAW

http://www.michaelfurtman.com/jpeg_myths.htm

Sounds like a fairly dated article but he raises some interesting points. Possibly not enough to change anybody's point of view if they are a RAW only shooter but still worth a look.

This one's not a bad read either: Shoot JPEG instead of RAW

:-)
I only shoot RAW because I get the maximum image quality that my camera can deliver and it is easier to use e.g. it's a trivial fix if you get white balance wrong and you can fiddle all day with your images non-destructively. I can create OOC JPEGs from the RAW files at any time using the camera manufacturer's software.

If I want JPEGs for the web I could shoot RAW + JPEG but I don't bother since it is so easy to convert RAW files to JPEG.

If I was a pro producing hundreds of images for clients who demand JPEGs or if I hated computers and image editing software I may well think differently.
Clients do want jpgs. Not raw. They want an image they can use; not need to cook first.
 
http://www.michaelfurtman.com/jpeg_myths.htm

Sounds like a fairly dated article but he raises some interesting points. Possibly not enough to change anybody's point of view if they are a RAW only shooter but still worth a look.

This one's not a bad read either: Shoot JPEG instead of RAW

:-)
I only shoot RAW because I get the maximum image quality that my camera can deliver and it is easier to use e.g. it's a trivial fix if you get white balance wrong and you can fiddle all day with your images non-destructively. I can create OOC JPEGs from the RAW files at any time using the camera manufacturer's software.

If I want JPEGs for the web I could shoot RAW + JPEG but I don't bother since it is so easy to convert RAW files to JPEG.

If I was a pro producing hundreds of images for clients who demand JPEGs or if I hated computers and image editing software I may well think differently.
Clients do want jpgs. Not raw. They want an image they can use; not need to cook first.
That depends on the client. Some insist on the RAW only. Magazines like to prepare and do the cooking rather than have it half baked.

Danny.

--
Birds, macro, motor sports.... http://www.birdsinaction.com
Flickr albums ..... https://www.flickr.com/photos/124733969@N06/sets/
The need for speed ..... https://www.flickr.com/photos/130646821@N03/
 
Last edited:
The main points of this article are pretty much common knowledge to most of us who are serious digital photgahers. I like to shoot RAW and do so almost 100% of the time. I like to process the photos though... if I wasn't doing that and wanted out-of-the-camrea images I know that I really wouldn't be loosing any perceivable quality to JPEG. If I shot hundreds of frames each time I went out with my camera, Id be worried about storage space in using RAW, but I don't do that.... I shoot far less.

I know that I can processes from JPEGs too as before I got a camera that shoots RAW I did exactly that and for the most part, got pretty good results. No doubt JPEGs have more latitude than film, as mentioned in the article. But do they have more latitude than RAW... I didn't think so.

The bottom line is that often I don't know initially which direction I'm going to go in with any given image that I shoot. I might experiment a bit and find that I really want to push the exposure one way or another or recover some detail from shadows, highlights, etc... or adjust for the improper color balance that I'd mistakenly set incorrectly. RAW is much better it seems for this kind of thing and the article didn't really refute that.

For someone who shoots deliberately and processes every image, there doesn't seem to be a benefit to shooting JPEG. For other kinds of shooters there isn't going to be a benifit to shooting RAW. So... why all this argument about one being a better way to go than the other?

--
my flickr:
www.flickr.com/photos/128435329@N08/
 
Last edited:
Many cameras are capable of producing excellent JPEGs, but if you think it's only about image quality then you are missing the point of shooting RAW. It's about having options. There is only so much you can do with a JPEG if you want to change the look. You can't uncook an overcooked JPEG, so if you decide your in camera settings over sharpened the scene you are screwed. I sometimes like to develop multiple versions of a photo. I might have a more muted version as well as a punchier version. And if you want to apply some aggressive adjustments, RAW will stand up to the processing long after a JPEG starts to fall apart. RAW also makes it possible to recover more detail from blown highlights or underexposed shadows. And as RAW converters improve, you can get even better results. I have re-processed some ten year old Nikon NEF files using Lightroom and gotten better results than I could get using ten year old software.
Well said... that's pretty much how I feel about it; it isn't so much about quality but more a question of greater options.
 
I will never shoot only jpeg bc I don't trust any camera to get WB right all the time. Period. There are other reasons too, but that one alone is enough to keep me with raw forever.
 
Last edited:
The point is that his JPEG pictures are nicer looking than 99% of the people I have met who shoot RAW. He's a great photographer and he shows that JPEG is a great format to shoot in.
 
JPEG just wastes good data. Let's say the camera starts with a 12 bit raw file read off the sensor. It then throws away 4 bits per pixel. Then it interpolates data and adds another 16 bits per pixel of made up color information. So now each pixel is 24 bits, 2/3 of which is false information. So it takes this huge amount of data and has to compress it down to be even smaller than the original raw.

One time I took some sports pictures with a camera somebody accidentally left on JPEG. The images were much worse, even not taking into account postprocessing. You can see the compression if you look closely.
How close ?
If you crop to the size of a player from a picture taken at 200mm, and view on a monitor, you can see the compression effects.
 
I read the article. Too many technical errors suggest the author doesn't know what he's talking about. Beyond that he misses the salient points. A fact for which he provides a prefect photographic example in the image of the raptor posted.
I concur with you, Ysarex. No only the article is outdated but it's inaccurate.

BillyInya, if you are happy with the settings your camera applies to your pics and don't intend to post-process them JPG is perfect for everything: displaying, printing, etc.

My camera is far from smart in its settings. I'm seldom satisfied with its options. I use DxO OpticsPro to automatically correct all the defects and then manually put stuff my way. It's extremely fast. This a RAW-only job. To me it's a no-brainer: I shoot 100% RAW.

Nick
 
1. Modern cameras (even quite cheap ones) have sensors with native DR of 12-13 stops (several have more and the trend is upwards) while virtually all makers restrict their JPG output to about 9 stops. It's true that there are plenty of times when the scene DR is 9 stops or under, but also plenty where it's over 9 stops. Shooting JPG is throwing away up to 1/3 of the luminance data that you bought your camera to record.
That's interesting. I need to correct highlights and shadows on most of my shots, hence retrieving data that are present in the RAW files. I'm constantly wondering why this idiot (my camera) clipped them. Is there a rationale why camera makers downgrade the JPGs compared to what the sensor actually recorded from the scene, using only part of the DR? Photos would look nicer with good looking skies and more detailed shadows. Do they want to make them bland?
2. Colour fidelity. This is partly related to the DR factor but also reflects the fact that having 12- (or 14-) bits of information allows more precise description of colours. It's true that eventually the output will fall to 8-bits (for most screens and printers) but all editing degrades fidelity so we want as little degradation as possible.
That's why I export JPGs or TIFs only for publishing and very often trash these files. It's the RAWs I keep, save and view on screen. It's not a problem to print them directly either. I am very happy I did that for about a decade. Today I re-process some shots I did long ago since my technique and my tools improve a lot.

There would be no way to do that with JPGs. Disk space is not an issue.

Nick
 
Last edited:
The point is that his JPEG pictures are nicer looking than 99% of the people I have met who shoot RAW. He's a great photographer and he shows that JPEG is a great format to shoot in.
Well stated.
 
Any article that starts with the title "the real truth" is stupid and a waste of time.

When it comes to topics such as RAW vs JPG, there is no real truth. There are only opinions and personal preferred workflows. So when someone calls his opinion "the real truth" it is worth nothing.

Moti

--
http://www.pixpix.be
http://www.musicalpix.com
 
Last edited:
Any article that starts with the title "the real truth" is stupid and a waste of time.

When it comes to topics such as RAW vs JPG, there is no real truth. There are only opinions and personal preferred workflows. So when someone calls his opinion "the real truth" it is worth nothing.
Moti has spoken! We have the real truth from Moti which is that there is no real truth!
 
Any article that starts with the title "the real truth" is stupid and a waste of time.

When it comes to topics such as RAW vs JPG, there is no real truth. There are only opinions and personal preferred workflows. So when someone calls his opinion "the real truth" it is worth nothing.
Moti has spoken! We have the real truth from Moti which is that there is no real truth!
Maybe you should learn reading and try not to take things out of their context before you make silly comments because I didn't say something like this.

What I said was When it comes to topics such as RAW vs JPG, there is no real truth, there are only opinions and preferred workflows. Sorry that you are not able to understand the difference.

Moti
 
Any article that starts with the title "the real truth" is stupid and a waste of time.

When it comes to topics such as RAW vs JPG, there is no real truth. There are only opinions and personal preferred workflows. So when someone calls his opinion "the real truth" it is worth nothing.
Moti has spoken! We have the real truth from Moti which is that there is no real truth!
Maybe you should learn reading and try not to take things out of their context before you make silly comments because I didn't say something like this.

What I said was When it comes to topics such as RAW vs JPG, there is no real truth, there are only opinions and preferred workflows. Sorry that you are not able to understand the difference.
I was in context - never left the subject. And I understood everything you said - so no need to feel sorry for me.

Again, you said that "when it comes to raw vs. jpg, there is no real truth..." That is your real truth - that there is no real truth (with respect to jpg vs. raw). So you have established that there is an absolute and this absolute is that there is no absolute. It therefore follows that no one can have an opinion about raw vs. jpg that is more or less truthful or factual than another.

Unfortunately, this is not correct. There are absolutes about raw vs. jpg which can be demonstrated outside of opinion and preferred workflows. How you prefer to work and what you care about - that is in the realm of opinion for sure. But there are some very real and objective truths in this subject space.

What I was pointing out is that you are applying a very specific set of terms to a broad and subjective part of the conversation and this ultimately leads to logical fallacy. IE, you are mixing terms. There IS real truth and there ARE opinions and preferred workflows at the same time. These do not exclude each other.

And back to the article - his real truth is junk. Most of us are plainly seeing things in his own images that he claims aren't there. In fact, there is a lot of bunk in his article that I find laughable.
 
Last edited:
Any article that starts with the title "the real truth" is stupid and a waste of time.

When it comes to topics such as RAW vs JPG, there is no real truth. There are only opinions and personal preferred workflows. So when someone calls his opinion "the real truth" it is worth nothing.
Moti has spoken! We have the real truth from Moti which is that there is no real truth!
Maybe you should learn reading and try not to take things out of their context before you make silly comments because I didn't say something like this.

What I said was When it comes to topics such as RAW vs JPG, there is no real truth, there are only opinions and preferred workflows. Sorry that you are not able to understand the difference..
And back to the article - his real truth is junk. Most of us are plainly seeing things in his own images that he claims aren't there. In fact, there is a lot of bunk in his article that I find laughable.
Oh I see, so this is YOUR real truth.... Wow. how do they call it? pot, cattle, black?

I just can't figure out why you should have replied in such a silly, pseudo philosophical and rude way, when actually we share the same opinion about the article.

Moti

--
http://www.pixpix.be
http://www.musicalpix.com
 
Last edited:
1. Modern cameras (even quite cheap ones) have sensors with native DR of 12-13 stops (several have more and the trend is upwards) while virtually all makers restrict their JPG output to about 9 stops. It's true that there are plenty of times when the scene DR is 9 stops or under, but also plenty where it's over 9 stops. Shooting JPG is throwing away up to 1/3 of the luminance data that you bought your camera to record.
That's interesting. I need to correct highlights and shadows on most of my shots, hence retrieving data that are present in the RAW files. I'm constantly wondering why this idiot (my camera) clipped them. Is there a rationale why camera makers downgrade the JPGs compared to what the sensor actually recorded from the scene, using only part of the DR?
I don't know this for certain but it's what I believe is the explanation.

The final output picture is seen on a medium, whether it be paper, screen or whatever, that has a relatively low tonal range. Put this another way - the contrast between the darkest point on the picture and the brightest is quite small.

Our eyes don't register tonal range a such; they work by detecting differences between neighbouring sensors (rods and/or cones). Those differences are, of course, contrast. What matters most to us looking at a scene is local contrast in quite small patches of the total scene. Our brains blend those patches into apparently seamless whole views.

Come back to photographs, though, and (unless we resort to selective editing in PP) we can't control local contrast but only global contrast. Now, if the top and bottom of the tonal range are fixed - as the output device makes them - there's only a given amount of global contrast available. Spread the wide DR (= tonal range) of a whole scene into the narrow tonal range of the output and at any point the local contrast is reduced.

As a general rule we tend to notice the mid-tones more than the tonal extremes; that's why all conversions from raw data need a tone response curve that is some sort of S-shape. Experience has shown that for the viewing devices usually available a DR in the image (file or negative) works well if it is held to about 9EV.
Photos would look nicer with good looking skies and more detailed shadows. Do they want to make them bland?
No; it's actually the other way round. What they want is for most scenes to look contrasty - the opposite of bland - so they compress the total DR to achieve this. If we put the clock back a bit, film is quite tolerant of overexposure of the skies so many photos survived the narrow tonal range of film and paper at a relatively modest cost - shadow areas became blocked.

This was so common for so long that many people came to accept that photos miss shadow detail (my sister-in-law once complained that it it's wrong to open shadows because we don't see that way). Early digital cameras also had narrow DR so this way of seeing things was perpetuated.

It's only quite recently that the ability to get decent detail in shadows and good colours in skies at the same time as acceptable mid-tone contrast has become possible, and then only by using different tone curves than typical in-camera JPG.
 
Any article that starts with the title "the real truth" is stupid and a waste of time.

When it comes to topics such as RAW vs JPG, there is no real truth. There are only opinions and personal preferred workflows. So when someone calls his opinion "the real truth" it is worth nothing.
Moti has spoken! We have the real truth from Moti which is that there is no real truth!
Maybe you should learn reading and try not to take things out of their context before you make silly comments because I didn't say something like this.

What I said was When it comes to topics such as RAW vs JPG, there is no real truth, there are only opinions and preferred workflows. Sorry that you are not able to understand the difference..
And back to the article - his real truth is junk. Most of us are plainly seeing things in his own images that he claims aren't there. In fact, there is a lot of bunk in his article that I find laughable.
Oh I see, so this is YOUR real truth.... Wow. how do they call it? pot, cattle, black?

I just can't figure out why you should have replied in such a silly, pseudo philosophical and rude way, when actually we share the same opinion about the article.
Yes, we do agree on that. This guy is unbelievable.

Sorry for coming across rude. I was just pointing out that it's not so much about truth vs non truth when we get down to the jpg vs. raw argument. It's more about what you prefer. It can't be proven that jpg "is as good as raw" - this is a broad statement sure to get people fired up. It can be proven that "jpeg is good enough for..." or "raw allows for better..." (for example).
 
Last edited:
JPEG just wastes good data. Let's say the camera starts with a 12 bit raw file read off the sensor. It then throws away 4 bits per pixel. Then it interpolates data and adds another 16 bits per pixel of made up color information. So now each pixel is 24 bits, 2/3 of which is false information. So it takes this huge amount of data and has to compress it down to be even smaller than the original raw.

One time I took some sports pictures with a camera somebody accidentally left on JPEG. The images were much worse, even not taking into account postprocessing. You can see the compression if you look closely.
How close ?
If you crop to the size of a player from a picture taken at 200mm, and view on a monitor, you can see the compression effects.
This makes no sense whatsoever without seeing how much of the original frame the player was occupying. You can fill the frame with the player if you're close enough but if you're some distance off the player could be minute in relation to his surroundings so how much of a crop are you talking about here ? Care to show the picture before cropping ?
 
"One of the great myths in digital imaging – adopted as gospel by both photographers and editors – is that JPEG images are so inferior to RAW as to make these images unsuitable for professional work."

Except I don't think I've ever heard anyone say jpegs aren't good enough for professional work.

- Dennis
 
The point is that his JPEG pictures are nicer looking than 99% of the people I have met who shoot RAW.
Well Billy, I can't speak for any of the other people you've met who shoot RAW, but, as for me, I'm not much to look at. I'm old and unkempt with a scraggly beard. My wife is always complaining about my appearance. I go out in sweat pants with my shirt tails hanging out and she says I embarrass her. I'm way overdo for a haircut. So yep, Furtman's photos are nicer looking than I am. You're right about that.
 

Keyboard shortcuts

Back
Top