M
malch
Guest
Can be, in theory. Almost never is, in reality.no, i am not wrong. but you are not quite right either. the fact is that a jpeg CAN be just as good as a raw.
This isn't hard. The camera sensor and ADC capture a whole bunch of data. That data is what goes into a RAW file.
The data can also be used to generate an in-camera JPEG. However, that's performed by an algorithm with limited adjustments. Furthermore, those adjustments can only be set before you trigger the shutter. It also involves an immediate and irreversible reduction to 8 bits of precision. Data are permanently lost and that is a mathematical fact.
When you shoot RAW, more adjustments are possible, they're possible after the fact, and greater precision can be retained throughout the process until you finally drop back to an 8 bit JPEG.
You may be quite happy with your in-camera JPEG's. They may be outstanding award winning photographs. And, if you (and maybe your "customers") are happy with them, that is just hunky dory. But as a matter of physics and math, the RAW file can virtually always be used to create a "better" image, assuming sound post-processing skills.
Just consider things like noise reduction and sharpening. The algorithms available within, say CS4, are substantially better than those implemented in the camera firmware. And when shooting RAW they can be applied to the original data at full precision versus data than has already been reduced to 8-bits and subjected to lossy compression.
I'll accept that your in-camera JPEG's are simply wonderful; enough to make Ansel Adams cry. But if you shot RAW and processed them carefully, they could be better.