Skipper494: All very fine, but in film days we got grain from particles in the film, nothing from die film. As an engineer with quite a lot of experience in sensor design, I can assure you that most noise comes from the proximity of photo sites, or wiring, (even poor software) which is why lower resolution large sensors have less noise and more dynamic range.
Compare the Nikon D700 and the Sony NEX 7, for instance. Consider how good the Fuji S2 Pro was in it's day. Compare a lower resolution 1/1.7 sensor with a higher resolution 1/2.33, such as the Nikon P7000 with the Pan FZ35.
Frankly, too much emphasis is put on high numbers and marketing, instead of good photography results.
@Mike Engles : If you understand what tone reproduction is you can easily see that no gamma is needed for vision (other than very mild one to compensate for the flare), as the original scene has no gamma too. You are misled by poor language of "Human vision, under common illumination conditions ... , follows an approximate gamma or power function".
57even: One big problem with ETTR - channel clipping. The histogram may tell you when the JPEG is clipping, but it usually uses the green channel. If you have a three-colour histogram, it's easier.
> If you have a three-colour histogram, it's easierNo, it is not - as the green channel normally clips the first.
@Mike Engles> gamma 2.2 (this needed for our vision)
We see the Nature as it is, in gamma = 1. Gamma 2.2 serves as compensation for CRT tubes, which have their own gamma.
cpt kent: Seems to be a lot of folks exclaiming brilliance without questioning. Sources? References? Research? Further reading?
Shot noise is fundamental physics. See the works of Walter Schottky on vacuum tubes, Campbell theorem, works of Ernest Rutherford and Hans Geiger, math by Harry Bateman, etc.
It is rather amazing that such a well-known phenomenon, 100+ years after it was discovered, described, and quantified, is so often overlooked.
> I assume you mean most electronic noise, 'cause most of the noise comes from the light itself
Shot noise is not only photon shot noise, but junction shot noise too ;)
falconeyes: Interesting and important article.
However, it should have used fewer words. The article makes a simple matter look more complicated than it really is. And may discourage some to read it.
Everybody thinking that noise is (mostly) a camera artefact should read the article tough.
You right. It starts with per column black levels. This was obvious already on 5D original. But it does not finish there.
Anastasiadis Lazaros Thessaloniki Wedding: I read the article and I personally don't agree with a lot of things. Raindrops inside tubes don't behave like light in photos, sensor size does not matter the technology and quality of the sensor does not have to do with its size. This article tries too hard to impress the amateur photographer with magic tricks but actually it does not offer anything new except maybe some confusion about how you should take photos without much noise.
The original object to study this type of noise were shot towers (used to manufacture lead shot), so, yes, rain parabole works ;)
RichRMA: "Output size" I thought was determined by pixel count?
"In turn, this is why we talk about different sensor sizes representing an image quality/size/price balance: because, so long as the sensor's electronic performance is similar, the effect of shot noise means that sensor size is the major determinant of image quality. Yes, pixel count can make some difference, but shot noise tends to play a much bigger role, if you compare images at the same output size."
If you cut a FF sensor in 4 parts, essentially making it a 4/3rds size sensor, the individual pixels will still render the same noise as when it was a full-frame sensor. Overall light collection does not effect the response since the pixel size does not change and therefore the noise will not change. You could do an experiment by inserting a physical cropping mechanism in front of the FF sensor and see this is the case. All that will happen is the angle of view of the scene will be cut in half, which has no impact on noise.
Yes, it needs to be done with raw data out of camera. But there are also other ways to compare noise, however those methods are not direct visual.
Upsizing a demosaicked image is not a convincing procedure. You can try upsizing binned images.
Iliah Borg: > your camera's exposure meter and exposure guides are all based around correctly exposing the midtones of JPEGs
I would use "correct placement on the brightness scale" maybe instead of "exposing". Technically, the current fashion for tone reproduction is like this: start the shoulder at L*≈70, but place the skin tone (L*≈65) where it belongs - that is, the accurate reproduction starts at skin tone, which is higher than midtone.
Tone means brightness here. Any sane generic tone curve is designed in such a way, that "correct exposure" results in skin brightness being as it is expected.
I'm trying to say that "correctly exposed" is in fact "correct brightness".
Yes, ISO standard insists on 118 on sRGB scale for midtone placement.
Out of necessity, all generic tone curves need to work with rated ISO so as to place skin tone (starting at L* close to 70, with the "darker than light" being 67) close to where it belongs.
> your camera's exposure meter and exposure guides are all based around correctly exposing the midtones of JPEGs
zsedcft: I was under the impression that ETTR is often not a good idea. If the subject is the brightest thing in the scene (a wedding dress or clouds for example) exposing to have the wedding dress almost clipping is a bad idea. The closer you get to clipping the less tones you have available for highlights. I use ETTR on my eos-m for timelapses, but I avoid the very top of the histogram when I am actually holding the camera. It may not happen as much with 14/16bit RAW files, but I believe it is still an issue.
(if you can afford it) Just get a D800/D800E/D810/A7R and you will have a camera that you can shoot at ISO 100 and recover everything in post processing. I have found that ISO 100 with the exposure pushed in lightroom looks the same as ISO 1600 from the camera. Shadow noise is always better than blown highlights IMO so I usually pretty conservative with exposure. If you want the best of both worlds in landscape, just shoot a exposure bracket and pick the best or blend them.
For ETTR to work well, the clipping point needs to be within 3% from linear response; the in-camera raw data compression (if lossy) needs to be independent of the raw values; and the converter needs to apply compensations of exposure in a linear fashion.
James Bligh: Dpreview D750 review does not mention a single word about internal reflection of D750. I cannot help but say the review is flawed when Iliah Borg has said the following.
(Quote) This issue is too easy to trigger, and when asked which camera to buy (that's the season, you know) - I can't say "D750". (Unquote)
> This issue is too easy to trigger
Did I say - "on every D750 camera"?
Mikael Risedal: why cant that be true?raw is raw and as long their are no clipping in any channel you can do the WB later on against for example white or grey surface which is done anyway
and again there are other solutions than adobes, like www.qpcard.com which is easier, cheaper, faster and better
First, it depends on the balancing algorithm, some will cause clipping where raw data is not clipped.
Second, the starting point for the exposure when one shoots for calibration and profiling is what incident exposure meter tells, and that is far from clipping. One can bracket from there up and compare results - preferably, using numbers, and not eyeballing; as we are talking calibration and profiling here.
Third, close to clipping point is where one can easily run into non-linearities, and skew the profile.
I have a QPCard and see no magic in it. It glares under the daylight, it is not durable in the field, and I do not see it providing better accuracy. I wonder how it is easier. Speed is irrelevant. Price is not significantly lower; replacing it twice as often as Passport eats out any savings.
If you wish I can collect some QPCards and measure them to see if the manufacturing tolerances and aging affect accuracy. It is easy with a spectrophotometer ;)
john Clinch: Do different cameras of the same make and model see color differently. If they do why? I've never heard of sample variation between cameras AF fine tune aside
6% difference easy. No 2 sensor batches are exactly the same.
Steve, the following needs a complete revision:"As I am capturing raw files, I will have the camera set to the AdobeRGB color space and not sRGB, which is better suited to JPEG shooting. AdobeRGB gives a wider color gamut, and is the best option for raw images that will subsequently be edited on a computer. You should use the same color space for both the calibration shot and subsequent images which will use the same profile."
The colour space setting in the camera does not affect raw data.
Kuturgan: Why not to show todays Sochi? It is not the same city as it was 5 years ago.
Here is todayshttp://www.youtube.com/watch?feature=player_embedded&v=z574CEtM6LY
More on Sochi http://nashe-nasledie.livejournal.com/1508962.html