Why doesn't medium format have high ISOs?

Film kept getting better, and soon people were actually shooting wedding and serious work on 35. The "revolution" came in 1961, when Vietnam got hot, and American and European journalists hit the place with MF gear, and saw what the Japanese were doing with the new Nikon F.

Next thing you know, Blads and Konis are littering the beeches, and the entire field of wartime journalism has converted to 35mm.
Are you certain of your history here? I was recently thumbing through "This is War!," by David Douglas Duncan, surely one of the most amazing war photographers ever to have hit a battlefield. He embedded himself with marines in Korea, and the photography is absolutely stunning. The last page of the book goes into photographic details, and he states that he used two Leicas and had replaced his German lenses with those new Nikkors made in occupied Japan. He said that by this point (the war was ongoing when the book was published), every photographer who came through Japan on his way to Korea was using Nikkors.

Korea is often mentioned as the turning point in Japan's economic recovery, and the birth of the Nikon legend. Duncan sure makes it sound like 35mm dominated the Korean battlefield, a full decade before we got involved in Vietnam.

Quotes from the book:

"The photographs in the book were made with a Leica IIIC, 35mm camera. During assignments two of these Leicas were carried, on on each side of my body, slung from their leather straps which went around the neck and crossed like ammunition bandoliers in front of my chest. All my rolls of film were in my back-pack, along with a toothbrush, bar of soap, bottle of insect repellent, single blanket, extra pair of socks and a waterproof poncho..."

"...The reason for two Leicas being used was fundamental -- one was fitted with the standard 50mm lens, and the other with a telephoto. ... My Lecias were used around the battlefields of Korea under almost every imaginable handicap -- from the humid, dust-soaked summer months, to the unforgettably cold days of winter near the Changlin Reservoir. The cameras kept working perfectly, even after the film itself started breaking during winding -- just from cold."

"Every photograph in This is War! was taken with a Leica camera, but fitted with Nikkor lenses . . . made in occupied Japan. Prior to the outbreak of the Korean War, Horace Bristol, former Life and Fortune photographer now living in Tokyo, and I began experimenting with the whole new line of Nikkor lenses, made by the Nippon Optical Company, Tokyo, and discovered, to our utter amazement, that their three standard lenses for 35mm cameras were far superior, in our opinions, to any standard 35mm lenses available on the open market -- British, American or German."

"As the Korean War progressed, and other magazine and newspaper photographers arrived in Tokyo, the reputation of the new lenses spread until, within a matter of only three months, there was scarcely a photographer working out of Japan who was not using Nikkors on his cameras."

--
http://models.stevemelvin.com
 
There are always some people who are against mediumformat. I enjoy it. I have three of them besides my D3X and D3. For me there is no better detail and especially color what a modern MF system deliver. It puts my D3X to shame. But what do I know? I'm just a photographer and not a gearhead! Let others here masturbate about tech stuff who nobody wanna hear and need to take an image. Wasted time my friend...

BTW, have you seen the files from the new HD4-40? I was blown away. Better than what the 31 can deliver. Check it..
But that's not the real issue.

Rayman said his images weren't at 100%. Well, that's the neatest thing about digital imaging, the more you downsize the images, the better the high ISO looks. A Nikon D3, at web size 1024x768, about 1/4 size, looks insanely clean at ISO 125,000.

Print the images, side by side, at a decent size. That's all you need to do to see whether MF's more pixels and allegedly "better pixels" actually makes a difference. I've done this, MF and FF, on the same shoot.

I'm convinced that the fact that more MF shooters and image customers don't try side-by-side comparisons is the only thing that keeps Blad and P1 in business.

--
Rahon Klavanian 1912-2008.

Armenian genocide survivor, amazing cook, scrabble master, and loving grandmother. You will be missed.

Ciao! Joseph

http://www.swissarmyfork.com
Joseph looks like you havent looked at the pictures carefully ... you would have seen that they are larger then 100 % not smaller.

i did it this way BECAUSE everything smaller looks OK on screen and I wanted to show the difference between the systems.
MF isnt bad up to 1600 iso if you choose the right back or camera.
I chose the 31 on purpose because of that.
But I´m with you that MF has lots less funds for R+D.
The format is still bigger and has some advantages.....
Peter
 
Thank you, Joseph, for your usual core dump. I appreciate the detailed knowledge you contribute which is not otherwise widely available. It's all the little extras that move things off track.

The "interline" structure permits parallel processing of capture and data transfer. The net result is faster processing where it counts - repeated capture - even if the processing speed is somewhat slower. Computers work the same way. Faster processer clocks do not necessarily mean faster processing. The number and quality of the steps are usually more important.

The Nikon D1x and D100 employ electronic shutters for all speeds higher than 1/250, which cannot be accomplished with a "full frame transfer" sensor. If not "interline", then what? (Full-frame in this context refers to the architecture, not physical size relative to the camera format.)

You correctly observe that MFD backs are designed for 16-bit imaging, which requires more processing and (probably) hand-selection and tuning of components. This is the core of my argument that MFD backs are designed for optimum image quality (at the expense of speed), which you oddly deny in your opening paragraph.

CCDs do run hotter than CMOS sensors. For example, Sony, in the description of their 1/2" EX1 and EX3 cameras, chose to use CMOS rather than CCD sensors to reduce heat build up. Most MFD backs use some form of active heat dissipation - my CFV has a fan, many backs use thermoelectic cooling.

DPReview and the Luminous-Landscape site refer to the traditional step-wedge approach to measuring useable dynamic range. The image is useable when there is sufficient contrast to distinguish one test patch from the next. DxO measures dynamic range from the noise floor to the maximum output, which renders the real difference between MFD and small-format sensors insignificant. You can have an useable image in the presence of noise (remember Tri-X and grain?), but there not as much contrast at low levels in a small-format sensor (or with color negative film) as with MFD.

How does metamerism apply to digital imaging? With prints, on the other hand, the change in color depending on the viewing light constitutes a FAILURE of metamerism.

There is no color, per se, in RAW images (capitalized for convenient emphasis), it is all in how the RAW file is translated to an RGB image. There have been huge improvements in that process since I started shooting digital, and especially in the way Hasselblad (and presumably other) RAW files are treated.

Dyes in color film and Bayer arrays are rightfully designed to cover the visible spectrum reasonably evenly. The human eye is a poor model, since the dyes which impart color vision are unevenly distributed (emphasis on red/green) ( http://en.wikipedia.org/wiki/Color_vision ). The third dye appears to be relatively new in the evolutionary sense, possibly to distinguish threats and food from subtle changes in a predominently green environment. Reptiles and birds have much better color vision, some having a fourth color. The idea in film and digital imaging is to reproduce nature as best as possible, and let the human eye and brain do what they must to interpret the results.

Finally, the Hasselblad, being completely manual, lends itself to nature photography and closeups, where automatic focusing and exposure are not very effective. That aside, how is taking one's time with certain types of photography a sin? Being dogmatic about it is another thing, which I am not.

The following example compares, among other things, the superior dynamic range of MFD (lower) to film (Ektar 100, upper). Look in the shadows, especially inside the roof blowers and under the maintenance bridge. You can clearly see the blades and supports in the CFV version, but not in film (the Nikon scanner has ample dynamic range to capture whatever is on the film). You can also see Moire in the CFV shot, which is problematic for repetitive detail near the limits of resolution. You do not get Moire with a D3, nor the same level of detail. Moire and aliasing is seldom an issue with landscapes or closeups in nature (the task, not the John Shaw title). There is some detail in the brightest highligts of the CFV version, despite the fact they represent the reflection of sunlight from bright metal.

http://forums.dpreview.com/galleries/3292675740/photos/403141/m0091024a_sf_0007-v-digital-resolution
 
More often they say something like:

"MF cameras have incredible clarity"

"MF cameras give me incredible richness in my files, unlike anything I have ever seen"

And a bunch of other terms that dont actually mean anything. :)
I usually respond back, making up a bunch of similar terms: ambiance, fidelity, etc.

The "clarity" guy will usually explode. I had one go on for pages...

--
Rahon Klavanian 1912-2008.

Armenian genocide survivor, amazing cook, scrabble master, and loving grandmother. You will be missed.

Ciao! Joseph

http://www.swissarmyfork.com
 
As in "measurable"...
Thank you, Joseph, for your usual core dump. I appreciate the detailed knowledge you contribute which is not otherwise widely available.
You're welcome...
It's all the little extras that move things off track.
I've put some of those in here, too.
The "interline" structure permits parallel processing of capture and data transfer. The net result is faster processing where it counts - repeated capture - even if the processing speed is somewhat slower.
That would only be true for situations where the exposure time is a substantial fraction of the frame rate. And the cameras that I've used with interline transfer sensors are not set up to take advantage of this. 3 or 4 MF backs, D1 and D1X, they all "stutter" for an image transfer time between two long exposures.
Computers work the same way. Faster processer clocks do not necessarily mean faster processing. The number and quality of the steps are usually more important.

The Nikon D1x and D100 employ electronic shutters for all speeds higher than 1/250, which cannot be accomplished with a "full frame transfer" sensor.
D100 did no such thing, thank God.

It's pretty easy to see which cameras did or did not use interline sensors.
  • D100 had the same top shutter speed (1/4000 sec) and x-sync speed (1/125 sec) as the N80 body.
  • D70 had a much higher top shutter speed (1/8000 sec) and x-sync speed (1/500 sec) than the N70 body. It also had all the other baggage that goes with an interline sensor, reduced dynamic range when compared to the D100, and much greater vulnerability to blooming. So many D70 images where a blown highlight turns into a band across the image.
The D1X did, but it was a botch in so many ways. Between the rectangular pixels and the interline sensor, it's the reason the D100 shocked the world by having better image quality in pretty much every visible and measurable way than the "flagship" did.

Nikon has been very careful never to repeat that mistake.

And no Nikon that employed an interline transfer sensor operated like you describe "employ electronic shutters for all speeds higher than 1/250,"

D1X, D70, D40, etc. did nothing, at all different at speeds above or below 1/250, or any other arbitrary threshold. They all clear the sensor before the first curtain opens, start the exposure by clearing again after the first curtain is fully open, do the transfer to shielded storage at the end of the exposure time, and only then do they close the second curtain. That's as true at 30 seconds as it is at 1/4000 sec.
If not "interline", then what? (Full-frame in this context refers to the architecture, not physical size relative to the camera format.)
  • Interline on D1, D1h, and D1x, because it was 1999 and Nikon thought that customers would tolerate the interline sensor to get the x-sync bumped from 1/250 to 1/500 sec, and the top shutter speed bumped from 1/8,000 sec to 1/16,000 sec. This turned out to be wrong.
  • Full frame on D100 and D200, because Nikon learned something from D1 by 2001 and wouldn't even try interline on an advanced amateur camera.
  • Interline on D70, because in 2004 they thought that the N75 body was so unimpressive that customers would put up with an interline sensor to get the low-cost body bumped from 1/90 sec sync to 1/500 sec, and the top shutter from 1/2,000 sec to 1/8,000 sec. (D70 can do 1/16,000 sec shutter with a firmware hack, it's deliberately "defeatured"). And this turned out to be wrong, too.
  • Full frame on D80, because Nikon learned that the D70 niche wouldn't tolerate interline sensors.
  • Interline on D40, because Nikon thought they'd finally found the niche where customers would accept an interline sensor to get higher sync and shutter speeds.
  • Full frame on D40X, because it turned out that interline had no place, anywhere.
Sony, Pentax, Leica, Epson, Oly, and Panasonic never repeated the interline mistake.
You correctly observe that MFD backs are designed for 16-bit imaging, which requires more processing
Processors are pretty much either 8 bit or 16 bit theses days. There's no processing difference between 12, 14, and 16 bit images.
and (probably) hand-selection and tuning of components.
Definitely. That's been a major problem with high bit count converters for decades.
This is the core of my argument that MFD backs are designed for optimum image quality (at the expense of speed), which you oddly deny in your opening paragraph.
It's not "oddly". MF backs are designed for simplicity of design, basically implementing Kodak or DALSA "cookbook"circuitry.
CCDs do run hotter than CMOS sensors.
Nope. I've measured them with a non-contact Omega, and I have the data sheets on an awful lot of them. I've even watched sensors on a thermograph, to locate the heat infiltration problems in industrial cameras.
For example, Sony, in the description of their 1/2" EX1 and EX3 cameras, chose to use CMOS rather than CCD sensors to reduce heat build up. Most MFD backs use some form of active heat dissipation - my CFV has a fan, many backs use thermoelectic cooling.
That's mainly because they have the processor sophistication of a year 1995 video game. Even a monster like the KAF-50100 only draws 180mW during readout (and nothing but leakage during exposure). It's not sensor heat that you're getting rid of, it's heat from everything else.

The scientific community calls this "amp glow", because the side of the sensor nearer the output amplifiers gets heated up by low noise analog circuitry that dissipates a lot more heat than the sensor does.

(to be continued)

--
Rahon Klavanian 1912-2008.

Armenian genocide survivor, amazing cook, scrabble master, and loving grandmother. You will be missed.

Ciao! Joseph

http://www.swissarmyfork.com
 
DPReview and the Luminous-Landscape site refer to the traditional step-wedge approach to measuring useable dynamic range. The image is useable when there is sufficient contrast to distinguish one test patch from the next. DxO measures dynamic range from the noise floor to the maximum output, which renders the real difference between MFD and small-format sensors insignificant. You can have an useable image in the presence of noise (remember Tri-X and grain?), but there not as much contrast at low levels in a small-format sensor (or with color negative film) as with MFD.

How does metamerism apply to digital imaging? With prints, on the other hand, the change in color depending on the viewing light constitutes a FAILURE of metamerism.
That's a failure of "illuminant metamerism". Cameras suffer from failures of "observer metamerism".
There is no color, per se, in RAW images (capitalized for convenient emphasis),
I see no need to be yelled at.

Of course there's color in raw images. It's the color that corresponds to the spectral response curves of the sensor filters.
it is all in how the RAW file is translated to an RGB image. There have been huge improvements in that process since I started shooting digital, and especially in the way Hasselblad (and presumably other) RAW files are treated.
MF files are very difficult to process. The cost-saving omission of AA filters leaves the software folks with a real mess to deal with.
Dyes in color film and Bayer arrays are rightfully designed to cover the visible spectrum reasonably evenly. The human eye is a poor model, since the dyes which impart color vision are unevenly distributed (emphasis on red/green)
That's the quantity of elements, and has nothing to do with spectral responses.

But since you mention unequal distribution, that's also true of Bayer, twice as many green as red or blue. But that's scalars, not vectors. The definition of colorimetric response specifies linear combinations of tristimulus values, transformed by a 3x3 matrix. Even if the number of sensing elements affected the output, it would do so in a linear way, which would be scaled right out in a colorimetric calculation.
I can give you some pointers to much better sources of information, if you're actually interested in learning about things like observer metamerism or colorimetric vision.
Finally, the Hasselblad, being completely manual, lends itself to nature photography and closeups, where automatic focusing and exposure are not very effective. That aside, how is taking one's time with certain types of photography a sin? Being dogmatic about it is another thing, which I am not.
Taking one's time is not a sin.

The sin is being so undisciplined that you need a camera with more awkward controls to "force" you to slow down. I shoot the Nikon D3 in manual exposure and manual focus most of the time. But one lever, and two clicks of one dial engages the automation when I (not a camera maker) decide I need it.
The following example compares, among other things, the superior dynamic range of MFD (lower) to film (Ektar 100, upper).
No. They compare only one user's scanning technique.
 
The D1x does not use the mechanical shutter to effect any speed faster than 1/250 second. I know because I tested it myself after reading of this behavior. The entire frame is exposed at once by an electronic flash, at all the speeds on the shutter dial, provided the flash is triggered using the PC port rather than the shoe. (A Nikon flash is disabled at speeds greater than 1/500 second if fired from the shoe. I have other flash units, but only my SB-800 is shorter than 1/16000 at low power. The exposure is only attenuated if the flash has a longer duration than the shutter speed, e.g., at higher power settings.)

There are several other Nikon cameras with the same behavior, perhaps not specifically the D100.

The shape of the D1x pixels has nothing to do with this behavior, and is completely irrelevant to this discussion - another core dump I suppose. It was a pretty good camera in its time. I never again purchased nor used 35mm film after a brief comparison. I'm not particularly sorry if you don't approve ;-)
 
The D1x does not use the mechanical shutter to effect any speed faster than 1/250 second. I know
I don't care what you claim to "know". What you are arguing about is not what I stated.

That is a standard tactic of yours: you do not quote what someone says, you cut everything and provide some paraphrase that only exists in your own little world.

So, one more time, here is what I said.

"D1X, D70, D40, etc. did nothing, at all different at speeds above or below 1/250, or any other arbitrary threshold. They all clear the sensor before the first curtain opens, start the exposure by clearing again after the first curtain is fully open, do the transfer to shielded storage at the end of the exposure time, and only then do they close the second curtain. That's as true at 30 seconds as it is at 1/4000 sec."

It is entirely true, and consistent with the results that you observed.
because I tested it myself after reading of this behavior. The entire frame is exposed at once by an electronic flash, at all the speeds on the shutter dial, provided the flash is triggered using the PC port rather than the shoe. (A Nikon flash is disabled at speeds greater than 1/500 second if fired from the shoe. I have other flash units, but only my SB-800 is shorter than 1/16000 at low power. The exposure is only attenuated if the flash has a longer duration than the shutter speed, e.g., at higher power settings.)

There are several other Nikon cameras with the same behavior, perhaps not specifically the D100.
Then why did you say it was the D100?

That is another characteristic of yours: your posts are very highly detailed, and so many of those details are incorrect that it casts doubt on pretty much everything you say.

You stated that "Most small format CCDs have an "interline" design,"

I countered that most don't, and provided examples.
The shape of the D1x pixels has nothing to do with this behavior, and is completely irrelevant to this discussion - another core dump I suppose.
Absolutely amazing. So, you're the one true arbiter of what is or is not relevant to a discussion?
  • You brought up the whole transfer speed issue, which is obviously irrelevant, because it's totally wrong.
  • You brought up the "thermal noise" issue, which is a well debunked urban legend.
  • You went on and on about "astronomical CCDs".
  • The "Much of the image processing can occur in the sensor itself," canard about CMOS.
  • And then you went on about the incredibly relevant "I prefer the Hasselblad for more 'contemplative' photography" observation.
And then in the next post, you
  • Tried to waffle and defend your earlier interline comments about it "The "interline" structure permits parallel processing of capture and data transfer. The net result is faster processing where it counts - repeated capture " except that no camera actually does that. You either made it up or misinterpreted something you read and mostly forgot.
  • Made an argument about the ratio of red, green, and blue cones having something to do with color accuracy.
  • Added in film comparisons. B is better than A because B is better than C, that's a pretty basic logical fallacy.
It was a pretty good camera in its time. I never again purchased nor used 35mm film after a brief comparison. I'm not particularly sorry if you don't approve ;-)
You have no understanding, at all, of what I do not approve of.

Oh, and by the way, this is so good that it should be a signature line...

"Being dogmatic about it is another thing, which I am not. "

You're about as dogmatic as it gets, Ed.

--
Rahon Klavanian 1912-2008.

Armenian genocide survivor, amazing cook, scrabble master, and loving grandmother. You will be missed.

Ciao! Joseph

http://www.swissarmyfork.com
 
The 850 and 900 are "broken". Seriously, at least two engineers have torn them down to determine why Nikon D3X, with the same sensor, gets two stops better low light capability.
What did they find?
and the "color of Sony" is always mentioned as a plus for the system
No, it's not. Sorry.
Perhaps not always. But sometimes it is stated as a fact...: http://forums.dpreview.com/forums/read.asp?forum=1037&message=31460404&q=borg+a900+respect&qf=m

kind regards,

Jonas
 
I am speaking about the user, not the camera, so I repeat myself :

Using a Pro camera at 12 400 iso is irrevelant, unless the bud is photojournalist who only posts thumbnails on a website or newspaper !

I really doubt that any pro would sell any big enlargment with noisy p&s quality shots. The only purpose of extreme iso is photojournalism, magazines & thumbnails.
Thumbmails ?????????????????? :O

What are you talking about.
You seems have never tested D3S and don't really know what the Pros are doing

These are Iso 12 800:
(no other camera can do)





 
all I see is washed out colors, grey-yellows, jpeg artifacts, loss of detail, noise & hot pixels. Ok if you postprod it & print it at A4. But extreme isos are not suited for landscape or fashion blown up on wall posters, maybe some artists could use it for a certain purpose, but that's all !

That said, in the upcoming years, we might see medium format cameras using top-notch technology & be able to shoot something tastefull at those iso ranges.
 
"D1X, D70, D40, etc. did nothing, at all different at speeds above or below 1/250, or any other arbitrary threshold. They all clear the sensor before the first curtain opens, start the exposure by clearing again after the first curtain is fully open, do the transfer to shielded storage at the end of the exposure time, and only then do they close the second curtain. That's as true at 30 seconds as it is at 1/4000 sec."

In other words, the D1x employs an electronic shutter to determine the length of exposure, which cannot be done with a full-frame transfer type of sensor. That's precisely what I said, with one exception. The D1x varies the shutter speed below 1/250 second, using the mechanical shutter to control the exposure. The exposure is truncated electronically above 1/250 second. This according to Nikon technical support.

There are several ways to "mask" memory, including interline transfer sensors. It should not be necessary to "clear the sensor" with a separate action just prior to the shot, unless the sensor is employed in a live-view mode, as in a P&S camera. It is sufficient to keep it in an idle state.

Most current DSLRs and all MFD cameras control the exposure with the mechanical shutter only. The back is armed before the shutter is opened, then disarmed and the data read into memory after the shutter is closed.
 
all I see is washed out colors, grey-yellows, jpeg artifacts, loss of detail, noise & hot pixels. Ok if you postprod it & print it at A4. But extreme isos are not suited for landscape or fashion blown up on wall posters, maybe some artists could use it for a certain purpose, but that's all !

That said, in the upcoming years, we might see medium format cameras using top-notch technology & be able to shoot something tastefull at those iso ranges.
all I see is washed out colors, grey-yellows, jpeg artifacts, loss of detail, noise & hot pixels. Ok if you postprod it & print it at A4. But extreme isos are not suited for landscape or fashion blown up on wall posters, maybe some artists could use it for a certain purpose, but that's all !

That said, in the upcoming years, we might see medium format cameras using top-notch technology & be able to shoot something tastefull at those iso ranges.
We'll the fact is these kind of high iso photos produce more money most of the time.

Oh anyway Landscape is good for your hobby but to make money it's very very rare. The era of Ansel Adams is over.

Once in a while yes it sells but for continuous monthly money maker the D3S is the most money maker camera in Nikon lines.

And you said they are not Pro ??????????????????????????

DOH ... :O
 
you ae mispelling all my words. We are talking about medium format & image quality in this this topic. I especially said that, under certain circumstances, pro will use extreme isos, but not for art shots. If you are one of those, good for you. I do sell art prints.

Even photojournalists shooting war photography used the finest cameras, films & lenses to be able to blow up the pictures for galleries.
 
Kettle, meet black. :D
There are always some people who are against mediumformat. I enjoy it. I have three of them besides my D3X and D3. For me there is no better detail and especially color what a modern MF system deliver. It puts my D3X to shame. But what do I know? I'm just a photographer and not a gearhead! Let others here masturbate about tech stuff who nobody wanna hear and need to take an image. Wasted time my friend...

BTW, have you seen the files from the new HD4-40? I was blown away. Better than what the 31 can deliver. Check it..
 
The cost-saving omission of AA filters leaves the software folks with a real mess to deal with.

I doubt that the omission of anti-aliasing filters is a "cost-saving" device. When you pay that much for a sensor to replace 4x5 film (much less MF film), you want the maximum sharpness available. Aliasing (e.g., staircasing) is easily removed in post-processing. Moire is much harder to remove after the fact, but is seldom a problem unless there is highly repetetive detail (as in my example). Even fabric is seldom a problem unless it is held nearly flat in the image.

Dyes in color film and Bayer arrays are rightfully designed to cover the visible spectrum reasonably evenly. The human eye is a poor model, since the dyes which impart color vision are unevenly distributed (emphasis on red/green)

The human eye is a poor model, since the dyes which impart color vision are unevenly distributed (emphasis on red/green)

That's the quantity of elements, and has nothing to do with spectral responses.

I refer to the dyes which the human eye uses to respond to color, not the relative quantity of sensitive cells (



). The properties of the dyes which permit color vision is a matter of chemistry, not perception.

But since you mention unequal distribution, that's also true of Bayer, twice as many green as red or blue. But that's scalars, not vectors. The definition of colorimetric response specifies linear combinations of tristimulus values, transformed by a 3x3 matrix. Even if the number of sensing elements affected the output, it would do so in a linear way, which would be scaled right out in a colorimetric calculation.

You are confusing spectral distribution (the color of the Bayer filters) with luminosity (twice as many green as red or blue). Bayer's intention was to mimic the response of the human eye, which is twice as sensitive to green as the other colors. This bias is removed in processing the final colors. To this extent, the green "bias" serves to reduce noise in the results, much like pre-emphasis does for sound recordings.

The sin is being so undisciplined that you need a camera with more awkward controls to "force" you to slow down.

Nothing forces me to use a manual camera, it is my choice and I take pleasure from the diversion. The fact that the image quality of an Hasselblad is better than with my D3 is a bonus. I'm glad you mentioned that I can use a D3 in manual mode. Is that a new feature, or are you just patronizing me ;)
 
Well, we get another lecture in sensor theology here.
Simply science and engineering, not theology.
By that reasoning, with read noise at, say 1/8^=2^-3 electron, we only need 32=2^5 electrons to achieve 8 stops DR! ;)
I think you've just discovered that little thing about noise being scale dependent. Talking about noise on the basis of a single observation makes no sense. Over a number of observations, the DR is indeed 8 stops. And since the shot noise SNR of the whites will be only 5.7, it'll be nicely dithered so it'll look smoothly gradated.

Well, noise in a single pixel (one observation) means nothing, so you'll have to define the range of observation over which the noise is observed
And, as variances are additive, the variance from pooling, say 4 pixels, with read noise A is 4A.
Wrong, Still Poisson (mostly, apart from that bit which is quantisation noise). A is the standard deviation, the noise from pooling four pixels is 2A.
Unless the read noise of each pixel is less that 1/4 of that of a 4x bigger pixel,
Which it is under strict scaling.
the total S/N-ratio will become smaller by pooling this way, resulting in less dynamic range recoverable.
Larger, if you deal with the SD rather the variance, which is what you should be doing.
And, everything else being equal and random allowed to be random, this seems to be what we observe. BUT, variances won't be fully additive unless the engineers allow them to be.............
The read noise is mostly thermal noise so it's pretty good white noise.
It is only for the Poisson shot noise, with variance equal to expected value, and distibutions reproducing additively, that we have this wonderful property of full additivity. Which means that by pooling, we can get the S/N-ratio higher "at no cost".
I can't even think what you're trying to say here. But the conclusion is wrong, by 'pooling' (by which I assume you mean 'binning' all that has been done is that a low pass filter has been applied to the shot noise and the signal, so the cost is high frequncies, aka resolution.
I have heard exactly this same theological exegesis before, from another nick, with the same Canon-blessings. Seemingly no capability or willingness at all to acknowledge that ALL sensor development strategies have their strengths and weaknesses, and that many of the problems with Canon's approach are widely acknowledged.
I have no idea who you are'talking about. Nikon's sensors lead the way in QE, that is a major achievement. The best would be the QE Nikon gets with the read n oise Canon achieves - that would be a rather good sensor.
In a very authoritative way, but rather void of documentation, we are told how things REALLY are. So we better not check for ourselves!
Where are your sources then. I will refer you to Emil Martinec's rather good tutorial on this http://theory.uchicago.edu/~ejm/pix/20d/tests/noise/ , or several others if you don't like that.
I feel this is kind of trolling,
Your feeling is wrong
and that it defeats much of the intention of such forums. It may be a very conscious strategy from Canon, in that they have some large problems with their approach which they at present don't want to have openly and penetratingly discussed, and rather try to turn sensor discussions into theology and name-calling.
and that's all just paranoid nonsense
I'll just pick one example here, it is typical and very central to the theological argumentation:
" Shrinking pixels is quite a good way of improving most sensor characteristics"

True or false?
True
For almost all practical purposes, measurements indicate that it is, at best, only partially true.
Cite the measurements, explain how they say this.
It IS true for resolution, and when read noise is low, there is little sense in capturing electrons way beyond the bit depth of the RAW file. But, as soon as we enter high ISO territory, even the sampling strategies may become problematic. 6 steps above base ISO, if a full exposure at base is 32000 electrons, we have only 500 left. If we say that gives 9 stops of DR, halfway down the tonal curve we are working with 30-60 electrons. That may, for example, affect the color precision, as the Bayer filter doesn't allow us to simply pool the closest pixels. We may have to go down to a far lower resolution to get acceptable quality
Someone who doesn't understand information. If you know where every photon lands, you know everything about the image projected onto the sensor. The closer you get to that, the more scope you have to make what you want of the image. If you throw it away, all you can base your imagery on is guesswork and blur.
And the point is, that high ISO performance IS relevant for many applications, so any changes affecting that performance may be problematic. The "smaller is better" theology is also contradicted by the performance of the D3s sensor.
An outlier, entirely predicted by its exceptional quantum efficiency, and you have no evidence that has anything to do with pixel size.
 
Well, we get another lecture in sensor theology here.
Simply science and engineering, not theology.
Spouts the high priest of read noise, shot noise, thermal noise, quantization noise, pixels, quantum efficiency and equivalence. As I wrote earlier . . .
Such an innocent appearing, very short question. What HorsePix doesn't realize that it's a trap, and when sprung he'll be inundated with reams and reams of esoteric verbiage that goes round and round and never leads anywhere that's even remotely useful.
. . . and as is apparent, the trap has been sprung. Tyrone Wellhung, aka Shull Bitter, aka . . . is back in his element and yet another thread suffers.

By that reasoning, with read noise at, say 1/8^=2^-3 electron, we only need . . .
<yawn> . . . to check off the IGNORE USER box.
 
And, as variances are additive, the variance from pooling, say 4 pixels, with read noise A is 4A.
Correct.
Nope.
Still Poisson (mostly, apart from that bit which is quantisation noise). A is the standard deviation, the noise from pooling four pixels is 2A.
Also correct. Variance is the square of std dev.

--
emil
--



http://theory.uchicago.edu/~ejm/pix/20d/
 

Keyboard shortcuts

Back
Top