GabrielZ: Only costs 22 grand, I'll buy a dozen...ha ha. But seriously why go for this when you can get a D800/D800e with virtually the same resolution, add to it a Zeiss prime and you've got a system with the same image quality for a fraction of the price. Still...a nice camera none the less.
@ Canon Pro...with a high speed sync capable speedlight, you can use flash up to 1/4000, no ND filters required. An old school friend suggested that NDs would work just as well but I'd have to have three of them tyo fit the difffferent lens sizes.
With high speed sync,one can switch from flash to ambient at the flick of a switch without having to fumble around with filters.
No need for a $21000 dollar camera to accomplish that.
obeythebeagle: Don't take my Kodachrome, or HDR, or B&W, away. It's all fun. HDR is Ansel Adam's brain on LSD.
I formulated my own developer for Tech Pan. It was difficult to mix but gave far better results than the Kodak Technidol developer. So I can always make my own. But the only Tech Pan I can find is 8 year old stock that been stored at room temp.
I hope that whomever buys Kodaks patents will recognize the qualities of Tech Pan for fine arts photography and start making it.
If that doesn't happen I'll proabably never soot another role or sheet of film. And when Canon gets off its behind and gives us the rumored 46MPX camera, even tech pan will be surpassed.
CaseyComo: Call me old-fashioned, but I prefer the look of a single exposure. If the sky is too bright, expose for the shadows and use a grad ND filter.
@gasdive:I'm with you on that. I don't like the graduated filter look, they can only used in certain circumstances and you're stuck with whatever you get when you take the shot. I cringe when I watch CSI Miami which overuses an orange grad filter. Makes it look like Miami's got a bad smog problem!
HDR offers more control and flexibility.One can make the image scream "HDR" or one can apply just enough of the effect to balance the tonal values and make the image pop a little more than a straight shot.
There are lots of filter effects that were ok when that was all we had but look dated today.
dark goob: Just think, if you had an external HDMI monitor you wouldn't have to bend to the ground. Or a camera with a rotating screen.
Why doesn't Canon put a damn rotating screen on the Mark III? I would buy one if it had it.
In certain conditions an iPad is the right tool but in the field I want something smaller, easier to handle that doesn't take up too much space;like the Samsung Note.
But I'd rather have a dedicated device that has physical buttons and dials that mirror the cameras. I don't like the touchscreen shutter releases. The camera controller apps I've looked have certain functions that are not part of the camera system like intervalometer and expanded HDR exposure braketing.Touch screen focus select would be nice.
But there are compatibility issues with apps and phones. My brand new phone would have to be upgraded and rooted for the app to work. That is too complicated and risky.
A dedicated device would not require its own internal operating system, would run off the camera battery, would not have to be replaced when I changed phones and would probabaly be more rugged and materproof than an iPad or smart phone.
I much rather have a dedicated piece of camera hardware.
elefteriadis alexandros: HDR? just use film..
I actually agree with your assessment of the sample images above. They are not great examples of the true potential of realistic renderings of HDR imageing.
May I add that its a little bit unfair to judge my abilities based upon someone elses images!!! My work does not look anything like the above samples. I don't shoot to display online, I shoot to print. My philosophy is that the photograph is not complete until it has been consigned to paper. Much of my work is black and white, which was not addressed here. So please don't disparage my work based upon someone elses work.
Francis Sawyer: These aren't HDR. They're LDR. Step one is educating the public about the misused terminology that pervades this fad.
If you see something in a JPEG online, you know it's not HDR. HDR requires a file format that can store it, like EXR. It also can't be viewed on any normal computer monitor.
People often confuse High Dynamic Range Capture and tonemapping.
It is tonemapping that creates the oversaturated and oversharpened images., not the dynamic range of the original scene.
Tonemapping can be applied to single LDR images as well. If the dynamic range of the original scene was within the range of the camera sensor, then the working image file can have as much shadow and highlight information as a high dynamic range scene captured with multiple exposures and rendered as an "HDR" image.
A high dynamic range scene captured via the multi shoot HDR method, can be rendered as a rather conventional looking image. It will simply reveal more highlight and shadow detail than a single shot image would.
SDPharm is correct. HDR refers to the method of capture, not the final loutcome.
Tonemapping is just the methods employed to render the HDR image to the desired result.
The "HDR look" isn't the result of HDR and should be refered to as the "tonemapped look"
They took my Tech Pan film away years ago. Now I need a 46MPX camera to be happy!
You can still have the look of a single exposure with HDRI. With HDR capture, we can record more information than we really need, how much of it you use is up to you. Its better to have more information to work with than to be struggling to tame a highlight or open a shadow that has no detail.
The average person now spends hours a day staring at brightly lit TV screens, computer monitors and smart phones. Thats what their eyes are atuned to. Neon colors are even big in fashion.
Many artists think in order to be seen they have to be as bright and garish as everyone else. They may be correct, but that doesn't make them right.
The question is: Who is YOUR target audience? Is it the viewer who appreciates subtleties, or is the the viewer who wants to be dazzled? Its like pop music VS classical. The masses are drawn to the overprocessed HDR images like moths to a candle.
There really is no right or wrong, it is merely a matter of taste.
Why doesn't Canon and all the other DSLR makers, make their own proprietory external monitors?
I can use a cell phone app to connect my smart phone to my DSLR and contro, the camera from the cell phone.Put the camera on a pole with a tripod head and one does not even have to bend over to get close to the ground.
But those apps all have their limitations.
Canon makes cameras with touch screens that allow the user to change seting on the screen. All they need to do is make a larger touch screen that can beconnected to the camera with a cable..or maybe even WiFi.
If the app writers can create an app for this, I'm sure that it is within the technological capability of Canon or Nikon to create an off the camera controller. They could do better than the cell phone apps by replicating all the buttons and dials of the camera on the remote monitor!!
Wouldn't that be cool?
Wye Photography: After reading this I am still not a 'fan' of HDR. The images on the first page look 'overcooked', the first two images on the second page, I admit, do look nice. The last image on page 2 looks like it has been done with one of those water colour apps on an iPad. A waste of an EOS 5D MkIII.
I don't think this article convinces me that HDR, and I quote, "emulate(s) the world we see with our own eyes". My world certainly doesn't look anything like these images. In dynamic range perhaps, in colour NO!
Then again, this article is for those ALREADY into HDR.
It is nothing new for photgraphers to overcook special techniques. I remember when I was was a teenager I learned how to use use color filters with black and white film....or how to solarize a print or how to sandwhich my negatives with Kodalith masks. I went crazy creating wachy over the top special effects. I made alot of crazy surrealistic prints but in the end, after I learned how to control the techniques to use them as a means of controling the quality of more conventional looking images.
Whenever a new technique is introduced, people will have a tendancy to overdo it to create unusual looking images, but in time, they outgro it. Eventually the overcooked images are regarded as cheap amateur manipulations. Skill practioners will learn to restrain themselves and apply just the right amount of the effect.
Unfortunately, many amateurs are using simple one size fits all HDR processors on point and shoot cameras and cell phones. They don't know any better, they think its cool.
PascallacsaP: Read the light, be patient, and you don't need HDR.
Take, this served photographers well for decades and many great photographs were made during that time. But throughout the history of photography, photographers have struggled with the inherent limitations of film. That gave rise to the Zone System and artificial lighting.
The aesthetics of what defines a great image were determined by the limitations of film. Photographers had to work with what they had. Ansels Adams stock in trade was his ability to create prints that had beautifully rich and detailed shadows and highights that made his prints "pop" in a way that others did not.
Many scenes are within the dynamic range of film and required no special technique to render them.
And when using artificial light the photographer is simply using the manipulation of light to bring the dynamic range of the scene within the range of the film or sensor.
HDRI is not a requirement, it is merely an option. HDRI simply captures the full DR, but the image does not have to use it all.
I used the Zone System in large format black and white for many years and used every technique I knew of to try to sqeeze more range out of the film without turning it to mush. I mastered the craft of b&w printing so I could render all the details in the negative...without turning the image to mush.
To a more limited degree I applied Zone methods to color.
Multi exposure HDR imaging blows Zone System B&W out of the water in its ability to capture the full dynamic range of an scene (...without turning it to mush.) Even more so when compared to color film.
Long before digital, I used the 3 exposure technique with 4x5 B&W. With 3 enlargers, carefull allignment, alot of burning & dodging and darkroom trickery, I could , after a few hours, have a handfull of really gorgeous prints. Thus I was able to bring the full DR of a high contrast scene to the print.
That still doesn't compare to what can be done with digital HDRI.
Shooting architecture on 4x5 film, I was often faced with the challenge of balancing dark shadows with bright lights and windows, utilizing split exposures, artificial lighting, reflectors, scrims, gobos, diffusers and filters..all in an effort to capture in camera a relatively 'normal' looking image.
Digital photography was a godsend because I could bracket my shots and composite the various componants in Photoshop to create a "normal" image.
So right from the getgo, I saw HDR imaging as a means of bringing out shadow details and taming blown out highlights to create a "normal" image. It's really just one more technique for quality control, as are artificial lighting and Zone System.
One could just as easily create wild and wacky surrealistic images in Photoshop and with film. While it is fun to experiment, it's HDRIs ability to render highlight and shadow detail that I find most useful. I use just enough to make my images pop without looking obviously tonemapped.
Laminated: DPReview - thank you for posting the details of some of the cameras on the Curiosity rover. I'm surprised you've got so much detail here - typically this info is tightly controlled.
To those underwhelmed by the cameras, speaking as an engineer who has designed parts of a camera in Mars orbit, I can tell you:- engineers at Malin, JPL and NASA know what they are doing and being cost effective is a daily concern.- the optical engineers do know how to design a camera and they understand light.- this is a science mission balanced with generating public support- the system requirements are very demanding for such a mission - the launch is incredibly violent, weight is tracked to the tenth of a gram, space is cold (duh), temperature swings are Mars are very large, etc. - power is provided by solar panels, not recharcheable Li-Ion batteries.- communication to Mars is not like plugging in a USB cable - if something breaks, you can't take it in for service.
The development of these craft takes many many years and just like Voyager, they end up being equiped with old outdated equipment.
I actually made some of the componants that were installed on the Viking Mars landers. They were upgraded versions of devices that were in use in oil refineries, chemical plants and nuclear powerplants. They were designed to endure conditions that make a trip to Mars look like a walk in the park. Between the time the parts were made and the actual launch, production of the device was discontinued and it was replaced by new (and very well tested) technology that was not only more sensitive and accurate, but also more reliable and durable than what was installed on the lander. The new devices were being installed in oil refineries a few years before Vikings launch. One of the refineries exploded and burned but all the new devices survived the blast and fire.
With 17 cameras on board, there's no reason 1 of them could not be more advanced.
Alizarine: Wow so much ranting on the image quality...
Me I'm happy the United States' NASA still goes on missions like this, in the name of discovery and learning. While its residents and the rest of the world concentrate on luxury... or getting there.
These plantary probes takemore than a decade tod esign and build. By the time they get around to launching, the technology onboard is old and outdated. NASA needs to modify its design and testing policies to allow the newest and best technology to be installed on the probes shortly before launch. As an insurance measure they could still include the old but thoroughly tested and prooven technology, while allowing a place for newer technology to be installed.
Since Curiosity has 17 cameras on board, they could have made room for at least one state of the art camera. Given the quality of images that we can get from tiny point and shoot cameras, there's no reason why they couldn't have a installed a more advanced version on board.
Even the engineers who designed the craft so many years ago had no idea how quickly and how far digital cameras would advance before it was launched.
SDPharm: So I downloaded the image, rotated it and cropped out the computer rendered gray image. The resulting image is about 3700 x 2800, or about 10 MP. What's going on here? How does the 2mp camera produce a 10mp image?
Its a composite image.
The compositing software technology that allows us to make panoramas, high rez stitches and HDR images and the hardware and software that lead to the shifting sensor backs of the early generation professional digitial cameras. (like the early 2000s Imacon) are the result of technology devloped by NASA for the Viking landers.
The signal uplink to the orbiting satillites cannot handle large image files . What I find puzzling is that they did not equip the satellites with more powerful transmitters. That was not a matter of technological capability, but simply size and weight. If NASA had bigger boosters, then they could send bigger probes, satellites and landers. The manned Appollo command and landing modules were huge in comparison to the typical plantary probe. If large boosters like the Saturn V were used for plantary probes, then NASA engineers would not be inhibited by size and weight and they could send bigger and better technology.
gwales: Curiosity isn't on Mars to take pictures...
Then why does it have 17 cameras?
Without pictures, Curiosity would have no way of knowing where to go and where to use its miriad of devices to test the soil. The rover cannot make those decisions on its own. They are made by scientists on the ground who examine the images to determine which sites and rocks to go to and test. One of the most intersting finds on Mars was discovered in an image and only confirmed by the test instruments.
To the scientists, the images serve a practical function, but to the rest of us,the visual images are very important. We want to see what it looks like on the surface and view these pictures with wonder.
Visual imagery has and still does represent the most important tool of discovery and exploration. The mysterious features of Jupiter, Saturn and Europa, Titan and the other moons, that have piqued our curiosity, were discovered by vusual imagery.
A visit to Mars is a photo op we could not fail to take advantage of.
How did DP review come to the conclusion that the D3200 competes with the Canon T3?
The T3 is nothing more than a bare bones low entry level DSLR for consumers who've never owned a DSLR and don't know how to buy one.
To anyone who wants a T3 I say "You really should buy a $1300 camera....Because by the time you figure out how to use the camera you will realize that it's less camera than you want and you'll end up spending another $800 for a T3i sooner rather than later."
Regardless of any shortcomings some may find in the D3200, it far outclasses the T3.
I think the D3200 would appeal more to the kind of photographer who might buy a Canon T3i or T4i. With it's higher resolution and lower price point it'll certainly give those cameras a run for the money.
If I didn't already own a bagfull of Canon glass, I'd give this camera serious consideration. And if the recent trend of Canon and Nikon continues, I may dump that glass and switch to Nikon.
facedodge: Consumers will soon learn that it's not all about Megapixels with camera phones catching up. Until then... 24MP will be a key selling feature. It's the first item they list on the stat sheet.
"hey Joe, what about this red one over here? it's $150 more expensive than that Canon, but it's got twice the pixels"
Only someone who doesn't understand the relationship between sensor size, focal length, perspective, angle of view, depth of field and optical resolution, digital noise and dynamic range would think that a camera phone with a 24mpx is equivelent to a DSLR.
Charrick: There are so many people who hate more megapixels. I know that more megapixels decreases the size of each pixel (that is, image sensor element), thus allowing it to gather less light. I'm not disputing that.
But with the D800, I thought people would have learned that, at least in low to moderate ISO settings, more megapixels DOES translate into a sharper picture with more details. Some people are pretending that technological innovation with sensor sensitivity to light stopped in 2006. And if that were the case, then perhaps 6 megapixel sensors would be good enough.
I, for one, am glad that some companies are pushing the envelope. I don't like pixels just for their own sake, but it's clear that at the 24MP range, pictures taken in daytime will probably look better than with, say, a 12-16MP sensor of the same size. Then again, I take far more pictures in the daytime than in the middle of the night or in candle-lit rooms.
Oh, one more thing:
Since the beginning of digital photography one of the most important commandments was ;"Thou shalt not waste pixels.". In other words, frame the image as close to the final cropping as possible, so as to avoid having to crop any pixels out.
With high MPX images, we can go ahead and crop the image and still make 8x10 prints or larger, without having to uprez.
Who needs a $12,000, 600mm lens when for less money you can get a 36MPX camera and a couple of great lenses and just crop the image on the rare occasions that 300mm just isn't enough.
Many photographers have gotten so used to cropping in camera and saving pixels that they have forgotten that once, long ago in a galaxy far far away, photographers shot film and cropped their images in the darkroom.
With more pixels we can throw away the "Thall shalt not waste pixels" commandment.