Camera Operation

The Nexus 5’s camera app main view is minimalist. There’s a mode selector under the shutter button, and the circle above it opens the menu and displays current settings information (auto flash is enabled here). Touching the screen moves the focus reticule and biases exposure towards that part of the scene.

Google’s native camera app is a mixed bag, with a superficially simple interface that never quite gels into anything particularly intuitive or functional. You can jump to it from the lock screen via a shortcut (though “jump” might not be the best word, as we’ll discuss in the performance section). The main view is uncluttered. The big blue button takes a picture. Tapping anywhere on the screen sets focus momentarily on that point, with exposure appropriately biased towards that part of the scene.

The app focuses continuously but still checks focus before shooting unless you hold the shutter button down to lock it, with the picture taken immediately when you release. Weirdly, the camera can’t quite decide if the press-and-hold also locks exposure: it seems to partially recalculate exposure when the picture is taken, so that if you lock exposure on a bright part of the frame and recompose, you’ll get an exposure that’s darker than you’d expect without the exposure lock but brighter than the actual exposure you locked in. If this is confusing, it’s because it doesn’t really make sense as a camera behavior.

Face detection is always on and generally works well, though it can be fooled by sunglasses.

Below the button is the mode selector for switching between stills, video, panorama, and Photo Sphere shooting. There’s no shortcut to start video recording on the stills screen. Tapping the circle above the shutter button opens the first level of settings, which let you toggle HDR+ (more on that in the Features sections), set exposure compensation, change flash mode (on/off/auto), or switch between the front and rear cameras. There’s also a “sliders” icon there that takes you to the second level of settings for toggling location tagging, controlling white balance, setting a self timer, picking a scene mode, and changing the output resolution.

The first level menu also pops up if you touch and hold your finger on the screen. The idea seems to be that with the options arrayed in arcs, you could smoothly slide through the menus to your desired setting with a graceful swipe. It kind of works, but it can also feel labored and fiddly. There’s quite a pause before the menu displays, and sometimes it doesn’t pop up at all. Then you need to be very precise when sliding since the settings icons are so small.

The first level of the settings menu, accessible by tapping the circle above the shutter button or holding a finger on the screen: HDR+, exposure compensation, more settings, flash mode, front/rear camera.
The second level of settings: location stamping, self-timer, resolution, white balance, scene mode.

If you don’t use the touch-and-hold technique, you need at least two taps to change anything. And because of the menu layout, you might need to move two fingers: for example, while holding the phone in landscape orientation, your right index finger can hit the settings circle, but then you’ll need to use your left to reach the HDR+ toggle, which compromises your grip on the phone.

Despite a fair amount of manual control, there’s one setting that’s glaringly absent: ISO. Apple has never let users take control of this crucial setting, but most phones from other makers provide the option. It’s particularly useful on handsets with optical image stabilization since it lets you hold down the ISO in scenes where a long exposure is appropriate, resulting in a lower-noise image (or force the phone to use a higher shutter speed when there’s subject movement). We can’t figure out why Google provides no less than eight resolution options (from 8MP down to QVGA) but omits this useful, and largely standard, control.  

And that’s not the app’s only quirk. The viewfinder display is actually a cropped preview of what the camera captures, with the final image having more vertical coverage than the preview image. What you see is not what you get. There may be a universe where this is a good idea, but we’re not living in it. The same goes for the video mode, though there the discrepancy is a bit less and you end up with extra horizontal coverage.

The camera app gives a cropped preview of the final shot. Is Google trying to evoke the charm of an old-school tunnel viewfinder? Why not simulate some parallax error for the fun of it?
Note the extra vertical coverage in the actual image.

There are good camera apps on Android (FlavioNet’s Camera FV-5 comes to mind, though it doesn’t completely support the Nexus 5 as of this writing) but third-party apps rarely tie into the hardware as neatly as the native app. So while you have other options, it’s a shame that Google’s native camera isn’t more satisfying to use.