Last week Apple announced its iPhone 13 and iPhone 13 Pro lineup, which incorporate major advancements to its camera technology.

While we're waiting to get our hands on the devices, we wanted to break down the improvements the new models bring compared to the previous generation. But here's the gist of it: Apple managed to cram the larger sensor and optics seen in last year's iPhone 12 Pro Max into the much smaller 13 Mini this year, all the way down to the sensor shift stabilization technology. So when it comes to this year's iPhone cameras, think 'bigger'.



iPhone 13 vs. 12

Diagonally arranged camera array on the iPhone 13 and 13 Mini. A diagonal arrangement ensures that both horizontal and vertical scene detail produce non-zero disparity in both portrait and landscape orientations, to aid in depth-map generation in Portrait Mode.

Last year's iPhone 12 models didn't see any step-up in sensor size from the 11 models until you got to the 12 Pro Max model. Sensor size, along with lens' maximum aperture, is the largest determinant of image quality, since dynamic range and low light performance are intimately linked to how much scene light you capture.

This year, there are significant upgrades to sensor sizes, starting with Apple bringing the largest sensor in its largest phone from last year down into its smallest new phone, the iPhone 13 Mini. Below we break down all the technical specs of the iPhone 13 (and 13 Mini, which shares identical specs) vs. the iPhone 12 (and 12 Mini) from last year:

Lens Pixel Pitch Sensor Area Equiv. aperture Stabilization Focus

iPhone 13

Wide

26mm equiv.
F1.6

1.7µm 35.2mm2
(1/1.9")
F8.2 Sensor-shift Dual-pixel AF

iPhone 12

Wide

26mm equiv.
F1.6
1.4µm 23.9mm2
(1/2.55")
F9.9 Optical Dual-pixel AF

iPhone 13/12

Ultra Wide

13mm equiv
F2.4
1.0µm 12.2mm2
(1/3.4")
F20.2 None Fixed focus
The iPhone 13's 1/1.9" wide (main) sensor captures 47% more light than the 1/2.55" sensor in the iPhone 12, thanks to the extra 11.3mm2 of surface area on the chip. The above specs are equivalent for the base and mini models.

Like last year's iPhone 12 Pro Max, the iPhone 13 and 13 Mini models capture ~47% more light than the iPhone 12 and 12 Pro models, thanks to the 47% larger sensor surface area resulting from the 1.7µm pixels (up from 1.4µm). The 13 and 13 Mini models also gain the sensor-shift image stabilization from the 12 Pro Max. This should enable longer exposures for low light photos (or fewer blurry ones, anyway), and might also help capture more steady video footage. To get a sense for the improved low light ability this year's iPhone 13 will bring over last year's 12 and 12 Pro models, have a look at this 30s Night Mode shot of the starry sky, comparing the 12 Pro Max to the 12 Pro, below. The ultra-wide camera remains the same.

iPhone 12 Pro Max (full-size JPEG) | iPhone 12 Pro (full-size JPEG)

The display has been improved on the 13 and 13 Mini as well, particularly important when it comes to viewing the HDR stills and Dolby Vision HDR video the iPhones shoot (no, not that HDR, this HDR). It's 28% brighter and can achieve 800 nits for standard (SDR) content, maxing out at 1200 nits for HDR stills and video (a spec that was previously reserved only for the Pro models). These maximum brightness levels can be sustained for longer periods of time due to increased display efficiency. A ceramic shield on top of the display adds durability, as well.

iPhone 13 Pro / Max vs. 12 Pro / Max

This year, if you want the best camera, you no longer have to choose the bigger Pro Max model - both the 13 Pro and 13 Pro Max feature the same sensors, optics, stabilization and features. The three cameras in the Pro models span a 6x optical focal length range:

Compared to the base 13 and 13 Mini models, the Pro models gain the following:

  • Larger main camera sensor with 1.9µm pixels and wider F1.5 aperture
  • 77mm equivalent F2.8 '3x' telephoto camera with PDAF and OIS
  • Upgraded ultra-wide with a wider F1.8 lens for better low light performance, and AF for Macro photography

Let's take a closer look at the main 'wide' camera, and how it compares to last year's iPhone 12 Pro and 12 Pro Max models:

Wide (main) cameras:

Lens Pixel Pitch Sensor Area Equiv. aperture Stabilization Focus

iPhone 13 Pro / Max

26mm equiv.
F1.5

1.9µm 44mm2
(1/1.65")
F6.8 Sensor-shift

Optical
Dual-pixel AF

iPhone 12 Pro

26mm equiv.
F1.6
1.4µm 23.9mm2
(1/2.55")
F9.9 Optical Dual-pixel AF

iPhone 12 Pro Max

26mm equiv.
F1.6
1.7µm 35.2mm2
(1/1.9")
F8.2 Sensor-shift Dual-pixel AF

The 1.9µm pixel pitch of the 13 Pro and Pro Max 1/1.65"-type* main camera sensor allows it to achieve a 44mm2 surface area, which gathers 84% more light than last year's 1/2.55"-type sensor in the 12 Pro, and 25% more light than the 1/1.9"-type sensor in last year's 12 Pro Max and this year's 13 and 13 Mini models. The lens has been upgraded from an F1.6 to a wider F1.5 aperture, bringing in 14% more light.

The larger sensors, bigger entrance pupils, and faster apertures mean a significantly bigger camera module on this year's 13 Pro (right) compared to last year's 12 Pro (left).

That makes the sensor and lens combination F6.8 equivalent (read our article on equivalence), so you can achieve some background blur and subject separation optically, without of Portrait Mode. F6.8 equivalent also puts the iPhone 13 Pro's main camera only roughly 2.5 EV short of the light gathering capability of a full-frame camera with an F2.8 lens attached, assuming parity in sensor efficiency, microlens design and light gathering ability, etc. In reality that's probably an unrealistic assumption, so the difference is likely greater, but this is still remarkable; especially once you consider the extra light the smartphone will capture thanks to its use of multi-frame tile-based image fusion (a simplistic version of which is described here).

This is where smartphones capture multiple images in a sequence and align them intelligently to cope with moving elements in the scene: essentially making up temporally for what they lack in spatial light capture (sensor size).

This year's Pro models capture roughly 100% and 40% more light in the dark compared to last year's Pro and Pro Max, respectively

So, bigger sensors and brighter apertures are better, but what does all this mean compared to the previous generation models? You can expect the iPhone 13 Pro to show at least a 1 EV improvement in low light performance compared to last year's 12 Pro, and a 0.5 EV improvement compared to last year's 12 Pro Max and this year's 13 and 13 Mini models. That's because the combination of the larger sensor and brighter lens on the 13 Pro main camera lets in roughly 100% and ~40% more light than the 12 Pro and 12 Pro Max / 13 main cameras, respectively. Apple claims a bit more (2.2x and 47%) so there may be other factors or efficiencies, but the numbers are broadly similar. Video quality, which doesn't benefit from quite as much multi-frame fusion as stills (though, at least two frames of varying exposure are combined per frame), will particularly see a notable improvement in quality.

Furthermore, the combination of sensor-shift and optical lens stabilization this year, compared to either optical or sensor-shift stabilization in last year's models, should allow for longer hand-held exposures, further improving Night Mode. When held steady enough (on a tripod), last year's Pro models shot 30s exposures of 10 frames, 3s each; this year's Pro models with combination IS are capable of shooting 3 frames, 10s each in Night Mode, presumably to reduce the impact of read noise. It's possible this combination may also benefit video stabilization.

Ultra wide cameras:

Last year's iPhone 12 Pro models were capable of some stunning results with the ultra-wide module, thanks to the availability of Night Mode and ProRaw. (1s night mode)

The ultra-wide cameras have also been upgraded, now with a brighter F1.8 aperture (compared to F2.4 on last year's models), and with phase-detect autofocus. That makes the ultra-wide camera now F15.1 full-frame equivalent, up from F20.2 on last year's models. That amounts to 78% (0.83 EV) more light, a big helping hand for that small 1/3.4"-type sensor.

Lens Pixel Pitch Sensor Area Equiv. Aperture Stabilization Focus

iPhone 13 Pro / Max

13mm equiv.
F1.8

1.0µm 12.2mm2
(1/3.4")
F15.1 None Sparse PDAF

iPhone 12 / Pro / Max

13mm equiv.
F2.4
1.0µm 12.2mm2
(1/3.4")
F20.2 None Fixed focus

The addition of autofocus allows the ultra-wide to focus down to 2 cm for some pretty compelling macro photography, judging from the samples Apple shared:

Apple also claims a 'faster sensor', which we interpret as 'faster readout speed'. Faster read-out speeds offer a number of benefits, ranging from reduced rolling shutter artifacts and banding under artificial lighting. Additionally, faster sensor readout can theoretically improve electronic image stabilization in video, by increasing the interval between when a frame of video has been read out and the next one needs to be acquired. But we have no way of knowing yet if the 13 Pro models realize this benefit.

Telephoto cameras:

The telephoto camera has been upgraded in a couple of ways: first, at 77mm equivalent it's now 3x that of the wide camera's 26mm equivalent field-of-view, up from the 2.5x 65mm equiv. module on last year's 12 Pro Max, or the 2.0x 52mm equiv. module on the 12 Pro. That gives you more reach, and more potential for subject isolation. With that increase in 'zoom' though comes a decrease in light gathering ability: the F2.0 and F2.2 apertures on last year's Pro and Pro Max models, respectively, are replaced with an F2.8 aperture, which turns out to be roughly F23.8 equivalent in full-frame terms.

So don't expect much subject-background separation outside of Portrait Mode. None of this is too surprising: as focal length increases, the physical size of the aperture must increase to maintain the same F-number, and that means, you guessed it, bigger optics. There's only so much room inside these tiny camera modules.

Lens Pixel Pitch Sensor Area Equivalent aperture Stabilization Focus

iPhone 13 Pro / Max

77mm equiv.

1.0µm 12.2mm2
(1/3.4")
F23.8 Optical (OIS) Sparse PDAF

F2.8

iPhone 12 Pro Max

65mm equiv. 1.0µm 12.2mm2
(1/3.4")
F18.7 Optical (OIS) Sparse PDAF

F2.2

iPhone 12 Pro

52mm equiv.

1.0µm 12.2mm2
(1/3.4")
F17.4 Optical (OIS) Sparse PDAF

F2.0

The second, arguably more exciting improvement to the telephoto module is the availability of Night Mode. Last year, iPhone 12 brought Night Mode to the ultra-wide camera, but this year, it's finally available on all three cameras (if you thought you were getting Night Mode on the telephoto module previously, it was only because light levels had dropped so low that the iPhone was switching to the 1x module and cropping in).

This is a welcome addition (Google's Pixel 4, for example, enjoyed it on its 48mm equiv. module back in 2019) which should bring a dramatic improvement to low light photos shot with the telephoto module, which are otherwise hampered by the small sensor and the (relatively) narrow aperture.

Photographic Styles

iPhone photos have a look. It changes year to year, but they tend to be consistently well exposed with relatively mild contrast (this contrast is enhanced, though, when viewed in HDR on iPhone's OLED displays). Colors are pleasing without appearing overdone, with bright blue cyan-ish skies, skin tones that one would rarely describe as dull, and generally neutral white balance but, starting with the iPhone 11, with a slight greenish tint overall. Quite different from the vibrant warm output of Samsung, or the cool contrasty look of Google's Pixel phones. iPhone output can be described as pleasing across a wide variety of scenarios due to this choice to remain balanced, and it's a great starting point from which to introduce your own personal style.

Enter Photographic Styles. If you've often found yourself editing your iPhone photos to add a bit of tint, perhaps some punch to your photos - especially if you find yourself doing so repeatedly - you'll appreciate 'Photographic Styles', a way to personalize your iPhone camera to your own tastes. It essentially allows you to dial in your individual preferences and have them applied to all the images you shoot. But it goes deeper than that.

A representation of Apple's multi-frame image processing pipeline, with disparate steps in the process identified. Pay particular attention to the 'semantic rendering' mask, which allows Apple to apply various tone operations selectively to various elements of the scene - such as skies and faces - disparately.

This isn't a simple filter being applied after you've shot your photo. Instead, it's integrated into the advanced multi-frame image processing pipeline, applying local edits at the appropriate stages so that - as we understand it - edits meant to cool the overall image don't inadvertently cool skin tones, or vice versa. And since it's being applied directly to the multi-frame ISP, it's all being done in real-time as the photo is rendered, so you can preview the effect as you're shooting.

You start by choosing from the following default presets to change the overall look of your photo: Standard, Rich Contrast, Vibrant, Warm and Cool. You can then tweak each one of these to your own liking with individual controls over tone and warmth.

According to Apple's Rebecca Pujols, each mode uses Apple's semantic understanding - its discrete knowledge of various portions of the photo like skies vs. grass vs. faces - to apply appropriate local adjustments, whilst preserving skin tones. We're not sure if that means that skin tones are preserved entirely no matter the preset, or whether skin tones are treated differently based on the individual adjustment or parameter, but the gist of it is that different scene elements can be adjusted independently. We'll be digging into this once we've received samples of the phones. We do know that currently, Photographic Styles is incompatible with Apple ProRaw.

During the keynote, Pujols also mentioned that Smart HDR 4, Apple's latest iteration of its intelligent multi-frame capture and fusion pipeline, now uses a learning-based approach to identify individual subjects in a photo and process different skin tones individually.

Video

In the video department, Apple has made a big play by introducing Cinematic Mode, which is actually a number of things. It's essentially a 'Video Portrait Mode' and a way to change focus (and depth-of-field effect) after-the-fact, something we've dug into deeper in our article here. We think it could revolutionize video for the consumer, and in a few generations perhaps even for a budding cinematographer. See the video below for a demonstration from Apple, and here's a music video by an independent producer (NSFW).

Apple has said that ProRes video will also be made available in a future update to the 13 Pro and Pro Max, at up to 1080/30p with the 128GB storage option and up to 4K/30p with the 256GB to 1TB storage options. Apple says it has also focused this year on using machine learning to intelligently apply sharpening selectively to different subjects in the scene. In general, some of the semantic rendering used to process various portions of photos differently, in a context-sensitive manner, is now possible in video.

Display

The display on the 13 Pro models has also been improved. First, efficiency improvements allow the display to attain 1000 nits peak brightness in outdoor environments, up from 800 nits last year. We can't stress how useful this is for camera use in bright, outdoor environments. Many Android devices with highly rated screens (capable of 1000 nits in HDR mode, for example) become quite difficult to use outdoors when the bright sunlight overwhelms the phone's display.

Furthermore, new to the iPhone 13 Pro models is Pro Motion, which allows the display to slow down or speed up the refresh rate as necessary, depending on what's going on, on the screen. Scrolling slowly? The display will slow down to 10 Hz. Watching high frame rate video or scrolling through your photos library quickly? The display will speed up to 120Hz. Want your cinematic look preserved at 24p without 3:2 pulldown? Not a problem. The ability to slow the display down to 10Hz conserves battery life, and 120Hz is of course no problem for the instantaneous response of the OLED technology driving the display.

While the non-Pro models don't feature Pro Motion technology, and aren't quite as bright, they're just as competitive in the resolution department, with the 13 having the same exact size and resolution as the 13 Pro (6.1", 2532 x 1170, 460 ppi), the 13 Mini having an actual higher pixel density (476 ppi, 5.4", 2340 x 1080), and finally the 13 Pro Max featuring the largest screen (6.7", 2778 x 1284, 458 ppi).

HDR

OLED displays, like those on recent iPhones, are capable of brighter whites and darker blacks than older displays. On these displays, the shadow region would be brighter and able to express more contrast, while the sky would be brighter still and more distinct from midtones, just as it would be in the real world. However, it's impossible to convey the capabilities in a JPEG image like the one above, viewed on an SDR display, so the difference you see here - between what this photo would look like on a typical smartphone vs. an iPhone - is far more dramatic in reality.

We'll also take a moment here to point out that the display, in combination with Apple's decision with the iPhone X to enable what we term 'HDR display' or 'HDR playback' of still images (and video, for that matter), is what sets Apple iPhone devices apart from its competition. While a detailed explanation is beyond the scope of this article, HDR playback allows stills and video to have far more output dynamic range - to appear less flat - than standard dynamic range (SDR) output (see above).

Brights appear much brighter than midtones, and both midtones and highlights appear far brighter than shadows, which can be rendered deep and dark, yet still well defined and visible. That means tonal relationships between objects in the real world are better preserved - reflections of sunlight on a lake appear much brighter than the surrounding water as they do in reality (which won't be the case when you view this image on your computer monitor). And thanks to Apple's use of the wide P3 color space, photos can contain brighter and more saturated tones.

The iPhone 12 lineup saw some dramatic improvements to HDR playback, with nearly all still images enjoying the benefit of the increased output dynamic range (with some caveats), and video - which has enjoyed HDR capture for some time now - finally getting a proper HDR output format in the form of Dolby Vision for increased output or displayed dynamic range. We look forward to assessing further improvements to Apple's HDR rendering of its stills and videos in the future.

We hope that gives you a comprehensive look at what to expect from Apple's new devices. What would you like us to investigate and take deeper dives into? Please share your thoughts in the comments.


*The inch-type measurement for sensor size is arguably archaic at this time, and made more sense when back when there were only a limited set of available sensor sizes. Today, we continue to see various new sensor sizes, particularly in smartphones, so some of the inch-types we've indicated here are only approximations and in some cases, unheard of (e.g. 1/1.65"-type). In such cases, we've included them only for comparison reasons.