Camera app developer Hipstamatic says it has found a way to use the depth data generated by the iPhone X to improve the way its TinType app works out which areas of a picture to render out of focus. The depth information the new camera phone creates has allowed Hipstamatic’s developers to identify a genuine plane of focus instead of having to guess and simulate the effect just with software blurring.

A portrait taken with the new app showing a map that demonstrates the area of the image the camera takes to be the subject and where the plane of sharp focus should be

Hipstamatic founder Ryan Dorshorst says that the TrueDepth feature of the new iPhone X provides information at every pixel about how far away the subject is, so with a subject identified it is a much easier job to determine what is background as well as what is in front of and behind the subject – and to blur only those areas. This allows the developers not so much to improve the impression of a tin type’s characteristics but the extremely shallow depth of field that we associate most with large format cameras.

In previous versions of the app a ring of blur was placed around the subject based on where the camera was focused, but it was only really effective when the subject was a person and they were in the right part of the frame. The new v2.1 version bases the decision about where the blur should go on real depth data, so the effect can be applied in a convincing way in a much wider range of situations.

The app is only available for iPhone users, and can be downloaded from the Apple App Store. For more information on the TinType app see Hipstamatic's TinType page.