In a patent filing called "System and method for capturing images" that was first submitted in September 2013 and published yesterday, Apple describes a system that allows direct light from two different sensors onto one single image sensor using an array of mirrors and lenses. Using this system with two lenses that are capturing light from opposing directions would allow for a single image sensor in smartphones to capture images with both front and rear camera. Essentially, you could get identical image quality on front and back, smaller and thinner designs and possibly cost savings in manufacturing.

The system uses a series of electrically switchable mirrors which can change between reflective and transmissive modes by applying an electric current. This allows for controlling the direction of incoming light without the use of any moving parts. One or more of those mirrors redirect incoming light to an internal lens element before it hits the sensor. There is also a movable lens element that compensates for different distances to the sensor from the front and rear lenses. 

It would be interesting to find out if any light is being "lost" within the mirror and lens system, requiring faster apertures or more sensitive sensors to make up for it. As usual there is no way of knowing if this kind of technology will ever make it into a production device but the idea of capturing images on the main sensor, no matter what way around you are holding your phone, is certainly intriguing.

Source: USPTO | Via: Apple Insider