Image: Caltech

Smartphone cameras have improved considerably over the past few years but despite innovations such as image stacking and dual cameras with image fusion technology the cameras are still limited by the laws of physics. This becomes particularly evident when looking at the 'tele' lenses that have cropped up on some recent high-end smartphones with dual cameras, such as the iPhone 7 Plus or Xiaomi Mi6.

Due to space constraints in the slim smartphone bodies these lenses use smaller sensors and offer considerably slower apertures than their wide angle counterparts which makes them a lot less usable in lower light conditions. However, now it looks like a research team at Caltech could have found a solution to the problem. They have developed an 'optical phased array' chip that uses algorithms instead of a lens to focus the incoming light beam. A time delay which can be as short as a quadrillionth of a second, is added to the light captured at different locations on the chip. This allows for modifying focus without a lens.

Professor Ali Hajimiri says the system 'can switch from a fisheye to a telephoto lens instantaneously - with just a simple adjustment in the way the array receives light.' The existing 2D, lensless camera array consists of an 8x8 grid with 64 sensors and is capable of capturing a low resolution image of a barcode. The current image results are a long way from current smartphone cameras but at this point the system is only a proof of concept and potential commercial applications are a few years in the future. The team's next objective is to use larger receivers that are more sensitive and capable of capturing higher-resolution images.