Apple's Photos app and other gallery solutions have used AI (artificial intelligence) technology for years now to identify objects and scenes for image sorting, searching and categorization. But according to information that was found in the source code of the Apple HomePod firmware the same technology might soon be implemented in the iPhone Camera app, and be used in real time (instead of post-capture) to optimize camera settings such as exposure, white balance and HDR for specific scenes.

The new feature is called "SmartCam," and takes the widely used face detection technology to a new level. The code—which was discovered as part of a firmware leak for Apple's HomePod—identifies several different types of scenes that could be identified, including: baby photos, pets, the sky, snow, sports, sunset, fireworks, foliage, documents and more. This sounds pretty much like a list of conventional camera scene modes, but without the need to select and set them manually in the camera menu—Apple will simply recognize them and shift accordingly.

The so-called "SmartCam" feature was not announced by Apple at its Worldwide Developers Conference, which likely means it won't be made available to older iPhones with an update to iOS 11. It's possible Apple wants to retain the feature as a unique selling proposition for the next generation iPhone models, which are expected to be announced in September.