Apple already uses AI smarts to help with iPhone image processing via the neural engine that has been part of A-series chips since the iPhone 8 and iPhone X. But future iPhones could see some AI processing performed directly by the camera sensor itself …
The Verge reports that Sony – which supplies the advanced image sensors used in current iPhones – is working on integrating AI smarts directly into the sensor. It has already announced the first of these, though that is a model geared to commercial rather than consumer applications.
The IMX500 is for retail and industrial uses, such as facial recognition to support checkout-free stores, but the company is clearly thinking ahead to smartphone applications.
Many applications rely on sending images and videos to the cloud to be analyzed. This can be a slow and insecure journey, exposing data to hackers. In other scenarios, manufacturers have to install specialized processing cores on devices to handle the extra computational demand, as with new high-end phones from Apple, Google, and Huawei.
But Sony says its new image sensor offers a more streamlined solution than either of these approaches.
Whether Apple would take advantage of these types of sensors is unclear. The company puts a lot of work into its own image-processing algorithms, but it’s possible that future iPhones could incorporate sensors that do some AI work and then hand off to the A-series chip for further processing, potentially boosting both performance and efficiency.