Apple is developing a dedicated artificial intelligence chip to offload tasks like speech recognition and facial recognition on its mobile devices, according to Bloomberg.
The chip internally known as Apple Neural Engine could help improve battery life and overall performance, the report said.
Apple is looking to include the chip in both the iPhone and iPad. Apple is said to have begun testing future iPhone prototypes with the chip, but it’s not clear when the dedicated chip could arrive or if it will be included in the next iPhone release this fall.
Apple has been rumored to boost its artificial intelligence offerings to more fiercely compete against Google and Amazon, who have seemingly pulled away in the AI market for the time being. The Apple Neural Engine could be included in future Apple products like self-driving cars or AI-powered glasses.
Apple could discuss its AI plans during WWDC in June, Bloomberg said. Its competitor, Google, introduced new AI offerings in May that extend to phones, connected speakers, and cars.
This isn’t the first time Apple has developed a dedicated chip. It included a dedicated M-series chip for motion in the iPhone 6S and a chip for AirPods in the iPhone 7.
Apple has an AR team that is made up of hundreds of engineers working on AR-related features. In March, Bloomberg reported that the team is headed by Mike Rockwell, who previously ran the hardware and new technologies groups at Dolby. Rockwell is reporting to Dan Riccio, head of the iPhone and iPad hardware engineering groups.
Apple’s move into AR makes sense. The company has made several large AR-focused acquisitions, including PrimeSense and FlyBy, the maker of AR-camera software. Apple CEO Tim Cook has called AR a better technology than VR and for everyone, not just a niche market.