Apple is reportedly planning to bring cameras to its Apple Watch lineup within the next two years with advanced AI-powered features like Visual Intelligence on board. According to Bloomberg’s Mark Gurman in his latest Power On Newsletter, these cameras will be integrated into the devices differently depending on the model. For the standard Apple Watch Series, the camera will be embedded “within the display,” while the Apple Watch Ultra will have it positioned on the side, near the digital crown and button. This upgrade will allow the smartwatch to “observe its surroundings” and use artificial intelligence to provide users with useful, real-time information.
This isn’t just limited to the Apple Watch. Gurman also hints that Apple is exploring similar camera technology for future AirPods, aiming to enhance their functionality with AI-driven capabilities. The concept mirrors the Visual Intelligence feature that first launched with the iPhone 16. On the iPhone, this tool uses the phone’s camera to perform tasks like extracting event details from a flyer and adding them to your calendar or identifying a restaurant to fetch more information about it. Currently, Visual Intelligence relies on AI models from external companies, but Apple is working on developing its own in-house AI technology to power these features by 2027- the expected release year for the camera-equipped Apple Watch and AirPods.
The push for AI in Apple’s wearable devices ties into broader efforts led by Mike Rockwell, a key figure at the company. Gurman notes that Rockwell, who previously led the Vision Pro project, is now tasked with overseeing the much-needed upgrade to Siri’s language model, which has faced delays. Alongside this, he continues to contribute to visionOS, the software likely to support another upcoming Apple wearable: AI-enhanced AR glasses.