How Apple AirPods with camera may differ from Meta AI glasses, details here

how apple airpods with camera may differ from meta ai


A report says Apple is in the final stages of launching new AirPods with built-in cameras. These new AirPods will also come with AI capabilities and could become Apple’s first wearable device designed specifically for the artificial intelligence era. While the company has already faced several delays in rolling out the new generative AI-powered Siri, it does not want to lose space in the wearable AI gadget market to OpenAI and Meta, both of which are already working on similar AI-powered wearable devices.

Bloomberg, in a report, said the new AirPods are currently being tested and already look and function very close to what the final product is expected to be. Apple had earlier planned to launch these earbuds as early as the first half of this year, possibly alongside the new MacBook Neo. However, the report said the company was forced to postpone the launch because the new generative AI version of Siri is not ready yet.

How will cameras in Apple AirPods work?

The cameras in these earbuds are not meant for taking selfies or recording videos. Instead, they will function like eyes for Apple’s internal AI assistant, Siri, helping it observe and understand what is happening around the user.

Siri will reportedly use low-resolution visual data from the cameras to answer users’ questions about their surroundings.

What is the purpose of AI-powered AirPods?

The concept is expected to work similarly to the live visual mode in OpenAI’s ChatGPT, where users can point a camera at an object and ask questions related to it.

For example, if a user is looking at ingredients in a kitchen, they could ask the earbuds what meal they can make using those items. The AI would analyse what the cameras see and provide suggestions.

The earbuds could also remind users about something they saw earlier or improve navigation by using real-world visual information.
The AirPods will also reportedly come with a small LED light that turns on whenever the cameras are in use. The light is meant to let others know the cameras are actively being used, although it remains unclear how noticeable it will actually be because earbuds are very small.

How are Apple AirPods different from Meta smart glasses?

While both the upcoming AirPods and Meta smart glasses are AI-powered wearable gadgets, the biggest difference lies in the design.
Meta smart glasses are designed to be worn on the eyes, while the upcoming AirPods are earbuds designed to be worn in the ears.

In terms of design, apart from slightly longer stems to fit the cameras, the new earbuds are expected to look similar to the AirPods Pro 3.
In terms of capabilities, some features are expected to overlap between the two devices, including built-in AI assistance, live translation capabilities, navigation support, voice-controlled messaging and calls, as well as AI memory and recall features that could reportedly be available in the upcoming AirPods.

However, some Meta smart glasses features such as photo and video capture from a first-person perspective, gesture controls, and fitness and sports integration are not expected to be available on the new AirPods.

– Ends

Published On:

May 9, 2026 17:41 IST



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *