Apple is advancing its wearable technology strategy with three new AI-powered devices currently in development, according to recent reports. The company is working on smart glasses, an AI-enabled pendant, and upgraded AirPods featuring integrated cameras, all designed to work in tandem with a significantly enhanced version of Siri. These products represent Apple’s most concerted effort yet to expand beyond the iPhone into ambient computing through wearable form factors.
The initiative, first detailed by Bloomberg’s Mark Gurman, positions the upcoming wearables as direct extensions of Apple Intelligence, the company’s broader AI framework introduced across its ecosystem in 2024. Unlike the immersive mixed-reality approach of the Vision Pro headset, these new devices are intended to be lightweight, unobtrusive accessories that enhance everyday interactions through voice, visual input, and contextual awareness. All three products are expected to rely heavily on the iPhone for processing and connectivity, functioning as peripheral sensors rather than standalone computers.
Central to the vision is a next-generation iteration of Siri capable of interpreting real-time visual and auditory data from the wearer’s surroundings. This upgraded assistant would enable users to inquire questions about objects in their environment, receive live translation, capture photos and videos via voice command, and acquire context-aware reminders based on what the device “sees.” The integration of cameras across all three wearables marks a notable shift in Apple’s privacy-conscious design philosophy, suggesting confidence in on-device processing to mitigate data exposure risks.
Among the trio, the AI-powered AirPods with cameras are reported to be closest to release, with Gurman indicating a potential launch later in 2026. These earbuds would retain the familiar AirPods form factor while incorporating tiny outward-facing cameras to gather environmental data for Siri processing. The smart glasses, meanwhile, are said to be undergoing internal prototyping, with hardware engineering teams having received test units as of early 2026. Production could begin as early as December 2026, pointing to a possible market release in 2027.
The AI pendant, described as resembling a flat disc similar in size to the AirTag 2, offers the most flexibility in wearability. It may include both a clip for attachment to clothing or bags and a hole for apply as a necklace. Constructed from aluminum and glass, the device is positioned as a minimalist alternative to screen-based wearables, relying entirely on voice interaction and environmental sensing. Gurman notes that while earlier rumors suggested an AI pin similar to the Humane AI Pin, the pendant format better aligns with Apple’s aesthetic and ergonomic preferences.
Design Philosophy and User Experience Focus
Apple’s approach to these wearables emphasizes subtlety and utility over technological spectacle. The smart glasses, in particular, are being designed to resemble conventional eyewear rather than head-mounted displays. Multiple sources indicate the frames will be made from acetate, a lightweight, durable material commonly used in prescription glasses, and will avoid any form of visual overlay in the lenses. This distinguishes them from augmented reality devices that project digital content into the user’s field of view.
Instead, functionality will be delivered through audio cues, haptic feedback (where applicable), and direct iPhone integration. Users will be able to interact with Siri using voice commands to initiate calls, play music, grab notes, or retrieve information about their surroundings. The glasses may likewise support features like live transcription of conversations, environmental audio analysis, and gesture-based controls via subtle head movements — all processed through the paired iPhone.
Camera capabilities are expected to vary slightly across devices. The smart glasses are rumored to include a dual-camera system: one high-resolution sensor for photos and video, and a secondary monochrome sensor optimized for depth perception and environmental mapping, similar in principle to the LiDAR scanner found in recent iPhone Pro models. This second camera could enable the device to estimate distances, detect surface textures, and recognize objects with greater accuracy — foundational capabilities for contextual AI responses.
Market Positioning and Competitive Landscape
By focusing on audio-first, visually aware wearables that offload computation to the iPhone, Apple aims to differentiate its offerings from competitors like Meta’s Ray-Ban Stories and upcoming generations of AI glasses from startups such as Humane and Brilliant Labs. While those products often emphasize standalone operation or social media integration, Apple’s strategy centers on seamless interoperability within its existing ecosystem, prioritizing reliability, privacy, and ease of use for iPhone users.
Pricing expectations remain unconfirmed, but industry analysts suggest the smart glasses could compete in the $299 to $499 range — aligning with Meta’s current pricing for its Ray-Ban smart glasses. The AirPods with cameras would likely carry a premium over standard models, potentially placing them above the current $249 price point of AirPods Pro. The AI pendant, if released, may be priced competitively with AirTag accessories or as a standalone lifestyle accessory.
Crucially, none of these devices are expected to require a cellular connection or operate independently of the iPhone. This design choice reinforces Apple’s strategy of using wearables as extensions of its flagship product rather than replacements, reducing complexity in both hardware design and user onboarding. It also allows the company to leverage its existing investments in silicon, AI models, and software security without duplicating them across multiple form factors.
Privacy, Ethics, and User Trust
The integration of always-on cameras in consumer wearables raises significant privacy considerations, both for users and those in their vicinity. Apple has historically taken a cautious approach to sensor deployment, often limiting functionality until robust safeguards are in place. For these new devices, the company is expected to rely on on-device processing for visual data, ensuring that raw images or video are not transmitted to Apple’s servers unless explicitly permitted by the user for specific tasks like photo storage or cloud-based editing.
Indicators such as recording lights, audible cues, and clear user controls for disabling sensors are likely to be included to address concerns about covert surveillance. Apple’s recent emphasis on transparency in AI training data usage and its public stance against surveillance capitalism may shape how these features are communicated to consumers. Still, the success of the product line will depend heavily on public perception and regulatory scrutiny, particularly in regions with strict biometric data laws.
Apple has not yet issued any official statement confirming the development of these specific wearables. As of the latest available information, the projects remain in the internal prototyping and testing phase, with no public roadmap or regulatory filings indicating imminent release. Consumers seeking updates are advised to monitor Apple’s official newsroom, SEC filings, and verified reports from journalists with a track record of accurate supply chain insights.
As wearable AI continues to evolve, Apple’s latest moves signal a belief that the next major shift in personal computing will not come from immersive headsets, but from discreet, context-aware accessories that understand the world through the user’s senses — and respond helpfully, without demanding constant attention.