Apple Tests Four Smart Glasses Designs as It Prepares to Take On Meta Ray-Bans
Bloomberg's Mark Gurman revealed Apple is testing at least four frame styles for AI-powered smart glasses set for a 2027 launch, powered by a custom N401 chip and dual cameras. The no-display device positions Apple squarely against Meta's wildly popular Ray-Ban smart glasses, while a simultaneous leadership shake-up sees AI chief John Giannandrea retire.
Apple is testing at least four distinct frame designs for its upcoming AI-powered smart glasses, according to a sweeping report by Bloomberg’s Mark Gurman published over the weekend. The report — which also disclosed the surprise departure of Apple’s top AI executive — offers the most detailed window yet into a product that could define the next chapter of wearable computing.
Four Designs, Premium Materials
The prototypes under evaluation span two broad shape families: rectangular and oval. Within those categories, Apple is testing a large rectangular frame, a slimmer rectangular version said to closely resemble the glasses worn by CEO Tim Cook, a larger oval or circular frame, and a smaller oval or circular variant. Engineers are working with acetate as the primary body material — a choice that signals Apple’s intention to position the glasses as a premium fashion accessory rather than a utilitarian gadget.
Confirmed colorways in active testing include classic black, ocean blue, and a warm light brown. The front-facing cameras will be arranged in an oval pattern surrounded by subtle indicator lights — a nod to both aesthetics and the regulatory and social norms around ambient recording that have tripped up earlier smart-glasses products.
The N401 Chip and a Camera-First Philosophy
Under the hood, Apple’s glasses will run on the N401, a custom silicon designed in-house and based on the same S-series architecture powering Apple Watch. The chip is deliberately modest: the glasses have no display of any kind, meaning all computationally intensive processing is offloaded to a paired iPhone. Two cameras handle distinct jobs — one optimized for photo and video capture, the other dedicated to computer vision tasks like object recognition and spatial awareness.
This camera-first, display-free approach is a studied departure from the Vision Pro’s inside-out computing philosophy. Apple is betting that consumers who found the Vision Pro too isolating want ambient intelligence woven into regular eyewear rather than a headset that seals them off from the world.
AI Features: Siri Takes Center Stage
The primary interface is an upgraded Siri, which Apple has been rebuilding with large-language-model capabilities over the past two years. On these glasses, Siri will handle a full suite of ambient tasks: reading out notifications, playing music, managing phone calls, providing turn-by-turn navigation through audio, and running live translation in real time.
Visual intelligence — a feature already shipping in iPhone 16 and later — gets a persistent, always-available form factor here. Users will be able to point their gaze at an object, sign, menu, or landmark and ask Siri about it without reaching for a phone. Apple’s partnership with Google means some queries will route through a custom Gemini model, supplementing on-device capabilities when cloud inference is needed.
The glasses are designed to work in concert with two other ambient-AI devices Apple is reportedly developing in parallel: a camera-equipped AI pendant that can be clipped to clothing for continuous visual context, and camera-equipped AirPods that feed audio-spatial context to Siri. Together, the three products would form an always-on sensory network feeding Apple Intelligence.
Giannandrea Out, Subramanya In
Gurman’s report included a significant organizational revelation: John Giannandrea, Apple’s Senior Vice President for Machine Learning and AI Strategy since 2018, is retiring. Giannandrea, who was poached from Google where he ran Search and AI, presided over the foundational architecture of Apple Intelligence but was widely criticized internally and by analysts for the slow, stumbling rollout of advanced Siri features.
His replacement is Amar Subramanya, a former Microsoft AI executive. The timing is notable: Apple is entering a period where its AI consumer products — including these glasses — will be judged directly against offerings from Meta, Google, and increasingly Amazon. Subramanya’s deep enterprise AI background at Microsoft suggests Apple wants sharper execution discipline as it accelerates product delivery.
The Meta Threat
The competitive urgency is real. Meta’s Ray-Ban smart glasses — developed in partnership with EssilorLuxottica — have become a runaway hit, with Meta shipping several million units and iterating rapidly on capabilities. Meta’s latest version supports real-time AI vision, voice commands through Meta AI, and live translation, features that overlap directly with what Apple is building.
Meta’s strategic advantage is its established fashion partnership and a price point well below what Apple will likely charge. Apple’s counter-bet is that it can deliver a tighter hardware-software integration, superior audio and camera quality, and a premium brand identity that justifies a premium price — much as it did with AirPods when it entered a crowded wireless earphone market in 2016.
Third-party analysts estimate Apple could ship between 8 million and 15 million smart glasses units in their first full year if pricing lands below $500 — a range comparable to the higher end of AirPods. IDC forecasts the global smart glasses market to exceed $25 billion in annual revenue by 2028, with AI-enabled wearables accounting for the vast majority of growth.
Timeline and What Comes Next
Apple is targeting an announcement either at its fall 2026 event or early 2027, with retail availability in spring or summer 2027. The company typically runs parallel prototype tracks before committing to final industrial design, so the existence of four distinct frame styles does not necessarily mean all four will ship — Apple may converge on two or even one before finalization.
What is clear is that the timeline has firmed up considerably from the exploratory phase that characterized Apple’s wearables road map just twelve months ago. With Meta accelerating, Google testing its own AI glasses through partnerships, and Amazon rumored to be working on smart glasses tied to Alexa, Apple no longer has the luxury of a patient, multi-year runway.
The next wave of personal computing is being worn on faces. Apple, characteristically, arrived later than the pioneers — but rarely arrives last.