Qualcomm CEO Reveals Secret AI Device Partnerships With OpenAI and Meta—The Smartphone Era Is Ending
Cristiano Amon disclosed that Qualcomm is developing undisclosed AI form factors with 'pretty much all' major AI companies, explicitly naming OpenAI and Meta. The new Snapdragon Wear Elite chip—supporting 2B-parameter models on-device—is the silicon foundation for what Amon calls the 'ecosystem of you': glasses, earbuds, and autonomous agents replacing the smartphone as the center of digital life.
Cristiano Amon has spent years building the chipset inside the device you’re reading this on. Now the Qualcomm CEO is telling anyone who will listen that the smartphone—Qualcomm’s primary revenue engine for two decades—is approaching the end of its reign as the center of digital life. And he’s not speculating. He’s already deep inside the plans to replace it.
In a candid interview published May 9, Amon disclosed that Qualcomm is actively working with most of the major AI companies on undisclosed hardware form factors. “There are some secret form factors that I cannot tell you about,” he said. “But I think we’re working with pretty much all of them.” He explicitly named OpenAI and Meta—while declining to name the others—as partners developing AI-native devices that, by 2028, could meaningfully absorb workloads currently handled by smartphones.
The “Ecosystem of You”
Amon’s organizing concept is what he calls the “ecosystem of you”: a constellation of always-worn devices—smart glasses with cameras, earbuds with microphones, pins, pendants, maybe a ring—all coordinated by an AI agent that sees what you see, hears what you hear, and operates continuously on your behalf across software and real-world environments.
“If AI understands what we say, what we hear, what we see—glasses are very close to your eyes, your ears, your mouth,” Amon said. The vision is not a single replacement device but a distributed personal AI layer, where compute lives on your body rather than in your pocket.
The key shift is agentic: rather than a device you pick up and interact with deliberately, this ecosystem runs in the background. Your agent books the meeting, routes the task, answers the voicemail, and surfaces only what requires your attention. The smartphone becomes, at most, an incidental node in a network primarily anchored to your face, ears, and clothing.
Snapdragon Wear Elite: The Chip Behind the Vision
The hardware enabler for this vision is the Snapdragon Wear Elite platform, which Qualcomm announced at MWC 2026 in Barcelona. It is the first wearable chip to carry Qualcomm’s “Elite” designation—a brand the company has used only for its highest-performance mobile silicon—and the first wearable SoC to integrate Qualcomm’s dedicated Hexagon NPU.
The NPU matters because it enables local inference of AI models up to two billion parameters directly on the device, without a round-trip to the cloud. For a wearable that is meant to hear and see everything you encounter, cloud-dependent inference would introduce latency, burn battery, and raise obvious privacy concerns. Running even a moderately capable language model at the edge eliminates all three problems.
The performance numbers against the previous generation (Snapdragon W5+ Gen 2) are substantial: five times faster single-core CPU performance, seven times faster GPU throughput. Battery life is extended by 30% in comparable use profiles, and the platform supports rapid charging to 50% capacity in approximately ten minutes. These are not incremental improvements—they represent the kind of generational leap that typically unlocks new use case categories.
Qualcomm describes the platform as enabling devices that “see what you see, hear what you hear,” with multimodal processing across voice, vision, and location running locally or in a hybrid cloud configuration depending on context and connectivity.
OpenAI’s Hardware Push
Qualcomm’s disclosure that it is powering OpenAI’s first push into hardware adds a significant new data point to what has been a well-telegraphed but unconfirmed strategic direction. OpenAI has been reported for months to be developing a consumer device designed around its AI models—separate from ChatGPT as a software product—in collaboration with former Apple design chief Jony Ive.
Reporting from earlier this week indicated a timeline of 2027 for the first OpenAI-branded device, with MediaTek as one chip supplier. Amon’s comments suggest Qualcomm is also in that conversation. Whether the eventual device is a phone-adjacent hardware or a purpose-built AI wearable—which Amon’s framing strongly implies—remains undisclosed.
Meta’s parallel effort is more visible: the Ray-Ban smart glasses built with EssilorLuxottica already sell in volume, and the company has announced Samsung AI glasses (model numbers SM0200P and SM0200J) as a next-generation product. Meta’s Muse Spark AI model is understood to be designed in part for distributed, wearable deployment.
Why 2028?
Amon’s choice of 2028 as the inflection point for “meaningful workload shift” from phones to new form factors is deliberate. It reflects the product cycles of the companies he is working with, the 6G timeline (which Amon has called the connectivity layer that makes ubiquitous AI agents economically viable), and the maturation of on-device AI inference silicon.
6G standardization is expected to complete around 2028, and early commercial deployments in South Korea and Japan are anticipated in the same window. The combination of ubiquitous high-bandwidth low-latency connectivity and on-device inference capable of running billion-parameter models creates the infrastructure layer that agentic wearables require to function reliably in real-world conditions.
The 2028 framing also buys time for a consumer behavior shift. The smartphone has a thirty-year head start as the primary screen. Displacing it requires a device that is not merely technically capable but socially comfortable—that people are willing to wear all day, in all contexts, without self-consciousness. Meta’s smart glasses data, Apple’s Vision Pro sales trajectory, and the consumer response to AI pins like Humane’s product (which ultimately struggled) are all informing how aggressively companies are willing to push on this timeline.
What This Means for the Semiconductor Landscape
For Qualcomm, the strategic logic is straightforward: if the smartphone unit count peaks or declines, the company needs to be the preferred silicon supplier for whatever replaces it. Snapdragon Wear Elite is the opening bid. The Edge AI chip market—covering wearables, smart glasses, earbuds with intelligence, automotive AI, and ambient computing—is projected to grow at a compound annual rate exceeding 40% through the end of the decade.
The deeper play is the software and platform layer. Qualcomm has been building out its AI Hub, a repository of optimized models designed to run on its NPUs, and the Qualcomm AI Studio developer toolkit. If the “ecosystem of you” runs on Qualcomm silicon, the company’s relevance to every AI application developer grows commensurately—whether or not those developers write a line of code targeting a Qualcomm product directly.
Nvidia has dominated the AI infrastructure conversation for two years. Qualcomm is positioning itself as the company that owns the inference edge—the last meter between the AI and the human body. The race for that position, with Samsung, MediaTek, Apple Silicon, and eventually custom chips from OpenAI and Meta all in contention, may prove to be the defining hardware competition of the second half of this decade.