Skip to content
FAQ

WWDC 2026 Preview: iOS 27 Brings Gemini-Powered Siri, Third-Party AI Extensions

Apple's Worldwide Developers Conference runs June 8–12 in Mountain View, and it could be the company's most consequential software event in years. iOS 27 is expected to debut a redesigned Siri backed by Google's Gemini, a dedicated Siri app, and a new 'Extensions' system that lets users route AI queries to Claude, Grok, and other third-party chatbots.

5 min read

Apple’s 37th annual Worldwide Developers Conference opens on June 8, 2026, at the Shoreline Amphitheatre in Mountain View, California. Every WWDC carries weight, but this year’s edition arrives at an inflection point: the company that spent the better part of two years defending a cautious, on-device-first approach to AI now has one of the boldest Siri transformations in the assistant’s history ready to show developers. iOS 27 is shaping up to be the most AI-intensive release Apple has ever shipped.

Siri 2.0: From Voice Command to Full Chatbot

The centerpiece of WWDC 2026 will almost certainly be a complete overhaul of Siri — the virtual assistant that, despite decades of incremental improvements, has consistently lagged behind rivals from Google, OpenAI, and Amazon. That changes with iOS 27.

The redesigned Siri, expected to be publicly demonstrated for the first time at the June 8 keynote, will function as a full conversational chatbot. It will support web search, image generation, content summarization, coding assistance, file analysis, and multi-step command execution — capabilities that bring Apple’s assistant into genuine parity with ChatGPT, Google Gemini, and Anthropic’s Claude.

The visual redesign is also significant. When users invoke Siri, the Dynamic Island at the top of the screen will display a “Search or Ask” prompt. Activation will trigger a “thin glow” around the Dynamic Island — a design language that breaks from the full-screen animation of previous versions and signals that Siri is now meant to feel more like an ambient intelligence layer than a discrete app.

Apple is also introducing a dedicated Siri app, preinstalled on iOS 27. The app allows for sustained back-and-forth conversation and maintains a full conversation history — a feature users of ChatGPT and Google Gemini have enjoyed for years. For Siri, it marks a philosophical shift: from single-turn commands to continuous, context-aware dialogue.

Google Gemini at the Core

Powering much of this transformation is Google. Under the Apple-Google partnership finalized in January 2026, the next generation of Apple Foundation Models will be built on a custom 1.2 trillion-parameter Gemini model developed specifically for Apple’s ecosystem. Apple is reportedly paying approximately $1 billion per year for access.

The arrangement gives Apple “a lot more freedom with Google’s tech” than was initially disclosed. Apple can use the foundation model to produce smaller, task-specific models — some of which run entirely on-device through Private Cloud Compute, preserving the privacy architecture that Apple considers a key competitive differentiator. For users, the experience will feel like Siri: seamless, integrated, and private. Behind the scenes, it’s Google’s most advanced infrastructure.

This partnership represents a remarkable realignment. For years, Apple and Google competed directly on AI features. The collaboration — structured so that Apple retains control over the user experience while Google supplies the underlying intelligence — may prove to be one of the most consequential partnerships in the consumer AI era.

Third-Party AI Extensions: Ending OpenAI’s Exclusive

Perhaps the most structurally significant announcement expected at WWDC 2026 is what Apple is calling Siri “Extensions” — a new system in iOS 27 Settings that lets users route AI queries to third-party chatbots. According to multiple reports, the launch lineup includes Claude from Anthropic, Google Gemini, Grok from xAI, and potentially others.

This ends the exclusive arrangement that OpenAI has held with Apple since the original Apple Intelligence rollout. OpenAI’s ChatGPT integration — announced with considerable fanfare at WWDC 2024 — has given the company privileged access to hundreds of millions of iPhone users. With Extensions, that access becomes non-exclusive. Claude, Gemini, and Grok will be able to compete on equal footing within the iOS interface.

The implications for the AI industry are significant. iPhone users represent one of the most valuable distribution channels in consumer technology. The ability to set a preferred AI assistant as a first-class citizen within iOS — accessible directly from Siri — gives every major AI lab a new vector for user acquisition and retention. Anthropic, in particular, stands to gain from access to Apple’s demographic: wealthier, more technically engaged users who are precisely the audience that AI subscription services need to convert.

Android XR Smart Glasses: From Concept to Product

WWDC 2026 is also expected to bring the first concrete product developments from Google’s Android XR platform, which debuted as a concept at last year’s I/O conference. Multiple hardware partners are now involved in the smart glasses effort, potentially enabling a range of price points when devices ship.

Apple’s own augmented reality and mixed reality efforts are expected to remain largely in the background at this year’s conference. The Vision Pro has been on the market for over a year, and while enterprise adoption has grown, consumer uptake remains limited. Developers attending WWDC will likely see incremental improvements to visionOS rather than a major platform announcement.

Developer Tools and AI Infrastructure

Beyond the headline Siri news, WWDC 2026 will feature an expanded set of developer tools for integrating AI into apps. Apple is expected to announce deeper APIs for on-device model inference, improved Core ML tooling, and expanded support for the Foundation Models framework that lets third-party apps run Apple’s models locally.

The agentic coding toolchain is also on the agenda. Apple has been building features that allow Xcode to leverage AI for code completion, documentation generation, and automated testing — positioning the company’s IDE to compete more directly with GitHub Copilot and Cursor in the developer productivity space.

What WWDC 2026 Means for Apple’s AI Narrative

Since the original Apple Intelligence announcement in 2024, critics have pointed to the slow rollout and modest capabilities of Apple’s AI features compared to rivals. The company faced particular pressure after OpenAI’s GPT-5 series raised the bar dramatically, and after Google’s Gemini became the AI engine of choice for more and more Apple users who were sidestepping Siri entirely.

WWDC 2026 is Apple’s answer. By packaging Gemini’s raw capability in Apple’s privacy-first UX, and by opening the platform to multiple AI providers, Apple is attempting to position itself not as an AI developer — but as the premier AI delivery mechanism for mobile. Rather than trying to out-research OpenAI or Anthropic, Apple is betting on distribution, design, and ecosystem lock-in.

Whether that strategy succeeds will depend on how well iOS 27’s Siri actually performs in daily use. The history of AI assistant announcements is littered with impressive demos that fell short of expectations in the hands of real users. June 8 will tell us whether Apple’s Gemini-powered transformation is the real thing — or another promise deferred.

The developer keynote begins at 10:00 a.m. PT on June 8. Sessions run through June 12, with developer labs and workshops offering direct access to Apple engineers throughout the week.

Apple WWDC iOS 27 Siri Gemini Google AI assistant
Share

Related Stories

Apple's Gemini-Powered Siri Arrives in iOS 26.4 — But the Full Upgrade Is Still Months Away

iOS 26.4 is rolling out with the first phase of Apple's $1 billion-per-year Google Gemini integration, bringing on-screen awareness, contextual understanding, and email summarization to Siri. But the transformational conversational AI Apple promised when it announced the partnership in January has been split across multiple OS updates, with full Phase 2 capabilities pushed to iOS 26.5 in May and iOS 27 in September — frustrating early adopters and raising questions about Apple's AI execution.

5 min read

Apple Tests Four Smart Glasses Designs as It Prepares to Take On Meta Ray-Bans

Bloomberg's Mark Gurman revealed Apple is testing at least four frame styles for AI-powered smart glasses set for a 2027 launch, powered by a custom N401 chip and dual cameras. The no-display device positions Apple squarely against Meta's wildly popular Ray-Ban smart glasses, while a simultaneous leadership shake-up sees AI chief John Giannandrea retire.

4 min read

Apple Names John Ternus as CEO, Tim Cook to Become Executive Chairman in September

Apple has named hardware engineering chief John Ternus as its next CEO, effective September 1, 2026, marking the company's first leadership change in 15 years. Tim Cook will transition to executive chairman, passing the baton to an engineer who built many of Apple's most iconic products—a choice analysts say signals a renewed hardware-first strategy for the AI era.

5 min read