Skip to content
FAQ

Apple to Let Users Pick Their AI: iOS 27 Opens Apple Intelligence to Gemini, Claude, and More

Apple plans to introduce an 'Extensions' framework in iOS 27 that lets users swap in third-party AI models — including Google's Gemini and Anthropic's Claude — across Siri, Writing Tools, and Image Playground. The move marks Apple's biggest pivot on AI openness yet and signals a platform war brewing inside the world's most valuable consumer device.

4 min read

Apple is preparing a fundamental change to how it handles artificial intelligence on the iPhone. According to a Bloomberg report published this week, iOS 27, iPadOS 27, and macOS 27 — set to be unveiled at WWDC on June 8 — will include a new “Extensions” framework that lets users route AI tasks to whichever third-party model they prefer, including Google’s Gemini, Anthropic’s Claude, and others.

The move represents a striking shift in strategy for a company that has historically guarded its platform with near-religious consistency. Less than two years after Apple launched Apple Intelligence as a tightly controlled, privacy-first AI layer, the company appears to be acknowledging a hard truth: users want choice, and Apple can no longer afford to bet the AI experience on its own models alone.

What “Extensions” Actually Does

The Extensions framework functions as an AI routing layer inside Apple Intelligence. Once a user installs an app like Gemini or Claude from the App Store, that app surfaces as an available engine inside Siri, Writing Tools, and Image Playground. An iPhone owner drafting an email can choose to run the edit through Anthropic’s Claude instead of Apple’s built-in model; a user in Image Playground can opt for Gemini’s image generation capabilities instead of Apple’s diffusion pipeline.

Crucially, Apple is also planning a voice personalization layer on top of this. Queries handled by Apple’s own systems will use a distinct Siri voice, while responses generated by a third-party model — say, Claude — could use a separate audio signature, giving users an audible cue about which AI is speaking. It is a subtle but elegant UX solution to the identity problem of multi-model devices.

The system does not replace Siri outright; Apple appears determined to keep its own assistant as the default orchestration layer. But the signal is clear: Siri becomes a surface, not the brain.

Why Apple Is Opening Up Now

The timing is deliberate. Apple Intelligence launched to middling reviews in late 2024 and has struggled to keep pace with rapidly advancing models from OpenAI, Google, and Anthropic. Rivals have released multiple flagship models since Apple’s last major AI update, and the gap in raw capability — particularly for complex writing, coding, and image generation tasks — has become visible even to mainstream users.

Regulatory pressure is also a factor. The European Union’s Digital Markets Act has already forced Apple to allow browser engine competition on iOS; AI model choice fits neatly within the same regulatory logic. By opening up proactively, Apple gets to design the integration on its own terms rather than being compelled by Brussels or a future U.S. rule.

There is a commercial calculus here too. Apple Intelligence has been one of the weakest justifications for upgrading to an iPhone 16 or iPhone 17. A framework that brings the world’s best AI models natively into iOS — with the privacy and on-device processing advantages Apple can uniquely offer — becomes a compelling upgrade story that no Android OEM can replicate at the same level of system integration.

The Platform Stakes

For Google and Anthropic, the Extensions framework is a distribution windfall. Getting into the default AI layer of two billion active Apple devices is arguably more valuable than any standalone app launch. Both companies have already established relationships with Apple: Gemini powers the Siri backend on iOS 26.4, and Claude is available via ChatGPT-style extensions in current builds. Formalizing these integrations at the OS level deepens the moat.

OpenAI is notably absent from the initial Bloomberg report’s list of supported providers, though that may reflect negotiating timing rather than a permanent exclusion. OpenAI and Apple have had a rocky relationship since the original ChatGPT integration was announced at WWDC 2024, and the competitive dynamics have sharpened considerably since OpenAI began developing its own AI smartphone hardware.

For smaller AI labs, the Extensions API could be transformative — or clarifying. A well-designed integration surface could give specialized models (a coding-focused assistant, a multilingual model optimized for Southeast Asian languages, a domain-specific legal writing tool) a direct path to iPhone users. But Apple has historically made third-party integrations difficult enough to effectively favor first-party alternatives, and the App Store review process could become a gatekeeper in a new form.

Privacy Architecture

One open question is how Apple threads the privacy needle. Its existing approach to third-party AI — routing queries through Apple’s servers as a privacy intermediary, or running inference on-device — has been central to its Apple Intelligence pitch. The Bloomberg report does not detail how on-device processing interacts with third-party cloud models, which by definition require sending data to external servers.

Apple will almost certainly require Extension providers to agree to strict data handling terms as a condition of App Store access. The company has done this with HealthKit, HomeKit, and other sensitive data frameworks. But the “your data never leaves your device” framing becomes harder to maintain when the underlying model is a cloud service run by Alphabet or Anthropic.

This is the tension Apple must resolve before WWDC. How it explains the privacy architecture will determine whether Extensions is perceived as an enhancement of Apple Intelligence or a compromise of it.

What to Watch at WWDC

Apple’s Worldwide Developers Conference begins June 8 at Apple Park in Cupertino. The Extensions framework — assuming the Bloomberg report is accurate — will be among the headline announcements, alongside Android-competing features in iOS 27, a refreshed Siri with improved on-device reasoning, and likely a preview of the AI models powering the next generation of Apple silicon.

The developer community’s reaction will matter enormously. If Apple restricts Extensions to a small set of approved partners or buries the settings deep in menus, the practical impact will be limited. If the framework is genuinely open and prominently surfaced, it could establish iOS as the default deployment target for every major AI lab on the planet — and reshape the economics of the AI model business in the process.

Either way, the age of Apple doing AI entirely in-house is over.

Apple iOS 27 Apple Intelligence Gemini Claude Siri AI models
Share

Related Stories

Apple Sets April 28 iOS 26 SDK Deadline as EU Forces Open App Distribution

Apple has announced that all App Store submissions must use the iOS 26 SDK starting April 28, 2026, forcing developers to update their toolchains. Simultaneously, EU Mobile Software Competition Act compliance in iOS 26.2 is unlocking alternative app marketplaces, third-party payment processing, and independent distribution for the first time — the biggest structural change to the App Store since its 2008 launch.

5 min read

Apple's Gemini-Powered Siri Arrives in iOS 26.4 — But the Full Upgrade Is Still Months Away

iOS 26.4 is rolling out with the first phase of Apple's $1 billion-per-year Google Gemini integration, bringing on-screen awareness, contextual understanding, and email summarization to Siri. But the transformational conversational AI Apple promised when it announced the partnership in January has been split across multiple OS updates, with full Phase 2 capabilities pushed to iOS 26.5 in May and iOS 27 in September — frustrating early adopters and raising questions about Apple's AI execution.

5 min read

Apple Names John Ternus as CEO, Tim Cook to Become Executive Chairman in September

Apple has named hardware engineering chief John Ternus as its next CEO, effective September 1, 2026, marking the company's first leadership change in 15 years. Tim Cook will transition to executive chairman, passing the baton to an engineer who built many of Apple's most iconic products—a choice analysts say signals a renewed hardware-first strategy for the AI era.

5 min read