Apple has announced it will open Siri to rival AI assistants in iOS 27, introducing a new "Extensions" system that will allow Google Gemini, Anthropic Claude, and other AI services to integrate directly with Siri.
The move ends OpenAI's exclusive partnership with Apple and represents a significant strategic pivot: rather than betting on its own AI capabilities, Apple is becoming the platform through which users access the AI of their choice.
The Platform Play
Apple's current AI strategy, launched with iOS 18 and expanded in iOS 26, involves a combination of on-device Apple Intelligence models and a privileged integration with ChatGPT. When Siri encounters requests beyond what its on-device models can handle, it can route them to ChatGPT — with user permission. That's been OpenAI's exclusive lane.
iOS 27 changes this. The Extensions framework opens that routing capability to any AI provider that builds an integration. The practical result: a user who prefers Claude for writing, or Gemini for research, will be able to set that as their preferred backend through Siri, rather than being defaulted to ChatGPT.
It's a move that benefits users in the short term, but more importantly, it's a defensive strategic bet for Apple. The company has struggled to build a competitive frontier AI model — Apple Intelligence has been widely criticized as underpowered compared to Claude, GPT-4o, and Gemini. Rather than continuing to pour resources into a model capability race it's losing, Apple is pivoting to own the interface layer.
What This Means for the Ecosystem
For Google, getting Gemini into Siri is a major distribution win. Google already has a deep relationship with Apple via the Gemini distillation deal — Apple has "complete access" to Gemini in Google's data centers to train smaller on-device AI models. A Siri Extension would add consumer-facing distribution on top of that infrastructure relationship.
For Anthropic, the same opportunity applies. Claude's growing reputation for coding and reasoning tasks gives it a distinct user base among developers and knowledge workers who use iPhones. A native Siri integration would make Claude accessible to those users without requiring them to switch between apps.
For OpenAI, losing exclusivity is a material setback. The ChatGPT integration with Apple was one of the most significant distribution deals in the company's history — putting it in front of iPhone users by default. Being one option among several is a different position than being the default.
Google's Countermove
The Apple announcement landed the same day Google made its own major moves. The company launched Gemini 3.1 Flash Live, its highest-quality voice model yet, powering Search Live's global rollout across 200+ countries. It also rolled out tools to import full chat history and memories from rival AI apps directly into Gemini.
The moves look like a direct response to the same dynamic Apple is navigating: as AI assistants multiply, the battle is increasingly about distribution, stickiness, and integration depth — not just raw model capability. Google is betting on search and voice as its moat. Apple is betting on the phone itself.
Both strategies could work. What neither can afford is to be locked out of the conversation entirely.
The Bigger Picture
The Siri Extensions announcement is part of a broader pattern playing out across the industry: the AI assistant layer is becoming a commodity, and platform ownership is where the durable advantage lies.
Apple, Google, and Microsoft are all making variations of the same bet: don't just build an AI model, own the surface through which people access AI. Apple's version involves being the phone. Google's involves being search and the browser. Microsoft's involves being Office and Windows.
The era of single-vendor AI dominance through exclusive platform partnerships appears to be ending. iOS 27 just confirmed it.



