Apple is developing a significantly upgraded version of Siri for iOS 27 that will support multi-step requests, Bloomberg reports — the kind of compound instructions that voice assistants have promised for years but never reliably delivered. If it ships as described, it would represent the single largest functional leap for Siri since its 2011 debut.
What Multi-Command Actually Means
The concept is straightforward: instead of treating each voice request as an isolated transaction, Siri would understand and execute chained instructions. Ask for directions to a restaurant, then tell Siri to text those directions to a friend — all in one breath.
This is not a new idea. Google Assistant and Amazon Alexa have both experimented with multi-step commands, with mixed results. The technical challenge is not just natural language parsing — it is maintaining context across steps, resolving ambiguities, and handling failures gracefully when one step in the chain does not work as expected.
Apple has reportedly been working on this capability since June 2024 as part of the broader Apple Intelligence initiative. The two-year development timeline suggests this is more than a surface-level feature addition — it requires fundamental changes to how Siri processes intent, manages state, and interacts with on-device apps and services.
Beyond Multi-Step: The Full iOS 27 Siri Overhaul
Multi-command support is part of a larger set of upgrades targeting Siri's overall intelligence and utility. According to Bloomberg's reporting, the iOS 27 Siri update also includes:
Personal context awareness. Siri will draw on information from your apps, messages, and on-device data to provide more relevant responses. Ask "when does my flight land?" and Siri will check your email confirmations rather than offering a web search.
Screen awareness. Siri will be able to see and understand what is currently displayed on your screen, allowing commands like "send this to Sarah" without needing to specify what "this" refers to.
World Knowledge Answers. A web summary feature that lets Siri provide synthesized answers to factual questions by drawing on online sources — similar to the AI-generated summaries Google has been rolling out in Search.
Standalone Siri app. Apple is reportedly designing a dedicated Siri application for more complex, chatbot-style interactions — separate from the overlay interface that appears when you activate Siri today.
AI-enhanced keyboard. The default keyboard will gain grammar corrections and improved word suggestions powered by Apple Intelligence models.
The Timeline Question
Apple is expected to preview these capabilities at WWDC on June 8, with a target release alongside iOS 27 in September. But Bloomberg notes that some features may initially launch with a "Preview" label — Apple's way of signaling that a feature is functional but still being refined.
This phased approach has become Apple's standard playbook for AI features. Apple Intelligence itself launched in late 2024 with a subset of its promised capabilities, and features like notification summaries were later revised after user complaints about accuracy.
The Preview label sets expectations — useful for a company that has historically preferred to ship polished products over beta experiments. It also gives Apple cover if the multi-command functionality is not robust enough for all use cases at launch.
Why It Matters for Apple's AI Strategy
Siri's limitations have been a persistent weak point in Apple's ecosystem. While competitors invested heavily in conversational AI, Apple's assistant remained largely transactional — good at timers, reminders, and music playback, but unreliable for anything requiring genuine reasoning or multi-step execution.
The multi-command upgrade, combined with the recently announced support for third-party AI assistants through iOS 27's Siri Extensions, suggests Apple is pursuing a dual strategy: make its own assistant significantly smarter while simultaneously allowing users to route requests to rival AI systems like ChatGPT or Gemini when appropriate.
This is a pragmatic approach. Apple does not need Siri to be the best conversational AI on the market — it needs Siri to be the best at orchestrating tasks within the Apple ecosystem. Multi-command support is squarely aimed at that goal: not answering open-ended questions, but executing complex workflows across Apple apps and services with minimal friction.
Whether the September release delivers on this ambition will depend on how well Apple has solved the hard problems of context persistence, error recovery, and cross-app coordination. The two-year development timeline suggests the company knows how difficult these problems are. The Preview label suggests it is not entirely sure it has solved them.



