Apple has spent the better part of three years promising a smarter Siri. With iOS 27, the company appears ready to actually deliver one and to go significantly further than most people anticipated. Based on reports from Bloomberg, 9to5Mac, MacRumors, and code analysis by independent researchers, iOS 27 is shaping up to be the most consequential software update Apple has shipped since the original App Store. This is not a polish release. It is a structural reimagining of what an iPhone is supposed to do.
What Is iOS 27 and When Does It Come Out?
Apple is expected to unveil iOS 27 during its WWDC 2026 keynote on June 8. The first developer beta will likely be available immediately following the keynote, with a public beta typically following in July. The full release is expected in September 2026.
The update spans iPhone, iPad, and Mac simultaneously, with the AI changes running across iOS 27, iPadOS 27, and macOS 27 as a unified platform push.
Why iOS 27 Is Different From Every iOS Update Before It
Most iOS updates are iterative. New emoji. Refined animations. A tweaked lock screen. iOS 27 is architecturally different because it treats the iPhone not as a device that runs apps, but as a platform that coordinates intelligence. The distinction matters.
Bloomberg's Mark Gurman reports that Apple intends to treat iOS 27 similarly to Mac OS X Snow Leopard, prioritizing performance and stability over a long list of new features. That framing is technically accurate but undersells the ambition. Snow Leopard made the Mac faster. iOS 27 is attempting to make the iPhone smarter in ways that change how users interact with every surface of the operating system.
The core strategy has three pillars: a rebuilt Siri, an open AI model ecosystem through a feature called Extensions, and a wave of on-device AI tools baked into system apps.
The New Siri: From Voice Assistant to Conversational AI
Siri has been the most criticized piece of software Apple ships. iOS 27 is the company's most serious attempt to fix that.
The upgraded Siri is rumored to offer a standalone Siri app for iPhone, a chatbot-style interface similar to ChatGPT and Gemini, world knowledge built on a large language model foundation, a new design tied to the Dynamic Island, the ability to perform multiple actions with a single request, and integrations with more third-party AI agents.
iOS 27 is likely to introduce a new Siri interface in the Dynamic Island. When a user triggers Siri, the Dynamic Island will reportedly show a "Search or Ask" prompt, accompanied by a glowing cursor.
The new Siri will be built on an entirely new foundation model that uses Google's Gemini as its base, with Apple modifications, enhancements, and guardrails on top. That is a significant disclosure. Apple is not simply licensing a chatbot. It is building its own intelligence layer on top of a foundation model it does not fully control, which raises real questions about differentiation, data handling, and long-term competitive positioning.
Apple also plans to finally deliver the delayed personalized Siri features originally shown in 2024, such as asking Siri about a contact's flight and lunch reservation based on information retrieved from the Mail and Messages apps. Tim Cook confirmed on a recent earnings call that Apple looks forward to bringing a more personalized Siri to users this year.

Extensions: Apple Opens the Door to Claude, Gemini, and Beyond
The most strategically significant announcement in iOS 27 may not be Siri at all. It is Extensions.
According to Bloomberg, Apple is referring to this new capability as Extensions. In test versions of iOS 27, Apple explains that Extensions allow users to access generative AI capabilities from installed apps on demand, through Apple Intelligence features such as Siri, Writing Tools, Image Playground, and more.
Users will reportedly have their pick of which third-party AI models they want to use for a host of tasks. Models from Google and Anthropic are being tested.
For example, Google and Anthropic could add support for the Extensions system through the Gemini and Claude apps, respectively. Users could then choose those models to power features like Siri, Writing Tools, and more. This is in addition to Apple's existing deal with Google to use Gemini to power native Siri and Apple Intelligence features.
For Siri specifically, users will be able to select a different voice for conversations that use external models. This is designed to help users quickly understand which AI source is handling their query.
Apple will also create a specific App Store section listing compatible AI apps that users can download, making it easier to onboard into the Extensions ecosystem.
This is a meaningful strategic shift. Apple has historically kept its operating system tightly sealed. Allowing third-party AI models to power core system features like Writing Tools and Siri is the kind of opening that, in previous years, would have been unthinkable.
AI Photo Editing: Extend, Enhance, and Reframe
Apple is planning a major overhaul of the built-in photo-editing features for iPhone, iPad, and Mac, leaning heavily on artificial intelligence to better compete with Android devices. The company is developing a new suite of tools powered by its Apple Intelligence platform for iOS 27, iPadOS 27, and macOS 27, slated for release this fall.
Apple's Photos app is reportedly gaining three new AI-powered editing tools. Extend lets users generate additional image content beyond the original frame. Enhance uses AI to automatically improve color, lighting, and overall image quality. Reframe is designed primarily for spatial photos and allows users to shift perspective after the shot is taken.
All of this AI processing is expected to happen directly on the iPhone, powered by Apple Intelligence. On-device processing reinforces Apple's long-standing commitment to user privacy and data security by keeping personal photos from being uploaded to external servers.
For users who have watched Android manufacturers ship generative photo tools for two years, this will feel overdue. For Apple, the emphasis on privacy-preserving, on-device processing is the deliberate differentiator.
Visual Intelligence Gets Smarter
iOS 27 may give Visual Intelligence the ability to use the camera and AI to understand real-world text and objects, then automatically perform useful actions. Reports suggest it will be able to read and analyze nutrition labels to identify calories and nutrients such as protein, carbohydrates, and fats, and save that information to the Health app for tracking. Visual Intelligence may also scan a business card or document containing contact details and automatically add the relevant information to the Contacts app.
Apple is also expected to expand Siri's smart home capabilities, including AI vision recognition features that allow devices to recognize users and what they are doing while protecting privacy. These AI-powered features will be critical to future products like smart home cameras and smart glasses.
Smaller Changes With Real Impact
Not everything in iOS 27 is large-scale infrastructure. Several smaller features signal where Apple's AI is quietly embedding itself into daily use.
Autocorrect may gain alternative word suggestions, with Apple working on a change that could resemble tools like Grammarly. In addition to correcting misspelled words, the updated keyboard could offer alternative word choices.
Safari's Tab Groups feature may gain automatic naming through Apple Intelligence, similar to how Apple currently uses AI to create custom podcast chapters in Apple Podcasts and generate new Reminders sections to organize tasks. iOS 27 will reportedly be able to automatically name Tab Groups based on their contents as an optional feature.
iOS 27 will reportedly support 5G satellite internet connectivity, though this functionality may be limited to the upcoming iPhone 18 Pro, iPhone 18 Pro Max, and iPhone Ultra models with Apple's next-generation C2 modem.
Which iPhones Will Support iOS 27?
The iPhone 11, iPhone 11 Pro, iPhone 11 Pro Max, and iPhone SE 2 are not expected to support iOS 27. Apple tends to drop devices from new OS releases after approximately seven years, so their omission is consistent with historical patterns.
The iPhone 15 and older will not be able to take advantage of any Apple Intelligence features, as is currently the case.
Users running iPhone 16 or later will have access to the full Apple Intelligence feature set. Users on iPhone 12 through iPhone 15 will receive the core iOS 27 update but without AI-dependent capabilities.
What This Means for the Broader AI Industry
Apple's entering the AI model marketplace as a neutral platform rather than a closed ecosystem is a significant development. Google, Anthropic, and others now have direct access to iOS at the system level, something that was not possible before. That changes the competitive dynamics considerably.
For users, the practical result is that the smartphone becomes a hub that coordinates between different AI services rather than being locked to a single assistant. The iPhone stops being an Apple Intelligence device and starts being an AI platform that happens to be made by Apple.
Whether Apple can execute on this in time for a September launch is the real question. The company has missed Siri-related deadlines repeatedly over the past two years. WWDC on June 8 will reveal how much of what has been reported is actually ready to ship.




