Apple’s AI Pivot: Siri May Become a Full Chatbot in iOS 27
Toggle Dark Mode
Apple’s first wave of Siri improvements may only be a couple of months away, but the company appears to have much bigger ambitions for its beleaguered voice assistant.
At a Glance: A Two-Step Siri Evolution
- Phase 1 (Spring 2026): iOS 26.4 brings a contextual Siri using Apple Foundation Model (AFM) 10, an 8x jump in intelligence.
- Phase 2 (Fall 2026): iOS 27 debuts “Campos,” a full-fledged AI chatbot comparable to Gemini 3.0.
- The Core Shift: Apple moves from a voice-first assistant to a text-and-voice chatbot deeply integrated into core apps like Mail, Photos, and Xcode.
While the company was happy to partner with OpenAI to leverage ChatGPT as a Siri extension, it was apparent from the start that this was likely to be little more than a stop-gap while it got Siri up to speed. Moving forward, Apple isn’t going to want to cede territory to its rivals, and there’s more to AI assistants than just voice interaction.
Right now, ChatGPT and Gemini are the two dominant players in the interactive chatbot space, and both of them are very present on Apple’s devices. Despite its integration with ChatGPT and its new partnership to use Google’s AI technology, these are still distinct apps that exist outside of the Apple ecosystem — and will continue to be.
Today, Bloomberg’s Mark Gurman confirmed what many of us had already suspected: despite Apple’s quiet protestations otherwise, the company is indeed planning to turn Siri into something that can stand toe-to-toe with these other tools.
In a new report, Gurman says Apple plans to fully enter the “generative AI race dominated by OpenAI and Google,” with a new chatbot that it’s already developing for a potential release in iOS 27 later this year.
The chatbot — code-named Campos — will be embedded deeply into the iPhone, iPad and Mac operating systems and replace the current Siri interface, according to people familiar with the plan. Users will be able to summon the new service the same way they open Siri now, by speaking the “Siri” command or holding down the side button on their iPhone or iPad.
Mark Gurman
“Campos” is expected to take things up a notch further than even the contextual Siri that Apple promised during its 2024 Worldwide Developers Conference (WWDC). While that delayed promise may have been a bit of smoke and mirrors in its initial presentation, Apple has been hard at work to make it a reality, and its willingness to make an alliance with one of its biggest rivals shows just how serious Apple is.
To be clear, this will be distinct from what Gurman calls the “non-chatbot update” to Siri. That’s still on track to arrive in iOS 26.4, which is just around the corner and could go into early beta as soon as next month. The iOS 26.4 update will fulfill Apple’s 2024 promises, adding the ability to analyze on-screen content and personal data from apps like Mail, Messages, and Calendar to get more contextual awareness. Gurman says it should also be better at searching the web for information.
Enter Campos…
However, that’s just the first phase in a much broader plan. If Apple can get things moving fast enough, it could debut the new Siri chatbot at this year’s WWDC in June as part of its plans for iOS 27. Whether it will ship in iOS 27.0 is another matter, but hopefully Apple has learned its lessons from 2024 and won’t show it off unless it has absolutely certainly it can deliver it during the iOS 27 lifecycle, which will run into early 2027.
The Campos upgrade will leverage those initial features, plus all the things that Siri can already do, such as making phone calls, setting timers, launching the camera, playing music, controlling home devices, and more. Gurman says that Apple also plans to integrate Siri even more deeply into all of its core apps, from Music, Podcasts, and TV to Mail and Photos, letting users do more with just their voice. Some of this is already there, and some of it aligns with the promises Apple made previously, so the notion isn’t entirely new, but definitely has the potential to take things to another level. It could also eventually replace Spotlight, since it would offer a much more intelligence search interface.
Interestingly, Apple also plans to seriously up the stakes in its Google partnership. While the short-term iOS 26.4 Siri improvements are expected to be based on Apple Foundation Models (AFM) version 10, a 1.2 trillion parameter version of Google’s AI models that represents an 8x jump over the current models, Campos will turn the dial up to (version) 11 — an even higher-end model that’s comparable to Gemini 3.0.
OS Version
Model
Estimated Power
Interaction Style
Current
AFM v1
~150B Parameters
Voice Command
iOS 26.4
AFM v10
1.2 Trillion
Contextual/Action-based
iOS 27.0
AFM v11
Gemini 3.0 Class
Full Generative Chatbot
However, Apple might also be willing to surrender some of its control to get there, as insiders say that it’s discussing hosting Campos directly on Google’s more powerful AI servers rather than Apple’s Private Cloud Compute infrastructure. It’s an open question how privacy considerations will factor into this, but it’s likely talking about Apple hosting or leasing Google-built servers with their tensor processing units (TPUs), rather than mixing it into Google’s own AI farms.
If Apple goes down that road, it may merely be another temporary solution, as it’s reportedly building the new system so that the underlying models can be easily swapped out, leaving the door open for a pivot back to Apple’s own in-house models once they’re ready. Several analysts have already pointed out that Apple is unlikely to depend on Google for any longer than it needs to, and the current Gemini deal is more about short-term expediency than long-term strategy.
Siri Gets Chattier

According to Gurman, the Campos upgrade will offer both voice and typing-based interactions, although it’s not yet clear how that will be delivered. It’s been possible to type to Siri for several years now, and Apple made that capability more accessible in iOS 18 with the launch of Apple Intelligence, but it’s far more ethereal right now.
Like ChatGPT and Google Gemini, Apple’s chatbot will allow users to search the web for information, create content, generate images, summarize information and analyze uploaded files. It also will draw on personal data to complete tasks, being able to more easily locate specific files, songs, calendar events and text messages.
Mark Gurman
If Apple expects to create something that will make iPhone users willing to leave rival chatbots behind, it will likely need to create a more robust interface, possibly baking it into a standalone app. Gurman says Apple is testing a “standalone Siri app” right now, but adds that the company “doesn’t plan to offer that version to customers,” preferring to integrate the software the way Siri is today.
That may not be the ideal approach, but it would also align with the strategies Apple has used in the past with its home and iCloud platforms. Both HomeKit and iCloud Drive launched as background services without any specific companion apps; the Home and Files apps didn’t arrive until two or three iOS versions later. However, Apple also allowed third-party apps to tie into those frameworks; the Siri “Campos” will likely have an open API that provides developers with access to the chatbot features, much like the Apple Intelligence frameworks in iOS 26, but whether Apple will allow Campos-powered chatbots on the App Store is another matter.

