Will iOS 18 Improve Siri on the HomePod?

Homepod Siri Credit: Apple
Text Size
- +

Toggle Dark Mode

Perhaps one of the most significant changes coming with this year’s iOS 18 release will be a much more intelligent Siri voice assistant. That’s all thanks to Apple Intelligence, but it could also result in a big difference in how Siri responds and what it can do, depending on where you’re talking to it.

Many of us have been hoping these kinds of improvements would come to Siri for nearly a decade. While Siri was arguably state-of-the-art when it came to the iPhone 4S in 2011, it’s never managed to advance significantly beyond its humble beginnings.

This Limited-Time Microsoft Office Deal Gets You Lifetime Access for Just $39

Sick and tired of subscriptions? Get a lifetime license for Microsoft Office Home and Business 2021 at a great price!

While Siri floundered, competitors that launched three to four years later have managed to surpass it in key areas, particularly when it comes to answering general knowledge questions. Siri may still be better at setting reminders, starting workouts, and turning on the lights in your home, but it’s often stymied by even the most straightforward questions.

However, this year, Apple is taking the gloves off in the battle between voice assistants, promising a Siri that will better understand us, engage in more natural conversations, and know enough about our lives to give us the answers we need without having to spell out all the details.

WWDC24 408

That all sounds great, but it comes with a pretty significant catch: this new-and-improved Siri will only be fully available on devices that support Apple Intelligence — the new on-device AI features that provide the necessary large language models.

That means an iPhone 15 Pro or later or a Mac or iPad equipped with Apple’s M-series silicon. What’s conspicuously missing here is the one device that some of us talk to Siri on the most: the HomePod.

Considering the stringent requirements for Apple Intelligence, it’s no surprise that a mere HomePod doesn’t have the chops for it. The latest full-sized HomePod, released in early 2023, only has an Apple S7 chip — the same one found in the Apple Watch Series 7 — and is believed to have only 1GB of RAM. There’s no Neural Engine in that chip, but even the Apple Watch Series 9, which does have a Neural Engine in the S9, isn’t on the list.

Apple hasn’t specifically said the HomePod won’t be able to support Apple Intelligence in some way — perhaps the company will figure out a way to work through a nearby iPhone 15 Pro or other Apple Intelligence-capable device — but we’re betting against it, and so is Bloomberg’s Mark Gurman.

In fact, Gurman suggests that we may never see a traditional HomePod with Apple Intelligence support. In this week’s Power On newsletter, Gurman says it’s simply “too low-volume of a product to waste the engineering time.” Instead, Gurman says that Apple is focusing on the next generation of HomePods, which will do far more than just sit there.

This is the somewhat mythic personal robotics HomePod that Gurman has hinted at before: “an entirely new robotic device with a display that includes Apple Intelligence at its core.” We’ve also seen evidence that Apple is working on a HomePod with a seven-inch display that might be more suitable for Apple Intelligence, as it could also handle some of the visual generative AI features.

What About Cloud-Based AI?

WWDC24 366

One of the greatest strengths of Apple Intelligence is also its greatest weakness. Apple has insisted on building a large language model that runs almost entirely on its own silicon, relying on cloud-based servers as little as possible.

This is fantastic for both privacy and performance, but it’s leaving a lot of devices out of the party. That’s not just the HomePod but also the Apple Watch, last year’s base iPhone 15 models, and even the Vision Pro.

However, the interesting point is that Apple Intelligence doesn’t do all of its processing on-device. Apple has also built a Private Cloud Compute architecture to do the heavy lifting for requests beyond the capabilities of even Apple’s most powerful chips.

So, the question often arises as to why Apple can’t simply allow older devices to go straight to the cloud. After all, the servers are there, right? Why not use them to handle Siri requests from older and less powerful devices, just like Siri works now?

WWDC24 367

The short answer is that it’s not built that way. Apple has designed Private Cloud Compute to augment the on-device processing, not to replace it entirely. When an iPhone 15 Pro or M4 iPad Pro decides it needs more horsepower to handle a more sophisticated request, it doesn’t simply throw up its virtual arms in failure and hand off that entire request to Apple’s Private Cloud Compute systems.

Instead, it still processes what it can on-device and hands over the pieces it can’t handle to the cloud for processing. In essence, the two AI engines work cooperatively, like building a house; you can cut down the wood and frame the structure on-site, but custom fixtures are farmed out to contractors.

This reduces the load on Apple’s Private Cloud Compute servers while also ensuring that as much of your personal data as possible stays on your device, and you still get to take advantage of the performance benefits of your A17 Pro or M4 chip. After all, there’s no sense in letting that powerful Neural Engine sit idle if it can still accomplish at least some of what you’re asking it to do.



[The information provided in this article has NOT been confirmed by Apple and may be speculation. Provided details may not be factual. Take all rumors, tech or otherwise, with a grain of salt.]

Sponsored
Social Sharing