Apple and Google partnership: the new Siri update explained
So the headline is that the big Siri update we have all been waiting for is finally going to be coming in 2026 thanks to the Apple and Google partnership. We knew this was coming. This was a rumor for a while and we have talked about it in the past.
Over the last couple of months, Apple continued to just spin their wheels and Siri continued to struggle. It is essentially an admission from a four trillion dollar company that they haven’t finally figured out AI on their own. Instead, they will be partnering with a company that has.
This Apple and Google partnership is one of the most interesting developments in the tech world because everyone assumes Apple is always on the bleeding edge.
Now, this feels like the moment we can all point to and say that they are losing the AI race. Honestly, I don’t blame them. Building these models is hard.
But if you are Apple and you have been working on Siri and it is just bad, then this move makes perfect sense, even if it feels a little hilarious.
The vanilla statement and the reality
Apple put out a classic, vanilla statement about the deal. They said that after careful evaluation, they determined Google’s technology provides the most capable foundation for Apple foundation models.
They are excited about the innovative new experiences it will unlock for users. It is a very standard corporate line, but the implications are huge.
We probably won’t get the full details and answers until Apple’s WWDC event in June, which is likely when they will formally announce this revamped Siri.
But the shocker is that Apple is basically saying they couldn’t get their own house in order in time to compete with what is already out there. It reminds me of the situation with search engines.
Apple doesn’t make a search engine, so they partner with Google to bring the best one available to Safari users. This is just the 2026 version of that, but for AI.

What happens to OpenAI
This news is particularly interesting because of Apple’s recent partnership with OpenAI. As it stands now, Siri kicks out to ChatGPT for any complex request that requires world knowledge or multi-step reasoning.
But if this new Google foundation beefs up Siri and allows it to do all this stuff on its own, it won’t need to kick out to ChatGPT anymore.
Does this mean the OpenAI partnership is dead? We don’t know the terms of the deals yet. It is possible Apple wants multiple options, but it seems redundant to have two different massive AI partners for the same voice assistant.
If Google’s Gemini models are the foundation, it might just squeeze everyone else out.
How good will the new Siri actually be
I tend to think this partnership will make Siri pretty good. Gemini on Android phones is already quite impressive. They have looped in a lot of the old Google Assistant features, so the product as a whole is solid. I would say it is comfortably in the top three of all AI models.
The real question is how much Apple builds on top of those foundation models. They will undoubtedly have a massive privacy focus because that is their brand. When they integrated ChatGPT, they did a lot of work to make sure user data was protected.
Using Google’s models as a foundation doesn’t necessarily mean your data is being sent to Google’s servers for training, as they can run these models on the device.
I am curious to see how they customize the functionality to make it feel like an Apple product rather than just a Google skin.

Will we get circle to search
One feature that I am personally hoping for is a real version of “Circle to Search” on the iPhone. Right now, Apple Intelligence has a weird half-step version where you take a screenshot and highlight what you want to perform a Google image search.
But on Android, Circle to Search is one of the most useful AI features of the last five years. You can circle anything on your screen, text or images, to get info immediately.
If this Apple and Google partnership brings that full functionality to the iPhone, that would be a massive win for users.

Interface vs models: who wins
There is a thought that has been popping into my head: maybe who owns the models isn’t as important as who controls the interface. Think about tools like Raycast on the Mac.
It is a Spotlight replacement that lets you choose which AI model you want to chat with. You can pick ChatGPT, Gemini, or even Grok.
At the end of the day, it doesn’t really matter which model I am choosing. I am just picking the tool that works best for me in that moment. But the interface I use to interact with that tool is Raycast.
Raycast gets the user because they are delivering the experience. Apple is doing this deal to keep users on the iPhone. If you were thinking about switching to a Pixel or a Samsung phone because their AI tools were better, this move is designed to take away that reason to leave.
Final thoughts on this Apple and Google partnership
It is a bit of a PR hit for Apple to admit they need help, but in the long run, if Siri actually starts working well again, users won’t care who is powering it. They just want their phone to be smart.
This Apple and Google partnership ensures that Apple stays relevant in the conversation while they figure out their long term strategy.
I am genuinely curious about how much these LLMs actually matter to you when you are buying a new phone. Is a smarter Siri enough to keep you on an iPhone, or are you looking for something more?
It is going to be a very interesting couple of years as we see how this actually rolls out to our devices.








[…] to leave their ecosystem, but that is exactly what is happening here. Whether you are a diehard Apple fan or someone who has been eyeing the latest Android flagship, this update has something that will change how you use your […]