iOS 26 is Coming: Here's What We Know

iOS 26 harnesses the power of Apple Intelligence, a rebuilt Siri, and new AI tools. Here’s what’s actually changing and what still needs work.
iOS 26 is Coming: Here's What We Know

Every September, iOS gets its annual glow-up, which usually means a handful of shiny new features wrapped in Apple’s “biggest update ever” marketing. 

But this year is different.

iOS 26 is the first iPhone OS built from the ground up with Apple Intelligence at its core. That means a rebuilt Siri, deeper AI integration, and a whole lot of promises about making your iPhone more useful and, apparently, more “delightful.”

So before the hype hits overdrive, here’s what’s actually coming (and where Apple’s still got some homework to do).

iOS 26 Release Date and Supported Devices

iOS 26 Release Date and Supported Devices

Apple officially unveiled iOS 26 at their Worldwide Developers’ Conference (WWDC) in June, with a public release projected for mid-September. 

Based on what we know now, the headline features, including Apple Intelligence and the new Siri, won’t be available on every device, even after release. From what I’ve seen, you’ll need at least an iPhone 16 or newer to get the full AI experience. Yes, you’ll still get the basics on older models, but without the latest Apple silicon, you’re essentially running “iOS 26 Lite.”

There’s also the usual regional rollout caveat. Expect the US to get most Apple Intelligence features first, with other countries playing catch-up over the following months. It’s not exactly new behaviour for Apple, but it’s worth keeping in mind if you live outside the States, especially if you’ve been eyeing the AI tools as a reason to upgrade.

Siri’s Big Rebuild

Apple’s calling this the “most advanced Siri ever,” and technically, they’re right. But it’s also the most expensive Siri ever, depending on how far you want to take it.

There are now three “flavours” of Siri:

  • Standard Siri. Free, built into iOS 26, handles basic tasks and some new Apple Intelligence features.
  • Siri + ChatGPT. Routes certain queries through OpenAI for better answers, but still free.
  • Siri + ChatGPT Plus. $19.99/month for higher usage limits, better voice models, and extra features like larger file and photo processing.

On paper, it’s a huge leap. In practice… It’s still hit-or-miss.

For example, with multi-timer support, Siri failed my “5 minutes for sausages, 12 minutes for fries” test by lumping them together into a single timer. And if you need to delete any alarms? Prepare for a mini interrogation as Siri repeatedly asks which one you mean, even when you’ve literally just set it.

Compare that to Google Gemini, which not only nails the multi-timer request but can delete them all in one go without the back-and-forth. Granted, Gemini’s not flawless either (I’ve had it spit out chunks of raw code when setting alarms), but at least the end result works without me wanting to give up and do it manually.

The bigger frustration is Siri’s habit of handing off to ChatGPT whenever a request gets even slightly complex, which is fine if you’re paying for Plus. But otherwise, you’re left with an assistant that suddenly “can’t help with that” unless you subscribe. That’s a slippery slope for a core OS feature.

Apple Intelligence Features

Apple Intelligence Features

Apple’s big selling point for iOS 26 is that it’s been “built for Apple Intelligence from the ground up.” That translates to a mix of on-device AI tricks and cloud-assisted heavy lifting.

In theory, this should make Siri feel more like a genuine assistant than a voice-controlled search bar. In reality, though, I anticipate lots of inconsistencies. Half the time, Siri punts the request over to ChatGPT, which has no access to your email, messages, or files, so it comes back with a generic “I couldn’t find anything”... or worse, a web search result.

Apple Intelligence also brings AI-generated images and edits, but right now, they’re lagging behind the competition. Simple object removal works okay, but when I tested more complex edits (like filling in half of my face from a selfie), Samsung’s AI nailed it while Apple’s attempt looked like something from a bad deepfake meme.

In short, the ambition is there, and when it works, it’s genuinely useful. But if you’ve seen what Google Gemini or Samsung AI can do, you’ll notice Apple’s still a step behind in execution, and we’ll see if iOS 26 tightens things up. 

Everyday Usability Changes

Not every iOS 26 change is flashy, but some of the smaller tweaks will make a difference (at least when they work as intended).

Shopping lists are a good example. Siri will happily add items for you, and it usually gets it right the first time. It’s not as fast or as conversational as Gemini, which feels more like talking to someone who’s already halfway to the shop, but it gets the job done.

Messaging is a mixed bag. With WhatsApp, Siri is actually more reliable than Gemini, which still has an annoying habit of “not finding” contacts that are sitting right there in the app. Siri will find them, ask for your message, and send it without drama. The problem is when you try to share photos or videos. Apple’s AI still can’t send media to WhatsApp directly, so you’re stuck doing that part manually.

Media control is smoother, though. Playing Spotify playlists, skipping tracks, or jumping into Apple Music works fine, though the AI layer doesn’t really add much beyond what Siri’s been able to do for years.

In short, iOS 26’s everyday upgrades feel more like quality-of-life polishing than ground-breaking change, and if you use a mix of Apple and third-party apps, you’ll quickly hit the same integration walls we’ve been complaining about for ages.

New Visual and Context Tools

New Visual and Context Tools

One of the more useful Apple Intelligence upgrades in iOS 26 is the expansion of its “Visual Intelligence” features. Previously, you could only point your camera at something in the real world and ask Siri what you were looking at. Now, it works on screenshots too.

Take a screenshot of an event poster, and Siri can recognise the date, time, and location, then offer to create a calendar event for you. Or, snap a screenshot of an online product, and you can reverse image search it or hand it over to ChatGPT for “more information.” 

It’s a neat step forward, but right now, it’s less “AI magic” and more “AI middleman.”

Privacy and Online Requirements

Apple’s been loud about the privacy angle with iOS 26. And to be fair, some of it is legit. When you ask Siri a question that requires ChatGPT, it’ll prompt you first before sending anything to OpenAI. That means you won’t accidentally feed your personal data to an external AI without knowing about it.

But here’s the reality check: most of the flashy Apple Intelligence features still require an internet connection. Without it, you’re back to the “offline Siri” skill set (setting timers, toggling settings, launching apps). Anything involving AI-generated content, contextual queries, or multi-step tasks just stops cold.

And while Apple might have stronger privacy messaging, it doesn’t mean your data never leaves the device. In certain cases, processing still happens in the cloud, Apple just routes it through their own servers (or OpenAI’s) instead of another random provider.

The bottom line: iOS 26’s privacy prompts are a nice touch, but “on-device processing” doesn’t mean you’re invisible. If you want to keep something truly private, don’t put it into any AI assistant.

iOS 26 Limitations to Expect

Even with all the Apple Intelligence hype, there are some practical limits you’ll run into with iOS 26. And it’s better to know them now than be surprised later.

  • First, offline mode. Yes, Siri can still set alarms, start timers, toggle Bluetooth, and launch apps without the internet. But any AI-heavy request just won’t happen unless you’re online. 
  • Second, hallucinations. That’s the AI term for when it confidently gives you the wrong answer. In my testing, I’ve had Apple Intelligence say one thing, then change its answer completely when I asked it to double-check, only to then confirm the second answer was “definitely correct.” This isn’t unique to Apple, but it means you can’t blindly trust every response.
  • Third, hardware and regional limits. The full Apple Intelligence feature set appears to require at least an iPhone 16, and possibly newer, with many of the best tricks rolling out in the US first. If you’re on an older model (like iPhone 15) or outside Apple’s priority markets, you might be looking at a very stripped-down experience for months.

In other words: iOS 26 is aiming high, but it’s still bound by the usual AI caveats, including internet dependence, occasional inaccuracy, and the “latest hardware only” tax.

Final Thoughts Before Release

iOS 26 is shaping up to be Apple’s most ambitious update in years, because it’s betting big on Apple Intelligence.

If you’re already an iPhone user, iOS 26 will be worth the update; it’s the biggest step forward Apple’s made in years. But if you’re comparing it to Android 16 with Google Gemini, it’s clear Apple’s still chasing rather than leading.

We’ll see how much changes between now and the public release, but right now, iOS 26 feels like the start of something exciting.

About the author
Pete Matheson

Experiments in Progress

Tested tech, buying guides, and behind-the-scenes experiments from an award-winning technology entrepreneur. Built for tech enthusiasts who want tools that work for them. Sign up for free:

Pete Matheson

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Pete Matheson.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.