The air buzzed with that familiar blend of overcaffeinated anticipation, jet-lag, optimism over the top, and just enough competitive anxiety to keep the conversations sharp. Between the App Store veterans and AI start-up hopefuls, the vibe was unmistakable: Apple needed to bring something fresh — not just feature parity with Google, not just a clever new icon — but a vision. Preferably one that didn’t look like it had been panic-coded after OpenAI name-dropped Jony Ive about a week or so ago.

The Look: Liquid Glass and the Cult of Clarity
Let’s talk about the translucent elephant in the room: Liquid Glass. It’s not just a design refresh; it’s a full-on philosophical shift. Every icon, menu, and window now floats in a semi-transparent cocoon that’s equal parts minimalism and digital opulence. It’s the biggest aesthetic overhaul since skeuomorphism got the boot — and yes, it’s stunning. At times, it also feels like Apple accidentally trained a neural net on bubble wrap, but the ambition behind it is clear (pun definitely intended).

It’s also everywhere: iOS 26, macOS Tahoe, iPadOS 26, watchOS, tvOS — even VisionOS. Critics have pointed out some usability concerns — transparency can be gorgeous and confusing all at once — but for most of us at the conference, the immediate response was awe. Think: ‘Apple’s designers ate glass, and it was delicious.’ Furthermore on the second day of the event, I had the opportunity to attend a Developers Activity in the Big Sur auditorium and there were moments during the live demos where everyone wowed out loud.
The Feel: Whispered AI and Loud Silence on Siri
What didn’t get shouted from the rooftops — or even mentioned much at all — was Siri. If she was at WWDC, she was wearing an invisibility cloak behind a frosted Liquid Glass pane.
Instead, Apple introduced its AI updates in a far more subdued, Apple-esque fashion: less “here’s our model card,” more “your iPhone just knows.” The headline act here wasn’t a flashy ChatGPT competitor. It was contextual intelligence, sprinkled smartly across features like:
- Call Screening with voice transcription à la Google Pixel
- Hold Assist, your new AI call butler
- Live Translation, on-device and almost real-time
- Visual Intelligence, which lets you interrogate screenshots like a digital Poirot
It’s all smart, cautious, and hyper-integrated — and in typical Apple fashion, built with privacy front and centre (which is great IMHO). The on-device commitment remains strong, even if the model itself isn’t breaking benchmark records (yet).
The unspoken narrative here? Apple is doing AI on its own terms. And while it might be playing catch-up in model capabilities, it’s betting on trust and ecosystem integration as its ace.
Apple is quietly opening a powerful new chapter for developers: meet the Foundation Models framework — a new toolkit that puts Apple’s on-device intelligence directly into your hands.
Unveiled at WWDC 2025, the framework allows developers to build on the same privacy-preserving, latency-crushing models that power Apple Intelligence, but within their own apps. That means smarter features, no server round-trips, and experiences that work even offline.
Craig Federighi put it simply (and elegantly): Apple’s models are getting better, faster, and now — more open. “We’re giving developers direct access to the on-device foundation model… powerful, fast, built with privacy, and available even when users are offline,” he said. “We think this will ignite a whole new wave of intelligent experiences.”
And that’s the real story here. Want your education app to generate personalised quizzes on the fly? Your nature app to understand voice queries deep in the woods with zero signal? Now you can — with zero inference costs and full control.
Foundation Models
What was interesting to me and a lot of other attendees was the Foundation Models framework. Apple is quietly opening a powerful new chapter for developers: a new toolkit that puts Apple’s on-device intelligence directly into your hands.
The framework allows developers to build on the same privacy-preserving, latency-crushing models that power Apple Intelligence, but within their own apps. That means smarter features, no server round-trips, and experiences that work even offline.

Craig Federighi put it simply (and elegantly): Apple’s models are getting better, faster, and now — more open.
“We’re giving developers direct access to the on-device foundation model… powerful, fast, built with privacy, and available even when users are offline,”
he said.
“We think this will ignite a whole new wave of intelligent experiences.”
And that’s the real story here. Want your education app to generate personalised quizzes on the fly? (Think Kahoot! – I had the pleasure of meeting one of the developers behind the API, well done Konstantinos!). Your nature app to understand voice queries deep in the woods with zero signal? Now you can — with zero inference costs and full control. In short: this isn’t just Apple shipping smarter apps. It’s Apple giving you the tools to do the same.
The Substance: Macifying the iPad, Gaming Grows Up, and More
If you’ve ever used your iPad and wished it would just act like a Mac already — iPadOS 26 is your moment. Windowed apps, a proper menu bar, Exposé — the tablet has finally evolved into the laptop-replacement Apple once teased us with, back in the days when keyboards and USB-C were heresies.
Meanwhile, Apple’s subtle takeover of the gaming world took another step with the new Games app — finally acknowledging that half a billion people don’t just want mobile games buried inside the App Store. It’s sleek, social, and looks like Apple Arcade just got a second wind.
macOS Tahoe continued the Liquid Glass party and signalled the quiet farewell to Intel Macs. RIP, spinning fans. You’ll be missed by no one.
The Edge: VisionOS, WatchOS and a AppleTV
VisionOS is still trying to convince people to strap a ski goggle to their face, but the addition of spatial widgets and co-viewing features means Apple’s not giving up. There’s something eerily beautiful about sticking a floating calendar next to your actual kettle and then walking around it. If you’re a developer in this space, the spatial anchoring APIs alone are worth a tinker. There I will come out… I like it!
Meanwhile, WatchOS got more than a birthday cake. Liquid Glass made it to the wrist, alongside an AI-powered Workout Buddy who talks to you through your AirPods. It’s either deeply motivating or vaguely dystopian — depends on your sleep score.
Apple TV got its own frosted makeover too, with new glassy menus and some tempting TV+ content (hello, Murderbot). But the real news might be that Maps and CarPlay now look like Google Maps and Android Auto. Which isn’t revolutionary — unless you’re Apple, where sometimes admitting your rivals had good ideas is the boldest move of all.

A Transitional Year in Full Transparency
WWDC 2025 wasn’t about shock and awe. It wasn’t trying to out-AI OpenAI or out-meme Google. What it did was show Apple doubling down on what it’s always done best: platform-wide consistency, refined user experience, and feature creep dressed up as elegance.
Is Liquid Glass revolutionary? Perhaps not, butt it’s cohesive, ambitious, and a signal that Apple’s long design drought is over. Furthermore, it makes use of the power of the hardware behind it.
Is Apple’s AI strategy leading the charge? Not yet. I have my hopes as it is quietly embedding intelligence where it matters, rather than making it the headline act.
As a developer and designer, it felt like a good year. A necessary breath before the next sprint. There’s polish. There’s focus. There’s a whisper of what’s coming. And then there was the awesome and delightful experience of coming to Apple Park with fellow developers. I was very pleased to have been lucky enough to have been invited to the Premiere of F1 at the Steve Jobs Theather on the Tuesday evening. True one more thing to have enjoyed.
