We’ve just returned from AWE 2025, where we wandered through convention centre and expo halls for three days, having interesting discussions with old friends and new. We enjoyed the talks by Evan Spiegel and the team from Meta, and some of the many panel discussions. It feels like the convergence of AI and wearable technology has reached a tipping point that is genuinely transformative rather than just incremental progress. So, what do we do with this, now that it’s here?
Ori Gets The Shot
Ori Inbar, founder of AWE2025, demo’d our Pool Assist Spectacles lens live on-stage in his opening keynote. It was excellent to see this coming together.
Moving Computing Outdoors
It’s encouraging to see the next generation of wearable devices being designed with the real world in mind. Snap emphasised this with its Spectacles, and Niantic’s demo of the AI assistant Hello, Dot (developed with the brilliant team at Liquid City) underscored the shift: instead of being tethered to indoor environments, these devices bring contextual intelligence, offering insights into street art and architecture as you move through the city. The shift happening: we’re moving from the home into the streets. More here.
Snap Yellow Everywhere
Snap had the most visible presence at the show, with their announcement of smaller, more advanced and publicly available Spectacles for 2026, it felt like a serious commitment to moving beyond their current developer-focused approach.
There was a large team from Snap on-site, including engineers happy to help directly on in-progress projects. This is what Snap’s team always does extremely well. They make you feel seen and they are approachable in way that is unique among the tech giants.
What's particularly significant is that Evan specifically emphasized next year’s devices will likely run whatever you build today. The device is pointless without an ecosystem of content and that is where Snap really leads the charge, if only because Lens Studio is such a powerful and well-designed proprietary tool.
Creative Applications and Culture
The STYLY team's collaboration with the Tokyo Dome, The Moon Cruise, showed how spatial computing can enhance rather than replace real-world experiences, overlaying real-time performance data onto stadium experiences. The Moon Cruise is a ticketed immersive space travel experience. More on the Tokyo Dome experience here.
Helen Papagiannis captured this wider shift perfectly in her talk about "Reality Modding" - the idea that reality is becoming editable. Though I have to wonder whether most people actually want reality to be that customizable?
Platform Convergence
Android XR's emergence as a unified platform felt significant. Seeing XREAL's Project Aura and Samsung's Project Moohan both running the same OS suggests we're moving past fragmentation, while Qualcomm's new on-device AI chips show the infrastructure is catching up to the vision.
The Practical Reality
Of course, challenges remain. Mostly, the awkwardness of explaining to non-industry-insiders what you're doing with this thing on your face, gesturing wildly into nothingness. Social acceptance is still a massive hurdle. But for the first time, it felt like the engineering problems are progressing rather than fundamental barriers.
I think most people at AWE would agree we're building toward a post-smartphone world where spatial computing becomes the primary interface with reality. Let’s see what the wider public thinks about this, because it still feels like an inside secret. But AWE 2025 felt like the moment when that shifted from ‘the future’ to ‘the present’ as Ori emphasized himself on stage.
The Glasses Are Getting Smarter
The AR wearables space is moving faster than most of us can keep track of. For our own sake and anyone else’s, we thought we’d try to line up what we’re seeing out there right now.