When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works

Home / News / Here’s when Apple’s AR glasses might be ready for launch – it’s sooner than you think

Here’s when Apple’s AR glasses might be ready for launch – it’s sooner than you think

Apple's smart specs may land as soon as the end of 2026 – but they're just the start, with a grander AR vision to come

Meta Ray-Bans AI

We’ve heard rumours of Apple’s AR smart glasses for years now – even before the Vision Pro headset hit the shelves. And since it arrived, we’ve heard that Apple’s main focus is shrinking the tech down into a pair of smart specs.

Now, it looks like Apple might be finally ready to make this device a reality. According to Bloomberg’s Mark Gurman, Apple is aiming for a release by the end of 2026.

The glasses are shaping up to be one of the most powerful offerings available. They’re expected to come with built-in speakers, microphones and cameras, and powered by an Apple-designed chip. We’re talking about a device that lets you take calls, listen to music, get directions and even use Siri without pulling out your phone. There’s mention of live translation features too, though how well any of this will work is still very much up in the air. Especially given Siri’s… reputation.

But apparently this isn’t Apple’s big AR vision just yet. Gurman notes that proper augmented reality glasses are still “years away.” Instead, this sounds more like Apple’s answer to Meta’s Ray-Ban smart glasses – albeit with better build quality, according to sources at Apple. Considering Meta has already sold over a million pairs and Google is now working with Xreal and others on a rival Android XR platform, this could prove to be a challenge.

But there’s one big question hanging over Apple’s effort: AI. Or rather, Apple’s shaky track record with it. There are said to be internal concerns that the company’s ongoing struggles with AI could hold this product back. Meta and Google are already pairing their wearables with their advanced AI systems, like Llama and Gemini. Whereas right now, Apple leans on third parties – Google Lens and OpenAI – for its iPhone-based Visual Intelligence features.

That might be fine for now, but a future product like this will need something more native, more integrated. There’s a lot riding on the AI chops Apple announces at WWDC 2025 in a few weeks.

Profile image of Connor Jewiss Connor Jewiss

About

Connor is a writer for Stuff, working across the magazine and the Stuff.tv website. He has been writing for around nine years now across the web and in print too. Connor has attended the biggest tech expos, including CES, MWC, and IFA – with contributions as a judge on panels at them. He's also been interviewed as a technology expert on TV and radio by national news outlets including France24. Connor has experience with most major platforms, though does hold a place in his heart for macOS, iOS/iPadOS, electric vehicles, and smartphone tech. Just like everyone else around here, he's a fan of gadgets of all sorts. Aside from writing, Connor is involved in the startup and venture capital scene, which puts him at the front of new and exciting tech - he is always on the lookout for innovative products.

Areas of expertise

Mobile, macOS, EVs, smart home