When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works

Home / Features / Why partially sighted people need smart glasses more than you

Why partially sighted people need smart glasses more than you

There's more to Google Glass and co than sneaky snaps of strangers. Sophie Charara speaks talks to the inventors and researchers behind the smart specs that improve poor vision

How smart are smart glasses, really?

Sure, video chats in the corner of your eye: that’s clever. Navigation at eye level is nice, too. And winking to take a pic? Genius.

But the real boffins are working on eyewear that dramatically improves the vision of people with severe sight loss. A team of Oxford University and RNIB researchers and scientists won the 2014 Google Impact Challenge this summer for prototype AR smart glasses, which are fitted with a 3D camera and project clear, bright images onto the lenses in real time.

They look bonkers and are only just being tested out of the lab. But for millions of people, these smart glasses could be much more than an early-adopter tech toy.

Google Glass and Epson Moverio

Smart glasses are filtering, very slowly, into the mainstream (or at least onto Silicon Valley faces) thanks to Google Glass. And Stephen Hicks, inventor of the smart glasses that won the Google Impact Challenge, thinks the time is ripe for wearable AR tech for all kinds of uses, not just visual entertainment.

“Wearable displays, of the type you would actually like to wear, are becoming smaller, cheaper and much higher quality,” says Hicks. “Google Glass is one, but also Epson’s Moverio is a great starting place for our type of visual assistance. While these technologies may be appropriate for visually impaired people in the first instance, the types of developments we are coming up with – cameras, software, displays and user interfaces – will eventually become part of a wearable visual technology that I think we would all find useful.”

READ MORE: One day, you’ll be a Glasshole too…

Since starting the project, Stephen Hicks and John Worsfold, his partner in crime at the Royal National Institute of Blind People’s Innovation and Development team have decided to use existing third-party hardware instead of going ahead with building a completely new device.

“We want to be as agnostic as possible in which hardware we use,” says Worsfold. “Because that market itself will evolve, there will be new innovations and we need to be able to capitalise on them when they come along.

“You can’t develop a pair of augmented reality glasses on half a million pounds (the winnings from the Google Impact Challenge). So being able to ride the back of third-party developments and consumer glasses is absolutely vital for us. But there have also been massive steps forward in separate disciplines, whether you’re talking about depth cameras or image processing. One of the challenges is to bring all that together in a form that lasts longer than 20 minutes on a battery.”

Next-gen wayfinding

The impact on the millions of people in the UK and abroad with partial vision could be huge if the teams can get a low-cost device into production.

Worsfold says there is a real business case for it. More importantly, if the smart glasses are discreet enough not to mark people out as having a disability then they can be used as a tool to increase confidence and independence. It could mean the difference between sticking to the same route every day and taking a risk with a new journey.

“Why is it important? This image processing is probably the most intuitive way to improve someone’s sight loss problems because what you’re using is their own sense; their own will to be able to see things.

It is not like having an invasive implant where you have to learn how to use that. This is something that uses their sight – what they’ve been hanging onto for what could be years or decades. We can use that or amplify that to give people better situational awareness and an understanding of a scene around them.”

Blimey, bionic glasses

There are still plenty of tech and design challenges to work around. Weight is one and Hicks expects it to be a problem for the next few years. Power is another: at the moment the prototype developed at Oxford needs wiring up to a box in the wearer’s pocket or laptop in a backpack.

“Screen size is my major focus at the moment, says Hicks. “Both in terms of field of view (how wide your screen is) and thickness. Not only do we have to make a it wide and lightweight, it also has to be thin and transparent. There have been a couple of breakthroughs in this area just this year and I’m excited about working with these inventors and their incredible concepts throughout 2015.”

Worsfold agrees about the extra challenge that transparency poses and also about how crucial it is to getting people to wear super vision-enhancing specs.

“Obviously our glasses have to be transparent, whereas conventional AR glasses don’t necessarily need to be,” he says. “Ours do for two reasons. We want to use people’s residual vision – and from a social interaction point of view, if you stick a big, black mask in front of your face, you suddenly become less approachable.”

Worsfold’s first reaction when he read about the initial ideas Hicks had presented to the Royal Society in 2011 was ‘Blimey, someone’s making bionic glasses. What a load of hype.’ Now, this smart-glasses tech has jumped through hoops set by the RNIB, the NHS and Google’s Impact Challenge.

What’s next? Porting the software to Android to harness the power of smartphones, refining face-enhancement techniques to improve recognition and, of course, getting to the stage where smart glasses of all shapes and sizes are socially acceptable on the street. Early adopters, at the ready…

Sophie Charara is a reviewer on Stuff.tv

Profile image of Sophie Charara Sophie Charara Stuff contributor


Sophie is a freelance writer and editor. She's interested in smartphones, tablets, apps, wearables, cool concepts and general thingymajigs.

Enable referrer and click cookie to search for eefc48a8bf715c1b 20231024b972d108 [] 2.7.22