Why partially sighted people need smart glasses more than you

There's more to Google Glass and co than sneaky snaps of strangers. Sophie Charara speaks talks to the inventors and researchers behind the smart specs that improve poor vision

How smart are smart glasses, really?

Sure, video chats in the corner of your eye: that's clever. Navigation at eye level is nice, too. And winking to take a pic? Genius.

But the real boffins are working on eyewear that dramatically improves the vision of people with severe sight loss. A team of Oxford University and RNIB researchers and scientists won the 2014 Google Impact Challenge this summer for prototype AR smart glasses, which are fitted with a 3D camera and project clear, bright images onto the lenses in real time.

They look bonkers and are only just being tested out of the lab. But for millions of people, these smart glasses could be much more than an early-adopter tech toy.

Next-gen wayfinding

The impact on the millions of people in the UK and abroad with partial vision could be huge if the teams can get a low-cost device into production.

Worsfold says there is a real business case for it. More importantly, if the smart glasses are discreet enough not to mark people out as having a disability then they can be used as a tool to increase confidence and independence. It could mean the difference between sticking to the same route every day and taking a risk with a new journey.

“Why is it important? This image processing is probably the most intuitive way to improve someone's sight loss problems because what you're using is their own sense; their own will to be able to see things.

It is not like having an invasive implant where you have to learn how to use that. This is something that uses their sight – what they've been hanging onto for what could be years or decades. We can use that or amplify that to give people better situational awareness and an understanding of a scene around them.”

Blimey, bionic glasses

There are still plenty of tech and design challenges to work around. Weight is one and Hicks expects it to be a problem for the next few years. Power is another: at the moment the prototype developed at Oxford needs wiring up to a box in the wearer's pocket or laptop in a backpack.

“Screen size is my major focus at the moment, says Hicks. “Both in terms of field of view (how wide your screen is) and thickness. Not only do we have to make a it wide and lightweight, it also has to be thin and transparent. There have been a couple of breakthroughs in this area just this year and I'm excited about working with these inventors and their incredible concepts throughout 2015.”

Worsfold agrees about the extra challenge that transparency poses and also about how crucial it is to getting people to wear super vision-enhancing specs.

“Obviously our glasses have to be transparent, whereas conventional AR glasses don’t necessarily need to be,” he says. “Ours do for two reasons. We want to use people’s residual vision – and from a social interaction point of view, if you stick a big, black mask in front of your face, you suddenly become less approachable.”

Worsfold's first reaction when he read about the initial ideas Hicks had presented to the Royal Society in 2011 was 'Blimey, someone's making bionic glasses. What a load of hype.' Now, this smart-glasses tech has jumped through hoops set by the RNIB, the NHS and Google's Impact Challenge.

What's next? Porting the software to Android to harness the power of smartphones, refining face-enhancement techniques to improve recognition and, of course, getting to the stage where smart glasses of all shapes and sizes are socially acceptable on the street. Early adopters, at the ready…