So I’ve turned up at your house for dinner (thanks for the invite, by the way), and I’ve got a camera attached to my face. Not only that, but I’m holding up an iPad at arm’s length so that I can fill any dull pauses in conversation by going on the internet. So, I’m here – and these soufflés are a revelation, by the way – but I’m also maybe filming you, or taking pictures of your house, or maybe I’m browsing the web while you’re talking.
And that’s basically the reason people don’t like Google Glass: whatever advanced technological abilities it may give you, the ones people notice are the ability to invade other people’s privacy and the ability to ignore people to their faces.
But while you can buy Google Glass, it isn’t consumer technology. It’s several years away from being something you’d actually buy. People who wear Glass today are like the stockbrokers who strutted around shouting into briefcase-sized mobile phones in the mid-80s – and who, like today’s Glassholes, were thought rude or laughable for doing so. A technology that’s not ready for mass use yet awkwardly encumbers them and because there are no social conventions on how to use smart glasses yet, no one’s figured out how to use it politely.
That doesn’t mean that in 10 years’ time there won’t be a smartglass equivalent of the Nokia 3210.
And when you think about it, it’ll take less of a leap to get smartglasses into our lives than it did with mobile phones. In countries like the US and UK, more than 60% of the population already wears glasses, so the infrastructure’s already there, and thanks to smartphones and tablets, we’re already used to carrying the internet with us everywhere we go.
I’ve already seen some prototype screens that fit into the frame of a regular pair of specs, which looks a great deal less creepy than Glass, and there’s research underway to create displays that fit into contact lenses, so you could have augmented reality information overlaid on your world and no one would really be able to tell.
How fun, and how useful, could that be? You could stare at the night sky and choose to see information about the stars and planets; you could look at a building and see diagrams of its internal structure; you could sit on a plane and look through the floor at the mountains and rivers beneath.
No more walking around looking at maps on your phone – signs with tailored directions hover in the air, visible only to you. You’d know how many calories you’d burned on your run, what appointments you had and when you needed to buy milk – because that information would just appear, hovering in your field of view, when convenient.
Google is far from the only company working towards this goal. Car companies are already putting heads-up displays in new models – your speed, the speed limit and your sat-nav directions just hang in the air, as if printed on the road – and it’s a lot better than looking down at the dashboard.
Virtual reality companies are busy turning bulky prototype headsets into sleek, high-resolution devices but they won’t stop there: Palmer Luckey, CEO of Oculus, has told me his eventual goal is to fit the same kind of display that goes into his company’s virtual reality headsets into a regular pair of glasses, creating a device that overlays the real world with immersive, high-resolution graphics.
When that happens, an AR device will become as commonplace and necessary as a smartphone today. And it still won’t be cool to film people without asking.
Will Dunn is the editor of Stuff magazine