High-tech gadgets and apps charge premium prices and promise greater independence for people with visual impairments when out and about. Alex Lee considers the promising gizmos consigned to dusty corners, unaffordable wearable tech, and the importance of human intervention.
In 2014, shortly after losing my eyesight, I backed a project on the crowdfunding website Indiegogo. Bereft and wanting to take back my independence without having to use a cane, I stumbled upon a new navigational tool called the Sunu Band, which used ultrasonic technology.
In reality it was just a modern-day reimagining of the Leslie Kay’s Sonic Torch of the 1950s. You strapped the Sunu Band onto your wrist, and it would vibrate discreetly when it detected that an object was about to come into your path. You’d hold your arm in the middle of your body and gesticulate left and right – and upward if you wanted to sweep for any low-hanging branches.
It sounded life-changing. But after backing the project and waiting a few months for the product to be delivered, I used it for a couple of weeks and never did again. Despite the band costing me something like £300, it now lies in a drawer somewhere, collecting dust.
Maybe the tech just wasn’t very good. Or maybe it was because I got frustrated with having to constantly recharge the wristband, annoyed that it could die at any second. Whatever it was, I didn’t get on with the device.
The market is filled with hundreds of gadgets and apps like the Sunu Band – blind tech experiments, I like to call them. Not all of them are great. Some have been designed with input from the visually impaired, some without. Creating a truly game-changing bit of assistive tech doesn’t always pan out as planned, but sometimes the simplest ideas have the biggest impact.
The pricey promise of Bluetooth beacons
In March 2014, the Royal Society of Blind Children’s Youth Forum (then the Royal London Society for Blind People) launched the first ever manifesto for young visually impaired people living in England. One of the issues raised in the manifesto was independent travel and transportation.
The Youth Forum kicked off a partnership with design and tech firm ustwo to make that a reality. Together they launched a project called Wayfindr, with one goal: to create an app that would help visually impaired people get from one point in a London Underground station to another, completely independently.
To achieve this, Bluetooth low-energy beacons would be deployed all across the London Underground. Beacons use radio waves to send signals to devices, like smartphones, and are able to give a user a rough approximation of their location by calculating the strength of the signal between user and beacon.
After trialling the technology at Pimlico station and then in Euston station, it appeared to be a huge success. Then, well, “It just fizzled out,” says Daniel Smith, who was part of the Youth Forum at the time. The app was never made. “Messing around in a Tube station on an app with infrastructural development like Bluetooth beacons – it’s just a hell of a project to get off the ground,” he adds.
Now I was allowing people to peek into my life, helping me do things that my I could have easily done on my own before.
Bluetooth beacons have been used successfully in some cities in Europe. In Lyon in France, every bus stop, roughly 90 per cent of the crossings and around 200 private and public buildings are fitted with Bluetooth sound beacons. If a visually impaired person wants to know exactly where they are, all they have to do is ping the beacon closest to them and listen for the verbal message telling them which building they’re at, whether it’s safe to cross the road, or which bus or tram is next to arrive at the stop.
But Smith is correct in his assessment of the situation – it is a large infrastructural undertaking. “The Bluetooth beacon thing has just been going on for years,” says Mike Wald, a professor in electronics and computer science, who leads research into accessible technologies at the University of Southampton.
He says that while architects design buildings with intelligent bits of software nowadays, accessibility still isn’t being taken into account. While the work of visually impaired people has arguably improved the world for sighted people, through things like speech synthesis, we’re still seeing blind people forgotten about when it comes to the fundamentals of getting around.
“The whole issue with Bluetooth beacons is – do you build them in when you build a building, and the same idea with pavements, or are they retrofitted?” Wald asks. “The beacons are a clever idea, but the cost of it is why they haven’t taken off.”
Bluetooth beacons only last a couple of years on a cell battery and need to be replaced on a regular basis. They also need to be placed strategically in certain points around buildings and cities in order for them to work. And if one fails to work, the whole system will crawl to a halt. If you’re a visually impaired person trying to travel independently and your lifeline stops working, then the assistive technology isn’t fit for purpose. For similar reasons many non-accessibility-related Bluetooth beacon projects fell by the wayside in 2016 and 2017.
Humans to the rescue
While there was a lot of promise in low-energy beacon technology, it seems to have been shelved. For now, at least. In the meantime, a compromise has emerged – crowd-sourced help. In January 2015, visually impaired Danish furniture craftsman Hans Jørgen Wiberg launched a free mobile app called Be My Eyes.
The app connects blind and visually impaired people to sighted volunteers through a live video feed accessed via your phone’s rear camera. As of 2021, there are four million signed-up volunteers around the world ready to help visually impaired people do basic tasks, like read their mail, or find a grocery item in a supermarket.
The same year that Be My Eyes launched; another company called Aira was also founded. Aira, a paid subscription service, connects trained sighted agents to visually impaired people. Aira agents also have access to the individual’s GPS location, and help direct them from to place to place, while also noting any obstacles in the person’s path. Both apps show off exactly why assistive tech doesn’t simply have to mean OCR (optical character recognition) technology, speech synthesis or artificial intelligence, but can involve other people too.
When I first started using Be My Eyes, it didn’t feel like true independence. I used to be able to walk around, look up at a sign and order from a menu without the assistance of another human being. But now I was allowing people to peek into my life, helping me do things that my I could have easily done on my own before. But my opinion of it has changed over the years.
“The thing I’ve always loved about Be My Eyes, and I’ve watched it grow, is that it’s a simple solution,” says Cathy Holloway, a professor of interaction design and innovation at University College London and the co-founder of the Global Disability Innovation Hub. “Do we need object recognition for absolutely everything, or can we have object recognition for 90 per cent of things, but actually, sometimes, even if you had the automated stuff, would you prefer to pick up the phone and ask?”
And Holloway is right. One day, I’m using a whole series of apps to find the location of a bar, determined to get there on my own. I’m using Microsoft’s Soundscape app, which reads out the names of shops and restaurants, in combination with Google Maps and, while they get me as close to my destination as possible, I still find myself lost in central London.
I could have purchased a £1,800 device called the OrCam, which could read shop signs out using artificial intelligence. Or maybe I could have brought along my £3,000 IrisVision virtual-reality goggles, which let me zoom into anything in my field of view with incredible clarity. All of these things would have helped me find my way there on my own. But it still would have taken me time or cost an eye-watering sum of money.
Until I finally decide to swallow my pride, pull up the Aira app, and call for help. In a minute’s time, I’m walking through the door of the bar, and thanking my agent.
About the contributors
Alex is a tech and culture journalist. He is currently tinkering with gadgets and writing about them for the Independent. You may have previously seen his work in the Guardian, Wired and Logic magazine. When he’s not complaining about his struggles with accessibility, you’ll likely find him in a cinema somewhere, attempting to watch the latest science-fiction film.
Ian Treherne was born deaf. His degenerative eye condition, which by default naturally cropped the world around him, gave him a unique eye for capturing moments in time. Using photography as a tool, a form of compensation for his lack of sight, Ian is able to utilise the lens of the camera, rather than his own, to sensitively capture the beauty and distortion of the world around him, which he is unable to see. Ian Treherne is an ambassador for the charity Sense, has worked on large projects about the Paralympics with Channel 4 and has been mentored by photographer Rankin.