What do you think of when you hear about Google Glass? Is it Google being stingy about porn and facial recognition, or is it about the potential for crazy augmented reality video games? There’s even more that Google Glass can do, but you probably never thought Google’s AR glasses would ever help the blind see.
Say hello to OpenGlass, a set of custom tools for Google Glass from Dapper Vision that allow visually impaired users to “see” the world around them. The two tools now available in OpenGlass are called Question-Answer and Memento – both help people see in an entirely new way. Here’s how they work:
(Question-Answer) The user takes a picture and asks a question. These get sent to cloud workers (mechanical turk) and twitter users who answer it. The answer received is read aloud to the user through Glass. Note: the voice that you hear is what the user hears, though it is slightly muffled because it is a bone conductive speaker (best heard through headphones, captions at the bottom of the screen note important user feedback).
(Memento) A native app streams real-time video frames to a cluster, which performs image matching against a dataset of images and annotations created by a sighted user. Annotations associated with matching images are sent back to Glass and read aloud to the user.
In short, Google Glass isn’t some miracle device that will give sight to the blind. It does, however, go a long way to make the lives of visually impaired people much easier. I’m especially a fan of crowdsourcing Twitter to let people know what they’re looking at.
I don’t know if Google ever envisioned Glass helping the visually impaired, but it’s absolutely amazing to see developers doing things like this with the hardware. It makes one wonder what people will think of next.