How the Blind Point a Smartphone at Everyday Control Panels—and Hear Prompts on Which of Those Microwave Buttons to Push. They Can Even Order Up Braille Labels.

By Lori Cameron
Published 04/19/2019
Share this on:

Push-button interfaces are everywhere—microwaves, toasters, coffee makers, thermostats, printers, copiers, checkout terminals, kiosks, and remote controls. And while they afford most of us great convenience, they are largely inaccessible to people who are visually-impaired.

But two new technologies aim to change that—VizLens and Facade, say researchers Anhong Guo and Jeffrey P. Bigham of Carnegie Mellon University in their study “Making Everyday Interfaces Accessible: Tactile Overlays by and for Blind People.”

“Making a physical environment accessible to blind people generally requires sighted assistance. VizLens and Facade put blind users at the center of a crowdsourced, computer-vision-based workflow that lets them make the environment accessible on their own terms,” write Guo and Bigham.

Want to stay in the know? Sign up for the Computer Society INSIDER newsletter.

Challenges faced by the visually impaired in using digital interfaces

People who are blind or visually impaired have a tough enough time as it is navigating their environment. When it comes to digital interfaces, blind users face several unique challenges:

  • Flat digital touchpads have replaced physical buttons, which blind users could previously distinguish with their fingers.
  • Blind people must rely on a sighted assistant to identify button functions and apply Braille labels to home appliances.
  • Because blind people cannot remember all the abbreviations and functions of complex interfaces, they might choose to label only a few functions, limiting their access.
  • If Braille labels wear out because of frequent use, which happens a lot with kitchen appliances, blind users lose access to the functions and need help relabeling buttons again.

In addition to identifying parts, people who are visually impaired need a means of easily reproducing and attaching braille labels to a digital interface.

Like what you’re reading? Explore our collection of magazines and journals.

That’s where VizLens and Facade come in.

How VizLens helps people who are blind

To begin, the blind user takes a picture of the digital interface.

How?

VizLens users take a picture of an interface they would like to use, such as that of a microwave oven. This image is interpreted quickly by crowd workers in parallel. The system then uses computer vision to give instantaneous interactive feedback and guidance on using the interface through a mobile (left) or wearable device (right).
VizLens users take a picture of an interface they would like to use, such as that of a microwave oven. This image is interpreted quickly by crowd workers in parallel. The system then uses computer vision to give instantaneous interactive feedback and guidance on using the interface through a mobile (left) or wearable device (right).

When the user holds the smartphone up to the interface, the app can tell if it is partially out of frame by detecting whether the corners of the interface are inside the camera frame. If they are not, the app will say something like “Move phone to the right.” If it detects no interface at all, it will say, “No object.”

Once an image is captured, the user sends it to the “crowd” to view and label all of the parts of the interface. For these apps, the developers use Amazon Mechanical Turk, a crowdsourcing platform where tens of thousands of workers are always online completing all kinds of human intelligence tasks.

Job-hunting? Subscribe to our Build Your Career newsletter.

Within a few minutes, crowd workers rate the image quality (whether it is blurry or partially cropped), mark the layout of the interface, note its buttons or other controls, and describe each button (for example, “baked potato” or “start/pause”). These results are verified using majority vote and locked into the app’s server for retrieval.

Later, when blind users want to use the interface, they open the VizLens mobile app, point the phone toward the interface, and hover a finger over the buttons.

VizLens gives users auditory feedback when they hover their finger over the buttons of different kinds of digital interfaces.

As can be seen in the video above, “computer vision” matches the crowd-labeled image to the real image. VizLens then detects what button the user is pointing at and tells the user what it is.

How Facade works

Facade uses the same image capturing and crowdsourcing tech that VizLens does—but with one important difference.

After gathering all the data about a digital interface, Facade allows users to make 3D prints of tactile overlays for appliance interfaces in just minutes.

If users don’t have a 3D printer at home, they can use a mail order service like 3D Hubs.

Facade’s crowdsourced fabrication pipeline enables blind people to produce a 3D-printed overlay of tactile buttons.
Facade’s crowdsourced fabrication pipeline enables blind people to independently make physical interfaces accessible by producing a 3D-printed overlay of tactile buttons.

Three-dimensional prints can be made of tactile overlays of any digital interface using customized words or abbreviations. The layout of the tactile overlay is stored in the Facade app so that, if users need to replace it, they can easily print and apply a new one.

Samples of printed tactile overlays and legends generated by Facade.
Samples of printed tactile overlays and legends generated by Facade. Users can choose to print a legend for the abbreviations. If they do not have a 3D printer at home, overlays can be printed by a commercial mail-order service such as 3D Hubs using PolyFlex or SemiFlex materials.

Static vs. dynamic interfaces

Static interfaces include the types of displays you see on microwave ovens, printers, and remote controls. The buttons don’t change, so a single reference image is enough.

Dynamic interfaces include the types of displays you see in kiosks or checkout terminals where pushing a button takes the user to a new screen with a different display.

Attend the 2019 IEEE Visualization Conference, to be held October 20-25, 2019, in Vancouver. It’s the premier forum for visualization.

VizLens and Facade promise to make it easier for the visually impaired to interact with both types of digital devices.

These two new apps “solve the long-standing challenge of making everyday interfaces accessible by tightly integrating the complementary strengths of the end user, the crowd, computer vision, and fabrication technology,” Guo and Bigham say.

 

Related research on tech for the visually impaired in the Computer Society Digital Library:

 


 

About Lori Cameron

Lori Cameron is a Senior Writer for the IEEE Computer Society and writes regular features for its digital platforms. Contact her at l.cameron@computer.org. Follow her on LinkedIn.