Future accessibility tools will be smart peripherals

Future accessibility tools will be smart peripherals

REDONDO BEACH, Calif.'While webmasters and systems administrators struggle with kludgy software add-ons, one visionary is experimenting with techniques to virtually let the blind see and the deaf hear.

Neil G. Scott, a computer scientist and chief engineer for the Archimedes Project at Stanford University, has been studying the accessibility issue for years. His current research focuses on networks of intelligent peripherals that make computer output understood by sight- or hearing-impaired people.

Scott and his students use a visual total access port, or VTAP, that connects to a computer, converting various screen objects into a bitmap data stream. The stream feeds into a prototype device called a graphical user interface accessor, which in turn directs it to any of several output assistive devices.

In Scott's prototype, icons come out as musical chords with a high audio rise time and slow decay, ending with words, such as Microsoft Excel, associated with the icon.

A haptic display, a type of a mouse driven by X- and Y-axis motors, tracks the outlines of frames and windows as the mouse moves over them'think of it as a computerized Ouija board. The haptic display can lodge itself on top of a radio button or slide off a menu option that's grayed out on the screen. Or the GUI accessor can drive a braille display or text reader.

'Finding the best output for each picture or image is the challenge,' Scott said last week in an interview during the Digital Government Research Conference, where he demonstrated the results of his latest research.

'Thomas R. Temin

inside gcn

  • IoT analytics platform

    Modern data analytics for public safety IoT

Reader Comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above