Project GOal

HandSight is a vision-augmented touch system intended to support activities of daily living for people with visual impairments, through sensing and feeding back non-tactile information about the physical world as it is touched. 

KEYWORDS

Accessibility, Visual Impairments, Wearables, Real-Time OCR

Role

User Research, Prototype Testing, Prototyping

Team

Catherine Jou, Lee Stearns, Anis Abboud, and others from the HandSight Team led by John Froehlich at the UMD HCI Labs


User Testing: Text Reading Prototype


Prototype Features

I became involved in this project around the time the prototype already was in the final stages of development, and the team was gearing up to test the prototype. This phase of the project focused specifically on the task of helping a visually impaired individual read digital and printed text through touch. The prototype was testable in two modes:

  • An iPad app mode that read words to the user based on touch-based finger tracing, and was able to log precise path measurements.
  • A finger-mounted camera mode using a NanEye camera which was able to scan and read back printed text on paper.

The other component of the prototype was audio and haptic feedback to let the user know where their finger was in relation to a target location in a piece of text. The feedback system was controlled by a wrist-mounted Arduino Pro Micro board, with haptic actuators secured above and below the user's finger to provide directional feedback.

prototype Testing

The initial prototypes for this project had already been built at the time that I joined the HandSight research team, so my primary role in this phase of the project was to help carry out user testing of the text reading prototype. The objective of the study was to explore how to continuously guide a blind user’s finger across text. The study procedure began by allowing participants to explore a document freely by touch in order to gain a spatial understanding of the document layout (photos, text, whitespace). We then took participants through reading tasks using different (audio, haptic) feedback conditions. 

Though preliminary, our results showed that participants valued the ability to access printed material, and that, in contrast to previous findings, audio finger guidance may result in the best reading performance.


Design + build: HAND Navigation Prototype


Design & Prototyping

The next phase of the project was an investigation into providing body-mounted directional haptic feedback to a visually impaired user. This project direction was determined after a series of interviews we conducted with visually impaired users asking them about daily routines as well as challenges in their daily lives, as well as an extensive literature review to look at the different types of wearable haptic technologies that have been explored.

We made prototypes of actuators mounted on a ribbon cable bracelet, which were programmed with different vibration patterns mapped to directional signals and movements. We developed an interface to test the patterns (directional, targeted, pulsating, and interpolation) using Processing, as well as an Android app to facilitate directional target acquisition tasks.

Testing

The group ran a controlled lab experiment on a later version of the prototype to evaluate the impact of directional wrist-based vibro-motor feedback on hand movement, comparing lower-fidelity (4-motor) and higher-fidelity (8-motor) wristbands. Twenty blindfolded participants completed a series of trials, which consisted of interpreting a haptic stimuli and executing a 2D directional movement on a touchscreen. We compared the two conditions in terms of movement error and trial speed, but also conducted a deeper exploratory analysis of the impact of specific directions on performance. 

The results showed that doubling the number of haptic motors reduces directional movement error, but not to the extent expected. 


Publications


We submitted an article with our findings from the text reading HandSight study to the ACM Transactions on Accessible Computing (TACCESS) journal for review. A paper was also submitted an article to the Graphics Interface Conference (GCI) for the navigation HandSight prototype.

Evaluating Haptic and Auditory Directional Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras
Lee Stearns, Ruofei Du, Uran Oh, Catherine Jou, Leah Findlater, David A. Ross, Jon E. Froehlich
ACM Transactions on Accessible Computing (TACCESS 2016). To Appear.

Evaluation of User Performance using Wrist-based Haptic Directional Guidance for Hand Movement
Jonggi Hong, Lee Stearns, Tony Cheng, Jon E. Froehlich, David Ross, Leah Findlater
Proc. Graphics Interface Conference (GI 2016). To Appear.