GIST: Wearable Gestural Interface for Remote Spatial Perception
Spatial perception is a challenging task for people who are blind due to the limited functionality and sensing range of hands. We present GIST, a wearable gestural interface that offers spatial perception functionality through the novel appropriation of the user's hands into versatile sensing rods.
How it works
Using a wearable depth-sensing camera (Kinect), GIST analyzes the visible physical space and allows blind users to access spatial information about this space using different hand gestures. GIST supports four different spatial perception functions:
Color sensing: the "V" gesture using index and middle finger acts as an color sensing tool. The index and middle finger define a square and the dominant color of that area is sampled. Users can change the distance between both fingers to allow for more precise or larger area color sensing.
Human presence sensing: the fist gesture (e.g. no fingers) allows for sensing for the presence of a human.
- Point depth-sensing: pointing with the index finger conveys the distance to the object or human the arm is pointing at. An area of 20 by 20 pixels at the center of the index finger is sampled from the map to report the average depth information. Distance is reported in meters or feet depending on the user's preference.
If the human presence or
color sensing gesture is held in the same position for three seconds, GIST will automatically switch to point depth sensing.
- Area depth-sensing: the open hand gesture (five fingers) is an extension of point depth-sensing but instead of a 20x20 pixel area a rectangular area defined by the user's thumb, middle and ring finger is sampled. GIST will report the smallest depth in this area. This gesture may be useful to find objects that cannot be found by color (e.g., a white cup on a white table), but which only stand out by looking at depth. This is also a dynamic gesture that allows for sampling a larger or smaller area by contracting or expanding the fingers.
Using a wearable depth-sensing camera, spatial information is provided depending on where the user's arm is pointed at.
Vinitha Khambadkar and Eelke Folmer. GIST: A Gestural Interface for Remote Spatial Perception. Proceedings of the ACM Symposium on User Interface Software and Technology (UIST'13), pages 301-310, St. Andrews, October 2013.
[20% acceptance rate].