AUTOMATED CREATION AND DETAILED ANNOTATION OF AUDIO/TACTILE MAPS USING SCALABLE VECTOR GRAPHICS (SVG)

J.A. Miele

The Smith-Kettlewell Eye Research Institute, Rehabilitation Engineering Research Center, San Francisco, USA

jam@ski.org

 

Smith-Kettlewell’s Tactile Maps Automated Production (TMAP) project has demonstrated the feasibility of using geographic information systems in conjunction with embossing technologies to allow blind individuals to independently produce high-quality tactile street maps for use in wayfinding. Street names are indicated by placing abbreviated Braille labels around the perimeter of the map where text is unlikely to conflict with graphical features. This labeling technique minimizes tactile clutter and is extremely effective for representing grid-like street networks. However, idealized grids account for only a small fraction of real-world street maps. In fact, owing to a multitude of factors, Braille is often inadequate for the complex task of annotating tactile maps. Consequently, the last two years of TMAP development have included an emphasis on audio/tactile functionality. Audio/tactile maps use a tactile figure as a spatial key to a set of auditory annotations to be provided by a computer. A given annotation is indexed to the location of the tactile feature which it labels. When that object is selected, the recording is played. It is also possible to associate multiple annotations with a single map feature by placing them in a hierarchical data structure to allow different types of information (or layers) to be toggled on and off, requested, or represented to the user in different ways. Automated creation of audio/tactile maps requires that both the tactile component, and the audio annotations be produced without need of human intervention. The scalable vector graphics (SVG) standard is an ideal medium for representing all aspects of the audio/tactile map. SVG can represent the graphical components with coordinate-based objects, and annotations can be associated with any graphical object through description tags. By using synthetic speech, text annotations can be spoken, thus allowing text-based data to be seamlessly included without the need for costly recordings. A number of platforms currently exist and are under development for the representation of, and interaction with, audio/tactile maps. These include tablet-, camera-, and stylus-based technologies. the aim of the TMAP project is to maximize availability of tactile street maps for visually impaired travelers. Smith-Kettlewell is therefore committed to  collaborating and cooperating with a variety of third-party developers to ensure that the TMAP technology is compatible with the widest possible variety of audio/tactile technologies. This session will provide detailed descriptions of the automated map-creation process, as well as demonstrations of the current platforms for interacting with audio/tactile maps.