TOUCH, HEAR AND SEA: A SIMULATOR FOR THE BLIND SAILOR’S GEOGRAPHICAL REPRESENTATION

M. Simonnet1, R.D. Jacobson2, J. Rowell3

1 - European Center for Virtual Reality, Psychology, Brest, France.

2 - Investigating Multi Modal Representation of Spatial Environments, Geography, Calgary, Canada

3 - Anglia Polytechnic University, Geography, Cambridge, UK.

mathieu.simonnet@univ-brest.fr

 

Given the wide-open spaces that characterise maritime environments it is eminently possible for blind sailors to helm a sailboat using among other things primary elements such as the sensation of wind. As a visually impaired person can understand where they are and how to get to a chosen destination, they are even capable of acting as a navigator for a crew. This is partly due to the contrasts between areas covered by sea and land. While the absence of vision considerably reduces the perception of distant features, restricting the ability to take bearings or identify a position on a map on land, this is less important at sea where landmarks, paths and edges are not so instrumental to wayfinding. Presently, sighted sailors use GIS software, like Maxsea or Fugawi, to establish itineraries, set course and control their position. These offer the ability to update the location of a boat on the map, thereby easily and quickly linking micro-scale representations to macro-scale position in the physical world in real time. We predict that simulating navigation on a tactile map could encourage different kinds of spatial understanding. In this respect the paper describes a new innovation merging sonic output with haptic force feedback using the Phantom Omni interface that will provide a virtual reality simulation that could be the answer to blind sailors requirements. Additionally by ensuring that auditory, vocal and haptic specifications make sense to visually impaired people they will not only be provided with an accessible maritime map, but also information about the relationship of their boat to the actual environment. There is mounting evidence to suggest that delivering spatial information via several sensory sources is more effective than relying on a single output. Intending to help people develop a more complete picture of their surroundings, we have conceived a piece of software called “Seatouch” that provides spatial information via a combination of methods. As tactile discrimination is generally poor, only using the fingers does not enable people to access many details. Developing technology that mixes sonification, vocalization and force feedback enables visually impaired people to access extra information about: geographic features; distances and orientations from the boat or between two elements; the name of buoys and beacons; depth; position of the boat; and other attributes that touch alone would not offer. Using scientific methods the paper also reports how multimodal interfaces compare to traditional tactile maps for sailing purposes.