TY - GEN
T1 - Acoustic-tactile rendering of visual information
AU - Silva, Pubudu Madhawa
AU - Pappas, Thrasyvoulos N.
AU - Atkins, Joshua
AU - West, James E.
AU - Hartmann, William M.
PY - 2012
Y1 - 2012
N2 - In previous work, we have proposed a dynamic, interactive system for conveying visual information via hearing and touch. The system is implemented with a touch screen that allows the user to interrogate a two-dimensional (2-D) object layout by active finger scanning while listening to spatialized auditory feedback. Sound is used as the primary source of information for object localization and identification, while touch is used both for pointing and for kinesthetic feedback. Our previous work considered shape and size perception of simple objects via hearing and touch. The focus of this paper is on the perception of a 2-D layout of simple objects with identical size and shape. We consider the selection and rendition of sounds for object identification and localization. We rely on the head-related transfer function for rendering sound directionality, and consider variations of sound intensity and tempo as two alternative approaches for rendering proximity. Subjective experiments with visually-blocked subjects are used to evaluate the effectiveness of the proposed approaches. Our results indicate that intensity outperforms tempo as a proximity cue, and that the overall system for conveying a 2-D layout is quite promising.
AB - In previous work, we have proposed a dynamic, interactive system for conveying visual information via hearing and touch. The system is implemented with a touch screen that allows the user to interrogate a two-dimensional (2-D) object layout by active finger scanning while listening to spatialized auditory feedback. Sound is used as the primary source of information for object localization and identification, while touch is used both for pointing and for kinesthetic feedback. Our previous work considered shape and size perception of simple objects via hearing and touch. The focus of this paper is on the perception of a 2-D layout of simple objects with identical size and shape. We consider the selection and rendition of sounds for object identification and localization. We rely on the head-related transfer function for rendering sound directionality, and consider variations of sound intensity and tempo as two alternative approaches for rendering proximity. Subjective experiments with visually-blocked subjects are used to evaluate the effectiveness of the proposed approaches. Our results indicate that intensity outperforms tempo as a proximity cue, and that the overall system for conveying a 2-D layout is quite promising.
KW - HRTF
KW - immersive environments
KW - virtual reality
KW - visual substitution
KW - visually impaired
UR - http://www.scopus.com/inward/record.url?scp=84859460653&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84859460653&partnerID=8YFLogxK
U2 - 10.1117/12.916166
DO - 10.1117/12.916166
M3 - Conference contribution
AN - SCOPUS:84859460653
SN - 9780819489425
T3 - Proceedings of SPIE - The International Society for Optical Engineering
BT - Proceedings of SPIE-IS and T Electronic Imaging - Human Vision and Electronic Imaging XVII
PB - SPIE
T2 - Human Vision and Electronic Imaging XVII
Y2 - 23 January 2012 through 26 January 2012
ER -