In our daily lives, we usually perceive an event via more than one sensory modality (e.g., vision, hearing, touch). Therefore, multimodal integration and interactions play an important role when we use objects and for event recognition in our environment. A virtual environment (VE) is a computer simulation of a realistic-looking and interactive world. VEs should take into account the multisensory nature of humans and communicate with the user not only through vision but also through other modalities. In addition to vision, hearing and touch are the most commonly used communication channels. Recently, a variety of products with additional tactile input and output capabilities have been developed (e.g., Apple iPhone and other touch-screen devices, NintendoWii, etc.). Some of these devices provide new possibilities for interacting with a computer, including the auditory modality. Binaural synthesis and rendering are becoming key technologies for multimedia products. Virtual environments are no longer limited to academic research; they have commercial applications, particularly in medicine, game, and entertainment industries. Thus, the quality of VEs is becoming increasingly important. User interaction with a VE is a key issue in the perception of its quality. Several studies have discussed the quality of displays, input and output devices (for different modalities) as well as software and hardware issues; however, multimodal user interaction should also be examined. This paper focuses on the parameters that influence the quality of audio-tactile VEs.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.