This paper presents a study on the perception of a nearby wall considering a speaking avatar in virtual acoustic environments (VAEs) for headphone reproduction. The avatar represents the user in the virtual world and can be controlled by authentic tracked movements of the user. The only sound source is the avatar’s mouth. The scenario is realized with dynamic binaural synthesis based on oral binaural room impulse responses (OBRIRs). A psychoacoustic experiment was conducted to test for audible cues of the nearby virtual wall at different distances in different rooms. Measured as well as simulated OBRIRs were taken into account. The results provide encouraging information for considering echolocation aspects in the design of interactive virtual acoustic environments.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.