In many studies of localizing 3D sound via headphones, static sounds and individualized HRTFs were used. When the dynamic cue and nonindividualized HRTFs are used, 3D sound presentation time for accurate localization has been interesting, because moving the head toward the direction of the sound will take time. This paper presents an experimental study on subject's reaction time for localizing the 3D sound presented via headphones. There were 31 volunteers (16 males and 15 females). The experiment was conducted in a noise-isolated chamber. Huron Lake CP4 system was used to generate the 3D sound and Flock of birds motion tracking system was used to introduce the dynamic cue for localization and registration of head movement. The sound of coin-drops with a bandwidth of 0 Hz to 11,500 Hz was applied as the testing sound. The sound was presented at any possible position on the horizontal plane with constant distance to the center of the head. The subjects were indicated to move their body to localize the sound as soon as possible by pointing the midline of the head toward the right direction of the sound with ±10° azimuths accuracy. The results showed that for most of the subjects, it takes less than 11 seconds to localize the sound. The shortest average reaction time is 5.8 sec for female and 5.3 sec for male subjects; the longest reaction time for female is 32 sec and 42 sec for male subjects. Large individual differences were found and this may due to the reason of generic HRTFs used in the experiment. Localization adaptation was found in the experiment. It happened after longer time exposure to the same sound stimulation from the same location. Present study cannot identify the exposure time for developing localization adaptation.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.