We present an acoustic navigation experiment in virtual reality (VR), where participants were asked to locate and navigate towards an acoustic source within an environment of complex geometry using only acoustic cues. We implemented a procedural generator of complex scenes, capable of creating environments of arbitrary dimensions, multiple rooms, and custom frequency dependent acoustic properties of the surface materials. For the generation of the audio we used a real-time dynamic sound propagation engine which produces spatialized audio with reverberation by means of bi-directional path tracing (BDPT) and is capable of modeling acoustic absorption, transmission, scattering, and diffraction. This framework enables the investigation of the impact of various simulation properties on the ability of navigating a virtual environment. To validate the framework we conducted a pilot experiment with 10 subject in 30 environments and studied the influence of diffraction modeling on navigation by comparing their navigation performance in conditions with and without diffraction. The results suggest that listeners are successfully able to navigate VR environments using only acoustic cues. In the studied cases we did not observe a significant effect of diffraction on navigation performance. A significant amount of participants reported strong motion sickness effects, which highlights the ongoing issues of locomotion in VR.
Download Now (8.9 MB)