Friday, September 30, 9:45 am — 11:15 am (Rm 409A)
Moving Virtual Source Perception in 3D Space—Sam Hughes, University of York - York, UK; Gavin Kearney, University of York - York, UK
This paper investigates the rendering of moving sound sources in the context of real-world loudspeaker arrays and virtual loudspeaker arrays for binaural listening in VR experiences. Near Field compensated Higher Order Ambisonics (HOA) and Vector Base Amplitude Panning (VBAP) are investigated for both spatial accuracy and tonal coloration with moving sound source trajectories. A subjective listening experiment is presented over 6, 26, and 50 channel real and virtual spherical loudspeaker configurations to investigate accuracy of spatial rendering and tonal effects. The results show the applicability of different degrees of VBAP and HOA to moving source rendering and illustrate subjective similarities and differences to real and virtual loudspeaker arrays. This session is part of the co-located AVAR Conference which is not included in the normal convention All Access badge.
Disparity in Horizontal Correspondence of Sound and Source Positioning: The Impact on Spatial Presence for Cinematic VR—Angela McArthur, BBC R&D - London, UK; Queen Mary University of London - London, UK
This study examines the extent to which disparity in azimuth location between a sound cue and image target can be varied in cinematic virtual reality (VR) content, before presence is broken. It applies disparity consistently and inconsistently across five otherwise identical sound-image events. The investigation explores spatial presence, a sub-construct of presence, hypothesizing that consistently applied disparity in horizontal audio-visual correspondence elicits higher tolerance before presence is broken, than inconsistently applied disparity. Guidance about the interactions of subjective judgments and spatial presence for sound positioning is needed for non-specialists to leverage VR’s spatial sound environment. Although approximate compared to visual localization, auditory localization is paramount for VR: it is lighting condition-independent, omnidirectional, not as subject to occlusion, and creates presence. This session is part of the co-located AVAR Conference which is not included in the normal convention All Access badge.
Lateral Listener Movement on the Horizontal Plane (Part 2): Sensing Motion through Binaural Simulation in a Reverberant Environment—Matthew Boerum, McGill University - Montreal, QC, Canada; Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT); Bryan Martin, McGill University - Montreal, QC, Canada; Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT) - Montreal, QC, Canada; Richard King, McGill University - Montreal, Quebec, Canada; The Centre for Interdisciplinary Research in Music Media and Technology - Montreal, Quebec, Canada; George Massenburg, Schulich School of Music, McGill University - Montreal, Quebec, Canada; Centre for Interdisciplinary Research in Music Media and Technology (CIRMMT) - Montreal, Quebec, Canada
In a multi-part study, first-person horizontal movement between two virtual sound source locations in an auditory virtual environment (AVE) was investigated by evaluating the sensation of motion as perceived by the listener. A binaural cross-fading technique simulated this movement while real binaural recordings of motion were made as a reference using a motion apparatus and mounted head and torso simulator (HATS). Trained listeners evaluated the sensation of motion among real and simulated conditions in two opposite environment-dependent experiments: Part 1 (semi-anechoic), Part 2 (reverberant). Results from Part 2 were proportional to Part 1, despite the presence of reflections. The simulation provided the greatest sensation of motion again, showing that binaural audio recordings present less sensation of motion than the simulation. This session is part of the co-located AVAR Conference which is not included in the normal convention All Access badge.