Program: Paper Session 4

Home | Call for Contributions | Program | Registration | Venue & Facilities | Accessibility & Inclusivity Travel | Sponsors Committee Twitter |

  A women in white is wearing headphones and looking up. Text above her head says '2019 AES International Conference on Immersive and Interactive Audio. March 27-29, 2019. York, UK.



Paper Session 4 - Moving Sound and Moving Listeners

Chair: Jude Brereton


P4-1: "Perceptual Evaluation of Variable-Orientation Binaural Room Impulse Response Rendering"

Markus Zaunschirm, Franz Zotter and Matthias Frank

In the current effort to improve sound for virtual auditory environments, realism and audio quality in head-tracked binaural rendering is again becoming important. While rendering based on static dummy-head measurements achieve high audio quality and externalization, the realism lacks interactivity with changes of the head orientation. Motion-tracked binaural (MTB) has been presented as a head-tracked rendering algorithm for recordings made with circular arrays on rigid spheres. In this contribution, we investigate the algorithm proposed for MTB rendering and adopt it for variable-orientation rendering using binaural room impulse responses (BRIR) measured for multiple, discrete orientations of an artificial-head. The experiment in particular investigates the perceptual implications of the angular resolution of the multi-orientation BRIR sets and the time/frequency-resolution of the algorithm.


P4-2: "The Impact of Head Movement on Perceived Externalization of a Virtual Sound Source with Different BRIR Lengths"

Song Li, Roman Schlieper and Jürgen Peissig

The present study aims to investigate the influence of head movement on perceived externalization of a virtual sound source with various lengths of binaural room impulse responses (BRIRs). For this purpose, non-individual BRIRs were measured in a listening room and truncated to different lengths. Such modified BRIRs were convolved with speech and music signals, and the resulting binaural signals were presented over headphones. During each presentation, subjects were either asked to perform head movements or to remain stationary. The experimental results revealed that head movements can substantially improve externalization of virtual sound sources rendered by short BRIRs, especially for frontal sound sources. In contrast, head movements have no substantial influence on externalization for virtual sound sources generated by long BRIRs.


AES - Audio Engineering Society