Program: Paper Session 1

Home | Call for Contributions | Program | Registration | Venue & Facilities | Accessibility & Inclusivity Travel | Sponsors Committee Twitter |

  A women in white is wearing headphones and looking up. Text above her head says '2019 AES International Conference on Immersive and Interactive Audio. March 27-29, 2019. York, UK.


Paper Session 1 - Auditory Interaction and Engagement

Chair: Damian Murphy


P1-1: "Characteristics of Whole-body Vibration Induced Through Low-frequency Sound Exposure"

Adam Ford, Ludovico Ausiello and Chris Barlow

Within audio-tactile playback systems, the induced vibration is often calibrated subjectively with no objective frame of reference. Using a broadband excitation signal, the sound induced vibration characteristics of the torso were identified, including the magnitude response, amplitude conversion efficiency and subjective perceptual thresholds. The effect of additional factors such as Body Mass Index were considered. The human torso was shown to act as a Helmholtz cavity, while an increase in BMI was shown to reduce the peak vibration amplitude. The body was further shown to behave as a linear transducer of sound into vibration, leading to the production of a novel conversion table. Perceptual tests identified a frequency dependent threshold of 94-107dBZ required to induce a perceivable whole-body vibration.


P1-2: "Analysis and training of human sound localization behavior with VR application"

Yun-Han Wu and Agnieszka Roginska

A VR training application is built in this paper to help improve human sound localization performance on generic head-related transfer function. Subjects go through 4 different phases in the experiment, tutorial, pre-test, training and post-test, in which he or she is instructed to trigger a sound stimuli and report the perceived location by rotating their head to face the direction. The data captured automatically during each trial of the experiment includes the correct and reported position of the stimuli, reaction time and the head rotation at each 50ms. The analysis results show that there is a statistically significant improvement on subjects performance.


P1-3: "In-Virtualis: A Study on the Impact of Congruent Virtual Reality Environments in Perceptual Audio Evaluation of Loudspeakers"

Alejandro Saurì Suárez, Neofytos Kaplanis, Stefania Serafin, and Søren Bech

The practical benefits of conducting evaluations of acoustical scenes in laboratory settings are evident in the literature. Such approaches, however, may implicate an audio-visual incongruity, as assessors are physically in a laboratory room, whilst auditioning another, e.g., an auditorium. 

In this report it is hypothesised that presenting congruent audio-visual stimuli improves the experience of an auralised sound field. Measured sound fields were reproduced over a 3D loudspeaker array. Experts assessors evaluated those in two visual conditions: a congruent room, and a dark environment. The results indicate a tendency towards improved plausibility and decreased task-difficulty for congruent conditions. Visual conditions did not reveal a statistical significance indicating the need of further experiments with a larger sample-size, interface improvements, and realistic graphics.

AES - Audio Engineering Society