Events

Program: Paper Session 11

Home | Call for Contributions | Program | Registration | Venue & Facilities | Accessibility & Inclusivity Travel | Sponsors Committee Twitter |

  A women in white is wearing headphones and looking up. Text above her head says '2019 AES International Conference on Immersive and Interactive Audio. March 27-29, 2019. York, UK.

  

Paper Session 11: Content creation tools for workflows for XR

Chair: Lorenzo Picinali

 

P11-1: "Immersive audio programming in a virtual reality sandbox"

Nikolaj Andersson, Cumhur Erkut and Stefania Serafin

Immersive sandboxes for music creation in Virtual Reality (VR) are becoming widely available. Some sandboxes host Virtual Reality Musical Instruments (VRMIs), but usually only the basic components, such as oscillators, sample-based instruments, or simplistic step-sequencers. In this paper, after describing MuX (a VR sandbox) and its basic components, we present new elements developed for the environment. We focus on the lumped and distributed physically-inspired models for sound synthesis. A simple interface was developed to control the physical models with gestures, expanding the interaction possibilities within the sandbox. A preliminary evaluation shows that, as the number and complexity of the components increase, it becomes important to provide to the users ready-made machines instead of allowing them to build everything from scratch.

http://www.aes.org/e-lib/browse.cfm?elib=20443

 

P11-2: "Aesthetic modification of room impulse responses for interactive auralization"

Keith Godin, Hannes Gamper and Nikunj Raghuvanshi

Interactive auralization workflows in games and virtual reality today employ manual markup coupled to designer specified acoustic effects that lack spatial detail. Acoustic simulation can model such detail, yet is uncommon because realism often does not perfectly align with aesthetic goals. We show how to integrate realistic acoustic simulation while retaining designer control over aesthetics. Our method eliminates manual zone placement, provides spatially smooth transitions, and automates re-design for scene changes. It proceeds by computing perceptual parameters from simulated impulse responses, then applying transformations based on novel modification controls presented to the user. The result is an end-to-end physics-based auralization system with designer control. We present case studies that show the viability of such an approach.

http://www.aes.org/e-lib/browse.cfm?elib=20444

 

P11-3: "Reproducing Real World Acoustics in Virtual Reality Using Spherical Cameras"

Luca Remaggi, Hansung Kim, Philip J. B. Jackson and Adrian Hilton

Virtual Reality (VR) systems have been intensely explored, with several research communities investigating the different modalities involved. Regarding the audio modality, one of the main issues is the generation of sound that is perceptually coherent with the visual reproduction. Here, we propose a pipeline for creating plausible interactive reverb using visual information: first, we characterize real environment acoustics given a pair of spherical cameras; then, we reproduce reverberant spatial sound, by using the estimated acoustics, within a VR scene. The evaluation is made by extracting the room impulse responses (RIRs) of four virtually rendered rooms. Results show agreement, in terms of objective metrics, between the synthesized acoustics and the ones calculated from RIRs recorded within the respective real rooms.

http://www.aes.org/e-lib/browse.cfm?elib=20445

 

P11-4: "Efficient Encoding and Decoding of Binaural Sound with Resonance Audio"

Marcin Gorzel, Andrew Allen, Ian Kelly, Julius Kammerl, Alper Gungormusler, Hengchin Yeh and Frank Boland

Resonance Audio is an open source project designed for creating and controlling dynamic spatial sound in Virtual & Augmented Reality (VR/AR), gaming or video experiences. It also provides integrations with popular game development platforms and digital audio workstations as a preview plugin. Resonance Audio binaural decoder is used in YouTube VR to provide cinematic spatial audio experiences. This paper describes the core sound spatialization algorithms used in Resonance Audio.

http://www.aes.org/e-lib/browse.cfm?elib=20446

 

AES - Audio Engineering Society