AES E-Library

AES E-Library Search Results

Search Results (Displaying 1-10 of 22 matches) New Search
Sort by:
                 Records Per Page:

Bulk download: Download Zip archive of all papers from this conference

 

Moving Virtual Source Perception in 2D Space

Document Thumbnail

This paper investigates the rendering of moving sound sources in the context of real-world loudspeaker arrays and virtual loudspeaker arrays for binaural listening in VR experiences. Near Field compensated Higher Order Ambisonics (HOA) and Vector Base Amplitude Panning (VBAP) are investigated for both spatial accuracy and tonal coloration with moving sound source trajectories. A subjective listening experiment is presented over 6, 26, and 50 channel real and virtual spherical loudspeaker configurations to investigate accuracy of spatial rendering and tonal effects. The results show the applicability of different degrees of VBAP and HOA to moving source rendering and illustrate subjective similarities and differences to real and virtual loudspeaker arrays.

Authors:
Affiliation:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


Disparity in Horizontal Correspondence of Sound and Source Positioning: The Impact on Spatial Presence for Cinematic VR

Document Thumbnail

This study examines the extent to which disparity in azimuth location between a sound cue and image target can be varied in cinematic virtual reality (VR) content, before presence is broken. It applies disparity consistently and inconsistently across five otherwise identical sound-image events. The investigation explores spatial presence, a sub-construct of presence, hypothesizing that consistently applied disparity in horizontal audio-visual correspondence elicits higher tolerance before presence is broken, than inconsistently applied disparity. Guidance about the interactions of subjective judgments and spatial presence for sound positioning is needed for non-specialists to leverage VR’s spatial sound environment. Although approximate compared to visual localization, auditory localization is paramount for VR: it is lighting condition-independent, omnidirectional, not as subject to occlusion, and creates presence.

Author:
Affiliations:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


Lateral Listener Movement on the Horizontal Plane (Part 2): Sensing Motion through Binaural Simulation in a Reverberant Environment

Document Thumbnail

In a multi-part study, first-person horizontal movement between two virtual sound source locations in an auditory virtual environment (AVE) was investigated by evaluating the sensation of motion as perceived by the listener. A binaural cross-fading technique simulated this movement while real binaural recordings of motion were made as a reference using a motion apparatus and mounted head and torso simulator (HATS). Trained listeners evaluated the sensation of motion among real and simulated conditions in two opposite environment-dependent experiments: Part 1 (semi-anechoic), Part 2 (reverberant). Results from Part 2 were proportional to Part 1, despite the presence of reflections. The simulation provided the greatest sensation of motion again, showing that binaural audio recordings present less sensation of motion than the simulation.

Authors:
Affiliations:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


Object-Based 3D Audio Production for Virtual Reality Using the Audio Definition Model

Document Thumbnail

This paper presents a case study of the production of a virtual reality experience with object-based 3D audio rendering using professional tools and workflows. An object-based production was created using a common digital audio workstation with real-time dynamic binaural sound rendering and visual monitoring of the scene on a head-mounted display. The Audio Definition Model is a standardized meta-data model for representing audio content including object-based, channel-based, and scene-based 3D audio. Using the Audio Definition Model the object-based audio mix could be exported to a single WAV file. Plug-ins were built for a game engine in which the virtual reality application and the graphics were authored to allow import of the object-based audio mix and custom dynamic binaural rendering.

Authors:
Affiliation:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


Virtually Replacing Reality: Sound Design and Implementation for Large Scale Room Scale VR Experiences

Document Thumbnail

Audio for Virtual Reality (VR) presents a significant array of challenges and augmentations to the traditional requirements of sound designers employed within the video games industry. The change in perspective and embodiment of the player requires the employment of additional tools and consideration of object size, spacing and spatial design as a more significant part of the sound design process. The author presents her approach to these tasks from the perspective of developing audio for the large-scale Room Scale video game developer Zero Latency. Focusing on the design considerations and processes required in this unique medium, the content of this presentation is designed to give insight in to this large-scale version of VR technology.

Author:
Affiliation:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


Crafting Cinematic High End VR Audio for Etihad Airways

Document Thumbnail

MediaMonks were approached by Etihad Airways via their ad agency, The Barbarian Group, to create a Virtual Reality experience taking place aboard their Airbus A380, the worlds largest and most luxurious non-private airplane. Challenges included capturing audio including dialogue aboard the real plane, crafting an experience that encourages repeated viewing, and combining a sense of truthful realism with a sense of dream-like luxury without relying on a musical score, all in a head tracked spatialized mix. Artistic conventions around non-diegetic sound and their psychological impact in VR also required consideration.

Authors:
Affiliation:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


Efficient, Compelling, and Immersive VR Audio Experience Using Scene Based Audio/Higher Order Ambisonics

Document Thumbnail

Scene-based audio (SBA) also known as Higher Order Ambisonics (HOA) combines the advantages of object-based and traditional channel-based audio schemes. It is particularly suitable for enabling a truly immersive (360, 180) VR audio experience. SBA signals can be efficiently rotated and binauralized. This makes realistic VR audio practical on consumer devices. SBA also provides conductive mechanisms for acquiring live soundfields for VR. MPEG-H is a newly adopted compression standard that can efficiently compress HOA for transmission and storage purposes. It is the only known standard that provides compressed HOA end-to-end. Our paper describes a practical end-to-end chain for SBA/HOA based VR audio. Given its advantages over other formats, SBA should be “the format of choice” for a compelling VR audio experience.

Authors:
Affiliation:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


Soundfield Navigation using an Array of Higher-Order Ambisonics Microphones

Document Thumbnail

A method is presented for soundfield navigation through estimation of the spherical harmonic coefficients (i.e., the higher-order ambisonics signals) of a soundfield at a position within an array of two or more ambisonics microphones. An existing method based on blind source separation is known to suffer from audible artifacts, while an alternative method, in which a weighted average of the ambisonics signals from each microphone is computed, is shown to necessarily introduce comb-filtering and degrade localization for off-center sources. The proposed method entails computing a regularized least-squares estimate of the soundfield at the listening position using the signals from the nearest microphones, excluding those that are nearer to a source than to the listening position. Simulated frequency responses and predicted localization errors suggest that, for interpolation between a pair of microphones, the proposed method achieves both accurate localization and minimal spectral coloration when the product of angular wavenumber and microphone spacing is less than twice the input expansion order. It is also demonstrated that failure to exclude from the calculation those microphones that are nearer to a source than to the listening position can significantly degrade localization accuracy.

Open Access

Open
Access

Authors:
Affiliation:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:


Download Now (726 KB)

This paper is Open Access which means you can download it for free.

Start a discussion about this paper!


Immersive Audio Rendering for Interactive Complex Virtual Architectural Environments

Document Thumbnail

In this study we investigate methods for sound propagation in virtual complex architectural environments for spatialized audio rendering to use in immersive virtual reality (VR) scenarios. During the last few decades, sound propagation models have been designed and investigated for complex building structures using geometrical approach (GA) and hybrid techniques. For sound propagation, it is required to design fast simulation tools to incorporate a sufficient number of dynamically moving sound sources, room acoustical properties, and reflections and diffraction from interactively changing surface elements in VR environments. Using physically based models, we achieved a reasonable trade-off between sound quality and system performance. Furthermore, we describe the sound rendering pipeline into a virtual scene to simulate virtual environment.

Authors:
Affiliation:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


Immersive Audio for VR

Document Thumbnail

Object based sound creation, packaging, and playback of content is now prevalent in the Cinema and Home Theater, delivering immersive audio experiences. This has paved the way for Virtual Reality sound where precision of sound is necessary for complete immersion in a virtual world.

Authors:
Affiliations:
AES Conference:
Paper Number: Permalink
Publication Date:
Subject:

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Start a discussion about this paper!


                 Search Results (Displaying 1-10 of 22 matches)
AES - Audio Engineering Society