AES E-Library

AES E-Library

Deep Neural Networks for Cross-Modal Estimations of Acoustic Reverberation Characteristics from Two-Dimensional Images

Document Thumbnail

In augmented reality (AR) applications, reproduction of acoustic reverberation is essential for creating an immersive audio experience. The audio component of an AR experience should simulate the acoustics of the environment that users are experiencing. Earlier, sound engineers could program all the reverberation parameters in advance for a scene or if the audience was in a fixed position. However, adjusting the reverberation parameters using conventional methods is difficult because all such parameters cannot be programmed for AR applications. Considering that skilled acoustic engineers can estimate reverberation parameters from an image of a room, we trained a deep neural network (DNN) to estimate reverberation parameters from two-dimensional images. The results suggest a DNN can estimate the acoustic reverberation parameters from one image.

Authors:
Affiliations:
AES Convention: Paper Number:
Publication Date:
Subject:
Permalink: https://www.aes.org/e-lib/browse.cfm?elib=19512

Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!

This paper costs $33 for non-members and is free for AES members and E-Library subscribers.

Learn more about the AES E-Library

E-Library Location:

Start a discussion about this paper!


AES - Audio Engineering Society