Cloud-Enabled Interactive Sound Propagation for Untethered Mixed Reality
×
Cite This
Citation & Abstract
M. Chemistruck, K. Storck, and N. Raghuvanshi, "Cloud-Enabled Interactive Sound Propagation for Untethered Mixed Reality," Paper 3-5, (2020 August.). doi:
M. Chemistruck, K. Storck, and N. Raghuvanshi, "Cloud-Enabled Interactive Sound Propagation for Untethered Mixed Reality," Paper 3-5, (2020 August.). doi:
Abstract: We describe the first system for physically-based wave acoustics including diffraction effects within a holographic experience shared by multiple untethered devices. Our system scales across standalone mobile-class devices, from a HoloLens to a modern smart phone. Audio propagation in real-world scenes exhibits perceptually salient effects that complement visuals. These include diffraction losses from obstruction, re-direction (“portaling”) of sounds around physical doorways and corners, and reverberation in complex geometries with multiple connected spaces. Such effects are necessary in mixed reality to achieve a sense of presence for virtual people and things within the real world, but have so far been computationally infeasible on mobile devices. We propose a novel cloud-enabled system that enables such immersive audio-visual scenarios on untethered mixed reality devices for the first time.
@article{chemistruck2020cloud-enabled,
author={chemistruck, michael and storck, kyle and raghuvanshi, nikunj},
journal={journal of the audio engineering society},
title={cloud-enabled interactive sound propagation for untethered mixed reality},
year={2020},
volume={},
number={},
pages={},
doi={},
month={august},}
@article{chemistruck2020cloud-enabled,
author={chemistruck, michael and storck, kyle and raghuvanshi, nikunj},
journal={journal of the audio engineering society},
title={cloud-enabled interactive sound propagation for untethered mixed reality},
year={2020},
volume={},
number={},
pages={},
doi={},
month={august},
abstract={we describe the first system for physically-based wave acoustics including diffraction effects within a holographic experience shared by multiple untethered devices. our system scales across standalone mobile-class devices, from a hololens to a modern smart phone. audio propagation in real-world scenes exhibits perceptually salient effects that complement visuals. these include diffraction losses from obstruction, re-direction (“portaling”) of sounds around physical doorways and corners, and reverberation in complex geometries with multiple connected spaces. such effects are necessary in mixed reality to achieve a sense of presence for virtual people and things within the real world, but have so far been computationally infeasible on mobile devices. we propose a novel cloud-enabled system that enables such immersive audio-visual scenarios on untethered mixed reality devices for the first time.},}
TY - paper
TI - Cloud-Enabled Interactive Sound Propagation for Untethered Mixed Reality
SP -
EP -
AU - Chemistruck, Michael
AU - Storck, Kyle
AU - Raghuvanshi, Nikunj
PY - 2020
JO - Journal of the Audio Engineering Society
IS -
VO -
VL -
Y1 - August 2020
TY - paper
TI - Cloud-Enabled Interactive Sound Propagation for Untethered Mixed Reality
SP -
EP -
AU - Chemistruck, Michael
AU - Storck, Kyle
AU - Raghuvanshi, Nikunj
PY - 2020
JO - Journal of the Audio Engineering Society
IS -
VO -
VL -
Y1 - August 2020
AB - We describe the first system for physically-based wave acoustics including diffraction effects within a holographic experience shared by multiple untethered devices. Our system scales across standalone mobile-class devices, from a HoloLens to a modern smart phone. Audio propagation in real-world scenes exhibits perceptually salient effects that complement visuals. These include diffraction losses from obstruction, re-direction (“portaling”) of sounds around physical doorways and corners, and reverberation in complex geometries with multiple connected spaces. Such effects are necessary in mixed reality to achieve a sense of presence for virtual people and things within the real world, but have so far been computationally infeasible on mobile devices. We propose a novel cloud-enabled system that enables such immersive audio-visual scenarios on untethered mixed reality devices for the first time.
We describe the first system for physically-based wave acoustics including diffraction effects within a holographic experience shared by multiple untethered devices. Our system scales across standalone mobile-class devices, from a HoloLens to a modern smart phone. Audio propagation in real-world scenes exhibits perceptually salient effects that complement visuals. These include diffraction losses from obstruction, re-direction (“portaling”) of sounds around physical doorways and corners, and reverberation in complex geometries with multiple connected spaces. Such effects are necessary in mixed reality to achieve a sense of presence for virtual people and things within the real world, but have so far been computationally infeasible on mobile devices. We propose a novel cloud-enabled system that enables such immersive audio-visual scenarios on untethered mixed reality devices for the first time.
Open Access
Authors:
Chemistruck, Michael; Storck, Kyle; Raghuvanshi, Nikunj
Affiliations:
Microsoft Mixed Reality; Microsoft Research(See document for exact affiliation information.)
AES Conference:
2020 AES International Conference on Audio for Virtual and Augmented Reality (August 2020)
Paper Number:
3-5
Publication Date:
August 13, 2020Import into BibTeX
Permalink:
http://www.aes.org/e-lib/browse.cfm?elib=20882