A Platform for Audiovisual Telepresence Using Model- and Data-Based Wave-Field Synthesis
We present a platform for real-time transmission of immersive audiovisual impressions using model- and data-based audio wave-field analysis/synthesis and panoramic video capturing/projection. The audio subsystem considered in this paper is based on microphone arrays with different element counts and directivities as well as weakly directional loudspeaker arrays. We report on both linear and circular setups that feed different wave-field synthesis systems. In an attempt to extend this, we present first findings for a data-based approach derived using experimental simulations. This data-based wave-field analysis/synthesis (WFAS) approach uses a combination of cylindrical-harmonic decomposition of cardioid array signals and enforces causal plane wave synthesis by angular windowing and a directional delay term. Specifically, our contributions include (1) a high-resolution telepresence environment that is omnidirectional in both the auditory and visual modality, as well as (2) a study of data-based WFAS realistic microphone directivities as a contribution towards for real-time holophonic reproduction.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is temporarily free for AES members.