Spatial Reconstruction-Based Rendering of Microphone Array Room Impulse Responses
×
Cite This
Citation & Abstract
L. McCormack, N. Meyer-Kahlen, and A. Politis, "Spatial Reconstruction-Based Rendering of Microphone Array Room Impulse Responses," J. Audio Eng. Soc., vol. 71, no. 5, pp. 267-280, (2023 May.). doi: https://doi.org/10.17743/jaes.2022.0072
L. McCormack, N. Meyer-Kahlen, and A. Politis, "Spatial Reconstruction-Based Rendering of Microphone Array Room Impulse Responses," J. Audio Eng. Soc., vol. 71 Issue 5 pp. 267-280, (2023 May.). doi: https://doi.org/10.17743/jaes.2022.0072
Abstract: A reconstruction-based rendering approach is explored for the task of imposing the spatial characteristics of a measured space onto a monophonic signal while also reproducing it over a target playback setup. The foundation of this study is a parametric rendering framework, which can operate either on arbitrary microphone array room impulse responses (RIRs) or Ambisonic RIRs. Spatial filtering techniques are used to decompose the input RIR into individual reflections and anisotropic diffuse reverberation, which are reproduced using dedicated rendering strategies. The proposed approach operates by considering several hypotheses involving different rendering configurations and thereafter determining which hypothesis reconstructs the input RIR most faithfully.With regard to the present study, these hypotheses involved considering different potential reflection numbers. Once the optimal number of reflections to render has been determined over time and frequency, the array directional responses used to reconstruct the input RIR are substituted with spatialization gains for the target playback setup. The results of formal listening experiments suggest that the proposed approach produces renderings that are perceptually more similar to reference responses, when compared with the use of an established subspace-based detection algorithm. The proposed approach also demonstrates similar or better performance than that achieved with existing state-of-the-art methods.
@article{mccormack2023spatial,
author={mccormack, leo and meyer-kahlen, nils and politis, archontis},
journal={journal of the audio engineering society},
title={spatial reconstruction-based rendering of microphone array room impulse responses},
year={2023},
volume={71},
number={5},
pages={267-280},
doi={https://doi.org/10.17743/jaes.2022.0072},
month={may},}
@article{mccormack2023spatial,
author={mccormack, leo and meyer-kahlen, nils and politis, archontis},
journal={journal of the audio engineering society},
title={spatial reconstruction-based rendering of microphone array room impulse responses},
year={2023},
volume={71},
number={5},
pages={267-280},
doi={https://doi.org/10.17743/jaes.2022.0072},
month={may},
abstract={a reconstruction-based rendering approach is explored for the task of imposing the spatial characteristics of a measured space onto a monophonic signal while also reproducing it over a target playback setup. the foundation of this study is a parametric rendering framework, which can operate either on arbitrary microphone array room impulse responses (rirs) or ambisonic rirs. spatial filtering techniques are used to decompose the input rir into individual reflections and anisotropic diffuse reverberation, which are reproduced using dedicated rendering strategies. the proposed approach operates by considering several hypotheses involving different rendering configurations and thereafter determining which hypothesis reconstructs the input rir most faithfully.with regard to the present study, these hypotheses involved considering different potential reflection numbers. once the optimal number of reflections to render has been determined over time and frequency, the array directional responses used to reconstruct the input rir are substituted with spatialization gains for the target playback setup. the results of formal listening experiments suggest that the proposed approach produces renderings that are perceptually more similar to reference responses, when compared with the use of an established subspace-based detection algorithm. the proposed approach also demonstrates similar or better performance than that achieved with existing state-of-the-art methods.},}
TY - paper
TI - Spatial Reconstruction-Based Rendering of Microphone Array Room Impulse Responses
SP - 267
EP - 280
AU - McCormack, Leo
AU - Meyer-Kahlen, Nils
AU - Politis, Archontis
PY - 2023
JO - Journal of the Audio Engineering Society
IS - 5
VO - 71
VL - 71
Y1 - May 2023
TY - paper
TI - Spatial Reconstruction-Based Rendering of Microphone Array Room Impulse Responses
SP - 267
EP - 280
AU - McCormack, Leo
AU - Meyer-Kahlen, Nils
AU - Politis, Archontis
PY - 2023
JO - Journal of the Audio Engineering Society
IS - 5
VO - 71
VL - 71
Y1 - May 2023
AB - A reconstruction-based rendering approach is explored for the task of imposing the spatial characteristics of a measured space onto a monophonic signal while also reproducing it over a target playback setup. The foundation of this study is a parametric rendering framework, which can operate either on arbitrary microphone array room impulse responses (RIRs) or Ambisonic RIRs. Spatial filtering techniques are used to decompose the input RIR into individual reflections and anisotropic diffuse reverberation, which are reproduced using dedicated rendering strategies. The proposed approach operates by considering several hypotheses involving different rendering configurations and thereafter determining which hypothesis reconstructs the input RIR most faithfully.With regard to the present study, these hypotheses involved considering different potential reflection numbers. Once the optimal number of reflections to render has been determined over time and frequency, the array directional responses used to reconstruct the input RIR are substituted with spatialization gains for the target playback setup. The results of formal listening experiments suggest that the proposed approach produces renderings that are perceptually more similar to reference responses, when compared with the use of an established subspace-based detection algorithm. The proposed approach also demonstrates similar or better performance than that achieved with existing state-of-the-art methods.
A reconstruction-based rendering approach is explored for the task of imposing the spatial characteristics of a measured space onto a monophonic signal while also reproducing it over a target playback setup. The foundation of this study is a parametric rendering framework, which can operate either on arbitrary microphone array room impulse responses (RIRs) or Ambisonic RIRs. Spatial filtering techniques are used to decompose the input RIR into individual reflections and anisotropic diffuse reverberation, which are reproduced using dedicated rendering strategies. The proposed approach operates by considering several hypotheses involving different rendering configurations and thereafter determining which hypothesis reconstructs the input RIR most faithfully.With regard to the present study, these hypotheses involved considering different potential reflection numbers. Once the optimal number of reflections to render has been determined over time and frequency, the array directional responses used to reconstruct the input RIR are substituted with spatialization gains for the target playback setup. The results of formal listening experiments suggest that the proposed approach produces renderings that are perceptually more similar to reference responses, when compared with the use of an established subspace-based detection algorithm. The proposed approach also demonstrates similar or better performance than that achieved with existing state-of-the-art methods.
Open Access
Authors:
McCormack, Leo; Meyer-Kahlen, Nils; Politis, Archontis
Affiliations:
Department of Information and Communications Engineering, Aalto University, Espoo, Finland; Department of Information and Communications Engineering, Aalto University, Espoo, Finland; Faculty of Information Technology and Communication Sciences, Tampere University, Finland(See document for exact affiliation information.) JAES Volume 71 Issue 5 pp. 267-280; May 2023
Publication Date:
May 9, 2023Import into BibTeX
Permalink:
http://www.aes.org/e-lib/browse.cfm?elib=22130