Measuring audio-visual speech intelligibility under dynamic listening conditions using virtual reality
×
Cite This
Citation & Abstract
AL. H.. Moore, T. Green, M. Brookes, and PA. A.. Naylor, "Measuring audio-visual speech intelligibility under dynamic listening conditions using virtual reality," Paper 45, (2022 August.). doi:
AL. H.. Moore, T. Green, M. Brookes, and PA. A.. Naylor, "Measuring audio-visual speech intelligibility under dynamic listening conditions using virtual reality," Paper 45, (2022 August.). doi:
Abstract: The ELOSPHERES project is a collaboration between researchers at Imperial College London and University College London which aims to improve the efficacy of hearing aids. The benefit obtained from hearing aids varies significantly between listeners and listening environments. The noisy, reverberant environments which most people find challenging bear little resemblance to the clinics in which consultations occur. In order to make progress in speech enhancement, algorithms need to be evaluated under realistic listening conditions. A key aim of ELOSPHERES is to create a virtual reality-based test environment in which alternative speech enhancement algorithms can be evaluated using a listener-in-the-loop paradigm. In this paper we present the sap-elospheres-audiovisual-test (SEAT) platform and report the results of an initial experiment in which it was used to measure the benefit of visual cues in a speech intelligibility in spatial noise task.
@article{moore2022measuring,
author={moore, alastair h. and green, tim and brookes, mike and naylor, patrick a.},
journal={journal of the audio engineering society},
title={measuring audio-visual speech intelligibility under dynamic listening conditions using virtual reality},
year={2022},
volume={},
number={},
pages={},
doi={},
month={august},}
@article{moore2022measuring,
author={moore, alastair h. and green, tim and brookes, mike and naylor, patrick a.},
journal={journal of the audio engineering society},
title={measuring audio-visual speech intelligibility under dynamic listening conditions using virtual reality},
year={2022},
volume={},
number={},
pages={},
doi={},
month={august},
abstract={the elospheres project is a collaboration between researchers at imperial college london and university college london which aims to improve the efficacy of hearing aids. the benefit obtained from hearing aids varies significantly between listeners and listening environments. the noisy, reverberant environments which most people find challenging bear little resemblance to the clinics in which consultations occur. in order to make progress in speech enhancement, algorithms need to be evaluated under realistic listening conditions. a key aim of elospheres is to create a virtual reality-based test environment in which alternative speech enhancement algorithms can be evaluated using a listener-in-the-loop paradigm. in this paper we present the sap-elospheres-audiovisual-test (seat) platform and report the results of an initial experiment in which it was used to measure the benefit of visual cues in a speech intelligibility in spatial noise task.},}
TY - paper
TI - Measuring audio-visual speech intelligibility under dynamic listening conditions using virtual reality
SP -
EP -
AU - Moore, Alastair H.
AU - Green, Tim
AU - Brookes, Mike
AU - Naylor, Patrick A.
PY - 2022
JO - Journal of the Audio Engineering Society
IS -
VO -
VL -
Y1 - August 2022
TY - paper
TI - Measuring audio-visual speech intelligibility under dynamic listening conditions using virtual reality
SP -
EP -
AU - Moore, Alastair H.
AU - Green, Tim
AU - Brookes, Mike
AU - Naylor, Patrick A.
PY - 2022
JO - Journal of the Audio Engineering Society
IS -
VO -
VL -
Y1 - August 2022
AB - The ELOSPHERES project is a collaboration between researchers at Imperial College London and University College London which aims to improve the efficacy of hearing aids. The benefit obtained from hearing aids varies significantly between listeners and listening environments. The noisy, reverberant environments which most people find challenging bear little resemblance to the clinics in which consultations occur. In order to make progress in speech enhancement, algorithms need to be evaluated under realistic listening conditions. A key aim of ELOSPHERES is to create a virtual reality-based test environment in which alternative speech enhancement algorithms can be evaluated using a listener-in-the-loop paradigm. In this paper we present the sap-elospheres-audiovisual-test (SEAT) platform and report the results of an initial experiment in which it was used to measure the benefit of visual cues in a speech intelligibility in spatial noise task.
The ELOSPHERES project is a collaboration between researchers at Imperial College London and University College London which aims to improve the efficacy of hearing aids. The benefit obtained from hearing aids varies significantly between listeners and listening environments. The noisy, reverberant environments which most people find challenging bear little resemblance to the clinics in which consultations occur. In order to make progress in speech enhancement, algorithms need to be evaluated under realistic listening conditions. A key aim of ELOSPHERES is to create a virtual reality-based test environment in which alternative speech enhancement algorithms can be evaluated using a listener-in-the-loop paradigm. In this paper we present the sap-elospheres-audiovisual-test (SEAT) platform and report the results of an initial experiment in which it was used to measure the benefit of visual cues in a speech intelligibility in spatial noise task.
Open Access
Authors:
Moore, Alastair H.; Green, Tim; Brookes, Mike; Naylor, Patrick A.
Affiliations:
Imperial College London, London, UK; University College London, London, UK(See document for exact affiliation information.)
AES Conference:
AES 2022 International Audio for Virtual and Augmented Reality Conference (August 2022)
Paper Number:
45
Publication Date:
August 15, 2022Import into BibTeX
Subject:
Paper
Permalink:
http://www.aes.org/e-lib/browse.cfm?elib=21876