Privacy-Aware Acoustic Assessments of Everyday Life
×
Cite This
Citation & Abstract
J. Bitzer, S. Kissner, and I. Holube, "Privacy-Aware Acoustic Assessments of Everyday Life," J. Audio Eng. Soc., vol. 64, no. 6, pp. 395-404, (2016 June.). doi: https://doi.org/10.17743/jaes.2016.0020
J. Bitzer, S. Kissner, and I. Holube, "Privacy-Aware Acoustic Assessments of Everyday Life," J. Audio Eng. Soc., vol. 64 Issue 6 pp. 395-404, (2016 June.). doi: https://doi.org/10.17743/jaes.2016.0020
Abstract: In order to enhance people’s ability to interact with their acoustic environments, hearing devices are common tools. However, it is difficult to evaluate the benefit of those tools or to measure acoustically challenging situations in natural environments. This paper proposes a way to measure the most important features of everyday acoustics environments by extracting a limited set of features while not compromising the privacy of partners and bystanders. The respective national laws on how to deal with audio privacy are very different among countries. The authors proposed using a smartphone as the source recorder but splitting the feature extraction into two phases: an initial feature processing in the smartphone and a later processing on a more powerful computer. For a given feature set, a statistical analysis shows comparable results from the extracted data when using either the original audio or the new privacy-aware extraction methods. A comparison shows that different scenarios result in separable features using the new extraction method.
@article{bitzer2016privacy-aware,
author={bitzer, joerg and kissner, sven and holube, inga},
journal={journal of the audio engineering society},
title={privacy-aware acoustic assessments of everyday life},
year={2016},
volume={64},
number={6},
pages={395-404},
doi={https://doi.org/10.17743/jaes.2016.0020},
month={june},}
@article{bitzer2016privacy-aware,
author={bitzer, joerg and kissner, sven and holube, inga},
journal={journal of the audio engineering society},
title={privacy-aware acoustic assessments of everyday life},
year={2016},
volume={64},
number={6},
pages={395-404},
doi={https://doi.org/10.17743/jaes.2016.0020},
month={june},
abstract={in order to enhance people’s ability to interact with their acoustic environments, hearing devices are common tools. however, it is difficult to evaluate the benefit of those tools or to measure acoustically challenging situations in natural environments. this paper proposes a way to measure the most important features of everyday acoustics environments by extracting a limited set of features while not compromising the privacy of partners and bystanders. the respective national laws on how to deal with audio privacy are very different among countries. the authors proposed using a smartphone as the source recorder but splitting the feature extraction into two phases: an initial feature processing in the smartphone and a later processing on a more powerful computer. for a given feature set, a statistical analysis shows comparable results from the extracted data when using either the original audio or the new privacy-aware extraction methods. a comparison shows that different scenarios result in separable features using the new extraction method.},}
TY - paper
TI - Privacy-Aware Acoustic Assessments of Everyday Life
SP - 395
EP - 404
AU - Bitzer, Joerg
AU - Kissner, Sven
AU - Holube, Inga
PY - 2016
JO - Journal of the Audio Engineering Society
IS - 6
VO - 64
VL - 64
Y1 - June 2016
TY - paper
TI - Privacy-Aware Acoustic Assessments of Everyday Life
SP - 395
EP - 404
AU - Bitzer, Joerg
AU - Kissner, Sven
AU - Holube, Inga
PY - 2016
JO - Journal of the Audio Engineering Society
IS - 6
VO - 64
VL - 64
Y1 - June 2016
AB - In order to enhance people’s ability to interact with their acoustic environments, hearing devices are common tools. However, it is difficult to evaluate the benefit of those tools or to measure acoustically challenging situations in natural environments. This paper proposes a way to measure the most important features of everyday acoustics environments by extracting a limited set of features while not compromising the privacy of partners and bystanders. The respective national laws on how to deal with audio privacy are very different among countries. The authors proposed using a smartphone as the source recorder but splitting the feature extraction into two phases: an initial feature processing in the smartphone and a later processing on a more powerful computer. For a given feature set, a statistical analysis shows comparable results from the extracted data when using either the original audio or the new privacy-aware extraction methods. A comparison shows that different scenarios result in separable features using the new extraction method.
In order to enhance people’s ability to interact with their acoustic environments, hearing devices are common tools. However, it is difficult to evaluate the benefit of those tools or to measure acoustically challenging situations in natural environments. This paper proposes a way to measure the most important features of everyday acoustics environments by extracting a limited set of features while not compromising the privacy of partners and bystanders. The respective national laws on how to deal with audio privacy are very different among countries. The authors proposed using a smartphone as the source recorder but splitting the feature extraction into two phases: an initial feature processing in the smartphone and a later processing on a more powerful computer. For a given feature set, a statistical analysis shows comparable results from the extracted data when using either the original audio or the new privacy-aware extraction methods. A comparison shows that different scenarios result in separable features using the new extraction method.
Open Access
Authors:
Bitzer, Joerg; Kissner, Sven; Holube, Inga
Affiliation:
Jade University for Applied Sciences, Institute of Hearing Technology and Audiology, Oldenburg, Germany JAES Volume 64 Issue 6 pp. 395-404; June 2016
Publication Date:
June 27, 2016Import into BibTeX
Permalink:
http://www.aes.org/e-lib/browse.cfm?elib=18298