Annotation and Analysis of Recorded Piano Performances on the Web
×
Cite This
Citation & Abstract
L. Fyfe, D. Bedoya, and E. Chew, "Annotation and Analysis of Recorded Piano Performances on the Web," J. Audio Eng. Soc., vol. 70, no. 11, pp. 962-978, (2022 November.). doi: https://doi.org/10.17743/jaes.2022.0057
L. Fyfe, D. Bedoya, and E. Chew, "Annotation and Analysis of Recorded Piano Performances on the Web," J. Audio Eng. Soc., vol. 70 Issue 11 pp. 962-978, (2022 November.). doi: https://doi.org/10.17743/jaes.2022.0057
Abstract: Advancing knowledge and understanding about performed music is hampered by a lack of annotation data for music expressivity. To enable large-scale collection of annotations and explorations of performed music, the authors have created a workflow that is enabled by CosmoNote, aWeb-based citizen science tool for annotating musical structures created by the performer and experienced by the listener during expressive piano performances. To enable annotation tasks with CosmoNote, annotators can listen to the recorded performances and view synchronized music visualization layers including the audio waveform, recorded notes, extracted audio features such as loudness and tempo, and score features such as harmonic tension. Annotators have the ability to zoom into specific parts of a performance and see visuals and listen to the audio from just that part. The annotation of performed musical structures is done by using boundaries of varying strengths, regions, comments, and note groups. By analyzing the annotations collected with CosmoNote, performance decisions will be able to be modeled and analyzed in order to aid in the understanding of expressive choices in musical performances and discover the vocabulary of performed musical structures.
@article{fyfe2022annotation,
author={fyfe, lawrence and bedoya, daniel and chew, elaine},
journal={journal of the audio engineering society},
title={annotation and analysis of recorded piano performances on the web},
year={2022},
volume={70},
number={11},
pages={962-978},
doi={https://doi.org/10.17743/jaes.2022.0057},
month={november},}
@article{fyfe2022annotation,
author={fyfe, lawrence and bedoya, daniel and chew, elaine},
journal={journal of the audio engineering society},
title={annotation and analysis of recorded piano performances on the web},
year={2022},
volume={70},
number={11},
pages={962-978},
doi={https://doi.org/10.17743/jaes.2022.0057},
month={november},
abstract={advancing knowledge and understanding about performed music is hampered by a lack of annotation data for music expressivity. to enable large-scale collection of annotations and explorations of performed music, the authors have created a workflow that is enabled by cosmonote, aweb-based citizen science tool for annotating musical structures created by the performer and experienced by the listener during expressive piano performances. to enable annotation tasks with cosmonote, annotators can listen to the recorded performances and view synchronized music visualization layers including the audio waveform, recorded notes, extracted audio features such as loudness and tempo, and score features such as harmonic tension. annotators have the ability to zoom into specific parts of a performance and see visuals and listen to the audio from just that part. the annotation of performed musical structures is done by using boundaries of varying strengths, regions, comments, and note groups. by analyzing the annotations collected with cosmonote, performance decisions will be able to be modeled and analyzed in order to aid in the understanding of expressive choices in musical performances and discover the vocabulary of performed musical structures.},}
TY - paper
TI - Annotation and Analysis of Recorded Piano Performances on the Web
SP - 962
EP - 978
AU - Fyfe, Lawrence
AU - Bedoya, Daniel
AU - Chew, Elaine
PY - 2022
JO - Journal of the Audio Engineering Society
IS - 11
VO - 70
VL - 70
Y1 - November 2022
TY - paper
TI - Annotation and Analysis of Recorded Piano Performances on the Web
SP - 962
EP - 978
AU - Fyfe, Lawrence
AU - Bedoya, Daniel
AU - Chew, Elaine
PY - 2022
JO - Journal of the Audio Engineering Society
IS - 11
VO - 70
VL - 70
Y1 - November 2022
AB - Advancing knowledge and understanding about performed music is hampered by a lack of annotation data for music expressivity. To enable large-scale collection of annotations and explorations of performed music, the authors have created a workflow that is enabled by CosmoNote, aWeb-based citizen science tool for annotating musical structures created by the performer and experienced by the listener during expressive piano performances. To enable annotation tasks with CosmoNote, annotators can listen to the recorded performances and view synchronized music visualization layers including the audio waveform, recorded notes, extracted audio features such as loudness and tempo, and score features such as harmonic tension. Annotators have the ability to zoom into specific parts of a performance and see visuals and listen to the audio from just that part. The annotation of performed musical structures is done by using boundaries of varying strengths, regions, comments, and note groups. By analyzing the annotations collected with CosmoNote, performance decisions will be able to be modeled and analyzed in order to aid in the understanding of expressive choices in musical performances and discover the vocabulary of performed musical structures.
Advancing knowledge and understanding about performed music is hampered by a lack of annotation data for music expressivity. To enable large-scale collection of annotations and explorations of performed music, the authors have created a workflow that is enabled by CosmoNote, aWeb-based citizen science tool for annotating musical structures created by the performer and experienced by the listener during expressive piano performances. To enable annotation tasks with CosmoNote, annotators can listen to the recorded performances and view synchronized music visualization layers including the audio waveform, recorded notes, extracted audio features such as loudness and tempo, and score features such as harmonic tension. Annotators have the ability to zoom into specific parts of a performance and see visuals and listen to the audio from just that part. The annotation of performed musical structures is done by using boundaries of varying strengths, regions, comments, and note groups. By analyzing the annotations collected with CosmoNote, performance decisions will be able to be modeled and analyzed in order to aid in the understanding of expressive choices in musical performances and discover the vocabulary of performed musical structures.
Open Access
Authors:
Fyfe, Lawrence; Bedoya, Daniel; Chew, Elaine
Affiliations:
STMS Laboratoire (UMR9912) – CNRS, IRCAM, Sorbonne Universit´e, Minist`ere de la Culture, Paris 75004, France; STMS Laboratoire (UMR9912) – CNRS, IRCAM, Sorbonne Universit´e, Minist`ere de la Culture, Paris 75004, France; Department of Engineering, King’s College London, London WC2R 2LS, United Kingdom(See document for exact affiliation information.) JAES Volume 70 Issue 11 pp. 962-978; November 2022
Publication Date:
November 15, 2022Import into BibTeX
Permalink:
http://www.aes.org/e-lib/browse.cfm?elib=22020