Grateful Live: Mixing Multiple Recordings of a Dead Performance into an Immersive Experience
×
Cite This
Citation & Abstract
T. Wilmering, F. Thalmann, and MA. B.. Sandler, "Grateful Live: Mixing Multiple Recordings of a Dead Performance into an Immersive Experience," Paper 9614, (2016 September.). doi:
T. Wilmering, F. Thalmann, and MA. B.. Sandler, "Grateful Live: Mixing Multiple Recordings of a Dead Performance into an Immersive Experience," Paper 9614, (2016 September.). doi:
Abstract: Recordings of historical live music performances often exist in several versions, either recorded from the mixing desk, on stage, or by audience members. These recordings highlight different aspects of the performance, but they also typically vary in recording quality, playback speed, and segmentation. We present a system that automatically aligns and clusters live music recordings based on various audio characteristics and editorial metadata. The system creates an immersive virtual space that can be imported into a multichannel web or mobile application allowing listeners to navigate the space using interface controls or mobile device sensors. We evaluate our system with recordings of different lineages from the Live Music Archive’s Grateful Dead collection.
@article{wilmering2016grateful,
author={wilmering, thomas and thalmann, florian and sandler, mark b.},
journal={journal of the audio engineering society},
title={grateful live: mixing multiple recordings of a dead performance into an immersive experience},
year={2016},
volume={},
number={},
pages={},
doi={},
month={september},}
@article{wilmering2016grateful,
author={wilmering, thomas and thalmann, florian and sandler, mark b.},
journal={journal of the audio engineering society},
title={grateful live: mixing multiple recordings of a dead performance into an immersive experience},
year={2016},
volume={},
number={},
pages={},
doi={},
month={september},
abstract={recordings of historical live music performances often exist in several versions, either recorded from the mixing desk, on stage, or by audience members. these recordings highlight different aspects of the performance, but they also typically vary in recording quality, playback speed, and segmentation. we present a system that automatically aligns and clusters live music recordings based on various audio characteristics and editorial metadata. the system creates an immersive virtual space that can be imported into a multichannel web or mobile application allowing listeners to navigate the space using interface controls or mobile device sensors. we evaluate our system with recordings of different lineages from the live music archive’s grateful dead collection.},}
TY - paper
TI - Grateful Live: Mixing Multiple Recordings of a Dead Performance into an Immersive Experience
SP -
EP -
AU - Wilmering, Thomas
AU - Thalmann, Florian
AU - Sandler, Mark B.
PY - 2016
JO - Journal of the Audio Engineering Society
IS -
VO -
VL -
Y1 - September 2016
TY - paper
TI - Grateful Live: Mixing Multiple Recordings of a Dead Performance into an Immersive Experience
SP -
EP -
AU - Wilmering, Thomas
AU - Thalmann, Florian
AU - Sandler, Mark B.
PY - 2016
JO - Journal of the Audio Engineering Society
IS -
VO -
VL -
Y1 - September 2016
AB - Recordings of historical live music performances often exist in several versions, either recorded from the mixing desk, on stage, or by audience members. These recordings highlight different aspects of the performance, but they also typically vary in recording quality, playback speed, and segmentation. We present a system that automatically aligns and clusters live music recordings based on various audio characteristics and editorial metadata. The system creates an immersive virtual space that can be imported into a multichannel web or mobile application allowing listeners to navigate the space using interface controls or mobile device sensors. We evaluate our system with recordings of different lineages from the Live Music Archive’s Grateful Dead collection.
Recordings of historical live music performances often exist in several versions, either recorded from the mixing desk, on stage, or by audience members. These recordings highlight different aspects of the performance, but they also typically vary in recording quality, playback speed, and segmentation. We present a system that automatically aligns and clusters live music recordings based on various audio characteristics and editorial metadata. The system creates an immersive virtual space that can be imported into a multichannel web or mobile application allowing listeners to navigate the space using interface controls or mobile device sensors. We evaluate our system with recordings of different lineages from the Live Music Archive’s Grateful Dead collection.
Open Access
Authors:
Wilmering, Thomas; Thalmann, Florian; Sandler, Mark B.
Affiliation:
Queen Mary University of London, London, UK
AES Convention:
141 (September 2016)
Paper Number:
9614
Publication Date:
September 20, 2016Import into BibTeX
Subject:
Spatial Audio: Production
Permalink:
http://www.aes.org/e-lib/browse.cfm?elib=18418