Comparison of Human and Machine Recognition of Electric Guitar Types
×
Cite This
Citation & Abstract
R. Profeta, and G. Schuller, "Comparison of Human and Machine Recognition of Electric Guitar Types," Paper 10315, (2019 October.). doi:
R. Profeta, and G. Schuller, "Comparison of Human and Machine Recognition of Electric Guitar Types," Paper 10315, (2019 October.). doi:
Abstract: The classification of musical instruments for instruments of the same type is a challenging task not only to experienced musicians but also in music information retrieval. The goal of this paper is to understand how guitar players with different experience levels perform in distinguishing audio recordings of single guitar notes from two iconic guitar models and to use this knowledge as a baseline to evaluate the performance of machine learning algorithms performing a similar task. For this purpose we conducted a blind listening test with 236 participants in which they listened to 4 single notes from 4 different guitars and had to classify them as a Fender Stratocaster or an Epiphone Les Paul. We found out that only 44% of the participants could correctly classify all 4 guitar notes. We also performed machine learning experiments using k-Nearest Neighbours (kNN) and Support Vector Machines (SVM) algorithms applied to a classification problem with 1292 notes from different Stratocaster and Les Paul guitars. The SVM algorithm had an accuracy of 93.9%, correctly predicting 139 audio samples from the 148 present in the testing set.
@article{profeta2019comparison,
author={profeta, renato and schuller, gerald},
journal={journal of the audio engineering society},
title={comparison of human and machine recognition of electric guitar types},
year={2019},
volume={},
number={},
pages={},
doi={},
month={october},}
@article{profeta2019comparison,
author={profeta, renato and schuller, gerald},
journal={journal of the audio engineering society},
title={comparison of human and machine recognition of electric guitar types},
year={2019},
volume={},
number={},
pages={},
doi={},
month={october},
abstract={the classification of musical instruments for instruments of the same type is a challenging task not only to experienced musicians but also in music information retrieval. the goal of this paper is to understand how guitar players with different experience levels perform in distinguishing audio recordings of single guitar notes from two iconic guitar models and to use this knowledge as a baseline to evaluate the performance of machine learning algorithms performing a similar task. for this purpose we conducted a blind listening test with 236 participants in which they listened to 4 single notes from 4 different guitars and had to classify them as a fender stratocaster or an epiphone les paul. we found out that only 44% of the participants could correctly classify all 4 guitar notes. we also performed machine learning experiments using k-nearest neighbours (knn) and support vector machines (svm) algorithms applied to a classification problem with 1292 notes from different stratocaster and les paul guitars. the svm algorithm had an accuracy of 93.9%, correctly predicting 139 audio samples from the 148 present in the testing set.},}
TY - paper
TI - Comparison of Human and Machine Recognition of Electric Guitar Types
SP -
EP -
AU - Profeta, Renato
AU - Schuller, Gerald
PY - 2019
JO - Journal of the Audio Engineering Society
IS -
VO -
VL -
Y1 - October 2019
TY - paper
TI - Comparison of Human and Machine Recognition of Electric Guitar Types
SP -
EP -
AU - Profeta, Renato
AU - Schuller, Gerald
PY - 2019
JO - Journal of the Audio Engineering Society
IS -
VO -
VL -
Y1 - October 2019
AB - The classification of musical instruments for instruments of the same type is a challenging task not only to experienced musicians but also in music information retrieval. The goal of this paper is to understand how guitar players with different experience levels perform in distinguishing audio recordings of single guitar notes from two iconic guitar models and to use this knowledge as a baseline to evaluate the performance of machine learning algorithms performing a similar task. For this purpose we conducted a blind listening test with 236 participants in which they listened to 4 single notes from 4 different guitars and had to classify them as a Fender Stratocaster or an Epiphone Les Paul. We found out that only 44% of the participants could correctly classify all 4 guitar notes. We also performed machine learning experiments using k-Nearest Neighbours (kNN) and Support Vector Machines (SVM) algorithms applied to a classification problem with 1292 notes from different Stratocaster and Les Paul guitars. The SVM algorithm had an accuracy of 93.9%, correctly predicting 139 audio samples from the 148 present in the testing set.
The classification of musical instruments for instruments of the same type is a challenging task not only to experienced musicians but also in music information retrieval. The goal of this paper is to understand how guitar players with different experience levels perform in distinguishing audio recordings of single guitar notes from two iconic guitar models and to use this knowledge as a baseline to evaluate the performance of machine learning algorithms performing a similar task. For this purpose we conducted a blind listening test with 236 participants in which they listened to 4 single notes from 4 different guitars and had to classify them as a Fender Stratocaster or an Epiphone Les Paul. We found out that only 44% of the participants could correctly classify all 4 guitar notes. We also performed machine learning experiments using k-Nearest Neighbours (kNN) and Support Vector Machines (SVM) algorithms applied to a classification problem with 1292 notes from different Stratocaster and Les Paul guitars. The SVM algorithm had an accuracy of 93.9%, correctly predicting 139 audio samples from the 148 present in the testing set.
Authors:
Profeta, Renato; Schuller, Gerald
Affiliation:
Ilmenau University of Technology, Ilmenau, Germany
AES Convention:
147 (October 2019)
Paper Number:
10315
Publication Date:
October 8, 2019Import into BibTeX
Subject:
Posters: Perception
Permalink:
http://www.aes.org/e-lib/browse.cfm?elib=20687