Music Emotion and Genre Recognition Toward New Affective Music Taxonomy
Exponentially increasing electronic music distribution creates a natural pressure for fine-grained musical metadata. On the basis of the fact that a primary motive for listening to music is its emotional effect, diversion, and the memories it awakens, we propose a novel affective music taxonomy which combines the global music genre taxonomy, e.g. Classical, Jazz, Rock/Pop, and Rap, with emotion categories such as Joy, Sadness, Anger, and Pleasure, in a complementary way. In this paper, we deal with all essential stages of automatic genre/emotion recognition system, i.e. from reasonable music data collection up to performance evaluation of various machine learning algorithms. Particularly, a novel classification scheme, called as consecutive dichotomous decomposition tree (CDDT) is presented which is specifically parametrized for multi-class classification problem with extremely high number of class, e.g. sixteen music categories in our case. The average recognition accuracy of 75% for the 16 music categories shows a realistic possibility of the affective music taxonomy we proposed.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is temporarily free for AES members.