Transfer Learning for Music Classification and Regression Tasks Using Artist Tags

A diagram of framework

Abstract

In this paper, a transfer learning method that exploits artist tags for general-purpose music feature vector extraction is presented. The feature vector extracted from the last convolutional layer in a deep convolutional neural network (DCNN) trained with artist tags is showed for music classification and regression tasks. Not only are artist tags adequate in the music community, therefore easy to be gathered, but also contain much high-level abstract information about the artists and the music audio released by the artists. To train the network, a dataset containing 33903 30-second clips, annotated with artist tags was created. The model is trained to predict the artist tags from audio content first in the proposed system. Then the model is transferred to extract the features that are used to perform music genre classification and music emotion recognition tasks. The experiment results show that the features learned using artist tags under the context of transfer learning are able to be effectively applied in music genre classification and music emotion recognition tasks.

Type
Publication
In Proceedings of the 7th Conference on Sound and Music Technology
Click the Cite button above to demo the feature to enable visitors to import publication metadata into their reference management software.