Skip to content

mdeff/paper-fma-challenge-webconf2018

master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Learning to Recognize Musical Genre from Audio, Challenge Overview

Michaël Defferrard, Sharada P. Mohanty, Sean F. Carroll, Marcel Salathé
The Web Conference, 2018

We here summarize our experience running a challenge with open data for musical genre recognition. Those notes motivate the task and the challenge design, show some statistics about the submissions, and present the results.

@inproceedings{fma_challenge,
  title = {Learning to Recognize Musical Genre from Audio},
  subtitle = {Challenge Overview},
  author = {Defferrard, Micha\"el and Mohanty, Sharada P. and Carroll, Sean F. and Salath\'e, Marcel},
  booktitle = {The 2018 Web Conference Companion},
  year = {2018},
  publisher = {ACM Press},
  isbn = {9781450356404},
  doi = {10.1145/3184558.3192310},
  archiveprefix = {arXiv},
  eprint = {1803.05337},
  url = {https://arxiv.org/abs/1803.05337},
}

Resources

PDF available at arXiv and TheWebConf.

Related: slides, data, code, crowdAI challenge, TheWebConf track.

Compilation

Compile the latex source into a PDF with make. Run make clean to remove temporary files and make arxiv.zip to prepare an archive to be uploaded on arXiv.

Figures

All the figures, along with the code and data to reproduce them, are in the analysis folder. While the PDFs are stored, they can be regenerated with make.

About

Learning to Recognize Musical Genre from Audio, Challenge Overview

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published