This repoistory contains the code used and data collected for the Tap to Drums: Extending Monophonically Tapped Rhythms to Polyphonic Drum Generation. Here you will find the PureData files used to run the experiment, the associated python files for the experiment and other support and initializer files.
Abstract In this paper, we explore the literature surrounding rhythm perception to develop algorithms that extract a monophonic rhythm from a polyphonic drum pattern. We develop machine learning models for those algorithms to predict the pattern’s location in a polyphonic similarity based 2-D latent rhythm space. Following that we have 25 subjects tap along to polyphonic drum patterns to explore the behaviors of reproducing complex rhythms. The model was able to reasonably predict the location of a monophonic rhythm in the rhythm space (MAE=0.039, SD=0.057). Subjects tapped more accurately to an intended velocity as they became more experienced with the system. The model failed to predict the location of the subject-tapped monophonic rhythms (MAE=0.4580, SD=0.076), highlighting the need for a more thorough subject-rated investigation into refining a tap->polyphonic drums pipeline.
Keywords: rhythm, rhythm perception, rhythm similarity, tapping, rhythm space
To run the experiment:
- Connect drum pad and headphones to computer. Start PureData and change Audio Output and MIDI input to respective equipment.
- Open tap_tests.pd, followed by running tap_tests.py.
- Test data is saved in tap-to-drums/results, subjects must manually save the first test.
To see results:
- Open data_analysis.py
- Select which approaches you want to see analysis from, set to True.
- Run and see graphical results.
If you have any questions or comments, please contact Peter Clark (peterjosephclark1@gmail.com).