A Multi-Scale Attentive Transformer for Multi-Instrument Symbolic Music Generation
Please visit
HaRry-qaq.github.io
to listen to different kinds of experiment demos.
Please go to MSAT/msat. You can create the environment with the following command.
conda env create -f environment.yml
Please go to baseline/mmt train the MMT-note, MMT-bar, MMT-track.
The relevant code for preprocessing is in baseline/mmt. Go to that folder under.
You can get the SOD dataset by
wget https://qsdfo.github.io/LOP/database/SOD.zip.
Get a list of filenames for each dataset.
find data/sod/SOD -type f -name *.mid -o -name *.xml | cut -c 14- > data/sod/original-names.txt
Note: Change the number in the cut command for different datasets.
Convert the MIDI and MusicXML files into MusPy files for processing.
python mmt/convert_sod.py
Extract a list of notes from the MusPy JSON files.
python mmt/extract.py -d sod
Enter the file address in the corresponding place of the code.
python mmt/cuthang_1024.py
sort the note-level representation to bar-level and track-level representation, Remember to modify the specified path within the file.
python mmt/representation-bar.py -d sod
python mmt/representation-track.py -d sod
Split the processed data into training, validation and test sets.
python mmt/split.py -d sod
Create the corresponding sod and sod/processed/note folder in the data folder. Put the csv obtained by cuthang_1024.py in sod/processed/note and the txt obtained by split.py in sod/processed.
Create the corresponding sod-bar and sod-bar/processed/note folder in the data folder. Put the csv obtained by representation-bar.py in sod-bar/processed/note and the txt obtained by split.py in sod/processed.
Create the corresponding sod-track and sod-track/processed/note folder in the data folder. Put the csv obtained by representation-track.py in sod-track/processed/note and the txt obtained by split.py in sod/processed.
Please go to baseline, train the MMT-note, MMT-bar, MMT-track.
MMT-note:
python mmt/train.py -d sod -o exp/sod/ape -g 0
MMT-bar:
python mmt/train.py -d sod-bar -o exp/sod-bar/ape -g 0
MMT-track:
python mmt/train.py -d sod-track -o exp/sod-track/ape -g 0
Go to the corresponding folder MSAT/.
Load the previously trained MMT-note and MMT-track into the model, i.e., change the path1 and path2 specified in the train_multi_scale.py.
Prepare the sod-bar data under MSAT/ in the same way as MMT-bar.
Run the following code to train the remaining parameters:
python -m torch.distributed.launch --nproc_per_node=3 msat/train_multi_scale.py -o exp/sod-bar/ape
Generate new samples using a trained model.
python msat/generate.py -d sod-bar -o exp/sod-bar/ape -g 0
Evaluate the trained model.
Go to MSAT/evaluate/.
Modify the path specified in the py file and run the following code:
python evaluate-local.py
python evaluate-instrument-corr.py
python evaluate-track-var.py