Skip to content

Irislucent/motif-encoder

Repository files navigation

This is the codebase of the paper titled Motif-Centric Representation Learning for Symbolic Music.

How to synthesize data

cd dataset
python preprocess.py --data_dir "your dataset directory, default is pop909" --save_dir "directory to save preprocessed data"
python metaphor_by_rules.py --data_dir "directory of preprocessed data" --save_dir "directory to save metaphorized data" --n_metaphors "# data views"
python split_train_val.py --data_dir "directory of metaphorized data" --save_dir "directory to save train/val datasets"

How to generate the real dataset from labels

cd dataset
python label_to_real_data.py --data_dir "directory of original data" --labels_dir "../data" --chunks_dir "directory of preprocessed data" --output_dir "directory to save relabeled data"
python split_train_val.py --data_dir "directory of relabeled data" --save_dir "directory to save train/val datasets"

How to train a model

python run_training.py --config "your-config-file.yaml"

Two example .yaml files are provided, respectively corresponding to "contrastive/" and "regularized/".

How to do motif-based music visualization

Enter model checkpoint path at the entry "active_checkpoint" in "your-config-file.yaml".

python run_visualization.py --config "your-config-file.yaml" --input_path "path to target .mid file"

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages