Skip to content

YatingMusic/compound-word-transformer

main
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
December 8, 2020 17:34
July 11, 2021 02:32
February 9, 2021 11:43
January 11, 2022 20:55
December 8, 2020 17:43
January 11, 2021 15:24
July 10, 2021 16:22

Compound Word Transformer

Authors: Wen-Yi Hsiao, Jen-Yu Liu, Yin-Cheng Yeh and Yi-Hsuan Yang

Paper (arXiv) | Audio demo (Google Drive) | Blog | Colab notebook

Official PyTorch implementation of AAAI2021 paper "Compound Word Transformer: Learning to Compose Full-Song Musicover Dynamic Directed Hypergraphs".

We presented a new variant of the Transformer that can processes multiple consecutive tokens at once at a time step. The proposed method can greatly reduce the length of the resulting sequence and therefore enhance the training and inference efficiency. We employ it to learn to compose expressive Pop piano music of full-song length (involving up to 10K individual to23 kens per song). In this repository, we open source our Ailabs.tw 1K7 dataset, and the codes for unconditional generation.

Dependencies

  • python 3.6
  • Required packages:
    • madmom
    • miditoolkit
    • pytorch-fast-transformers

chorder is our in-house rule-based symbolic chord recognition algorithm, which is developed by our former intern - joshuachang2311. He is also a jazz pianist 🎹.

Model

In this work, we conduct two scenario of generation:

  • unconditional generation

    • To see the experimental results and discussion, please refer to here.
  • conditional generation, leadsheet to full midi (ls2midi)

    • [Work in progress] We plan to open source the code associated with this part in the future.
      • melody extracyion (skyline)
      • objective metrics
      • model

Dataset

To prepare your own training data, please refer to documentaion for further understanding.
Or, you can start with our AIlabs.tw Pop1K7, which is available here.

Demo: Colab Notebook

The colab notebook is now available here.
Thanks our intern AdarshKumar712 for organizing the codes.

Acknowledgement

About

Official implementation of compound word transformer (AAAI'21)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages