SOTA Google's Perceiver-AR Music Transformer Implementation and Model
-
Updated
May 9, 2023 - Python
SOTA Google's Perceiver-AR Music Transformer Implementation and Model
Music remixer based on MusicGen-Chord
This is a cog implementation of the fine-tuner for Meta's MusicGen
A Bach music generator with Artificial Intelligence. This model is made by a VQ-VAE + Transformer (decoder-only). Sequences of midi 1 quarter length are compressed into 16 codebooks via VQ-VAE and a transformer learns how to generate the codebooks sequence to obtain a midi score.
Chord conditioning implemented MusicGen
Code and demo for paper: Zhao et al., "Q&A: Query-Based Representation Learning for Multi-Track Symbolic Music re-Arrangement," IJCAI 2023
[DEPRECEATED] Morpheus Music AI implementation spin-off :)
Algorithmic and AI MIDI Drums Generator Implementation
A pytorch Implementation of VAE-based musical model to generate and interpolate piano'notes using Nottingham dataset.
Music generation using a Long Short-Term Memory (LSTM) neural network. The gennhausser project uses TensorFlow and music21 libraries to create a synthetic dataset, train an LSTM model, and generate music sequences.
Symbolic music generation using multi agent systems
Generate Piano music using LSTM-based musical model.
Generating MIDI files by AI, using piano notes database and librairies such as music21
Add a description, image, and links to the music-generation-deep-learning topic page so that developers can more easily learn about it.
To associate your repository with the music-generation-deep-learning topic, visit your repo's landing page and select "manage topics."