Skip to content

Latest commit

 

History

History
83 lines (57 loc) · 2.21 KB

README.md

File metadata and controls

83 lines (57 loc) · 2.21 KB

WavetableCVAE

WavetableCVAE_Demo.mov

Abstract

WavetableCVAE" is an attempt to provide intuitive timbre control by generating wavetables conditionally with CVAE.

The code for the deep learning part is available here.

Plug-ins for DAW are available in this repository.

Japanese paper here

English paper in preparation.

Requirement

|
├── conf                   <- hydra config data
│
├── data                   <- Project data
│
├── src                    <- Source code
│   │
│   ├── check                    <- Visualization of generated results
│   ├── dataio                   <- Lightning datamodules
│   ├── models                   <- Lightning models
│   ├── tools                    <- utility tools
│   │
|   ├── utils.py                    <- Utility scripts
│   └── train.py                 <- Run training
│
├── torchscript            <- ckpt file
│
├── .gitignore                <- List of files ignored by git
├── requirements.txt          <- File for installing python dependencies
└── README.md

Installation

Creation of Virtual Environment

conda create --name <name> python=3.8.5 -y
conda activate <name>

Install

pip install -r requirements.txt

Usage

train

python ./src/train.py

How to change settings

By changing the settings in conf -> config.yaml, Parameters can be changed in various places

Note

The dataset is automatically downloaded the first time.

CPU and GPU switching is also automatically determined.

License

"WavetableCVAE" is under CC BY-NC 4.0 license.

Acknowledgment

This research was supported by the 12th Cybozu Labo Youth.