This project implements a Conditional Variational Autoencoder (CVAE) to generate shapes conditioned on emotions, using the EmoSet dataset. It explores the intersection of emotion recognition and generative models to create visual representations based on emotional input.
- Python 3.6+
- PyTorch
- torchvision
- PIL
- matplotlib
See requirements.txt
for detailed dependencies.
- Clone the repository:
git clone https://github.com/chirazedrine/generative-design-with-a-given-emotion-using-cvae.git
- Install the required packages:
pip install -r requirements.txt
Download and prepare the EmoSet dataset. Unzip it and place it in ./dataset/EmoSet-118K.
Run the training script:
python train_model.py
Generate shapes based on emotions:
python generate_images.py
dataset/
: Dataset preparation scripts.models/
: Contains the CVAE model definition.
generation/
: Script for generating images based on emotions.requirements.txt
: Project dependencies.
This project is licensed under the MIT License - see the LICENSE file for details.
- Thanks to the creators of the EmoSet dataset for their valuable resource. This project is inspired by research on emotion recognition and generative models.
- Reference: Jingyuan Yang, Qirui Huang, Tingting Ding, Dani Lischinski, Danny Cohen-Or, and Hui Huang, "EmoSet: A Large-scale Visual Emotion Dataset with Rich Attributes," in Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 20383–20394, 2023.