Skip to content

Latest commit

 

History

History
28 lines (18 loc) · 797 Bytes

README.md

File metadata and controls

28 lines (18 loc) · 797 Bytes

Data preparation for AFHQ-Cats experiments

The folder AFHQ/prepare_data contains the code to prepare data for the AFHQ-Cats experiments.

Pre-trained CLIP model

Since we apply the CLIP model to annotate the AFHQ-Cats data by designing proper prompts that contain the controlling attributes, you first need to install it as a Python package:

pip install git+https://github.com/openai/CLIP.git

Data generation

Generate 10k images and latent variables of StyleGAN2-ADA (including w and z):

bash scripts/run_gen_batch.sh

Use the pre-trained CLIP model to annotate the generated images:

bash scripts/run_clip_labeling.sh

The resulting pairs of latent variables and labels will be used to train latent classifiers.