Skip to content

Latest commit

 

History

History
139 lines (106 loc) · 11.2 KB

README.md

File metadata and controls

139 lines (106 loc) · 11.2 KB

Asset 5

🚀 Diffusion models are making the headlines as a new generation of powerful generative models.

However, many of the ongoing research considers solutions that are quite often quite specific and require large computational resources for training.

🔰 DiffusionFastForward offers a general template for diffusion models for images that can be a starting point for understanding and researching diffusion-based generative models.

  • PyTorch Lightning to enable easy training!
  • 💸 You can run all experiments online on Google colab - no need for own GPU machine!
  • 🔎 Examples for both low-resolution and high-resolution data!
  • ⛺ Examples of latent diffusion!
  • 🎨 Examples of image translation with diffusion!

The code structure is simple, so that you can easily customize it to your own applications.

🚧 Disclaimer: This repository does not provide any weights to the models. The purpose of this software is to be able to train new weights on a previously unexplored type of data.

Contents

There are three elements integrated into this project:

  • 💻 Code
  • 💡 Notes (in notes directory)
  • 📺 Video Course (to be released on YouTube)

💻 Code

This repository offers a starting point for training diffusion models on new types of data. It can serve as a baseline that can hopefully be developed into more robust solutions based on the specific features of the performed generative task.

It includes notebooks that can be run stand-alone:

  1. Open In Collab 01-Diffusion-Sandbox - visualizations of the diffusion process
  2. Open In Collab 02-Pixel-Diffusion - basic diffusion suitable for low-resolution data
  3. Open In Collab 03-Conditional-Pixel-Diffusion - image translation with diffusion for low-resolution data
  4. Open In Collab 04-Latent-Diffusion - latent diffusion suitable for high-resolution data
  5. Open In Collab 05-Conditional-Latent-Diffusion - image translation with latent diffusion

Dependencies

Assuming torch and torchvision is installed:

pip install pytorch-lightning==1.9.3 diffusers einops

💡 Notes

Short summary notes are released as part of this repository and they overlap semantically with the notebooks!

  1. 01-Diffusion-Theory - visualizations of the diffusion process
  2. 02-Pixel-Diffusion - basic diffusion suitable for low-resolution data
  3. 03-Conditional-Pixel-Diffusion - image translation with diffusion for low-resolution data
  4. 04-Latent-Diffusion - latent diffusion suitable for high-resolution data
  5. 05-Conditional-Latent-Diffusion - image translation with latent diffusion

📺 Video Course (released on YouTube)

The course is released on YouTube and provides an extension to this repository. Some additional topics are covered, such as seminal papers and on-going research work.

Screenshot 2023-03-01 at 19 46 20

The current plan for the video course (links added upon publishing):


💰 Training Cost

Most examples are one of two types of models, trainable within a day:

PixelDiffusion (Good for small images 👶) Appropriate for LR data. Direct diffusion in pixel space.

Image Resolution 64x64
Training Time ~10 hrs
Memory Usage ~4 GB

out-pixel-conditional-1

out-pixel-conditional-2

out-pixel-conditional-3

LatentDiffusion (Good for large images 🐋) Useful for HR data. Latent diffusion in compressed space.

Image Resolution 256x256
Training Time ~20 hrs
Memory Usage ~5 GB

out-latent-conditional-1

out-latent-conditional-2

out-latent-conditional-3


Other Software Resources

There are many great projects focused on diffusion generative models. However, most of them involve somewhat complex frameworks that are not always suitable for learning and preliminary experimentation.

Other Educational Resources

Some excellent materials have already been published on the topic! Huge respect to all of the creators 🙏 - check them out if their work has helped you!

Blog Posts

🔮 Explanation Videos

🔧 Implementation Videos

🎓 Video Lectures/Tutorials