Skip to content
This repository has been archived by the owner on Oct 17, 2023. It is now read-only.
/ FIRST Public archive

Frame Interpolation Refined with Stable Diffusion via Control Net

License

Notifications You must be signed in to change notification settings

Haoming02/FIRST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FIRST - Frame Interpolation Refined with Stable diffusion via control neT

Problem Statement

Traditional frame interpolation techniques were mainly trained on the motions and colors of pixels. As such, they work better for frames with relatively little difference in between. When the difference is too significant, the estimations become unreliable, and the algorithms essentially start guessing, resulting in ghosting artifacts that people usually associate with frame interpolation.

Proposed Solution

Stable Diffusion, on the other hand, is a generative model capable of creating more coherent pixels instead of artifacts. By combining the newly introduced reference_only module from ControlNet, which influences the generated result with provided references as the name implies, and the img2img feature from Stable Diffusion, you can refine the interpolated frame and generate a more cohesive result.

Sample Images

The following base interpolated frames were generated using Flowframes with the RIFE model

Previous Frame RIFE F.I.R.S.T Next Frame

Sample Videos

RIFE

F.I.R.S.T

Above contents were used solely for research purposes

Code

A simple Python GUI that streamlines this process

Prerequisite

How to Use

  1. In the settings of Automatic1111, set Multi ControlNet: Max models amount to 2 or above

    • Recommended to set Model cache size to 2 as well
  2. Launch Automatic1111 with api enabled

    • Open the webui-user.bat and add --api after COMMANDLINE_ARGS=
    • Set your checkpoint accordingly (Realistic / Anime)
  3. (Optionally) Edit parameters.py to change the Stable Diffusion settings, such as steps or sampler

  4. Launch main.py to open the GUI

    • On Windows, you can just use launch.bat
  5. Enter the path containing the frames

    • The path should contain both the original and the interpolated frames in sequence (Default behavior of Flowframes when you set the export to Image Sequence)
    • Only supports 2x interpolations, meaning the odd files should be original frames while even files should be interpolated frames
  6. Enter the path to store the outputs

    • Recommended to set a different path from Source
  7. Enter the port

    • The default is 7860
  8. Press Run

  9. Check the console for progress

  10. Use FFmpeg to merge the frames into video

About

Frame Interpolation Refined with Stable Diffusion via Control Net

Topics

Resources

License

Stars

Watchers

Forks