Skip to content
This repository has been archived by the owner on May 11, 2023. It is now read-only.

VapourSynth Basics

AlphaAtlas edited this page Oct 18, 2019 · 9 revisions

For those that aren't familiar with the absolute basics of Python or VapourSynth, I'd recommend skimming through some tutorials:

The bulk of your image and video processing will be done in .vpy files. You should keep these in the "VapourSynthScripts" folder for convenience.

You can open those .vpy files in the VapourSynth Editor. The basic syntax is something like this:

#The file is "read" from top to bottom.
#But Python will ignore any line that starts with a "#"

#Tell Python to import VapourSynth:
import vapoursynth as vs
core = vs.get_core()
#now, "core" represents the core functions of VapourSynth
#and "vs" represents extra stuff, like formats

#Import the mvsfunc script, and call it mvf:
import mvsfunc as mvf

#Get your image/video:
clip = core.ffms2.Source(r"C:/some folder/some video")
#"clip" now represents your media at C:/some folder/some video. 
#When you import or create anything, you can call it anything you want.

#"Core" functions, and .dll files that are imported by VapourSynth, 
#are prefixed with "core," and then their namespace
#This will usually be in the documentation/wiki for whatever function you're using.
clip = core.fmtc.bitdepth(clip, bits=16)
clip = core.knlm.KNLMeansCL(clip, d=2, a=3, h=0.8)
#In this case, we convert a clip to 16 bits, then denoise it with KNLMeansCL
#We specify the bit depth in fmtc, for example, with the "bits=16" argument.

#Upscale the clip
upscaled = VSGAN.Start(clip=clip, model=r"""ESRGANModels/ad_test_tf.pth""", scale=4, old_arch=False)
#You can return results as a clip with a different name, if you want. 
#"clip" will still be available for later use

#Downscale the upscaled clip to 1080p, then convert it to 10 bits with 420 pixel subsampling:
upscaled = mvf.SSIM_downsample(upscaled, w = 1920, h = 1080)
upscaled = mvf.ToYUV(upscaled, css="420", depth=10)

#output the clip
upscaled.set_output()

You can check your script (in which case errors will be spit out at the bottom) or try to preview it:

MissingImage

MissingImage

Pan through your video or sequences of images with the arrow keys or by clicking on the bar. If you have any kind of before/after preview built into your script, tapping the left and right arrow keys is a great way to compare images. You can also zoom into your media with integer scaling.

If you don't want to use VSPipe directly, you can queue up video encoding jobs. If you're batch processing images, you can use the "benchmark" function, as imwri will write images without piping them out of VapourSynth.

Other notes:

  • Python is tab/space sensitive.

  • Generally, processing at the highest possible bit depth with minimal conversion is best. This usually means 16 bits for most filters.