Skip to content
forked from ZHKKKe/MODNet

A Trimap-Free Solution for Portrait Matting in Real Time under Changing Scenes

Notifications You must be signed in to change notification settings

ai-foundation/MODNet

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MODNet: Is a Green Screen Really Necessary for Real-Time Portrait Matting?

Arxiv Preprint | Supplementary Video

WebCam Video Demo [Offline][Colab] | Custom Video Demo [Offline] | Image Demo [WebGUI][Colab]

This is the official project of our paper Is a Green Screen Really Necessary for Real-Time Portrait Matting?
MODNet is a trimap-free model for portrait matting in real time under changing scenes.

News

  • [Dec 25 2020] Merry Christmas! 🎄 Release Custom Video Matting Demo [Offline] for user videos.
  • [Dec 15 2020] A cool WebGUI for image matting based on MODNet is built by the Gradio team!
  • [Dec 10 2020] Release WebCam Video Matting Demo [Offline][Colab] and Image Matting Demo [Colab].
  • [Nov 24 2020] Release Arxiv Preprint and Supplementary Video.

Video Matting Demo

We provide two real-time portrait video matting demos based on WebCam. When using the demo, you can move the WebCam around at will. If you have an Ubuntu system, we recommend you to try the offline demo to get a higher fps. Otherwise, you can access the online Colab demo.
We also provide an offline demo that allows you to process custom videos.

Image Matting Demo

We provide an online Colab demo for portrait image matting.
It allows you to upload portrait images and predict/visualize/download the alpha mattes.

You can also use this WebGUI (hosted on Gradio) for portrait image matting from your browser without any code!

TO DO

  • Release training code (scheduled in Jan. 2021)
  • Release PPM-100 validation benchmark (scheduled in Feb. 2021)

License

This project (code, pre-trained models, demos, etc.) is released under the Creative Commons Attribution NonCommercial ShareAlike 4.0 license.

Acknowledgement

We thank City University of Hong Kong and SenseTime for their support to this project.
We thank the Gradio team for their contributions to building the demos.

Citation

If this work helps your research, please consider to cite:

@article{MODNet,
  author = {Zhanghan Ke and Kaican Li and Yurou Zhou and Qiuhua Wu and Xiangyu Mao and Qiong Yan and Rynson W.H. Lau},
  title = {Is a Green Screen Really Necessary for Real-Time Portrait Matting?},
  journal={ArXiv},
  volume={abs/2011.11961},
  year = {2020},
}

Contact

This project is currently maintained by Zhanghan Ke (@ZHKKKe).
If you have any questions, please feel free to contact kezhanghan@outlook.com.

About

A Trimap-Free Solution for Portrait Matting in Real Time under Changing Scenes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages

  • Python 93.7%
  • Dockerfile 6.3%