Skip to content

To check if the input video contains Nudity or any other NSFW content and label them.

Notifications You must be signed in to change notification settings

janu777/NSFW_Detection

Repository files navigation

NSFW_DETECTION

ABOUT:

To check if the input video contains Nudity or any other NSFW content and label them.

DEPENDENCIES:

Tensorflow,Python,Numpy,Opencv

Model:

Small Alex net.

Usage:

1.Unzip model8

2.Open constants.py

3.Change the MODEL_SAVE_PATH to /path/to/model8/NSFWmodel-8

4.Change the VIDEO_SAVE_PATH to /path/to/your/video/folder

5.Open terminal

6.RUN: python NSFW_Predictor.py

OUTPUT:

1.The model checks every 50 frames for NSFW content.

2.Shows the processed frame with its label. # LABELS = [NOT NUDE, NUDE, UNKNOWN]

3.Prints video summary showing VIDEO_NUMBER,NAME,NUMBER_OF_PROCESSED_FRAMES,NSFW_SCORE

4.The NSFW Score calculated using (NSFW_Count/Total_frames). # 1.0 MEANS 100% Nudity, 0.0 MEANS 0% Nudity

References:

http://blog.clarifai.com/what-convolutional-neural-networks-see-at-when-they-see-nudity/

https://blog.algorithmia.com/improving-nudity-detection-nsfw-image-recognition/

About

To check if the input video contains Nudity or any other NSFW content and label them.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages