Skip to content

This Project detects harmfull objects in images and determine whether an image is offensive for children.

Notifications You must be signed in to change notification settings

Hila1/ImageDetection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ImageDetection

This project is detecting obbussive images for children. The algorithm searches in images: 1. porn, nudity or other human inappropriate content. 2. drugs including weed plants. 3. weapons, such as knifes and guns The object recognition was made by training an image recogrition model, using a self build dataset of (train and test) images and annotations. Writen by Yocheved N. and Hila S.

About

This Project detects harmfull objects in images and determine whether an image is offensive for children.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published