This project is detecting obbussive images for children. The algorithm searches in images: 1. porn, nudity or other human inappropriate content. 2. drugs including weed plants. 3. weapons, such as knifes and guns The object recognition was made by training an image recogrition model, using a self build dataset of (train and test) images and annotations. Writen by Yocheved N. and Hila S.
-
Notifications
You must be signed in to change notification settings - Fork 0
Hila1/ImageDetection
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
This Project detects harmfull objects in images and determine whether an image is offensive for children.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published