Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
-
Updated
Jan 21, 2024 - Shell
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
Keras implementation of the Yahoo Open-NSFW model
Rest API Written In Python To Classify NSFW Images.
NudeNet: NSFW Object Detection for TFJS and NodeJS
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
An NSFW Image Classification REST API for effortless Content Moderation built with Node.js, Tensorflow, and Parse Server
✅ CODAR is a framework built using PyTorch to analyze post (Text+Media) and predict cyber bullying and offensive content.
[Android] NSFW(Nude Content) Detector using Firebase AutoML and TensorFlow Lite
This repo contains Deep learning Implementation for identifying NSFW images.
Group Guardian is a Telegram bot for admins to maintain a safe community.
An NSFW Image Classifier including an Automation Engine for fast deletion & moderation built with Node.js, TensorFlow, and Parse Server
This repository is dedicated for building a classifier to detect NSFW Images & Videos.
An internal hackathon for Kavach 23
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
Catgirl Sorting AI
Simple drop in API to determine if image is NSFW using TensorFlow
Python package to apply the Safety Checker from Stable Diffusion.
It is source code of ios App for my Keras Based NSFW Classifier.
Add a description, image, and links to the nsfw-classifier topic page so that developers can more easily learn about it.
To associate your repository with the nsfw-classifier topic, visit your repo's landing page and select "manage topics."