A tool for detecting viruses and NSFW material in WARC files
-
Updated
Jun 19, 2024 - Python
A tool for detecting viruses and NSFW material in WARC files
Python package to apply the Safety Checker from Stable Diffusion.
Containerized REST API for image & video classification, utilizing Hugging Face transformers.
Keras implementation of the Yahoo Open-NSFW model
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
A comprehensive NSFW image detection npm package, equipped with advanced algorithms to ensure the safety and integrity of online platforms by swiftly identifying and filtering explicit content
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
[READ-ONLY] CLI tool that uses machine learning to detect nudity in images.
Rest API Written In Python To Classify NSFW Images.
Simple drop in API to determine if image is NSFW using TensorFlow
NSFW.js implementation for image, gif and video. NSFW detection on the client-side via TensorFlow.js
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
Anti-NSFW Project in python using pre-trained model.
A keyword-based anti-NSFW classifier for Twitter, with a suspicious twist 🧐
Remove adult content in discord channels better with Artificial Intelligence.
NudeNet: NSFW Object Detection for TFJS and NodeJS
A browser interface for NudeNet classifier.
Filter(remove) the NSFW(not safe for work) images in a certain directory recursively, backup every images before it
Add a description, image, and links to the nsfw-classifier topic page so that developers can more easily learn about it.
To associate your repository with the nsfw-classifier topic, visit your repo's landing page and select "manage topics."