A simple JavaScript / TypeScript library to get list of adult words.
-
Updated
May 10, 2023 - JavaScript
A simple JavaScript / TypeScript library to get list of adult words.
SensiSafe is an API for detecting inappropriate content on the Internet, specialized in detecting upload of pornographic content by users.
Service in Node JS to classify images
A Browser extension that enables you to navigate the web with respect for your Islamic values, protect your privacy and reduce browsing distractions by auto detecting and blurring "Haram" content.
Nudity detection with JavaScript and HTMLCanvas
Add a description, image, and links to the nudity-detection topic page so that developers can more easily learn about it.
To associate your repository with the nudity-detection topic, visit your repo's landing page and select "manage topics."