A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
-
Updated
May 22, 2024 - TypeScript
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
image NSFW classify, support api and grpc, 图片鉴黄等分类检测,支持 api 和 grpc调用
Add a description, image, and links to the nsfw-classifier topic page so that developers can more easily learn about it.
To associate your repository with the nsfw-classifier topic, visit your repo's landing page and select "manage topics."