This RocketchatApp validates uploaded images against the Microsoft PhotoDNA cloud service and quarantines those identified as child abuse images (child pornography or CSEM).
-
Updated
Jun 17, 2024 - TypeScript
This RocketchatApp validates uploaded images against the Microsoft PhotoDNA cloud service and quarantines those identified as child abuse images (child pornography or CSEM).
An API endpoint that detects cuss words.
Set rules to target bad posts on Discussions
Add a description, image, and links to the abuse-detection topic page so that developers can more easily learn about it.
To associate your repository with the abuse-detection topic, visit your repo's landing page and select "manage topics."