SafeImage is a project that provides ways to classify images content as NSFW (Not Suitable for Work).
SafeImage module provides:
- Command utilities to classify images.
- Classification Workers to classify images from a Redis Queue.
- An REST WebService API.
git clone https://github.com/arquivo/SafeImage.git
- Clone repository:
git clone https://github.com/arquivo/SafeImage.git
- Build Docker Image:
docker build -t arquivo/safeimage
- Run Docker Container:
docker run -it arquivo/safeimage <command>
You need to have Caffe Framework installed on your system to be able to use it.
- Install SafeImage API:
pip install git+https://github.com/arquivo/SafeImage.git
- Add to PYTHONPATH the directory where Caffe is installed:
export PYTHONPATH=$PYTHONPATH:/opt/caffe/python
cli-safeimage-test-tool --help
cli-safeimage-indexing --help
nsfw-resnet-worker --help
nsfw-squeezenet-worker --help
uwsgi uwsgi.ini
Request a POST to /safeimage path with the following JSON content:
POST /safeimage
{
"image": image_64
}
Replace 'image_64' with the base64 encoded image bytes.