Skip to content

Latest commit

 

History

History
 
 

moderate-images

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Automatically Moderate Images

This sample demonstrates how to automatically moderate offensive images uploaded to Firebase Storage. It uses The Google Cloud Vision API to detect if the image contains adult or violent content and if so uses ImageMagick to blur the image.

Functions Code

See file functions/index.js for the moderation code.

The detection of adult and violent content in an image is done using The Google Cloud Vision API. The image blurring is performed using ImageMagick which is installed by default on all Cloud Functions instances. The image is first downloaded locally from the Firebase Storage bucket to the tmp folder using the google-cloud SDK.

The dependencies are listed in functions/package.json.

Trigger rules

The function triggers on upload of any file to your Firebase project's default Cloud Storage bucket.

Setting up the sample

  1. Create a Firebase project on the Firebase Console.
  2. In the Google Cloud Console enable the Google Cloud Vision API. Note: Billing is required to enable the Cloud Vision API so enable Billing on your Firebase project by switching to the Blaze or Flame plans. For more information have a look at the pricing page.
  3. Clone or download this repo and open the moderate-image directory.
  4. You must have the Firebase CLI installed. If you don't have it install it with npm install -g firebase-tools and then configure it with firebase login.
  5. Configure the CLI locally by using firebase use --add and select your project in the list.
  6. Install dependencies locally by running: cd functions; npm install; cd -

Deploy and test

To test the sample:

  1. Deploy your Cloud Functions using firebase deploy
  2. Go to the Firebase Console Storage tab and upload an image that contains adult or violent content. After a short time the image will be replaced by a blurred version of itself.