Skip to content
Faces from camera to emoji using Google Vision API
JavaScript HTML CSS
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
demo
server
src
.babelrc
.editorconfig
.gitignore
README.md
firebase_auth.example.js
package-lock.json
package.json

README.md

camemoji 😄🙂😐🙁🙁😡

Experimental project that uses Google Cloud Vision API in nodejs to recognize faces' emotions and map these results to emojis.

Demo

camemoji demo

How it works

Uses firebase to upload pictures from the webcam and Vision API to get a result of the emotion.

Currently the emotion mappers looks like this:

"NORMAL": "😐",
"HAPPY": "🙂",
"VERY_HAPPY": "😄",
"SAD": "🙁",
"VERY_SAD": "😭",
"ANGRY": "😠",
"VERY_ANGRY": "😡",
"SURPRISED": "😮",
"HEADWEAR": "🤠"

Running the project

Google Vision API

To run this project you'll need to set up a Google Vision Credential account, enable the API, create the authentication file and add it in your root project called auth.json.

Firebase

Additionally, for file upload I set up a firebase project for storage, you'll need the API Keyand the Storage Bucket in a firebase_auth.js file in the root of the project.

You can’t perform that action at this time.