Skip to content
Realtime Facial Emotion Recognition Application
Jupyter Notebook
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.ipynb_checkpoints Uploaded Example Images Dec 16, 2019
images Uploaded Example Images Dec 16, 2019
FERSystem.ipynb Uploaded Example Images Dec 16, 2019
README.md ReadMe update Dec 16, 2019
haarcascade_frontalface_default.xml first commit Dec 15, 2019

README.md

RealTime Facial Emotion Recognition System

The goal of the project is to identify 7 key human emotions: HAPPY, NEUTRAL, SAD, SURPRISE, ANGER, DISGUST and FEAR, the project uses FER2013 dataset to train a classification Convolutional neural networks (CNN) model which takes a grayscale images of faces and predicts the most probable emotion in the output layer. The model traning was done on google colab and it took near about 6.5 hours to fully train this model at 200 epochs trained model was then used as the baseline to classify a facial emotion in the real-time application which was implemented using OpenCV, we extract the detected face from image frames in a real-time video then offer it to the model to predict and print the emotion in real-time.

Dataset

Available at kaggel: FER2013

Online Model Available at Google Colab

Available at Colab: FERSystem

Demo

Neutral Face

Sad Face

Angry Face

Surprised Face Happy Face

You can’t perform that action at this time.