Skip to content

Av-hash/EmoRec

 
 

Repository files navigation

Emotion recognition using facial expressions

This project is 2020 summer project of brain and cognitive society ,Science and Technlogy council , IIT Kanpur. As the name suggests it is meant for leveraging the computer with the ability of classifying seven basic emotions using the facial expressions of humans.The seven basic emotions we're gonna classify are:

  • Happy
  • Sad
  • Angry
  • Disgust
  • Fear
  • Surprise
  • Contempt

Dataset

We used the extended cohn kanade (CK+) dataset which can be found here

Reference Paper

The reference paper we used is this reasearch paper published on springer.

Preprocessing

We performed the following preprocessing methods:

CNN model

We will be making a Sequential model comprising of using two convolutional layers (conv2D), two MaxPooling2D layers , a Flatten layer followed by a output Dense layer with softmax activation , with adam optimizer and categorical crossentropy loss.

Evaluation

We have evaluated our model on three different cropping methods:

  • Cropping with background
  • Cropping without background
  • Cropping without forehead

Also we varied the neuron number of hidden dense layer as 0, 256, 512, 1024. And we performed a ten fold cross validation on our model keeping the cropping method fixed (without background) but varying the neuron number.

Code

The link to the whole assembled code is here.

How to run

For running the model , just run the python script facial_expression_recognition.py

Documentation

The documentation of this project can be found here.

Results

The results of various evaluation methods we used are illustrated in this table :

Cropping without background

No. of neurons Accuracy graph Confusion matrix
0 Link Link
256 Link Link
512 Link Link
1024 Link Link

Cropping without forehead

No. of neurons Accuracy graph Confusion matrix
0 Link Link
256 Link Link
512 Link Link
1024 Link Link

Cropping with background

No. of neurons Accuracy graph Confusion matrix
0 Link Link
256 Link Link
512 Link Link
1024 Link Link

Cross Validation

We have done a ten-fold cross validation on our dataset. Following the research paper , it was done with same cropping method (without background) but with different neuron numbers.

No. of neurons Accuracy
0 97.96
256 98.27
512 96.62
1024 97.14

Average accuracy

On the paper On our Model
97.38 97.49

Reference Links

About

Facial Emotion Recognition

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 98.0%
  • Python 2.0%