MAIS202
is an introductory bootcamp to ML that taught me the basics such as CNNs, Linear Regression, Naive Bayes and many more.
As a complementary to the bootcamp, here is a project I wanted to tackle: Emotion Detection in Faces using ML.
The goal is to identify what emotion a person is feeling by looking at a static pitcure or a real-time video of them.
- Started by reading the academic paper DeepEmotion2019,
- Created my CNN and got stuck on how to convert numbers to images, how to localize faces, and how to make a live webcame demo.
- Found out two implementation that went over the correct code
- Implemented and merged the two after understanding every class and function used.
I followed OmarSayed7's implementation of DeepEmotion2019 and built my understanding of this whole project thanks to him. Then integrated the live demo feature from DeepLearning_by_PhDScholar, however when I test it on, the face detection square doesn't localize my face.
This project is for educational purposes only.
- Database: FER-2013
- Main: Dataset setup, training loop, and section to test on any image you input + webcam demo(not at peak performance).
- Deep_emotion: CNN structure with localization function.
- Data_loaders: DataLoader PlainDataset
- Generate_data: Convert data from numbers to images using PIL