nmirabets / emotion-recognition

This online application recognizes emotions through facial expressions on a webcam image feed. It utilizes an image recognition model trained on the FER-2013 dataset to provide real-time inference.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Development of PulseAI - Emotion Recognition System

Facial expressions convey a significant amount of information about human emotions and intentions. In the age of artificial intelligence, the ability to automatically recognize and interpret these expressions can play a pivotal role in various applications, from enhancing human-computer interactions to offering insights into mental health. As a step towards harnessing this potential, we have developed an advanced image recognition system using the fast.ai library, leveraging the widely-used FER-2013 dataset.

FER-2013 Dataset Overview

The Facial Expression Recognition 2013 (FER-2013) dataset comprises grayscale images, each of size 48x48 pixels. These images are categorized into seven distinct emotion labels: angry, disgust, fear, happy, sad, surprise, and neutral. With over 35,000 images, the dataset provides a comprehensive collection of facial expressions that is representative of various demographics, making it an excellent choice for training and testing facial recognition models.

Empowering the System with fast.ai

fast.ai is a deep learning library designed to make the process of training complex models as intuitive and streamlined as possible. With its high-level APIs and underlying robust mechanisms, it significantly reduces the learning curve for beginners, while offering granular control for experienced users. In this project, the library facilitated rapid prototyping, allowing us to experiment with a variety of architectures and techniques seamlessly.

Model Selection and Performance

Our development journey included testing multiple neural network architectures. Starting with simpler convolutional networks, we gradually moved to more intricate models, tweaking hyperparameters, and utilizing fast.ai's built-in functionalities like learning rate finders. After rigorous training and validation, we selected the best-performing model, ensuring a balance between accuracy and computational efficiency.

Presentation Link

About

This online application recognizes emotions through facial expressions on a webcam image feed. It utilizes an image recognition model trained on the FER-2013 dataset to provide real-time inference.


Languages

Language:Jupyter Notebook 99.6%Language:Python 0.4%