zhamilyaa / robt414_facial_expression

ROBT414 Human-Robot Interaction Final Project

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ROBT 414

Title: Emotion Detection from text using PyTorch and facial expression generation in Blender using OpenPose

Authors: Dana Aubakirova, Zhamilya Saparova, Nurdaulet Zhuzbay

Contents

  1. General-info
  2. emotion-detection
  3. openpose-to-blender
  4. Libraries

General-info

This project aims to develop a deep-learning based model that allows for the accurate semantic analysis of the text, that in combination with advanced 3D animation softwares could produce human-like facial expressions.

emotion-detection

The Project Notebook

This is the the emotion_detection.ipynb file containing the detailed implementation of the LSTM model for emotion detection from text. It includes the dataset explanation, the pipeline of the model, training output, detailed comments as well as results.

Dataset

####The datasets folder contains the train.csv (132 instances and labels), and test.csv (56 instances and labels) used for training and testing the model.

glove.6B.50d.txt is file containing pre-trained GloVe embeddings used for word-vector representations. The embedding can be downloaded from here.

openpose-to-blender

in this folder you will find the .blend, .json files for facial key points and facial_transfer_blender.py. Please, visit, separate README file in this folder to see how to call the script.

Libraries


A list of main libraries and tools used within the project:

About

ROBT414 Human-Robot Interaction Final Project


Languages

Language:Jupyter Notebook 100.0%