ElliotHYLee / GestureClassifier

Undergrad Research Project - Human-Robot Interaction

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gesture Classification using Kinect

Kinect skeletal frame based gesture classfication

Click for Youtube video:

Click for Youtube video: real-time application:

Prereq.s

Tensorflow

python libraries:

chmod +x pythonReady.sh
yes "yes" | sudo sh pythonReady.sh

run

python main.py Train
python main.py Test

ToDo

  • Add the "ghost" class in the data set.
  • plot the result in python.
  • Apply CNN based on raw RGB images.
  • Apply face recognition for operator identification.

Problem Setup

MS Kinect usually returns the skeletal coordinates as the first picture.

    However, it sometimes sees "Ghosts" as shown in this second picture.

    So, it is necessary to tell if a skeletal stream is from actual human or not. Fortunately, pattern-wise, the human and "ghost" skeletons look different. So, a single hidden layer NN can easily calssify the differences.

    Next, the four different gestures are fed as well. Idle, move forward, move back, and takeoff/landing. As a result, I needed only 5 classes, the last class can be simply "ghost" patterns.

    • Idle
    • Move Forward
    • Move Back
    • Takeoff/landing

Results

Accuracy= 0.90

About

Undergrad Research Project - Human-Robot Interaction


Languages

Language:MATLAB 49.1%Language:Python 45.5%Language:Shell 3.7%Language:M 1.7%