Read this in other languages: English, Portuguese.
The following project has the purpose to detect hand gestures and apply functionalities to the detections, such as moving a video game character or controlling hardware parts.
- Use conda to create virtual environments.
- Use cuda and cudnn for better performance (verify if GPU is compatible).
- Linux is recommended for all procedures.
Recommended version 3.7.
conda create -n <environment-name> python=3.7
conda activate <environment-name>
One line installation
pip install tensorflow-gpu==1.15.2 opencv-python numpy scipy matplotlib
pip install tensorflow==1.15.2
(CPU)
pip install tensorflow-gpu==1.15.2
(GPU)
pip install opencv-python
pip install numpy
pip install scipy
pip install matplotlib
Download the repository or clone by executing in the shell git clone https://github.com/jaaoop/ProjectGestus.git
. After this steps it will be ready to use.
ContinuousGesturePredictor.py makes gesture detection in real time.
- Execute in the shell
python ContinuousGesturePredictor.py
. - When the file is open, the webcam will start and the recording will be shown for the user.
- In the open window, a square will be drawn and, during 30 frames, will take the area as background. Taking this to count, leave this area free from the hand for better results.
- After the 30 first frames, a Thresholded and Statistics window will appear, at this moment the user must press 's' to start the detection and, then, positionate the hand in the drawn square.
- The window Thresholded and Statistics will show the detected gesture, the user is free to move and test new detections.
LabelGenerator.py generates new training gestures.
- Execute in the shell
python LabelGenerator.py -n <gesture-name>
. - When the file is open, the webcam will start and the recording will be shown for the user.
- In the open window a square will be drawn and, during 30 frames, will take the area as background. Taking this to count, leave this area free from the hand for better results.
- After the 30 first frames, a Thresholded and Statistics window will appear, in this moment the user must press 's' to start generating training and testing gesture pictures. Suggestion: Move the hand for diverse results.
- In the process of creating the new gesture, the shell will show the progress. When finished, two folders will be created, one of Train and one of Test, both with the gesture name.
Note: One additional parameter from LabelGenerator is
-t <image-number>
where the ammount of training images is defined, the test ones are 10% of this value. By default the parameter is set to-t 1000
.
ModelTrainer.py trains the model to detect new gestures.
- Certify that you have the same folders in Train and Test.
- Execute in the shell
python ModelTrainer.py
. - Wait until the end of the training.
Note: One additional parameter from ModelTrainer is
-c True
that allows to save the training chart. By default the parameter is set to-c False
.
The following files will behave in the same way as ContinousGesturePredictor.py
, with the exception that for each gesture an action is
assigned, like moving a video game character or pressing a keyboard key. The file basicApplication.py
is the implementations code structure without the assignments, the user is able to set actions accordingly to his necessities.
basicApplication.py is a template for possible applications.
- Execute in the shell
python basicApplication.py
. - The file will behave in the same way as
ContinousGesturePredictor.py
if the user doesn't make any gesture assignments.
runwNotes.py is application demo for your basic text editor.
- Execute in the shell
python runwNotes.py
. - If the text editor window is open, the user might see the designed characters being written according to the detection.
Note: Additional dependencies might be needed for some applications.
runwMinecraft.py is application demo that controls character movements in Minecraft.
- Execute in the shell
python runwMinecraft.py
. - If the Minecraft window is open, the character will move according to the detection.
Note: Additional dependencies might be needed for some applications.
runwArduino.py is application demo for Arduino.
- Execute in the shell
python runwArduino.py
. - If an Arduino is connected, the user might see commands being given according to the detection.
Note: Additional dependencies might be needed for some applications.
This project is part of the RAS Unesp Bauru projects. For more information about this and other projects, access: https://sites.google.com/unesp.br/rasunespbauru/home.
This project is free and non-profit.
The project is based in the repository Hand Gesture Recognition using Convolution Neural Network built using Tensorflow, OpenCV and python.