Note: Please note that this repo is a fork of the TwentyBn:Sense. Models showcased here are finetuned using the pre-trained weights and training tools available from the original repo.
For installation and instructions on runnning the already included demos, please check out the README from the main repository. You'll have to also download the pre-trained weights as described there too.
This repo demonstrates a RGB-based gesture recognition system aimed for Human Robot Interaction. This repository was further integrated here into the ROS framework with the intention to be used for HRI research.
The following gestures supported are:
List of gestures used | |||
---|---|---|---|
Come forward | Handwave | Pointing | Rotate Arm Clockwise |
Come forward | Pause | Resume | Rotate Arm anti-Clockwise |
Move to the left | Start | Thumbs down | Watch out |
Move to the right | Stop | Thumbs up |
Try it yourself:
PYTHONPATH=./ python examples/run_hri_recognition.py --use_gpu
Please refer to the original repo for requirements & installation steps here.
The code is MIT but pretrained weights come with a separate license. Please check the original sense repo for more information.