There are 23 repositories under gesture-recognition topic.
Linux multi-touch gesture recognizer
Human: AI-powered 3D Face Detection & Rotation Tracking, Face Description & Recognition, Body Pose Tracking, 3D Hand & Finger Tracking, Iris Analysis, Age & Gender & Emotion Prediction, Gaze Tracking, Gesture Recognition
Gesture recognition via CNN. Implemented in Keras + Tensorflow/Theano + OpenCV
A toolbox for skeleton-based action recognition.
ICCV 2023 Papers: Discover cutting-edge research from ICCV 2023, the leading computer vision conference. Stay updated on the latest in computer vision and deep learning, with code included. ⭐ support visual intelligence development!
gesture recognition toolkit
We present MocapNET, a real-time method that estimates the 3D human pose directly in the popular Bio Vision Hierarchy (BVH) format, given estimations of the 2D body joints originating from monocular color images. Our contributions include: (a) A novel and compact 2D pose NSRM representation. (b) A human body orientation classifier and an ensemble of orientation-tuned neural networks that regress the 3D human pose by also allowing for the decomposition of the body to an upper and lower kinematic hierarchy. This permits the recovery of the human pose even in the case of significant occlusions. (c) An efficient Inverse Kinematics solver that refines the neural-network-based solution providing 3D human pose estimations that are consistent with the limb sizes of a target person (if known). All the above yield a 33% accuracy improvement on the Human 3.6 Million (H3.6M) dataset compared to the baseline method (MocapNET) while maintaining real-time performance
Real-time Hand Gesture Recognition with PyTorch on EgoGesture, NvGesture, Jester, Kinetics and UCF101
MediaPipe(Python版)を用いて手の姿勢推定を行い、検出したキーポイントを用いて、簡易なMLPでハンドサインとフィンガージェスチャーを認識するサンプルプログラムです。(Estimate hand pose using MediaPipe(Python version). This is a sample program that recognizes hand signs and finger gestures with a simple MLP using the detected key points.)
Asset that improves touch input support (includes new gestures) in the Godot game engine. It also translates mouse input to touch input.
A simple Fingers Detection (or Gesture Recognition) using OpenCV and Python with background substraction 简单手势识别
This is a sample program that recognizes hand signs and finger gestures with a simple MLP using the detected key points. Handpose is estimated using MediaPipe.
Virtually controlling computer using hand-gestures and voice commands. Using MediaPipe, OpenCV Python.
Simple project to recognize hands in realtime. 👋 Serves as an Example for building your own object recognizer.
:hand: Recognizing "Hand Gestures" using OpenCV and Python.
Unified learning approach for egocentric hand gesture recognition and fingertip detection.
Control DJI Tello 🛸 using hand gesture recognition on drone`s camera video-stream. Feel free to contribute!
CVPR 2023-2024 Papers: Dive into advanced research presented at the leading computer vision conference. Keep up to date with the latest developments in computer vision and deep learning. Code included. ⭐ support visual intelligence development!
Very small gesture recognizer for JavaScript. Swipe, pan, tap, doubletap, longpress, pinch, and rotate.
Gesture-based app launcher for iOS
Resources about Sign Language Processing (e.g., Sign Language Recognition / Translation / Production)
Device-Free Gesture Tracking Using Acoustic Signals, MobiCom 2016
Rock, Paper, Scissors game implemented with TensorFlow.js and FingerPose
Multimodal Gesture Recognition Using 3D Convolution and Convolutional LSTM
Write programs with hand gestures
Basic Gesture Recognition Using mmWave Sensor - TI AWR1642
Sign Language Detection system based on computer vision and deep learning using OpenCV and Tensorflow/Keras frameworks.
HTML5 Pac-Man game with gesture recognition
Intel Realsense Toolkit for Hand tracking and Gestural Recognition on Unity3D
GUI based on the python api of openpose in windows using cuda10 and cudnn7. Support body , hand, face keypoints estimation and data saving. Realtime gesture recognition is realized through two-layer neural network based on the skeleton collected from the gui.