There are 38 repositories under human-computer-interaction topic.
Awesome work on hand pose estimation/tracking
Computer Vision library for human-computer interaction. It implements Head Pose and Gaze Direction Estimation Using Convolutional Neural Networks, Skin Detection through Backprojection, Motion Detection and Tracking, Saliency Map.
👀 Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences.
AAAI 2024 Papers: Explore a comprehensive collection of innovative research papers presented at one of the premier artificial intelligence conferences. Seamlessly integrate code implementations for better understanding. ⭐ experience the forefront of progress in artificial intelligence with this repository!
Introducing Venocyber md bot your personal chuddybuddy md you were looking for this is most powerful Whatsapp chat bot created to ensure your WhatsApp personal requirements you are all in one ✍️👋👋
Clojure(Script) library for phrasing spec problems.
手势识别进行自定义操控电脑程序 | an application based on gesture recognition for controlling desktop softwares, developed by MediaPipe + Electron + React
Quickly add MediaPipe Pose Estimation and Detection to your iOS app. Enable powerful features in your app powered by the body or hand.
A curated list of awesome affective computing 🤖❤️ papers, software, open-source projects, and resources
Easy to use Python command line based tool to generate a gaze point heatmap from a csv file. 👁️
openEMSstim: open-hardware module to adjust the intensity of EMS/TENS stimulators.
Toolkits to create a human-in-the-loop approval layer to monitor and guide AI agents workflow in real-time.
Code and data belonging to our CSCW 2019 paper: "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites".
Wearable computing software framework for intelligence augmentation research and applications. Easily build smart glasses apps, relying on built in voice command, speech recognition, computer vision, UI, sensors, smart phone connection, NLP, facial recognition, database, cloud connection, and more. This repo is in beta.
The official implementation for ICMI 2020 Best Paper Award "Gesticulator: A framework for semantically-aware speech-driven gesture generation"
a webcam-based virtual gesture mouse that is easy to use with hands on the desk.
As a part of the HAKE project, includes the reproduced SOTA models and the corresponding HAKE-enhanced versions (CVPR2020).
Official PyTorch implementation of TriHorn-Net
Notes for Human Computer Interaction course - CS6750
😎 Awesome lists about Speech Emotion Recognition
Augmented Reality (AR) app for shoe try-on and foot size measurement
All about human-AI interaction (HCI + AI).
Basketball coaches often sketch plays on a whiteboard to help players get the ball through the net. A new AI model predicts how opponents would respond to these tactics.
CrowdTruth framework for crowdsourcing ground truth for training & evaluation of AI systems
Demo for "MoSculp: Interactive Visualization of Shape and Time"
Interactive Visualization Interface for Multidimensional Datasets
Fist, palm and hand detection & tracking for intelligent human-computer interaction game character movement control with OpenCV on Java (Processing sketchbook).
This is the research repository for Vid2Doppler: Synthesizing Doppler Radar Data from Videos for Training Privacy-Preserving Activity Recognition.
The python code detects different landmarks on the face and predicts the emotions such as smile based on it. It automatically takes a photo of that person when he smiles. Also when the two eyebrows are lifted up, the system plays a music automatically and the music stops when you blink your right eye.