There are 14 repositories under human-robot-interaction topic.
Code accompanying the ECCV 2020 paper "Trajectron++: Dynamically-Feasible Trajectory Forecasting With Heterogeneous Data" by Tim Salzmann*, Boris Ivanovic*, Punarjay Chakravarty, and Marco Pavone (* denotes equal contribution).
YARP - Yet Another Robot Platform
Nimble: Physics Engine for Biomechanics and Deep Learning
RT-GENE: Real-Time Eye Gaze and Blink Estimation in Natural Environments
[IROS 2020] se(3)-TrackNet: Data-driven 6D Pose Tracking by Calibrating Image Residuals in Synthetic Domains
Assistive Gym, a physics-based simulation framework for physical human-robot interaction and robotic assistance.
Code for CVPR'18 spotlight "Weakly and Semi Supervised Human Body Part Parsing via Pose-Guided Knowledge Transfer"
A curated list of robot social navigation.
PaddleRobotics is an open-source algorithm library for robots based on Paddle, including open-source parts such as human-robot interaction, complex motion control, environment perception, SLAM positioning, and navigation.
[ICRA 2023] Intention Aware Robot Crowd Navigation with Attention-Based Interaction Graph
URDF models of humans created to perform human robot interaction experiments.
Code accompanying "The Trajectron: Probabilistic Multi-Agent Trajectory Modeling with Dynamic Spatiotemporal Graphs" by Boris Ivanovic and Marco Pavone.
A curated list of awesome human-robot interaction libraries and resources
[ICRA 2021] Decentralized Structural-RNN for Robot Crowd Navigation with Deep Reinforcement Learning
Software repository for estimating human dynamics
Gesture Recognition For Human-Robot Interaction with modelling, training, analysing and recognising gestures based on computer vision and machine learning techniques. This work was done at Distributed Artificial Intelligence Lab (DAI Labor), Berlin.
Official codebase for Sirius: Robot Learning on the Job
[ICRA 2023] Intention Aware Robot Crowd Navigation with Attention-Based Interaction Graph -- Sim2real code on Turtlebot2i
[RA-L + ICRA22] Learning Sparse Interaction Graphs of Partially Detected Pedestrians for Trajectory Prediction
Learning human-aware robot navigation behavior from demonstrations via Maximum Entropy Inverse Reinforcement Learning.
Python controller for Pepper humanoid robot. It allows to write apps in Python. There are examples of simple applications for Pepper. We develop GUI to operate the robot and run custom apps
[arXiv 2024] "HEIGHT: Heterogeneous Interaction Graph Transformer for Robot Navigation in Crowded and Constrained Environments"
Implementation of SHARP: Shielding-Aware Robust Planning for Safe and Efficient Human-Robot Interaction - RAL 2022
Implementation of implicit dual control-based active uncertainty learning for human-robot interaction - WAFR 2022 & IJRR 2023
Expressive robot face designed for tablets
OpenPHRI, a complete and generic solution for safe physical human-robot interactions
This is an official PyTorch implementation of "Gesture2Vec: Clustering Gestures using Representation Learning Methods for Co-speech Gesture Generation" (IROS 2022).
it provides Pepper Robot conversation abilities to handle a free open-domain dialogue.
A package for simple, expressive, and customizable text-to-speech with an animated face.
The human will use BCI to participate in the multi-robot strategy selection and control a single-robot at any time. For this purpose, a specific simulation system has been developed, which composed of: a non-invasive BCI Emotiv Epoc; a Graphical User Interface (GUI) based on QT; a multi-robot simulation environment based on Gazebo (open-source 3D robotics simulator).
Yet another repo for the baxter collaboration task.