liangyuwei / sign_language_robot

Dynamic Movement Primitive based Motion Retargeting, along with the sign language robot constituted by ABB's YuMi dual-arm collaborative robot and Inspire Robotics' multi-fingered hands.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

sign_language_robot

This is the code for the paper "Dynamic Movement Primitive based Motion Retargeting for Dual-Arm Sign Language Motions" accepted by ICRA 2021.

Youtube Video: Click Here

Bilibili Video: Click Here

Structure

This repo contains code for recording and processing sign language demonstrations, motion retargeting, and other stuffs. Will be re-organized soon...

About data collection

We use OptiTrack Motive and Wiseglove for recording human arm motions and finger movements. For further details, please refer to arm_hand_capture/README.md.

Dependencies

For data collection: usb_cam
vrpn_client_ros
rosserial_server

FAQ

  1. Error “sh: 1: v4l2-ctl: not found” requires the installation of v4l2, run sudo apt-get install v4l-utils

About

Dynamic Movement Primitive based Motion Retargeting, along with the sign language robot constituted by ABB's YuMi dual-arm collaborative robot and Inspire Robotics' multi-fingered hands.


Languages

Language:C++ 68.2%Language:Python 26.4%Language:CMake 3.8%Language:AMPL 1.2%Language:Dockerfile 0.2%Language:C 0.2%Language:Shell 0.1%Language:MATLAB 0.0%