lomcin / HuMAn

Human Motion Anticipation: an AI human motion prediction algorithm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

HuMAn: Human Motion Anticipation

Build Status Codecov GitHub issues GitHub pull requests GitHub forks GitHub stars GitHub license


An AI human motion prediction algorithm

πŸ“ Table of Contents

🧐 About

The main inspiration for developing this algorithm is exoskeleton transparency control, which aims at achieving synchronization and synergy between the motions of the exoskeleton robot and the human user. By being able to predict future motions from a previous time sequence, HuMAn can provide anticipation to a chosen control strategy.

🏁 Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

πŸ›  Prerequisites

What things you need to install the software and how to install them.

This algorithm is programmed using Python, currently using version 3.8. Installing Python through Anaconda 🐍 is recommended because:

  • You gain access to Conda packages, apart from Pip packages;
  • Conda is a great tool for managing virtual environments (you can create one to install all the prerequisites for HuMAn)!

Other key dependencies are (version numbers are kept for reference, but newer versions may work):

  • TensorFlow (version 2.4)

    • pip install tensorflow
  • NVIDIA CUDA Toolkit (version 11.0)

    • This is not mandatory, but highly recommended! An available NVIDIA GPU can speed up TensorFlow code to a great extent, when compared to running solely on CPU;
    • You can install it with Conda, enabling different versions of the toolkit to be installed in other virtual environments, or use the official installer from the NVIDIA website;
    • Ensure to pair TensorFlow and CUDA versions correctly (see this).
    • conda install cudatoolkit
  • STAR model (more about it below)

    • The authors of the STAR body model provide loaders based upon Chumpy, PyTorch and TensorFlow. I created a fork of their repository, to make pointing to the model (.npz files) directory easier and more flexible. You can install it using pip.
    • pip install git+https://github.com/Vtn21/STAR
  • Trimesh (version 3.9.1)

    • Used solely for visualizing AMASS recordings as body meshes, being thus not mandatory.
    • conda install -c conda-forge trimesh

πŸ—‚ Database and model

HuMAn uses the AMASS human motion database. Its data is publicly available, requiring only a simple account. The whole database (after uncompressed) has around 23 GB of NumPy npz files, corresponding to more than 45 hours of recordings. Keep it in a directory of your choice.

AMASS data can be visualized using a series of body models, such as SMPL, SMPL-H (this comprises hand motions), SMPL-X (SMPL eXpressive, with facial expressions), or the more recent STAR. HuMAn uses the STAR model as it has fewer parameters than its predecessors, while exhibiting more realistic shape deformations. You can download the models from their webpages, creating an account as done for AMASS.

Please note that the body models are used here just for visualization, and do not interfere in training. Thus, it is easy to incorporate the other models for this purpose.

Update the folder paths in the scripts as required. The example folder structure is given as follows:

.
β”œβ”€β”€ ...
β”œβ”€β”€ AMASS
β”‚   β”œβ”€β”€ datasets                          # Folder for all AMASS sub-datasets
|   |   β”œβ”€β”€ ACCAD                         # A sub-dataset from AMASS
|   |   |   β”œβ”€β”€ Female1General_c3d        # Sub-folders for each subject
|   |   |   |   β”œβ”€β”€ A1 - Stand_poses.npz  # Each recording is a npz file
|   |   |   |   └── ...
|   |   |   └── ...
|   |   β”œβ”€β”€ BMLhandball                   # Another sub-dataset (same structure)
|   |   |   β”œβ”€β”€ S01_Expert                # Subject sub-folder
|   |   |   └── ...
|   |   └── ...
|   └── models                            # Folder for STAR model (and maybe others)
|       └── star                          # The downloaded model
|           β”œβ”€β”€ female.npz
|           β”œβ”€β”€ male.npz
|           └── neutral.npz
β”œβ”€β”€ HuMAn                                 # This repository
|   └── ...
└── ...

πŸ’» Installing

This repository is compatible with pip. Thus, the easiest way to use it is by cloning it to a directory of your choice, then installing it as a pip package, enabling importing it inside your scripts.

git clone https://github.com/Vtn21/HuMAn
cd HuMAn
pip install -e .

🎈 Usage

After downloading and uncompressing the AMASS dataset, start by using the preprocessing script inside the scripts folder. Tweak it as necessary to create the TFRecords files, that will provide the algorithm with training and validation data.

After that, use the training script to train your model. It automatically saves training results. Call it with command-line args, "d" for "dataset" and "p" for "procedure". The "train" procedure trains the model from scratch, while "transfer" loads the previously trained universal model and fine-tunes it according to the selected dataset.

python train.py -d=universal -p=train
python train.py -d=bmlhandball -p=train
python train.py -d=bmlhandball -p=transfer
python train.py -d=mpihdm05 -p=train
python train.py -d=mpihdm05 -p=transfer

The evaluation folder contains several scripts for evaluating the trained the model from a series of different metrics. It also contains the npz2mat script, to convert npz to mat files and plot results using MATLAB. This is just personal preference, it is completely possible to use Matplotlib or other library for that purpose.

🀝 Contributing

  • Fork the repo
  • Check out a new branch based and name it to what you intend to do:
    • git checkout -b BRANCH_NAME
  • Commit your changes
    • Please provide a git message that explains what you've done;
    • Commit to the forked repository.
      git commit -m "A short and relevant message"
  • Push to the branch
    • git push origin BRANCH_NAME
  • Make a pull request!

✍️ Author


Victor T. N. πŸ€–

Made with ❀️ by @Vtn21

πŸŽ‰ Acknowledgements

About

Human Motion Anticipation: an AI human motion prediction algorithm

License:MIT License


Languages

Language:Python 95.2%Language:MATLAB 4.8%