Inspired by the arcade games Dance Dance Revolution and Just Dance, our web-based project allows anyone to input any song they wish (using YouTube), and our deep learning (DL) model will generate dance moves for them to follow. Not only does the game take the song as input, but it also processes a live video stream of the user dancing, using another DL model to detect their pose, with the goal of creating an interactive game, where they imitate the AI-generated dance moves or challenge friends to score points.
docker build -t openpose -f openpose-Dockerfile .
docker run opepose
docker build -t hyw/openpose:v0 Dockerfile .
Please note our initial respository is at this link. We removed all the heavy files to keep the repository lightweight.
- Ubuntu or similar (we used Pop! OS)
- NVidia GPU (we used RTX 2060)
- 1080p webcam
- Learning2Dance.
- CUDA toolkit version 11.1.1
- cuDNN version 8.1.0
- nvidia-docker
- Build the Learning2Eance docker image and place the Learning2Dance folder inside the HYW directory
- Connect the webcam via USB to your machine
- Run this command from inside the HYW directory:
sudo nvidia-docker run -u root -it --device=/dev/video0 --device=/dev/video1 --network host -v $(pwd):/workspace hyw/openpose:v0
- Run
python3 main.py
- Enjoy playing and dancing :)