The Musical Gestures Toolbox for Python is a collection of tools for video visualization and video analysis.
Videos can be used to develop new visualisations to be used for analysis. The aim of creating such alternate displays from video recordings is to uncover features, structures and similarities within the material itself, and in relation to, for example, score material. Three useful visualisation techniques here are motion images, motion history images and motiongrams.
MGT can generate both dynamic and static visualizations, as well as some quantitative data:
- dynamic visualisations (video files)
- motion videos
- motion history videos
- static visualisations (images)
- motion average images
- motiongrams
- videograms
- motion data (csv files)
- quantity of motion
- centroid of motion
- area of motion
Watch 10-minute intro video to the toolbox:
The standard installation via pip: paste and execute the following code in the Terminal (OSX, Linux) or the PowerShell (Windows):
pip install musicalgestures
MGT is developed in Python 3 and relies on FFmpeg and OpenCV. See the wiki for more details on the installation process.
The Jupyter notebook MotionGesturesToolbox.ipynb shows examples of the usage of the toolbox.
This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max.
The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.
If you use this toolbox in your research, please cite this article:
- Laczkó, B., & Jensenius, A. R. (2021). Reflections on the Development of the Musical Gestures Toolbox for Python. Proceedings of the Nordic Sound and Music Computing Conference.
Developers: Frida Furmyr, Marcus Widmer, Balint Laczko, Alexander Refsum Jensenius
This toolbox is released under the GNU General Public License 3.0 license.