I'm building a visualisation tool for live music performances. The app generates video grids containing one video for every element in the song. For elements that are pre-recorded the videos will be pre-recorded. For elements that are played live, the videos will be taken from webcams/capture cards/live streams.
This video explains the function of the app in detail.
This app is being developed live on weekly Youtube live-streams. They are announced ahead of time on my Youtube channel. You can also find all the previous streams on the C++ Real-time video processing playlist
This app has 4 external dependencies:
- CMake to build the project,
- pkg-config to locate shared libraries,
- FFmpeg to read and write video files.
On MacOS using Homebrew, run:
brew install cmake ffmpeg pkg-config
git clone https://github.com/bartjoyce/video-app --recursive
The repository includes a submodule for GLFW, so make sure to recursively clone the repo. If CMake complains and says it can't find GLFW, it might be because you did not clone it properly. In that case, try:
git submodule update --init
Inside the repo, create a build directory and run CMake within it:
mkdir build
cd build
cmake ..
make
./video-app
For webcam capture:
git checkout test/webcam
cd build
cmake ..
make
./video-app
This will use AVFoundation to display your webcam on the OpenGL surface.
- Fix
av_err2str
build problem in gcc (see issue). - Fix
YUVJ
deprecation problem - Fix
sws_scale()
segmentation fault on gcc due to badly constructed output buffers - Consider switch to SDL?
- Replace
sws_scale()
with hardware-accelerated alternative - Audio playback
- Playback of multiple videos simultaneously