uzh-rpg / vilib

CUDA Visual Library by RPG

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to make a live webcam test

feketerigo96 opened this issue · comments

I want to test vilib on my mobile robot ,but the test binary is based on Euroc dataset. Can you tell me how to make a live webcam test?

Hi,
I hope that a general guidance will suffice here:

  1. Compile vilib into a shared library with make solib.

  2. Create a normal executable (that you link with our shared library), in which:
    a) You grab your camera's frame: depending on your camera, you should be able to achieve this with v4l2, or your custom driver - there are many examples online showing you how to achieve this.
    b) As the algorithms presented in the paper tackle grayscale images - you need your image in grayscale representation, hence you might need an YUYV/RGB(A) to Grayscale conversion if this is not the case
    c) As you did not really specify whether you want to do feature detection or tracking, I'd advise you to consult how the feature detection/tracking is initialized and follow:

Once you have your initial version running, you could try loading the image directly into the 0th level of the image pyramid in device memory to omit the host > device transfer.

Thanks,that helps a lot.