aurora-opensource / xviz

A protocol for real-time transfer and visualization of autonomy data

Home Page:http://xviz.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to play live perception data without ROS

LeZhang13 opened this issue · comments

I can success convert KITTI and NuScene data and visualize them by localport:8080 in my Linux。
But now my team want use XVIZ and streetscape.gl to visualize our perception live data , current we not use ROS , is there any method to do live convert and live visualize perception data by python script? We will let our perception output data more like KITTI or NuScene data.
We will use a IPC to do the perception algorithm by python in Linux environment, and try to use AVS to do the visualize in same environment.
Hope for your feedback.
Thanks!
@twojtasz

The messaging over a websocket is how you would send live data. In the python case, you can see the websocket example here:
https://github.com/twojtasz/twojtasz.github.io/blob/master/jupyter-notebooks/XVIZ%20Simulated%20Scenario.ipynb

let me know if this works for you.

Hi, twojtasz, thanks for your feedback, I will try with my team!

@LeZhang13 Are you running on Linux or Mac? I am keep getting "update required" error even after the "yarn bootstrap" ran successfully.

I will run on Linux, but I spent my time to use nuScenes data converter now, not try live yet.