๐Sejong University Selly Project Developers in 2020๐
๊ต๋ด ์นดํ 'ํฌ๋๋ก์'๋ถํฐ AI์ผํฐ๊น์ง ์ธ๋ ๋ณดํ์ผ๋ก ์์จ์ฃผํํ๋ ๋ฐฐ๋ฌ๋ก๋ด์ ๋๋ค.
This is an Autonomous-driving robot software which drives on pavement only. First test path is pavement in front of main gate, Sejong University along with road from 'cafe Pandorosi' to 'AI center'.
์นด๋ฉ๋ผ ์ผ์๋ง์ ์ฌ์ฉํ vision ๊ธฐ๋ฐ ์์จ์ฃผํ ์๊ณ ๋ฆฌ์ฆ ๊ฐ๋ฐ.
This is an auto-driving algorithm based on vision.
- autonomous-driving-vision : Image analysis to aviod obstacle:pushpin:
- road-segmentation : Segmentation for pavement driving
- object-detection : object detection to divide moving obstacle such as car and fixed object such as tree
- VSLAM : Visual SLAM for mapping and localization to let robot know where it is
- If you use Lidar sensor, you can use cartographer for SLAM
Junuary/2020 - June/2020
- Embedded : Jetson nano, Arduino
- Vision : Python3, Jupyter Notebook, Tensorflow 2.0
- Navigation : C++, ROS, Android studio
Requirement | Description |
---|---|
ZED2 | camera sensor |
ydlidar | Lidar sensor for test |
ROS_melodic | We need melodic version of ROS, because of ubuntu |
Jetson_Nano | Main embedded, which uses ubuntu 18.04 |
python | v3.6 or higher |
opencv | v4.1.1 or higher |
tensorflow | v2.0 or higher |
git | We follow github flow |
- Motor-control : Motor control with Motor driver
- Navigation : Suggest route from current position to purpose point with GPS and SLAM
- cartographer : 2D SLAM based on LIDAR sensor
- selly_vision : Proto Image analysis to aviod obstacle
- software : lidar, rasberry camera, arduino motor test codes and Practice code about auto-driving robot
- selly_motor : arduino ROS subscriber and jetson nano ROS publisher, which controll motor
navigation- selly_service
selly_motorControl