fmartinmonier / artefact_mapping

Detection, tracking, and mapping of object artefacts

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

artefact_mapping

Detection, tracking, and mapping of object artefacts

Install

ROS and system dependencies

sudo apt install software-properties-common
sudo add-apt-repository "deb http://packages.ros.org/ros/ubuntu noetic main"
wget https://raw.githubusercontent.com/ros/rosdistro/master/ros.key -O - | sudo apt-key add -
sudo apt update
sudo apt install ros-noetic-desktop-full "ros-noetic-tf2-*" "ros-noetic-camera-info-manager*" --yes

sudo rosdep init
rosdep update
echo ". /opt/ros/noetic/setup.bash" >> ~/.bashrc
source ~/.bashrc

sudo apt install autotools-dev doxygen dh-autoreconf git git-lfs liblapack-dev libblas-dev libgtest-dev \
libreadline-dev libssh2-1-dev clang-format-6.0 python3-autopep8 python3-catkin-tools python3-pip python3-git \
python-setuptools python3-termcolor python3-wstool libatlas3-base python-is-python3 --yes

pip install -U requests

sudo apt install -y ccache &&\
echo 'export PATH="/usr/lib/ccache:$PATH"' | tee -a ~/.bashrc &&\
source ~/.bashrc && echo $PATH
ccache --max-size=10G

Setup catkin workspace

mkdir -p artefact_mapping_ws/src
cd artefact_mapping_ws
catkin init
catkin config --merge-devel # Necessary for catkin_tools >= 0.4.
catkin config --extend /opt/ros/noetic
catkin config --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo -DCMAKE_CXX_FLAGS=-fdiagnostics-color
cd src

Clone and build

git clone https://github.com/ethz-asl/artefact_mapping.git --recursive -b summer_school2021
catkin build artefact_mapping

Running the node

Source the workspace

source ~/artefact_mapping_ws/devel/setup.bash

Start mapping artefacts

Adapt the smb name in the launch file below to the correct SMB number and run to start the object mapping node.

roslaunch artefact_mapping artefact_mapping.launch

Detections will be published on the W_landmark topic in the odometry frame. Other parameters such as the ones listed below can also be added and adapted in the launch file.

Flag Default Description
object_tracker_image_topic /camera/color/image_raw ROS image topic on which to perform detection and tracking
image_topic_buffer_size 200 Buffer size of ROS topic listener for incoming images
sensor_calibration_file share/camchain.yaml Camera calibration file which is used for retrieving the intrinsics
object_tracker_detection_period 20 Periodicity with which to run object detection (yolo). Frames in between will be tracked using a faster method to obtain object locations (kcf). With the default value the detector is run every 20th frame.
darknet_cfg_path share/yolov3.cfg Path to network configuration file (the tiny version will run on a CPU but will produce less good and accurate detections). See the share folder for options
darknet_weights_path share/yolov3.weights Path to network weights, must match the cfg file. See the share folder for options
darknet_classes 0 Comma separated list without spaces to define the tracked object classes. The association between object name and number can be found here (the numbering starts from 0)
darknet_detection_threshold 0.4 Detection confidence threshold at which to start tracking an object
darknet_nms_threshold 0.45 Non maxima supression threshold, used in the elimination of duplicate detections
tracker_confidence_threshold 0.8 Confidence threshold at which the tracker still considers it is following a valid object, if the confidence drops below the threshold the track is terminated and the object pose triangulated
track_reassociation_iou 0.3 If a new detection and an existing track have an intersection over union (IoU) overlap above this threshold they are merged instead of creating a new separate track
object_tracker_pose_buffer_length 600 TF buffer length in seconds, in other words the time by which the images have to be processed or else they will no longer have a valid odometry transform associated
sensor_tf_frame /blackfly_right_optical_link Camera TF frame for triangulation
odom_tf_frame /odom Odometry TF frame for triangulation
publish_debug_images false Publish a debug topic /artefact_mapping/debug_image showing the detections and their tracking in real time
v 0 Verbosity output of node, increasing to 1 or 2 will give more debug messages and information on what is being detected, tracked and published

Running on SMB

Preliminary steps

  1. Make sur the right arguments are passed to the launch file in Artefact_Mapping/artefact_mapping/launch/artefact_mapping.launch. Thos should be:
    1. smb_number = 264
    2. object_tracker_image_topic = /versavis/cam0/image_raw
    3. darknet_cfg_path = $(find artefact_mapping)/share/yolov3.cfg
    4. darknet_weight_path = (find artefact_mapping)/share/yolov3.weights
    5. darknet_classes -> check with judges for correct artefacts
    6. odom_tf_frame = /tracking_camera_odom
    7. map_tf_frame = /map$
    8. darknet_detection_threshold = 0.2
    9. image_topic_buffer = 1000
    10. Save the launch file

Deployment

  1. Open 4 terminals
  2. In terminal 1:
    1. $ ssh team8@10.0.4.5
    2. Password: smb
  3. In terminal 2:
    1. $ ssh team8@10.0.4.5
    2. Password: smb
    3. $ cd Artefact_Mapping
    4. $ source devel/setup.bash
    5. $ roslaunch artefact_mapping artefact_mapping.launch
  4. In terminal 3:
    1. $ hostname -I -> Rememeber the hostname IP address
    2. $ export ROS_MASTER_URI=http://10.0.4.5:11311
    3. $ export ROS_IP=10.0.4.130 (or whatever hostname -I outputted)
    4. $ rqt_image_view
    5. In rqt_image_view, select topic /versavis/cam0/image_raw. Make sure the camera feed from SMB264 is being broadcasted. You can now see what the robot is seeing
  5. In terminal 4:
    1. $ export ROS_MASTER_URI=http://10.0.4.5:11311
    2. $ export ROS_IP=10.0.4.130
    3. rosbag record -O competition.bag /tf /tf_static /versavis/cam0/image_raw /artefact_mapping/image_debug
    4. A recording the competition will be available is the home directory of this base station computer
  6. It is important to keep connection to the robot during the run. Make sure connection is reliable!
  7. Once the run is done, kill the tasks in terminal 2,3 and 4
  8. Copy the artifacts.csv file from SMB to the base station computer. In terminal 2:
    1. $ scp smb264@10.0.4.5:/tmp/artifacts.csv yoruseer@10.0.4.130:/home/yoruseer. This should copy the rosbag from the robot to the base station computer

Post processing steps

  1. From now on, no need for any SSH connection to the robot. You should have the artifacts.csv and the competition.bag files on the base station computer
  2. Open artifacts.csv in your favorite spreadsheet software.
  3. In a terminal, run $ rosbag play competition.bag -p. If the rosbag lasts more the 10 minutes, you can speed it up by running $ rosbag play competition.bag -p -r1.5 where 1.5 is the relative speedup (to be adapted depending on how long the recording is)
  4. Run $ rqt_image_view. Select the topic /artefact_mapping/image_debug.
  5. Visible on your screen should be:
    1. The *csv file
    2. rqt_image_view
    3. The terminal window with the rosbag running
  6. Click on the terminal window. Press the space bar to launch the rosbag
  7. Many false positives are to be expected in the detections. When a true positive occurs, highlight it in the csv file. Make sure the timestamp corresponds to the timestamp shown on the terminal where the rosbag is played. If things are going to quickly to appropriatly highlight the true positives, feel free to pause the rosbag playing by pressing the space bar
  8. Once the rosbag fully played, delete all non highlighted detections. Handover this *csv file to the judges.

About

Detection, tracking, and mapping of object artefacts


Languages

Language:C++ 93.8%Language:CMake 5.2%Language:C 1.0%