Tim-HW / HW-BlueRov2-Sonar-based-SLAM

This project will evaluate simultaneous localisation and mapping (SLAM) algorithms for fusing sonar with DVL and IMU to produce maps for autonomous underwater vehicle (AUV) navigation for underwater ROV

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

BlueRov2 SLAM

deprecated

Recent approaches to robot localisation in complex environments have been data intensive,for example detailed LIDAR-derived maps have been used in self-driving cars. In the under-water domain, the ability to gather or exploit equivalent data is more limited, due to both, the diffculty of access and the characteristics of the sensors typically deployed. Most of the workdone around offshore energy installations use remotely operated vehicles (ROVs) fitted withvideo cameras and sonars. The video data is used for close-range navigation of several metresas well as to inspect a system/product. Sonars are used as a pilot navigation aid over longerranges where the use of video is not practical. this project will evaluate simultaneous localisa-tion and mapping (SLAM) algorithms for fusing sonar with DVL and IMU to produce maps forautonomous underwater vehicle (AUV) navigation or underwater ROV. Using a combinationof real-world and simulated data, the aim is to evaluate different SLAM performance by: Extracting range data from sonar images, DVL and IMU;•Building depth maps and converting into a 2D reconstruction of a broader scene; thee resulting 2D maps from different algorithms will be assessed with reference to the simu-lation environment or known features within the context of real world data

The entire implementation will be done using ROS kinetic framework with UUV simulation as environment. The model is simulated using Desistek SAGA but you can use your own model as well. We modified the sonar to be se same as the Micron Sonar from Tritech. The sonar based SLAM is acheived using an ICP coupled with a Kalman Filter. The purpose was to implemente a robust sonar-based SLAM using only the simulator.

(more infromation available at the end of this page)

Video presentation

Related Packages

Required: Ubuntu 16.04 with ROS Kinetic (http://wiki.ros.org/kinetic). Also, the following packages are required:

To install every packages needed run the commands:

$ cd ~/catkin_ws/src
$ git clone https://github.com/uuvsimulator/uuv_simulator
$ git clone https://github.com/uuvsimulator/desistek_saga.git
$ git clone https://github.com/pal-robotics-forks/point_cloud_converter.git
$ git clone https://github.com/fada-catec/amcl3d.git
$ cd ~/catkin_ws
$ catkin_make

Installation

$ cd ~/catkin_ws/src
$ git clone https://github.com/Bluerov2/MASTER.git
$ cd ~/catkin_ws
$ catkin_make # or <catkin build>, if you are using catkin_tools

Add the DVL

The original Robot doesn't have any DLV installed. Thus, we need to provide one :

Run the following command:

$ roscd desistek_saga_description/urdf/
$ sudo gedit desistek_saga_sensors.xacro 

then add the following command:

<!-- DVL  -->
<xacro:default_dvl_macro
  namespace="${namespace}"
  parent_link="${namespace}/base_link"
  inertial_reference_frame="${inertial_reference_frame}">
  <origin xyz="0 0 0" rpy="0 ${0.5*pi} 0"/>
</xacro:default_dvl_macro>

Mechanical Sonar

  • you will need to change some URDF code from the ROV to allow a 360° vision. for this run the following commands:
$ roscd uuv_simulator
$ cd ..
$ cd uuv_sensor_plugins/uuv_sensor_ros_plugins/urdf
$ sudo gedit sonar_snippets.xacro
  • and the following sonar description :
  <xacro:macro name="micron_sonar" params="namespace parent_link *origin">
      <xacro:multibeam_sonar
        namespace="${namespace}"
        suffix=""
        parent_link="${parent_link}"
        topic="sonar"
        mass="0.02"
        update_rate="15"
        samples="396"
        fov="6.3"
        range_min="0.3"
        range_max="75"
        range_stddev="0.027"
        mesh="">
        <inertia ixx="0.00001" ixy="0.0" ixz="0.0" iyy="0.00001" iyz="0.0" izz="0.00001" />
        <xacro:insert_block name="origin" />
        <visual>
          <geometry>
            <mesh filename="file://$(find uuv_sensor_ros_plugins)/meshes/p900.dae" scale="1 1 1"/>
          </geometry>
        </visual>
      </xacro:multibeam_sonar>
    </xacro:macro>
  • Then we will need to change the sonar called in the URDF of the desitek sage :

  • go here : desistek_saga/desistek_saga_description/urdfcand

  • open : desistek_saga_sensors.xacro

  • remove the orignial forward sonar and repace it by :

  <!-- MSIS sonar sensor -->
  <xacro:micron_sonar namespace="${namespace}" parent_link="${namespace}/base_link">
    <origin xyz="0.0 0 -0.1" rpy="0 0 0"/>
  </xacro:micron_sonar>

sonar2

Launch

To launch the Simulation using UUV simulation, playground and robotmodel:

$ roslaunch sonar_mapping sim.launch

To launch a rosbag of the real sonar in a square tank :

$ roslaunch sonar_mapping bag.launch

List of Task

  • IMU & DLV fused (/odom)
  • Scan-matching using ICP
  • Kalman Filter (Localisation)
  • Mapping
  • SLAM

Thesis

https://github.com/Tim-HW/Tim-HW-BlueRov2_Sonar_based_SLAM-/blob/master/Heriot_Watt_University__HWU__CS_Masters_thesis_Sonar_based_SLAM_for_underwater_ROV.pdf

About

This project will evaluate simultaneous localisation and mapping (SLAM) algorithms for fusing sonar with DVL and IMU to produce maps for autonomous underwater vehicle (AUV) navigation for underwater ROV


Languages

Language:Python 93.0%Language:CMake 7.0%