Smithangshu / SLAMPy-Monocular-SLAM-implementation-in-Python

Pythonic implementation of an ORB feature matching based Monocular-vision SLAM.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

SLAMPy-Monocular-SLAM-implementation-in-Python

Pythonic implementation of an ORB feature matching based Monocular-vision SLAM.

Contributors Forks Stargazers Issues MIT License LinkedIn

logo

Table of Contents

About The Project

Product Name Screen Shot

Simultaneous Localization and Mapping (SLAM) has been there for quite a while, but it has gained much popularity with the recent advent of Autonomous Navigation and self-driving cars. SLAM is like a perception that aids a robot/device to find it's relative position in an unknown environment. Applications of which extend from Augmented Reality, virtual reality, indoor navigation and Autonomous vehicles.

Built With

This section should list any major frameworks that you built your project using. Leave any add-ons/plugins for the acknowledgements section. Here are a few examples.

Getting Started

The application begins with calibrating the camera and setting the camera intrinsic for optimization. It makes use of OpenCV's ORB feature mapping function for key-point extraction. Lowe's ratio test is used for mapping the key-points. Each detected key-point from the image at '(t-1)' interval is matched with a number of key-points from the 't' interval image. The key-points with the least distance is kept based on the several generated. Lowe's test checks that the two distances are sufficiently different. If they are not, then the key-point is eliminated and will not be used for further calculations. For 2D video visualization, I had a couple of choices: OpenCV, SDL2, PyGame, Kivy, Matplotlib, etc. Turns out OpenCV's imshow function might not be the best choice. The application made use of SDL2, matplolib and kivy's video playing libraries but PyGame was outperformed all of them. Thus, I used PyGame for visualizing the detected keypoints and various other information such as orientation, direction and speed.

ezgif com-gif-maker (2)

For 3D visualization, Pangolin was the best option due to various reasons such as:

  • Supports python and it's opensource!
  • Uses simple OpenGL at its fundamental form
  • Provides Modularized 3D visualization For implementing a graph-based non-linear error function, the project leverages the python wrapper of G2O library. G2O is an open-source optimization library that helps reduce the Gaussian Noise from nonlinear least squares problems such as SLAM.

Prerequisites

This is an example of how to list things you need to use the software and how to install them.

  • OpenCV 4
pip3 install opencv-python
  • PyGame
python3 -m pip install -U pygame --user
  • NumPy
pip3 install numpy

Installation

  1. Clone the repository:
git clone https://github.com/Akbonline/SLAMPy-Monocular-SLAM-implementation-in-Python.git
  1. Running the algorithm on a video
python3 slam.py <test-video.mp4>

Usage

There is one test video included in the repo.

  1. To run the test video
python3 slam.py test.mp4

The output should look something like this:

ezgif com-gif-maker (3)

Contributing

Contributions are what make the open source community such an amazing place to be learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

Contact

<iframe width="560" height="315" src="https://www.youtube.com/embed/JUOY5DrO8R8" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

Acknowledgements

About

Pythonic implementation of an ORB feature matching based Monocular-vision SLAM.


Languages

Language:Python 100.0%