Orienfish / shittyrobot

A robot for 2-D map construction using stereo camera.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Shittyrobot

A Remote-Controlled Robot Vehicle for 2-D Map Reconstruction. Built in UCSD CSE237A, based on Raspberry Pi 3B+. Collaborator: Michael Liu.

This tutorial will walk you through how to build a robot vehicle from scratch, and share some of our experiences.

Introduction

Indoor map reconstruction is the first step for any location-based services. In this project, we tried to build a robot vehicle which travels around the room under remote control and measures distance with stereo camera and ultrasonic sensor. We select stereo camera because it contains high dimensional information thus is theoretically more suitable for environment reconstruction. Ultrasonic sensor is used to compensate when stereo camera doesn't work well - when there's a wall locating closely in front of the robot.

However, to be honest, the final system does not perform well. An important lesson I learned here is: sensor selection is the first and foremost step in embedded system implementation! Here are the possible reasons and my suggestions if you plan to work on similar projects:

  • Cheap stereo cameras (ELP Dual Lens Camera) are hard to calibrate and generate poor disparity maps. If you have enough budget, I would recommend using Intel's RealSense Camera or depth camera. Their well-developed tools will not only save your time but give you better output.
  • Cheap ultrasonic sensors (HC-SR04) perform poorly. They are good if you want to detect a wall with a distance less than 1m (although the standard range is 4m, it does not perform stablely when the distance is larger than 1m). However, the signals may bounce around in the room thus perform badly in a more complex environment, e.g. trying to detect a box. What's more, bear in mind that multiple ultrasonic sensors or multiple emissions may interfere with each other, too. If you insist on these distance-based sensors, I would suggest using the more promising LiDAR sensor.

Hardware List

  • Adafruit (PID 3244) Mini 3-Layer Round Robot Chassis Kit - 2WD with DC Motors. Link
  • Adafruit DC & Stepper Motor HAT for Raspberry Pi - Mini Kit. Link
  • 4 * 1.5V AA Battery Case Holder. Link
  • MPU-6050 3-Axis Accelerometer and Gyroscope, using I2C/SPI. Link
  • HC-SR04 Ultrasonic Sensor. Link
  • ELP Dual Lens Stereo Camera. ELP-960P2CAM-V90-VC. Link

Software List

  • Qt for Python (qt-dev, vis.py).
  • Stereo camera calibration and depth calculations (stereo).
  • Adafruit motor kit driver (car.py).
  • A path tracking module based on mpu6050 (sensor.py)
  • A distance detection module based on HC-SR04 (sonic.py)

How to assemble a robot vehicle from scratch

  1. Assemble robot kit and drive motors. This could be quickly done by following Adafruit's wonderful tutorial. Check our code in car.py.
2. Connect MPU-6050 to Raspberry Pi through I2C. Then install `python3-smbus` dependencies:
sudo apt install python3-smbus

Then install mpu6050-raspberrypi package from Pypi repository:

pip3 install mpu6050-raspberrypi

Finally you can get the accelerometer and gyroscope's data by:

from mpu6050 import mpu6050
sensor = mpu6050(0x68)
accelerometer_data = sensor.get_accel_data()
gyroscope_data = sensor.get_gyro_data()

Note:

  • Basically, this package can help you read raw data from MPU-6050 and set range. The available range of accelerometer is +-2G, +-4G, +-8G, +-16G while the range gyroscope is +-250, +-500, +-1000, +-2000 degrees per second. It's important to set the proper measurement range.
  • In this repository, we implement the distance calculation using integral of acceleration and speed. It's most important to figure the geometry projection right. Check sensor.py.

You can check the tutorial for package install. The datasheet of MPU-6050 can be found here.

  1. Connect ultrasonic sensor to Raspberry Pi through GPIO. Notice that you need to use extra resistors as the output voltage of ultrasonic sensor is 5V but the maximum acceptable voltage of GPIO is 3.3V. You can measure the pulse width using two simple loops:
while GPIO.input(ECHO)==0:
    pulse_start = time.time()
while GPIO.input(ECHO)==1:
    pulse_end = time.time()  

This is the basic framework and some exception handlings are needed. Check the tutorial and our code in sonic.py.

  1. Install OpenCV on Raspberry Pi and perform image processing as in stereo/stereo_display.py.
    You can follow this tutorial to build an optimized version of OpenCV with NEON and VFPV3 enabled. My experience is that this could improve the speed by around 50%!

Results

This picture shows the distance measurements from ultrasonic sensor. As you can see, ultraosonic sensor works pretty well on simple desk demo and straight wall. But it performs poorly in complex environments.
Currently, the time for one MPU-6050 measurement is ~6ms, while for ultrasonic sensor is ~30ms because we made a coupld of measurement and took the average. Before optimization, computing depth for one image costs ~1.6s. After enabling NEON and VFPV3, the time needed is ~0.6s.

Future Improvements

A lot of potential improvements could be done, including:

  • Using better and more expensive sensors
  • Improve the depth computation algorithm, extract useful information
  • Implement automatic environment exploration

Resources

You can check our slides. Welcome to UCSD's CSE237A!

About

A robot for 2-D map construction using stereo camera.


Languages

Language:Python 100.0%