This README file gives the necessary steps to reproduce the latest results of the tracker. More precisely, this github repository implements a version of the tracker, where the Jetson Nano sends the information (frames ..) to a remote computer before processing it on the remote computer. The processing of the tracking algorithm Goturn was tested on the Jetson Nano, but with poor performances (1-3 FPS). Which explains the choice of the remote processing which gave 6FPS as performances.
The control of the speed with the depth information of the stereo camera was not implemented here. The speed is fixed manually to the minimum.
The robot was tested with an Ethernet cable, since the wifi dongle had some driver issues. Adding the wifi module of Intel AC 8625 and the Chaogang antennas 648109 is recommended to benefit from the best processing speed.
This repository propose the raw codes, and not a ROS workspace or package to avoid any dependency issue.
Warning : this code was calibrated with the hardware of the robot, which is not disclosed in this github (refer to the report).
The remote computer used for the test was an ASUS G703VI. The hardware of the computer will influence the performances of the tracking : 6 FPS reached.
-
Ubuntu 18.04 bionic.
-
ROS melodic : ROS wiki.
- Check that the
cv_bridge
package is installed :
rospack list | grep cv_bridge
If nothing outputs, install the package with the following command: (ROS package)
sudo apt install ros-melodic-vision-opencv
- Check that the
-
Create a ROS workspace: ROS tutorial.
-
Create the package
remote_tracker
:
Follow the instructions of the ROS tutorial, when asked to create the package add the following dependencies:
catkin_create_pkg remote_tracker sensor_msgs cv_bridge rospy std_msgs
- Virtual environment with opencv >= 3.4.2: the tracking algorithm Goturn requires a specific opencv version.
virtualenv --version # checking if you already have it installed.
# In case you do not have it : installation.
sudo apt-get update
sudo apt-get install python-virtualenv
# Setting up the python environment.
virtualenv --system-site-packages -p python2.7 ~/opencv_py_env
source ~/opencv_py_env/bin/activate
pip install numpy
pip install -U rosinstall msgpack empy defusedxml netifaces
python -V
pip install opencv-contrib-python
deactivate
The interpreter of the opencv_py_env
is then specified in the first line of the tracker.py
file.
- In the
tracker.py
change the first line specifying the interpreter of theopencv_py_env
: you need to replacearnaud
by your username. - Place the code inside the folder
remote
into the ROS packageremote_tracker
. Make sure that the python codes are set to executable.
Download the model of the Goturn algorithm: more information
- Download the model here.
- Move the
goturn.caffemodel
andgoturn.protoxt
files to the folder where thetracker.py
file is located. Otherwise, the code won't be able to execute.
Make sure that the hardware of the robot is installed as presented in the hardware architecture of the report. If you power the Jetson Nano through micro-usb, set the powermode to 5W:
sudo nvpmodel -m 1
If you power the device through the barrel jacket or anything else than the micro-usb (that provides over 10W):
sudo nvpmodel -m 0
Note : For the setup of the robot you need a keyboard, screen, and a mouse.
For the software:
-
Download the Jetson Nano image (official link).
-
ROS melodic : follow the same steps as presented above. Or follow the steps of this tutorial. Should you choose the latter approach, enter the official apt ROS key (check this link), not the one presented in the tutorial, and install the ZED SDK before starting to install the ZED ROS Wrapper. Several following steps would be already completed.
-
ROS workspace.
-
Create the ROS package
robot_tracker
: follow the same steps as above, just replace the name of the package by the right one. -
Place the codes inside the
robot
folder in the ROS packagerobot_tracker
. Make sure that the codes are set to executable. -
Change the python interpreter to a python 3 interpreter if the one specified does not work. You can find them in the
/usr/bin
folder which contains all the compiled applications of the computer. -
Install some dependencies to be able to use python3 with ROS: for more information.
sudo apt-get install python3-pip python3-yaml
sudo pip3 install rospkg catkin_pkg
- Install the ZED SDK and the ZED ROS Wrapper (cf. documentation)
- Edit the parameters of the ZED camera :
sudo gedit $(rospack find zed_wrapper)/params/common.yaml # opening the parameter file.
Change the resolution
field to 720p
or VGA
, to increase the tracking rate.
- Install a swapfile. Make sure you have an SD card with a capacity higher than 32GB (64GB recommended).
git clone https://github.com/JetsonHacksNano/installSwapfile
cd installSwapfile
chmod +x installSwapfile.sh
./installSwapfile.sh
- Install the driver of the Adafruit shield PC9685: a jetsonhacks project did this before. Their repository is cloned and the right file is executed; then the permission for the GPIOs is set.
mkdir ~/adafruit # creating a file to store the cloned files.
cd ~/adafruit
git clone https://github.com/JetsonHacksNano/ServoKit
cd ServoKit
./installServoKit.sh
You need to reboot the Jetson Nano after. For the references please check the project article and the github repository.
- Connect the remote computer to a network through Wifi or Ethernet, and connect the Jetson Nano to the same network through Ethernet or with a Wifi dongle that was tested beforehand. The bandwidth of the wifi dongle will directly impact the tracking rate.
- On each computer check the ipv4 address in the network. Use the following command:
ifconfig
Check for the running
parameter, which indicates the running networks : usually two, the localhost
and the shared network.
- Test the connection:
ping <ip_robot> # on the remote computer
ping <ip_remote_computer> # on the robot
- If each computer can see each other:
You can remove the keyboard, mouse ... on the robot. Make sure the batteries are well installed, that the robot is not plugged to the wall or any other fixed power source. Make sure the ESC is on, otherwise the robot won't move.
On the remote computer:
/!\ At this stage, the robot should be ready for take off.
- Launch the ROS master in one bash terminal:
roscore
- Open another terminal:
Go to the ROS package
remote_tracker
folder. Go to the folder where the codetracker.py
. Execute the command:
ROS_HOME='pwd' # sets the current folder as the folder of execution for each node.
roslaunch remote_tracker remote.launch
- Reaching the robot via SSH, and launching the nodes on the robot:
Open another terminal.
ssh -X <user_on_robot>@<ip_robot>
# you will be prompted to enter a password.
# Reach the ROS master.
export ROS_MASTER_URI=http://<ip_remote_computer>:11311/ # connection to master.
export ROS_IP=<ip_robot>
roslaunch robot_tracker robot.launch
Should you want to stop the robot, switch off the ESC manually and then kill the nodes, or enter 0 in the terminal of the robot (the previous one).