mattwilliamson / fennee

Fennee quadruped hobby robot based on Champ

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Fennee Quadruped Robot

Foundation for Experimental Neural Network Endeavors

Printing Video

Printing Fennee

Walking video

Fennee on the Run


Hardware

Check the Hardware & Parts Details docs

Software

Install Fennee Packages

First Install ROS Noetic

I'd like to switch to ROS2 when CHAMP finishes porting over

# Source your ROS installation
source /opt/ros/noetic/setup.bash

# CD to your workspace source, e.g. ~/ros_ws and clone the repo
mkdir -p ~/ros_ws/src
cd ~/ros_ws/src
git clone https://github.com/mattwilliamson/fennee.git
vcs import < fennee/fennee_ros/fennee.rosinstall --recursive

# CD to your workspace and build
cd ..
rosdep install --from-paths src --ignore-src -r -y
catkin_make

# Source your build
source devel/setup.bash

Running

TODO: Need to specify dependencies correctly

sudo pip3 install -t /opt/ros/noetic/lib/python3/dist-packages adafruit_blinka adafruit-circuitpython-servokit pyyaml

Simulator

source devel/setup.bash
roslaunch fennee_config bringup.launch rviz:=true
source devel/setup.bash
roslaunch champ_teleop teleop.launch joy:=true

With a PS4/DS4 controller:

Input Output
Left Stick Walk/Yaw
Right Stick Pitch/Roll
L1 (hold) Holonomic (strafing)
R2 Sit

Jetson Nano

ssh user@jetson
mkdir -p ~/ros_ws/src
cd ~/ros_ws/src

git clone https://github.com/mattwilliamson/fennee.git
cd fennee

sudo docker build -t fennee .
sudo docker run -it --rm --runtime nvidia --network host fennee
sudo docker run -it --rm --runtime nvidia -v $HOME/ros_ws/src/fennee:/fennee_ws/src/fennee --network host fennee

# Source your build
source devel/setup.bash
roslaunch fennee_config bringup.launch hardware_connected:=true

Hardware

roslaunch fennee_config bringup.launch hardware_connected:=true

roslaunch champ_teleop teleop.launch joy:=true

rostopic echo joint_states
roslaunch fennee_config bringup.launch hardware_connected:=true

Calibration

Start up a terminal GUI to set the servo offsets to a yaml file

roslaunch fennee_control servo_interface.launch calibrate:=true
rostopic echo servo_duty_cycles

Put duty cycles into fennee_control/scripts/servo_interface.py pwm_map.

fennee_config/config/gait/gait.yaml has com_x_translation to shift the center of balance from the front to the back. For example -0.01 if the robot is leaning backwards a bit, keeping it from walking.

Watching logs

GUI

rqt_console

Credits

https://github.com/chvmp https://github.com/michaelkubina/SpotMicroESP32 https://spotmicroai.readthedocs.io/en/latest/ https://www.thingiverse.com/thing:4937631

TODO

  • rewrite IMU in c++
  • Make calibration launch file and parse yaml for servo_controller
  • Calibration script docs
  • Update URDF with longer body
  • Package requirements are a mess. pip vs rosdep etc.
  • Charging connector
  • Add clips to battery clips for wires
  • Rear cover hole for voltmeter
  • Oak D Lite Camera mount
  • LED Ring sticking out under lidar?
  • Separate mount for IMU
  • Supports around holes for covers
  • Custom top and bottom cover, speakers, mic, fan
  • Lengthen chassis for more electronics room
  • Removable electronicts mount
  • Custom hip for metal servo arm
  • Custom mounting board for electronics
  • Custom nose for camera and LED ring
  • Lengthen body
  • New cover with lidar mount and fan
  • Front camera
  • IMU Publisher
  • Faster baud rate
  • Calibration config file
  • Wiring diagrams
  • 3d print instructions/list
  • Show loading screen while robot is waiting for ROS
  • LED ring
  • LIDAR
  • Relay for servos
  • Some kind of quick release for covers - removing battery is annoying
  • Speakers/mic
  • blacklist packages: ros-noetic-rosserial ros-noetic-rviz ros-noetic-gazebo-plugins

Docker

https://hub.docker.com/_/ros/

Building the image LOW_MEM is for lower memory systems, but takes longer.

sudo docker build --build-arg LOW_MEM=1 -t fennee .

ROS uses the ~/.ros/ directory for storing logs, and debugging info

docker run -v "/home/ubuntu/.ros/:/root/.ros/" -it --rm --network=host fennee
sudo docker run --rm -it -e ROS_IP=192.168.50.83 --privileged -v /dev/:/dev/ --network host fennee roslaunch fennee_config bringup.launch hardware_connected:=true

RViz/Gazebo

sudo docker run --rm -it -v /dev/:/dev/ --network host -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix --privileged depthai_ros roslaunch depthai_filters example_seg_overlay.launch

Camera publisher with overlay for object detection

sudo docker run --rm -it -e ROS_IP=192.168.50.83 --privileged -v /dev/:/dev/ --network host fennee roslaunch depthai_filters example_det2d_overlay.launch camera_model:=OAK-D-LITE

Camera publisher with stereo, RGB and IMU

sudo docker run --rm -it -e ROS_IP=192.168.50.83 --privileged -v /dev/:/dev/ --network host fennee roslaunch depthai_examples stereo_inertial_node.launch enableRviz:=false

View Camera

ROS_MASTER_URI=http://192.168.50.83:11311
rqt_image_view

Record Video to output.avi

fennee
ROS_MASTER_URI=http://192.168.50.83:11311 rosrun image_view video_recorder image:=/mobilenet_publisher/color/image
vlc output.avi

Mapping

ROS_MASTER_URI=http://192.168.50.83:11311
roslaunch fennee_config slam.launch rviz:=true  

Save map

roscd fennee_config/maps
rosrun map_server map_saver

Autonomous Navigation

roslaunch fennee_config navigate.launch rviz:=true

roslaunch champ_config slam.launch rviz:=true

roslaunch fennee_config navigate.launch

PS4 Controller

hold PS button and share button

bluetoothctl
scan on
[NEW] Device A4:AE:12:BA:36:63 Wireless Controller

pair A4:AE:12:BA:36:63
trust A4:AE:12:BA:36:63
roslaunch fennee_config bringup.launch hardware_connected:=true
# roslaunch fennee_config bringup.launch hardware_connected:=true servo_connected:=false
roslaunch champ_teleop teleop.launch joy:=true

IMU

http://docs.ros.org/en/melodic/api/robot_localization/html/preparing_sensor_data.html

sensor_msgs/Imu - The IMU message is currently subject to some ambiguity, though this is being addressed by the ROS community. Most IMUs natively report orientation data in a world-fixed frame whose axes are defined by the vectors pointing to magnetic north and the center of the earth, respectively, with the Y axis facing east (90 degrees offset from the magnetic north vector). This frame is often referred to as NED (North, East, Down). However, REP-103 specifies an ENU (East, North, Up) coordinate frame for outdoor navigation. As of this writing, robot_localization assumes an ENU frame for all IMU data, and does not work with NED frame data. This may change in the future, but for now, users should ensure that data is transformed to the ENU frame before using it with any node in robot_localization.

Measure +9.81 meters per second squared for the Z axis. If the sensor is rolled +90 degrees (left side up), the acceleration should be +9.81 meters per second squared for the Y axis. If the sensor is pitched +90 degrees (front side down), it should read -9.81 meters per second squared for the X axis.

The IMU may also be oriented on the robot in a position other than its “neutral” position. For example, the user may mount the IMU on its side, or rotate it so that it faces a direction other than the front of the robot. This offset is typically specified by a static transform from the base_link_frame parameter to the IMU message’s frame_id. The state estimation nodes in robot_localization will automatically correct for the orientation of the sensor so that its data aligns with the frame specified by the base_link_frame parameter.

About

Fennee quadruped hobby robot based on Champ


Languages

Language:Python 66.3%Language:C 9.0%Language:Dockerfile 8.7%Language:HTML 6.9%Language:C++ 4.1%Language:CMake 3.7%Language:Shell 1.2%