ROS package containing modular's simulation scripts and launch files
A ready-to-use Docker container is provided, and it can be executed with .docker/run-docker.bash
. Upon
first execution, a lot of data might be downloaded. The container can be used to follow the rest
of this readme.
To update the image to the latest version
docker pull arturolaurenzi/concert_description
To locally build the image
.docker/build-docker.bash [--no-cache]
- ROS (desktop full is recommended,
moveit-core
) - XBot2 binaries (see here for instructions)
- The modular Python3 package (will be installed by forest)
In addition to using Docker, you can setup concert_description using forest.
- Install forest:
[sudo] pip3 install hhcm-forest
- Create a forest workspace. We are going to call it concert_ws for the sake of this example:
mkdir concert_ws && cd concert_ws
- Initialize the forest workspace and add recipes:
forest init
source setup.bash
echo "source $PWD/setup.bash" >> /home/USER/.bashrc
forest add-recipes git@github.com:advrhumanoids/multidof_recipes.git --tag master
Where you should substitute USER with your username.
Optional: If you don't have any ssh key set up in your system run also:
export HHCM_FOREST_CLONE_DEFAULT_PROTO=https
and consider adding it to the .bashrc
- Finally, just run:
forest grow concert_description
which will clone this repo and install the modular package.
If you have the XBot2 binaries installed you are ready to simulate the CONCERT robot!
P.S. If you want to run also this IK example remember to also run:
forest grow centauro_cartesio -j 4
mon launch concert_gazebo concert.launch [rviz:=true]
Note: For selecting to simulate sensors or not, the launch file accepts also a series of additional arguments. For example to run a simulation that will load also the gazebo plugins for the Realsense cameras and the Velodyne lidars run:
mon launch concert_gazebo concert.launch realsense:=true velodyne:=true
You'll need to have the proper dependencies installed in your setup in order for sensor simulation to work. See the forest recipe for this package.
xbot2-gui
rosservice call /xbotcore/homing/switch 1
or click Start on the GUI, next to the homing label.
rosservice call /xbotcore/ros_ctrl/switch 1
or click Start on the GUI, next to the ros_ctrl label. NOTE: you must not publish messages on the /xbotcore/command
topic when starting this module!
Messages published on the /xbotcore/command
topic are now forwarded to the simulator. This can be done (for debugging purposes) also via the GUI's sliders.
First, make sure that the ros_ctrl module is enable, and that the robot arm is not in a singular configuration (e.g., run the homing module once). Then, invoke the following launch file
mon launch concert_cartesio concert.launch xbot:=true gui:=true
Then, right-click on the interactive marker, and select Continuous Ctrl. Move the marker around, and see the resulting motion in Gazebo.
Note that this last part requires additional dependencies (see also setup-docker.bash
), that can be installed via the hhcm-forest tool. Follow instructions from here and then invoke
forest grow centauro_cartesio
Note to control the base in velocity mode (i.e., via geometry_msgs/TwistStamped
messages), you must first invoke the following ROS service:
rosservice call /cartesian/base_link/set_control_mode velocity
Upon succesful return, you can move the base by continuously sending velocity commands to the topic /cartesian/base_link/velocity_reference
; note that the msg.header.frame_id
field of the published messages can be usefully set to base_link
in order to have the commanded twist interpreted w.r.t. the local frame.
The robot API: https://advrhumanoids.github.io/XBotInterface/
XBot2: https://advrhumanoids.github.io/xbot2/ , https://github.com/ADVRHumanoids/xbot2_examples
CartesIO: https://advrhumanoids.github.io/CartesianInterface/