Skip to content

Latest commit

 

History

History
504 lines (370 loc) · 19.9 KB

tutorial-RMUS-EP.md

File metadata and controls

504 lines (370 loc) · 19.9 KB

The ROS topics and navigation system

1. Platform introduction

1.1 RoboMaster EP

RoboMasterEP

the RoboMaster EP is an autonomous vehicle platform equipped with a 4 axis palletizing robot arm and an encircling gripper for flexible gripping action, which is inspired from DJI's annual RoboMaster robotics competition.

It provides Mecanum wheel omnidirectional movement, fast wireless data link system including realtime video stream, and open sdk for further development and programming.

In order to match the theme of the competition, sim2real, the organizing committee equipped the RoboMaster EP with the external computing platform and other sensors communicates through the RoboMaster EP SDK with EP to realize the movement and grasping of the robot, complete the corresponding competition tasks, and is dedicated to connect the simulation environment with real applications.

chassis-parts-annotation

The components of the RMUS EP

hardware-diagram-RM-EP

The hardware diagram of the RMUS EP

Additional to the original RoboMaster EP design, during the RUSM sim2real challenge we equipped the EP robot with dedicated onboard lidar, RGBD camera, to get further environment information and NUC PC to extending the online computation capacity.

Type NUC11PAHI7
CPU i7-1165G7 2.8GHz * 8
RAM 8GB
SSD 256G

1.2 Sensors

Sensor Type Freq (SPS) Others
Lidar SlamTech Rplidar A2 10 8000sps, 0.2m-16m
RGB Camera Intel Realsense D435i 30 640x480 pixels, FOV 87° x 58°
Depth Camera ​ Intel Realsense D435i 30 640x480 pixels, FOV 87° x 58°
IMU HiPNUC Hi226 6-axis IMU/VRU 30 Static roll and pitch errors: 0.8°
Static roll and pitch angles, error bound: 2.5°
bias error of gyroscope: < 10°/h
Heading angle error when moving (under 6-axis mode): < 10°
Odometer RoboMaster SDK 10 -

1.3 Actuators

Actuator Recommend Range
chassis velocity control $0.1m/s\leq|v_x|\leq0.5m/s$
$0.1m/s\leq|v_y|\leq0.5m/s$
$0.01rad/s\leq|v_{th}|\leq0.5rad/s$
chassis position control $|x| \geq 0.1m$
$|y| \geq 0.1m$
$|theta| \geq 0.1rad$
arm end position control while $0.09\leq x \leq 0.18$, should keep $y\ge 0.08$
while $x&gt;0.18$, should keep $y\ge -0.02$
gripper control $x=1$ close gripper
$x=0$ open gripper

1.4 Robot Model

Tto build the static TF tree, according to the specific location of each sensors, the location of the discription file is here:

~/ep_ws/src/ep_description/urdf/ep_description.xacro

The user should test the development in the client image at first,

# static tf
$ roslaunch ep_descreption ep_description.launch

or include the part below in the launch file:

<launch>
...
  <!-- ep_description -->
  <include file="$(find ep_description)/launch/ep_description.launch"/>
...
</launch>

EP-Robot-tf_tree

EP Robot tf_tree

EP-Robot-tf_tree

Robot frame Axis

Coordinate system Description Origin according to /base_link (m)
/base_link The robot center coordinate system is located at the center of the four Mecanum wheels,
and the height is on the plane where the centers of the four Mecanum wheels are located.
X Front Y Left Z Upper
/laser_link Lidar coordinate (0.12, 0, 0.1)
/imu_link IMU coordinate (0, 0, 0.08)
/arm_link The base coordinate system of the robot arm (0, 0, 0.1)
/camera_bottom_screw_frame Realsense camera base screw positioning hole coordinate system,
as the standard for other coordinate systems of the camera
(0.15, 0, 0.0125)
/camera_link Realsense camera coodinate (0.16, 0.0175, 0.025)
/camera_depth_frame Depth coordinate of Realsense camera, sharing the same space with/camera_link (0.16, 0.0175, 0.025)
/camera_color_frame RGB coordinate of Realsense camera (0.16, 0.0325, 0.025)
/camera_aligned_depth_to_color_frame the coordinate of Realsense Depth image aligned to RGB coordinate,
sharing the same space with /camera_color_frame
(0.16, 0.0325, 0.025)

/base_link

​Robot center coordinate system, located at the center point of the 4 Mecanum wheel.

/imu_link

​IMU coordinate system, the same as base_link.

/laser_link

​2D LiDAR coordinate system, located at the centrer front of the vehicle.

2. ROS interface

The RoboMaster EP platform environment is the same as the docker provided to players, sharing the same ROS interface.

Enter the docker setup from Install guide, using the rostopic command from the cli:

sim2real@Ubuntu:~$ rostopic list -v

All the Published/Subscribed Topics is printed:

Published topics:

  • /rosout_agg [rosgraph_msgs/Log] 1 publisher
  • /rosout [rosgraph_msgs/Log] 7 publishers
  • /cmd_vel [geometry_msgs/Twist] 1 publisher
  • /arm_position [geometry_msgs/Pose] 1 publisher
  • /arm_gripper [geometry_msgs/Point] 1 publisher
  • /image_view_rgb/parameter_descriptions [dynamic_reconfigure/ConfigDescription] 1 publisher
  • /image_view_third/parameter_descriptions [dynamic_reconfigure/ConfigDescription] 1 publisher
  • /image_view_depth/parameter_descriptions [dynamic_reconfigure/ConfigDescription] 1 publisher
  • /image_view_rgb/parameter_updates [dynamic_reconfigure/Config] 1 publisher
  • /image_view_third/parameter_updates [dynamic_reconfigure/Config] 1 publisher
  • /image_view_depth/parameter_updates [dynamic_reconfigure/Config] 1 publisher
  • /image_view_depth/output [sensor_msgs/Image] 1 publisher
  • /image_view_third/output [sensor_msgs/Image] 1 publisher
  • /image_view_rgb/output [sensor_msgs/Image] 1 publisher
  • /camera/color/image_raw [sensor_msgs/Image] 1 publisher
  • /third_rgb [sensor_msgs/Image] 1 publisher
  • /camera/color/camera_info [sensor_msgs/CameraInfo] 1 publisher
  • /camera/aligned_depth_to_color/image_raw [sensor_msgs/Image] 1 publisher
  • /camera/aligned_depth_to_color/camera_info [sensor_msgs/CameraInfo] 1 publisher
  • /pointgoal_with_gps_compass [ros_x_habitat/PointGoalWithGPSCompass] 1 publisher
  • /gps [ros_x_habitat/PointGoalWithGPSCompass] 1 publisher
  • /imu/data_raw [sensor_msgs/Imu] 1 publisher
  • /tf [tf2_msgs/TFMessage] 1 publisher
  • /rplidar/scan [sensor_msgs/LaserScan] 1 publisher
  • /ep/odom [nav_msgs/Odometry] 1 publisher
  • /gripper_state [geometry_msgs/Point] 1 publisher
  • /pose/cube_1 [geometry_msgs/Pose] 1 publisher
  • /pose/cube_2 [geometry_msgs/Pose] 1 publisher
  • /pose/cube_3 [geometry_msgs/Pose] 1 publisher
  • /pose/cube_4 [geometry_msgs/Pose] 1 publisher
  • /pose/cube_5 [geometry_msgs/Pose] 1 publisher
  • /position/target_1 [geometry_msgs/Point] 1 publisher
  • /position/target_2 [geometry_msgs/Point] 1 publisher
  • /position/target_3 [geometry_msgs/Point] 1 publisher
  • /pose/ep_world [geometry_msgs/Pose] 1 publisher
  • /judgement/exchange_markers [std_msgs/String] 1 publisher
  • /judgement/markers_time [std_msgs/String] 1 publisher

Subscribed topics:

  • /rosout [rosgraph_msgs/Log] 1 subscriber
  • /camera/aligned_depth_to_color/image_raw [sensor_msgs/Image] 1 subscriber
  • /third_rgb [sensor_msgs/Image] 1 subscriber
  • /camera/color/image_raw [sensor_msgs/Image] 1 subscriber
  • /pointgoal_with_gps_compass [ros_x_habitat/PointGoalWithGPSCompass] 1 subscriber
  • /cmd_vel [geometry_msgs/Twist] 1 subscriber
  • /arm_gripper [geometry_msgs/Point] 1 subscriber
  • /arm_position [geometry_msgs/Pose] 1 subscriber
  • /cmd_position [geometry_msgs/Twist] 1 subscriber
  • /gps/goal [move_base_msgs/MoveBaseActionGoal] 1 subscriber

2.1 Sensor topics

/ep/odom (nav_msgs/Odometry) default: 30hz

Odometer data, including robot pose and speed information, are obtained by DJI master control.

/rplidar/scan (type: sensor_msgs/LaserScan, frame_id:laser_link) default: 10hz

The two-dimensional lidar scanning data, including scene scanning information, is acquired by lidar because the occlusion range of the robot body includes 270° in front of the robot.

/imu/data_raw (sensor_msgs/LaserScan) default: 30hz

IMU sensor data, including rotation, velocity, and acceleration information, are collected by the IMU.

/camera/color/camera_info type: [sensor_msgs/CameraInfo], frame_id: camera_color_optical_frame

RGB color image camera intrinsic parameter information.

/camera/color/image_raw type: [sensor_msgs/Image], frame_id: camera_color_optical_frame

RGB color image data, acquired by Realsense.

/camera/aligned_depth_to_color/camera_info type: [sensor_msgs/CameraInfo], frame_id: camera_color_optical_frame

Depth camera information.

/camera/aligned_depth_to_color/image_raw type: [sensor_msgs/Image], frame_id: camera_color_optical_frame

Depth image data, acquired by Realsense and aligned to RGB color images.

Note: The Realsense camera provides more images to choose from. We chose the depth data aligned with the color image as the default depth data provided in the habitat server environment.

2.2 Actuator topics

/cmd_vel [geometry_msgs/Twist]

Velocity command for the EP chassis, recommend range:

$$0.1m/s\leq|v_x|\leq0.35m/s$$ $$0.1m/s\leq|v_y|\leq0.35m/s$$ $$0.1rad/s\leq|v_{th}|\leq0.5rad/s$$

/cmd_position [geometry_msgs/Twist]

Position command for the EP chassis, recommend range:

$$0.1m\leq|x|$$ $$0.1m\leq|y|$$ $$0.1rad\leq|theta|$$

# Move EP forward for 0.1m
$ rostopic pub -1 /cmd_position geometry_msgs/Twist "linear:
  x: 0.1
  y: 0.0
  z: 0.0
angular:
  x: 0.0
  y: 0.0
  z: 0.0"

/arm_position [geometry_msgs/Pose]

Position control command for the Robotic arm, available range:

  • While $0.09\leq x \leq 0.18$, recommendated $y\ge 0.08$
  • While $x&gt;0.18$, recommendated $y\ge -0.02$

Take care the inference of Lidar range and the robotic arm.

Command corresponding to ee_link of the robotic arm, the offerset of gripping position to ee_link is (0.11, -0.04)

# Move the end of the robotic arm to x=0.09, y=0.1
$ rostopic pub -1 /arm_position geometry_msgs/Pose "position:
  x: 0.09
  y: 0.1
  z: 0.0
orientation:
  x: 0.0
  y: 0.0
  z: 0.0
  w: 0.0"

/arm_gripper [geometry_msgs/Point]

The commands for gripper:

  • $x = 1$ to close
  • $x = 0$ to open
# the command issued now is tp close the gripper
$ rostopic pub -1 /arm_gripper geometry_msgs/Point "x: 1.0
y: 0.0
z: 0.0" 

2.3 Others

2.3.1 Optional remote control

Both keyboard and the game controller supports provided in carto_navigation is available. In principle all XBOX 360 compatible controller could be used, to start the functionality:

# Keyboard
$ rosrun teleop_twist_keyboard teleop_twist_keyboard.py
# Game controller
$ roslaunch carto_navigation joy_stick.launch

or include the part below in the launch file:

<!-- Keyboard -->
  <include file="$(find carto_navigation)/launch/joy_stick.launch"/>
<!-- Game controller -->
  <node pkg="teleop_twist_keyboard" type="teleop_twist_keyboard.py" name="teleop_twist_keyboard_node" output="screen"/>

the keyboard functionality described as below:

  • Press ijl, to control forward/backward/rotation.
  • Press IJ<L, to control the horizontal movement.
  • Press k, stop the robot moving.
  • Press 1, to move the robotic arm upper.
  • Press 2, to move the robotic arm down.
  • Press 3, to close the gripper.
  • Press 4, to open the gripper.

the game controller functionality described as below:

  • Press a, to move the robotic arm upper.
  • Press b, to move the robotic arm down.
  • Press c, to close the gripper.
  • Press d, to open the gripper.

2.3.2 The position of ore

/pose/cube_N [geometry_msgs/Pose]

The mineral ore numbered with N start from 1 to 5, e.g. /pose/cube_1 according to the place information of ore 1:

$ rostopic echo /position/cube_1

# the output:
# position: 
#   x: 1.0000563859939575
#   y: 0.09999138861894608
#   z: 3.400020122528076
# orientation: 
#   x: 0.00039166660280898213
#   y: -2.329996641492471e-05
#   z: -0.0011298519093543291
#   w: 0.9999992847442627

2.3.3 The location of Exchange Station

/position/target_N [geometry_msgs/Point]

The Exchange Station numbered with N, ordering from left to right as 1/2/3. e.g. /pose/target_1 according to the place information of the leftmost Exchange Station:

$ rostopic echo /position/target_1

# the output:
# x: 2.325000047683716
# y: 0.085
# z: 1.8200000524520874

3 the Challenge field

challenge-field

Including the simulation coordinate system (world), map coordinate system (map), odometer coordinate system (odom), the initial pose of the robot in simulation (initial pose), the specific location is (4.2, 0, 3.5).

4.0 Enter the docker

4.0.1 Docker image for habitat simulator

# start Habitat simulator docker image for the first time
$ cd docker_habitat
$ ./create_container.sh
$ ./exec.sh

Start the image again.

Note Do Not executing the create_container_algo.sh script, otherwise all modifications to the image will be lost:

# Firstly
$ sudo docker ps -a #get the current [container ID]
$ sudo docker start [container_ID]
# after the container started
$ cd docker_habitat
$ ./exec_.sh

4.0.2 Algorithm environment for users

# start Habitat simulator docker image for the first time
$ cd docker_sim2real
$ ./create_container_algo.sh
$ ./exec_algo.sh

Start the image again.

Note Do Not executing the create_container_algo.sh script, otherwise all modifications to the image will be lost:

# Firstly
$ sudo docker ps -a #get the current [container ID]
$ sudo docker start container_ID
# after the container started
$ cd docker_sim2real
$ ./exec_algo.sh

4.1 Map based on cartographer

$ roslaunch carto_navigation mapping_sim.launch

Config files located at ~/ep_ws/src/carto_navigation/param/ as cartographer.lua and cartographer_localization.lua.While the map finished, save the map and run:

$ ~/ep_ws/src/carto_navigation/launch/map_writer.sh

Defaultly, the map saved here: ~/ep_ws/src/carto_navigation/maps/

4.2 Lider navigation based on cartographer and move_base

$ roslaunch carto_navigation navigation_sim.launch

The used planner (dwa/teb) can be modified in move_base.launch in the ~/ep_ws/src/carto_navigation/launch/ directory, and the configuration files are located in ~/ep_ws/src/carto_navigation/param/ in the directory with the corresponding name.

The complete tf tree of lidar navigation based on cartographer is shown in the figure below:

cartographer-tf-tree

Contains the simulation environment coordinate system world, the map coordinate system map, the odometer coordinate system odom, and hides the icons of the robot sensor coordinate system (laser_link and imu_link).

navi_axis

5. Others

5.1 Environment variables related to ENV_ROBOT_MODE in launch files

Players should not change this part, the docker image has made a distinction between simulation and reality.

5.2 Remap for the right topics

​ In order to make the Topic express its meaning more clearly, while avoiding the conflict of the same type of data from different sensors, topics such as /rplidar/scan, /ep/odom are used instead of the common /scan, /odom `, please use in the launch file tags to remap topics.