the RoboMaster EP is an autonomous vehicle platform equipped with a 4 axis palletizing robot arm and an encircling gripper for flexible gripping action, which is inspired from DJI's annual RoboMaster robotics competition.
It provides Mecanum wheel omnidirectional movement, fast wireless data link system including realtime video stream, and open sdk for further development and programming.
In order to match the theme of the competition, sim2real
, the organizing committee equipped the RoboMaster EP with the external computing platform and other sensors communicates through the RoboMaster EP SDK with EP to realize the movement and grasping of the robot, complete the corresponding competition tasks, and is dedicated to connect the simulation environment with real applications.
The components of the RMUS EP
The hardware diagram of the RMUS EP
Additional to the original RoboMaster EP design, during the RUSM sim2real challenge we equipped the EP robot with dedicated onboard lidar, RGBD camera, to get further environment information and NUC PC to extending the online computation capacity.
Type | NUC11PAHI7 |
---|---|
CPU | i7-1165G7 2.8GHz * 8 |
RAM | 8GB |
SSD | 256G |
Sensor | Type | Freq (SPS) | Others |
---|---|---|---|
Lidar | SlamTech Rplidar A2 | 10 | 8000sps, 0.2m-16m |
RGB Camera | Intel Realsense D435i | 30 | 640x480 pixels, FOV 87° x 58° |
Depth Camera | Intel Realsense D435i | 30 | 640x480 pixels, FOV 87° x 58° |
IMU | HiPNUC Hi226 6-axis IMU/VRU | 30 | Static roll and pitch errors: 0.8° Static roll and pitch angles, error bound: 2.5° bias error of gyroscope: < 10°/h Heading angle error when moving (under 6-axis mode): < 10° |
Odometer | RoboMaster SDK | 10 | - |
Actuator | Recommend Range |
---|---|
chassis velocity control |
|
chassis position control |
|
arm end position control | while while |
gripper control |
|
Tto build the static TF tree, according to the specific location of each sensors, the location of the discription file is here:
~/ep_ws/src/ep_description/urdf/ep_description.xacro
The user should test the development in the client
image at first,
# static tf
$ roslaunch ep_descreption ep_description.launch
or include the part below in the launch
file:
<launch>
...
<!-- ep_description -->
<include file="$(find ep_description)/launch/ep_description.launch"/>
...
</launch>
EP Robot tf_tree
Robot frame Axis
Coordinate system | Description | Origin according to /base_link (m) |
---|---|---|
/base_link |
The robot center coordinate system is located at the center of the four Mecanum wheels, and the height is on the plane where the centers of the four Mecanum wheels are located. |
X Front Y Left Z Upper |
/laser_link |
Lidar coordinate | (0.12, 0, 0.1) |
/imu_link |
IMU coordinate | (0, 0, 0.08) |
/arm_link |
The base coordinate system of the robot arm | (0, 0, 0.1) |
/camera_bottom_screw_frame |
Realsense camera base screw positioning hole coordinate system, as the standard for other coordinate systems of the camera |
(0.15, 0, 0.0125) |
/camera_link |
Realsense camera coodinate | (0.16, 0.0175, 0.025) |
/camera_depth_frame |
Depth coordinate of Realsense camera, sharing the same space with/camera_link |
(0.16, 0.0175, 0.025) |
/camera_color_frame |
RGB coordinate of Realsense camera | (0.16, 0.0325, 0.025) |
/camera_aligned_depth_to_color_frame |
the coordinate of Realsense Depth image aligned to RGB coordinate, sharing the same space with /camera_color_frame |
(0.16, 0.0325, 0.025) |
/base_link
Robot center coordinate system, located at the center point of the 4 Mecanum wheel.
/imu_link
IMU coordinate system, the same as base_link
.
/laser_link
2D LiDAR coordinate system, located at the centrer front of the vehicle.
The RoboMaster EP platform environment is the same as the docker provided to players, sharing the same ROS interface.
Enter the docker setup from Install guide, using the rostopic
command from the cli:
sim2real@Ubuntu:~$ rostopic list -v
All the Published/Subscribed Topics is printed:
Published topics:
- /rosout_agg [rosgraph_msgs/Log] 1 publisher
- /rosout [rosgraph_msgs/Log] 7 publishers
- /cmd_vel [geometry_msgs/Twist] 1 publisher
- /arm_position [geometry_msgs/Pose] 1 publisher
- /arm_gripper [geometry_msgs/Point] 1 publisher
- /image_view_rgb/parameter_descriptions [dynamic_reconfigure/ConfigDescription] 1 publisher
- /image_view_third/parameter_descriptions [dynamic_reconfigure/ConfigDescription] 1 publisher
- /image_view_depth/parameter_descriptions [dynamic_reconfigure/ConfigDescription] 1 publisher
- /image_view_rgb/parameter_updates [dynamic_reconfigure/Config] 1 publisher
- /image_view_third/parameter_updates [dynamic_reconfigure/Config] 1 publisher
- /image_view_depth/parameter_updates [dynamic_reconfigure/Config] 1 publisher
- /image_view_depth/output [sensor_msgs/Image] 1 publisher
- /image_view_third/output [sensor_msgs/Image] 1 publisher
- /image_view_rgb/output [sensor_msgs/Image] 1 publisher
- /camera/color/image_raw [sensor_msgs/Image] 1 publisher
- /third_rgb [sensor_msgs/Image] 1 publisher
- /camera/color/camera_info [sensor_msgs/CameraInfo] 1 publisher
- /camera/aligned_depth_to_color/image_raw [sensor_msgs/Image] 1 publisher
- /camera/aligned_depth_to_color/camera_info [sensor_msgs/CameraInfo] 1 publisher
- /pointgoal_with_gps_compass [ros_x_habitat/PointGoalWithGPSCompass] 1 publisher
- /gps [ros_x_habitat/PointGoalWithGPSCompass] 1 publisher
- /imu/data_raw [sensor_msgs/Imu] 1 publisher
- /tf [tf2_msgs/TFMessage] 1 publisher
- /rplidar/scan [sensor_msgs/LaserScan] 1 publisher
- /ep/odom [nav_msgs/Odometry] 1 publisher
- /gripper_state [geometry_msgs/Point] 1 publisher
- /pose/cube_1 [geometry_msgs/Pose] 1 publisher
- /pose/cube_2 [geometry_msgs/Pose] 1 publisher
- /pose/cube_3 [geometry_msgs/Pose] 1 publisher
- /pose/cube_4 [geometry_msgs/Pose] 1 publisher
- /pose/cube_5 [geometry_msgs/Pose] 1 publisher
- /position/target_1 [geometry_msgs/Point] 1 publisher
- /position/target_2 [geometry_msgs/Point] 1 publisher
- /position/target_3 [geometry_msgs/Point] 1 publisher
- /pose/ep_world [geometry_msgs/Pose] 1 publisher
- /judgement/exchange_markers [std_msgs/String] 1 publisher
- /judgement/markers_time [std_msgs/String] 1 publisher
Subscribed topics:
- /rosout [rosgraph_msgs/Log] 1 subscriber
- /camera/aligned_depth_to_color/image_raw [sensor_msgs/Image] 1 subscriber
- /third_rgb [sensor_msgs/Image] 1 subscriber
- /camera/color/image_raw [sensor_msgs/Image] 1 subscriber
- /pointgoal_with_gps_compass [ros_x_habitat/PointGoalWithGPSCompass] 1 subscriber
- /cmd_vel [geometry_msgs/Twist] 1 subscriber
- /arm_gripper [geometry_msgs/Point] 1 subscriber
- /arm_position [geometry_msgs/Pose] 1 subscriber
- /cmd_position [geometry_msgs/Twist] 1 subscriber
- /gps/goal [move_base_msgs/MoveBaseActionGoal] 1 subscriber
/ep/odom
(nav_msgs/Odometry) default: 30hz
Odometer data, including robot pose and speed information, are obtained by DJI master control.
/rplidar/scan
(type: sensor_msgs/LaserScan, frame_id:laser_link) default: 10hz
The two-dimensional lidar scanning data, including scene scanning information, is acquired by lidar because the occlusion range of the robot body includes 270° in front of the robot.
/imu/data_raw
(sensor_msgs/LaserScan) default: 30hz
IMU sensor data, including rotation, velocity, and acceleration information, are collected by the IMU.
/camera/color/camera_info
type: [sensor_msgs/CameraInfo], frame_id: camera_color_optical_frame
RGB color image camera intrinsic parameter information.
/camera/color/image_raw
type: [sensor_msgs/Image], frame_id: camera_color_optical_frame
RGB color image data, acquired by Realsense.
/camera/aligned_depth_to_color/camera_info
type: [sensor_msgs/CameraInfo], frame_id: camera_color_optical_frame
Depth camera information.
/camera/aligned_depth_to_color/image_raw
type: [sensor_msgs/Image], frame_id: camera_color_optical_frame
Depth image data, acquired by Realsense and aligned to RGB color images.
Note: The Realsense camera provides more images to choose from. We chose the depth data aligned with the color image as the default depth data provided in the habitat server environment.
/cmd_vel
[geometry_msgs/Twist]
Velocity command for the EP chassis, recommend range:
/cmd_position
[geometry_msgs/Twist]
Position command for the EP chassis, recommend range:
# Move EP forward for 0.1m
$ rostopic pub -1 /cmd_position geometry_msgs/Twist "linear:
x: 0.1
y: 0.0
z: 0.0
angular:
x: 0.0
y: 0.0
z: 0.0"
/arm_position
[geometry_msgs/Pose]
Position control command for the Robotic arm, available range:
- While
$0.09\leq x \leq 0.18$ , recommendated$y\ge 0.08$ - While
$x>0.18$ , recommendated$y\ge -0.02$
Take care the inference of Lidar range and the robotic arm.
Command corresponding to ee_link
of the robotic arm, the offerset of gripping position to ee_link
is (0.11, -0.04)
# Move the end of the robotic arm to x=0.09, y=0.1
$ rostopic pub -1 /arm_position geometry_msgs/Pose "position:
x: 0.09
y: 0.1
z: 0.0
orientation:
x: 0.0
y: 0.0
z: 0.0
w: 0.0"
/arm_gripper
[geometry_msgs/Point]
The commands for gripper:
-
$x = 1$ to close -
$x = 0$ to open
# the command issued now is tp close the gripper
$ rostopic pub -1 /arm_gripper geometry_msgs/Point "x: 1.0
y: 0.0
z: 0.0"
Both keyboard and the game controller supports provided in carto_navigation
is available. In principle all XBOX 360 compatible controller could be used, to start the functionality:
# Keyboard
$ rosrun teleop_twist_keyboard teleop_twist_keyboard.py
# Game controller
$ roslaunch carto_navigation joy_stick.launch
or include the part below in the launch
file:
<!-- Keyboard -->
<include file="$(find carto_navigation)/launch/joy_stick.launch"/>
<!-- Game controller -->
<node pkg="teleop_twist_keyboard" type="teleop_twist_keyboard.py" name="teleop_twist_keyboard_node" output="screen"/>
the keyboard functionality described as below:
- Press
i
,j
,,
,l
, to control forward/backward/rotation. - Press
I
,J
,<
,L
, to control the horizontal movement. - Press
k
, stop the robot moving. - Press
1
, to move the robotic arm upper. - Press
2
, to move the robotic arm down. - Press
3
, to close the gripper. - Press
4
, to open the gripper.
the game controller functionality described as below:
- Press
a
, to move the robotic arm upper. - Press
b
, to move the robotic arm down. - Press
c
, to close the gripper. - Press
d
, to open the gripper.
/pose/cube_N
[geometry_msgs/Pose]
The mineral ore numbered with N
start from 1 to 5, e.g. /pose/cube_1
according to the place information of ore 1:
$ rostopic echo /position/cube_1
# the output:
# position:
# x: 1.0000563859939575
# y: 0.09999138861894608
# z: 3.400020122528076
# orientation:
# x: 0.00039166660280898213
# y: -2.329996641492471e-05
# z: -0.0011298519093543291
# w: 0.9999992847442627
/position/target_N
[geometry_msgs/Point]
The Exchange Station numbered with N
, ordering from left to right as 1/2/3. e.g. /pose/target_1
according to the place information of the leftmost Exchange Station:
$ rostopic echo /position/target_1
# the output:
# x: 2.325000047683716
# y: 0.085
# z: 1.8200000524520874
Including the simulation coordinate system (world), map coordinate system (map), odometer coordinate system (odom), the initial pose of the robot in simulation (initial pose), the specific location is (4.2, 0, 3.5)
.
# start Habitat simulator docker image for the first time
$ cd docker_habitat
$ ./create_container.sh
$ ./exec.sh
Start the image again.
Note Do Not executing the create_container_algo.sh
script, otherwise all modifications to the image will be lost:
# Firstly
$ sudo docker ps -a #get the current [container ID]
$ sudo docker start [container_ID]
# after the container started
$ cd docker_habitat
$ ./exec_.sh
# start Habitat simulator docker image for the first time
$ cd docker_sim2real
$ ./create_container_algo.sh
$ ./exec_algo.sh
Start the image again.
Note Do Not executing the create_container_algo.sh
script, otherwise all modifications to the image will be lost:
# Firstly
$ sudo docker ps -a #get the current [container ID]
$ sudo docker start container_ID
# after the container started
$ cd docker_sim2real
$ ./exec_algo.sh
$ roslaunch carto_navigation mapping_sim.launch
Config files located at ~/ep_ws/src/carto_navigation/param/
as cartographer.lua
and cartographer_localization.lua
.While the map finished, save the map and run:
$ ~/ep_ws/src/carto_navigation/launch/map_writer.sh
Defaultly, the map saved here: ~/ep_ws/src/carto_navigation/maps/
$ roslaunch carto_navigation navigation_sim.launch
The used planner (dwa/teb
) can be modified in move_base.launch
in the ~/ep_ws/src/carto_navigation/launch/
directory, and the configuration files are located in ~/ep_ws/src/carto_navigation/param/
in the directory with the corresponding name.
The complete tf tree
of lidar navigation based on cartographer
is shown in the figure below:
Contains the simulation environment coordinate system world
, the map coordinate system map
, the odometer coordinate system odom
, and hides the icons of the robot sensor coordinate system (laser_link
and imu_link
).
Players should not change this part, the docker image has made a distinction between simulation and reality.
In order to make the Topic express its meaning more clearly, while avoiding the conflict of the same type of data from different sensors, topics such as /rplidar/scan
, /ep/odom
are used instead of the common /scan
, /odom
`, please use in the launch file tags to remap topics.