Skip to content

A project that recursively estimates the state and identifies the parameters of doors and handles in mobile manipulation.

Notifications You must be signed in to change notification settings

yifeidong0/door_estimation_tracking

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ROS Package of Door State Estimation and Parameter Identification for Mobile Manipulation

Python ROS Version

A ROS package for doors and handles detection, tracking and estimation using YOLOv5 and python-pcl.

Maintainer: Yifei Dong

Affiliation: Robotic Systems Lab, ETH Zurich (Master thesis project)

Contact: [email protected]

Table of Contents

Setup Instructions

First, clone the project repository to the src directory in your catkin workspace:

git clone https://github.com/YvesDong/door_estimation_tracking.git
git checkout yolov5_door_detection

To setup the python dependencies, run the bash script:

./install.sh

Note: Please follow the links below to install other necessary dependencies: tensorrt with CUDA 10.2, python-pcl, vision_opencv, geometry2.

Build Instructions

Packages geometry2 (tf), python-pcl and vision_opencv (cv_bridge) defaut to python2. To make it compatible with python3.6, some flags are needed as below. (Example paths on Jetson Xavier with ARM64 structure. Please modify them if needed.)

cd /PATH/TO/catkin_ws
catkin_make -DPYTHON_EXECUTABLE:FILEPATH=/usr/bin/python3 -DPYTHON_INCLUDE_DIR=/usr/include/python3.6m -DPYTHON_LIBRARY=/usr/lib/aarch64-linux-gnu/libpython3.6m.so
source devel/setup.bash

Running Instructions

To obtain the best inference results, please switch models in ... according to the size of your input images.

Change other configs as well according to the names of your dataset.

Please download bag files from the link.

For example use without changing the config, please download this one first.

After downloading, please run the bag in another console:

rosbag play <name>.bag

In a third console, run:

roscore

Run the estimator while the bag is re-playing:

cd src/alma_handle_detection
python3 estimate.py

Pause or continue the bag re-play according to the reminders in the console.

When the interactive window pops up, please select RoI (a target door) manually to initialize the estimator.

Vizualize the estimation results in Rviz (door_estimation/viz/rviz.rviz), rqt_image_view, or rqt_multiplot (door_estimation/viz/plot.xml)

rviz
rqt_multiplot
rqt_image_view

Results in Rviz simulation

Acknowledgment

The repository is dependent on the code from the following:

About

A project that recursively estimates the state and identifies the parameters of doors and handles in mobile manipulation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published