Skip to content

Commit

Permalink
Updating documents.
Browse files Browse the repository at this point in the history
  • Loading branch information
mktk1117 committed Dec 5, 2023
1 parent 702188c commit befa1d7
Show file tree
Hide file tree
Showing 11 changed files with 328 additions and 240 deletions.
Binary file added docs/media/main_mem.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/media/overview.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
45 changes: 30 additions & 15 deletions docs/source/documentation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,41 +14,56 @@ Welcome to elevation mapping documentation
Index
---------------

| :doc:`getting_started/index` - What is elevation mapping cupy
| :doc:`getting_started/introduction` - What is elevation mapping cupy
| :doc:`getting_started/installation` - How to install the elevation map
| :doc:`getting_started/tutorial` - How to launch the first elevation map



This is a ROS package for elevation mapping on GPU. The elevation mapping code is written in python and uses cupy for GPU computation. The
plane segmentation is done independently and runs on CPU. When the plane segmentation is generated, local convex approximations of the
terrain can be efficiently generated.

.. image:: ../media/main_repo.png
:alt: Elevation map examples
.. image:: ../media/main_mem.png
:alt: Overview of the project


Citing
---------------
If you use the elevation mapping cupy, please cite the following paper:
Elevation Mapping for Locomotion and Navigation using GPU

.. hint::

Elevation Mapping for Locomotion and Navigation using GPU `Link <https://arxiv.org/abs/2204.12876>`_

Takahiro Miki, Lorenz Wellhausen, Ruben Grandia, Fabian Jenelten, Timon Homberger, Marco Hutter

Elevation Mapping for Locomotion and Navigation using GPU `Link <https://arxiv.org/abs/2204.12876>`_
.. code-block::
@misc{mikielevation2022,
doi = {10.48550/ARXIV.2204.12876},
author = {Miki, Takahiro and Wellhausen, Lorenz and Grandia, Ruben and Jenelten, Fabian and Homberger, Timon and Hutter, Marco},
keywords = {Robotics (cs.RO), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Elevation Mapping for Locomotion and Navigation using GPU},
publisher = {International Conference on Intelligent Robots and Systems (IROS)},
year = {2022},
}
Multi-modal elevation mapping if you use color or semantic layers

.. code-block:: none
.. hint::

@misc{https://doi.org/10.48550/arxiv.2204.12876,
doi = {10.48550/ARXIV.2204.12876},
url = {https://arxiv.org/abs/2204.12876},
author = {Miki, Takahiro and Wellhausen, Lorenz and Grandia, Ruben and Jenelten, Fabian and Homberger, Timon and Hutter, Marco},
keywords = {Robotics (cs.RO), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Elevation Mapping for Locomotion and Navigation using GPU},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
MEM: Multi-Modal Elevation Mapping for Robotics and Learning `Link <https://arxiv.org/abs/2309.16818v1>`_

Gian Erni, Jonas Frey, Takahiro Miki, Matias Mattamala, Marco Hutter

.. code-block::
@misc{Erni2023-bs,
title = "{MEM}: {Multi-Modal} Elevation Mapping for Robotics and Learning",
author = "Erni, Gian and Frey, Jonas and Miki, Takahiro and Mattamala, Matias and Hutter, Marco",
publisher = {International Conference on Intelligent Robots and Systems (IROS)},
year = {2023},
}
71 changes: 0 additions & 71 deletions docs/source/getting_started/index.rst

This file was deleted.

61 changes: 49 additions & 12 deletions docs/source/getting_started/installation.rst
Original file line number Diff line number Diff line change
@@ -1,32 +1,55 @@
.. _installation:


.. toctree::
:hidden:
:maxdepth: 2
Installation
******************************************************************

Cuda installation <cuda_installation>
This section provides instructions for installing the necessary dependencies for the project. The installation process includes setting up CUDA & cuDNN, installing Python dependencies, and configuring Cupy.
Follow the instructions carefully to avoid any installation issues.


Dockers
==================================================================
We provide a docker setup for the project.
To build the docker image, run the following command:

Installation
******************************************************************

CUDA & cuDNN
.. code-block:: bash
cd <project_root>/docker
./build.sh
To run the docker image, run the following command:


.. code-block:: bash
cd <project_root>/docker
./run.sh
This will start the docker container and mount the home directory of the host machine to the docker container.
After you clone the project repository into your catkin_ws, you can build the packages inside the docker container.
To build the packages inside the docker container, follow the instructions in the `Build section <#build>`_ of this document.


On Desktop or Laptop with NVIDIA GPU
==================================================================

CUDA & cuDNN
------------------------------------------------------------------

If you do not have CUDA and cuDNN installed, please install them first.
The tested versions are CUDA10.2, 11.6

`CUDA <https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html#ubuntu-installation>`_
`cuDNN <https://docs.nvidia.com/deeplearning/sdk/cudnn-install/index.html#install-linux>`_


Check how to install :ref:`here<cuda_installation>`.


You can check how to install :ref:`here<cuda_installation>`.

Python dependencies
-------------------------------------------------------------------
------------------------------------------------------------------

You will need

Expand Down Expand Up @@ -235,4 +258,18 @@ Detectron

.. code-block:: bash
python3 -m pip install 'git+https://github.com/facebookresearch/detectron2.git'
python3 -m pip install 'git+https://github.com/facebookresearch/detectron2.git'
Build
==================================================================
After installing all the dependencies, you can build the packages.
Clone the project repository into your catkin_ws/src directory.
Then, build the packages with catkin.

.. code-block:: bash
cd <your_catkin_ws>
catkin build elevation_mapping_cupy # The core package
catkin build convex_plane_decomposition_ros # If you want to use plane segmentation
catkin build semantic_sensor # If you want to use semantic sensors
132 changes: 132 additions & 0 deletions docs/source/getting_started/introduction.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,132 @@
.. _introduction:



Introduction
******************************************************************
The Elevaton Mapping CuPy software package represents an advancement in robotic navigation and locomotion.
Integrating with the Robot Operating System (ROS) and utilizing GPU acceleration, this framework enhances point cloud registration and ray casting,
crucial for efficient and accurate robotic movement, particularly in legged robots.

Used for Various Real-World Applications
-------------------------------------------------------------------
This software package has been rigorously tested in challenging environments, including the DARPA Subterranean Challenge,
demonstrating its effectiveness in complex, real-world scenarios.
It supports a wide range of applications, from underground exploration to advanced research in legged locomotion and autonomous navigation.

* **DARPA Subterranean Challenge**: This package was used by Team CERBERUS in the DARPA Subterranean Challenge.

`Team Cerberus <https://www.subt-cerberus.org/>`_

`CERBERUS in the DARPA Subterranean Challenge (Science Robotics) <https://www.science.org/doi/10.1126/scirobotics.abp9742>`_

* **ESA / ESRIC Space Resources Challenge**: This package was used for the Space Resources Challenge.

`Scientific exploration of challenging planetary analog environments with a team of legged robots (Science Robotics) <https://www.science.org/doi/full/10.1126/scirobotics.ade9548>`_




Key Features
-------------------------------------------------------------------
* **Height Drift Compensation**: Tackles state estimation drifts that can create mapping artifacts, ensuring more accurate terrain representation.

* **Visibility Cleanup and Artifact Removal**: Raycasting methods and an exclusion zone feature are designed to remove virtual artifacts and correctly interpret overhanging obstacles, preventing misidentification as walls.

* **Learning-based Traversability Filter**: Assesses terrain traversability using local geometry, improving path planning and navigation.

* **Versatile Locomotion Tools**: Incorporates smoothing filters and plane segmentation, optimizing movement across various terrains.

* **Multi-Modal Elevation Map (MEM) Framework**: Allows seamless integration of diverse data like geometry, semantics, and RGB information, enhancing multi-modal robotic perception.

* **GPU-Enhanced Efficiency**: Facilitates rapid processing of large data structures, crucial for real-time applications.

Overview
-------------------------------------------------------------------

.. image:: ../../media/overview.png
:alt: Overview of multi-modal elevation map structure

Overview of our multi-modal elevation map structure. The framework takes multi-modal images (purple) and multi-modal (blue) point clouds as
input. This data is input into the elevation map by first associating the data to the cells and then fused with different fusion algorithms into the various
layers of the map. Finally the map can be post-processed with various custom plugins to generate new layers (e.g. traversability) or process layer for
external components (e.g. line detection).

Subscribed Topics
-------------------------------------------------------------------
The subscribed topics are specified under **subscribers** parameter.

Example setup is in **elevation_mapping_cupy/config/core/example_setup.yaml**.

* **/<point_cloud_topic>** ([sensor_msgs/PointCloud2])

The point cloud topic. It can have additional channels for RGB, intensity, etc.

* **/<image_topic>** ([sensor_msgs/Image])

The image topic. It can have additional channels for RGB, semantic probabilities, image features etc.

* **/<camera_info>** ([sensor_msgs/CameraInfo])

The camera info topic. It is used to project the point cloud into the image plane.

* **/<channel_info>** ([elevation_map_msgs/ChannelInfo])

If this topic is configured, the node will subscribe to it and use the information to associate the image channels to the elevation map layers.

* **/tf** ([tf/tfMessage])

The transformation tree.

* The plane segmentation node subscribes to an elevation map topic ([grid_map_msg/GridMap]). This can be configured in
**convex_plane_decomposition_ros/config/core/parameters.yaml**

Published Topics
-------------------------------------------------------------------
For elevation_mapping_cupy, topics are published as set in the rosparam.
You can specify which layers to publish in which fps.

Under publishers, you can specify the topic_name, layers basic_layers and fps.

.. code-block:: yaml
publishers:
your_topic_name:
layers: [ 'list_of_layer_names', 'layer1', 'layer2' ] # Choose from 'elevation', 'variance', 'traversability', 'time' + plugin layers
basic_layers: [ 'list of basic layers', 'layer1' ] # basic_layers for valid cell computation (e.g. Rviz): Choose a subset of `layers`.
fps: 5.0 # Publish rate. Use smaller value than `map_acquire_fps`.
Example setting in `config/parameters.yaml`.

* **elevation_map_raw** ([grid_map_msg/GridMap])

The entire elevation map.

* **elevation_map_recordable** ([grid_map_msg/GridMap])

The entire elevation map with slower update rate for visualization and logging.

* **elevation_map_filter** ([grid_map_msg/GridMap])

The filtered maps using plugins.

The plane segmentation node publishes the following:

* **planar_terrain** ([convex_plane_decomposition_msgs/PlanarTerrain])

A custom message that contains the full segmentation as a map together with the boundary information.

* **filtered_map** ([grid_map_msg/GridMap])

A grid map message to visualize the segmentation and some intermediate results. This information is also part of **planar_terrain**.

* **boundaries** ([visualization_msgs/MarkerArray])

A set of polygons that trace the boundaries of the segmented region. Holes and boundaries of a single region are published as separate
markers with the same color.

* **insets** ([visualization_msgs/PolygonArray])

A set of polygons that are at a slight inward offset from **boundaries**. There might be more insets than boundaries since the inward
shift can cause a single region to break down into multiple when narrow passages exist.
Loading

0 comments on commit befa1d7

Please sign in to comment.