Skip to content

Latest commit

 

History

History
84 lines (75 loc) · 2.85 KB

README.md

File metadata and controls

84 lines (75 loc) · 2.85 KB

Self-Learning Exploration and Mapping for Mobile Robots via Deep Reinforcement Learning

This repository contains code for robot exploration with Deep Reinforcement Learning (DRL). The agent utilizes the local structure of the environment to predict robot’s optimal sensing action. A demonstration video can be found here.

drawing

drawing

Dependency

  • Python 3
  • scikit-image
    pip3 install scikit-image
    
  • tensorboardX
    pip3 install tensorboardX
    
  • TensorFlow (this code is writen under TF1.x but it is modified to be compatible with TF2)
  • pybind11 (pybind11 — Seamless operability between C++11 and Python)
    git clone https://github.com/pybind/pybind11.git
    cd pybind11
    mkdir build && cd build
    cmake ..
    sudo make install
    

Compile

You can use the following commands to download and compile the package.

git clone https://github.com/RobustFieldAutonomyLab/DRL_robot_exploration.git
cd DRL_robot_exploration
mkdir build && cd build
cmake ..
make

How to Run?

  • For the CNN policy:

    cd DRL_robot_exploration/scripts
    python3 tf_policy_cnn.py
    
  • For the RNN policy:

    cd DRL_robot_exploration/scripts
    python3 tf_policy_rnn.py
    
  • To select the running mode, at the beginning of the tf_policy code:

    # select mode
    TRAIN = False
    PLOT = True
    

    Set TRAIN=False to run the saved policy. You can train your own policy by setting TRAIN=True. Set PLOT=True to show visualization plots.

  • To show the average reward during the training:

    cd DRL_robot_exploration
    tensorboard --logdir=log
    

Cite

Please cite our paper if you use any of this code:

@inproceedings{ExplorDRL2019,
  title={Self-Learning Exploration and Mapping for Mobile Robots via Deep Reinforcement Learning},
  author={Chen, Fanfei and Bai, Shi and Shan, Tixiao and Englot, Brendan},
  booktitle={AIAA SciTech Forum},
  pages={0396},
  year={2019},
}

Reference