This repository hosts the code for ViPER, accepted for CoRL 2024.
ViPER is a neural framework for visibility-based pursuit-evasion, where agents cooperatively search for worst-case evaders. A team of agents methodically explores and clears the entire environment by expanding frontiers, turning all contaminated areas (white) into cleared ones (green).
Use conda and pip to setup environments:
conda create -n viper python=3.11 scikit-image imageio tensorboard matplotlib pytorch pytorch-cuda=11.8 -c pytorch -c nvidia -y
conda activate viper
pip install ray wandb opencv-python-headless
bash ./utils/download.sh
Set appropriate parameters in test_parameter.py
and run test_driver.py
to evaluate.
You can also create your own map by running viper_demo.py
, which opens a canvas for you to draw on.
- Use Obstacle and Free Space brushes to draw your map. Adjust the brush size with the thickness slider.
- Click Reset to clear the canvas, setting it entirely to obstacles or free space.
- Click Random Map to load a random map from test map dataset.
- Click Place Agents to place multiple agents in the free space.
- Click Play to observe how ViPER agents plan their paths. The canvas will close and the interactive demo will play automatically.
Alternatively, you can save the map that you created.
- Click Start Position to place the starting position of agents.
- Click Save Map before closing the canvas. Your map will be saved as
maps_spec/map.png
.
Make sure you have downloaded the map dataset.
Set appropriate parameters in parameter.py
and run driver.py
to train the model.
If you find our work useful, please consider citing our paper:
@inproceedings{wang2024viper,
title={ViPER: Visibility-based Pursuit-Evasion via Reinforcement Learning},
author={Wang, Yizhuo and Cao, Yuhong and Chiun, Jimmy and Koley, Subhadeep and Pham, Mandy and Sartoretti, Guillaume},
booktitle={8th Annual Conference on Robot Learning},
year={2024}
}
Authors: Yizhuo Wang, Yuhong Cao, Jimmy Chiun, Subhadeep Koley, Mandy Pham, Guillaume Sartoretti