This is the code for the 2018 CoRL paper ESIM: an Open Event Camera Simulator by Henri Rebecq, Daniel Gehrig and Davide Scaramuzza:
@Article{Rebecq18corl,
author = {Henri Rebecq and Daniel Gehrig and Davide Scaramuzza},
title = {{ESIM}: an Open Event Camera Simulator},
journal = {Conf. on Robotics Learning (CoRL)},
year = 2018,
month = oct
}
You can find a pdf of the paper here. If you use any of this code, please cite this publication.
- Accurate event simulation, guaranteed by the tight integration between the rendering engine and the event simulator
- Inertial Measurement Unit (IMU) simulation
- Support for multi-camera systems
- Ground truth camera poses, IMU biases, angular/linear velocities, depth maps, and optic flow maps
- Support for camera distortion (only planar and panoramic renderers)
- Different C+/C- contrast thresholds
- Basic noise simulation for event cameras (based on additive Gaussian noise on the contrast threshold)
- Motion blur simulation
- Publish to ROS and/or save data to rosbag
Installation instructions can be found in our wiki.
Specific instructions to run the simulator depending on the chosen rendering engine can be found in our wiki.
We thank Raffael Theiler and Dario Brescianini for their contributions to ESIM. This research was supported by by Swiss National Center of Competence Research Robotics (NCCR), Qualcomm (through the Qualcomm Innovation Fellowship Award 2018), the SNSF-ERC Starting Grant and DARPA FLA.
A significant part of ESIM uses components (spline trajectories, inertial measurement unit simulation, various utility functions) from the ze_oss project. ESIM depends on UnrealCV for the photorealistic rendering engine. We also reused some code samples from the excellent Lean OpenGL tutorial in our OpenGL rendering engine. Finally, ESIM depends on the Open Asset Import Library (assimp) to load 3D models and Blender scenes within the OpenGL rendering engine.