Skip to content

HARPLab/DriverSituationalAwareness

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Modeling Drivers’ Situational Awareness from Eye Gaze for Driving Assistance

News:
09.05.2024: We released the code to reproduce the results in the paper.

This repository provides code for the following paper:

Modeling Drivers’ Situational Awareness from Eye Gaze for Driving Assistance, CoRL 2024 by Abhijat Biswas, Pranay Gupta, Shreeya Khurana, David Held, Henny Admoni

Content

Download the Dataset

If you want to use our data, follow these steps in this section. If you want to generate your own data skip to DReyeVR Setup.

The raw data (images, gaze, button press, etc.) can be downloaded from Box here. The data is available as part zip files and must be concatenated together post download.

cat raw_data_test_part_files/part*.zip > raw_data_test.zip

Using the data

You can see examples of how various data modalities are used via the dataloader file in data/dataset_full.py's SituationalAwarenessDataset class.

Dataset Generation (Coming Soon)

To generate your own data, you must first set up the DReyeVR simulator.

DReyeVR-Setup

Generating your own data involves using the DReyeVR simulator. To learn more about DReyeVR, visit the DReyeVR repo. DReyeVR installation and setup instructions can be found here.

In order to parse the data, you will need to set up DReyeVR in a conda environment. Instruction for this here. Script related to data parsing can be found in the DReyeVR-parser repo. In particular post_process_participant.sh (found here) will extract the awareness data for the recording files, perform a replay to get the sensor data, calculate any offset (read more here) and produce frames with gaze and button press overlay.

Evaluation

Setup

First, you have to install the conda environment.

git clone https://github.com/HARPLab/DriverSA.git
cd Driver SA
conda env create -f environment.yml
conda activate sit_aw_env

Model weights and baseline evaluation code coming soon.

Citation

@inproceedings{biswasmodeling,
  title = {Modeling Drivers’ Situational Awareness from Eye Gaze for Driving Assistance},
  author = {Biswas, Abhijat and Gupta, Pranay and Khurana, Shreeya and Held, David and Admoni, Henny},
  booktitle = {8th Annual Conference on Robot Learning}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages