Skip to content

Latest commit

 

History

History
68 lines (46 loc) · 2.27 KB

nuscenes.md

File metadata and controls

68 lines (46 loc) · 2.27 KB

NuScenes Datasets

We offer to test on the daytime sequences of Nuscenes dataset.

nusc_demo

Jsonify NuScenes dataset

Check meta_data/nusc_trainsub/json_from_cfg.ipynb and modify the data path.

Run through the notebook to output jsonify nuscenes data. This will increase start-up speed and lower dataset memory usage.

Dataset validation / visualization in ROS (Optional)

Check nuscenes_visualize repo.

Training Schedule

Baseline:

## copy example config
cd config
cp nuscenes_wpose_example nuscenes_wpose.py

## Modify config path
nano nuscenes_wpose.py
cd ..

## Train
./launcher/train.sh configs/nuscenes_wpose.py 0 $experiment_name

## Evaluation
python3 scripts/test.py configs/nuscenes_wpose.py 0 $CHECKPOINT_PATH

It's fine to just use the baseline model for projects. After training baseline, you can further re-train with self-distillation:

## export checkpoint
python3 monodepth/transform_teacher.py $Pretrained_checkpoint $output_compressed_checkpoint

## copy example config 
cd config
cp distill_nuscenes_example distill_nuscenes.py

## Modify config path and checkpoint path based on  $output_compressed_checkpoint
nano distill_nuscenes.py
cd ..

## Train
./launcher/train.sh configs/distill_nuscenes.py 0 $experiment_name

Visualize with jupyter notebook

Check demos/demo.ipynb for visualizing datasets and simple demos.

Onnx export

We support exporting pretrained model to onnx model, and you need to install onnx and onnxruntime.

python3 scripts/onnx_export.py $CONFIG_FILE $CHECKPOINT_PATH $ONNX_PATH 

Online ROS full demo

  1. Launch nuscenes_visualize to stream image data topics and Rviz visualization.
  2. Launch monodepth_ros to infer on camera topics.

For nuscenes, we offer an additional node to inference six images in batches. Please make sure your computer is powerful enough to infer six images online.