SHIFT is a driving dataset for continuous multi-task domain adaptation. It is maintained by the VIS Group at ETH Zurich.
The main branch works with PyTorch1.6+.
CVPR.2022.SHIFT.Dataset.-.Teaser.mp4
Please refer to get_started.md for install instructions.
Please refer to dataset_prepare.md for instructions on how to download and prepare the SHIFT dataset.
Please refer to train_test.md for instructions on how to train and test your own model.
Please refer to challenge.md for instructions on how to participate in the challenge and for training, test, and adaptation instructions.
The challenge is organized for the Workshop on Visual Continual Learning @ ICCV2023. Checkout wvcl.vis.xyz/challenges for additional details on this and other challenges.
We will award the top three teams of each challenge with a certificate and a prize of 1000, 500, and 300 USD, respectively. The winners of each challenge will be invited to give a presentation at the workshop. Teams will be selected based on the performance of their methods on the test set.
We will also award one team from each challenge with an innovation award. The innovation award is given to the team that proposes the most innovative method and/or insightful analysis. The winner will receive a certificate and an additional prize of 300 USD.
Please notice that this challenge is part of the track Challenge B - Continual Test-time Adaptation, together with the challenge on "Continuous Test-time Adaptation for Semantic Segmentation". Since the challenge on "Continuous Test-time Adaptation for Object Detection" constitutes half of the track B, the prize should be considered half of what mentioned above.
Results and models are available in the model zoo.
Supported Adaptation Methods
Supported Datasets
If you find this project useful in your research, please consider citing:
- SHIFT, the dataset powering this challenge and the continuous adaptation tasks:
@inproceedings{sun2022shift,
title={SHIFT: a synthetic driving dataset for continuous multi-task domain adaptation},
author={Sun, Tao and Segu, Mattia and Postels, Janis and Wang, Yuxuan and Van Gool, Luc and Schiele, Bernt and Tombari, Federico and Yu, Fisher},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={21371--21382},
year={2022}
}
- DARTH, the test-time adaptation method introducing the detection consistency loss for detection adaptation based on mean-teacher:
@inproceedings{segu2023darth,
title={Darth: holistic test-time adaptation for multiple object tracking},
author={Segu, Mattia and Schiele, Bernt and Yu, Fisher},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
pages={9717--9727},
year={2023}
}
This project is released under the MIT License.