Official code for Continual Test-Time Domain Adaptation, published in CVPR 2022.
This repository also includes other continual test-time adaptation methods for classification and segmentation. We provide benchmarking and comparison for the following methods:
- CoTTA
- AdaBN / BN Adapt
- TENT
on the following tasks
- CIFAR10/100 -> CIFAR10C/100C (standard/gradual)
- ImageNet -> ImageNetC
- Cityscapes -> ACDC (segmentation)
Please create and activate the following conda envrionment. To reproduce our results, please kindly create and use this environment.
# It may take several minutes for conda to solve the environment
conda update conda
conda env create -f environment.yml
conda activate cotta
# Tested on RTX2080TI
cd cifar
# This includes the comparison of all three methods as well as baseline
bash run_cifar10.sh
# Tested on RTX2080TI
bash run_cifar10_gradual.sh
# Tested on RTX3090
bash run_cifar100.sh
# Tested on RTX3090
cd imagenet
bash run.sh
Since April 2022, we also offer the segmentation code based on Segformer. You can download it here
## environment setup: a new conda environment is needed for segformer
## You may also want to check https://github.com/qinenergy/cotta/issues/13 if you have problem installing mmcv
conda env create -f environment_segformer.yml
pip install -e . --user
conda activate segformer
## Run
bash run_base.sh
bash run_tent.sh
bash run_cotta.sh
# Example logs are included in ./example_logs/base.log, tent.log, and cotta.log.
## License for Cityscapses-to-ACDC code
Non-commercial. Code is heavily based on Segformer. Please also check Segformer's LICENSE.
Please cite our work if you find it useful.
@inproceedings{wang2022continual,
title={Continual Test-Time Domain Adaptation},
author={Wang, Qin and Fink, Olga and Van Gool, Luc and Dai, Dengxin},
booktitle={Proceedings of Conference on Computer Vision and Pattern Recognition},
year={2022}
}
- TENT code is heavily used. official
- KATANA code is used for augmentation. official
- Robustbench official
- ImageNet-C Download
For questions regarding the code, please contact [email protected] .