Yibo Yang*, Haobo Yuan*, Xiangtai Li, Jianlong Wu, Lefei Zhang, Zhouchen Lin, Philip H.S. Torr, Dacheng Tao, Bernard Ghanem.
[Optional] To start, you need to install the environment with docker (in docker_env directory):
docker build -t ftc --network=host .
Note that we have published the pre-installed image and no need to run the above command if you network is well.
Then, you can start a new container to run our codes:
DATALOC={YOUR DATA LOCATION} LOGLOC={YOUR LOG LOCATION} bash tools/docker.sh
You do not need to prepare CIFAR datasets.
For ImageNet datasets, please prepare and organize it as following:
imagenet
├── train
│ ├── n01440764
│ │ ├── n01440764_18.JPEG
│ │ ├── ...
├── _val
│ ├── n01440764
│ │ ├── ILSVRC2012_val_00000293.JPEG
│ │ ├── ...
Please create a docker container and enter it (DATALOC and LOGLOC have default values, but they may not match your env):
DATALOC=/path/to/data LOGLOC=/path/to/logger bash tools/docker.sh
Let's go for 🏃♀️running code.
25 steps
bash tools/dist_train.sh configs/cifar/resnet12_cifar_dist_25.py 8 --seed 0 --deterministic --work-dir /opt/logger/cifar100_25t
10 steps (shuffled)
bash tools/dist_train.sh configs/cifar_lt/resnet_cifar_shuffle_10.py 8 --seed 0 --deterministic --work-dir /opt/logger/cifar100_lt_10t_shuffle
25 steps
bash tools/dist_train.sh configs/imagenet/resnet18_imagenet100_25t.py 8 --seed 0 --deterministic --work-dir /opt/logger/i100_25t
10 steps (Shuffled)
bash tools/dist_train.sh configs/imagenet_lt/resnet18_imagenet100_shuffle_10t.py 8 --seed 0 --deterministic --work-dir /opt/logger/i100_lt_10t_shuffle
Please refer to our another repo.
To conduct UniCIL, you need to run the base session first and run the incremental sessions beyond the base session checkpoint.
Base Session:
bash tools_general/dist.sh train_base configs_general/cifar_general/resnet18_cifar_10.py 8 --seed 0 --deterministic --work-dir /opt/logger/general_cifar_10
Incremental Sessions:
bash tools_general/dist.sh train_inc configs_general/cifar_general/resnet18_cifar_10.py 8 --seed 0 --deterministic --work-dir /opt/logger/general_cifar_10
You can calculate the average of "[ACC_MEAN]" of each session to get the average incremental accuracy. Be careful that "[ACC_MEAN]" is the accuracy after a specific session rather than the average incremental accuracy in the tables of our paper.
If you find this work helpful in your research, please consider referring:
@article{UniCIL,
author={Yibo Yang and Haobo Yuan and Xiangtai Li and Jianlong Wu and Lefei Zhang and Zhouchen Lin and Philip H.S. Torr and Bernard Ghanem and Dacheng Tao},
title={Neural Collapse Terminus: A Unified Solution for Class Incremental Learning and Its Variants},
journal={arXiv pre-print},
year={2023}
}