Isolation and Impartial Aggregation: A Paradigm of Incremental Learning without Interference
Yabin Wang*, Zhiheng Ma*, Zhiwu Huang, Yaowei Wang, Zhou Su, Xiaopeng Hong. 2023 Proceedings of the AAAI Conference on Artificial Intelligence (AAAI 23).
[Paper]
In this paper, we propose anchor-based energy self-normalization for stage classifiers.
The classifiers of the current and the previous stages,
Create the virtual environment for ESN.
conda env create -f environment.yaml
After this, you will get a new environment esn that can conduct ESN experiments.
Run conda activate esn
to activate.
Note that only NVIDIA GPUs are supported for now, and we use NVIDIA RTX 3090.
Please refer to the following links to download three standard incremental learning benchmark datasets.
[CIFAR-100] Auto Download
CORe50
DomainNet
[5-Datasets]
Unzip the downloaded files, and you will get the following folders.
core50
└── core50_128x128
├── labels.pkl
├── LUP.pkl
├── paths.pkl
├── s1
├── s2
├── s3
...
domainnet
├── clipart
│ ├── aircraft_carrier
│ ├── airplane
│ ... ...
├── clipart_test.txt
├── clipart_train.txt
├── infograph
│ ├── aircraft_carrier
│ ├── airplane
│ ... ...
├── infograph_test.txt
├── infograph_train.txt
├── painting
│ ├── aircraft_carrier
│ ├── airplane
│ ... ...
... ...
Please change the data_path
in utils/data.py
to the locations of the datasets.
python main.py --dataset cifar100_vit --max_epochs 30 --init_cls 10 --inc_cls 10 --shuffle
python main.py --dataset domainnet --max_epochs 10 --init_cls 20 --inc_cls 20 --shuffle
python main.py --dataset core50 --max_epochs 10 --init_cls 50 --inc_cls 50 --dil True --max_cls 50
python main.py --dataset 5datasets_vit --max_epochs 10 --init_cls 10 --inc_cls 10
Pretrained models are available here.
We assume the downloaded weights are located under the checkpoints
directory.
Otherwise, you may need to change the corresponding paths in the scripts.
Please check the MIT license that is listed in this repository.
We thank the following repos providing helpful components/functions in our work.
If you use any content of this repo for your work, please cite the following bib entry:
@article{wang2022isolation,
title={Isolation and Impartial Aggregation: A Paradigm of Incremental Learning without Interference},
author={Wang, Yabin and Ma, Zhiheng and Huang, Zhiwu and Wang, Yaowei and Su, Zhou and Hong, Xiaopeng},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
year={2023}
}