Skip to content

Latest commit

 

History

History
54 lines (48 loc) · 1.87 KB

README.md

File metadata and controls

54 lines (48 loc) · 1.87 KB

Denoising Masked Autoencoders Help Robust Classification (ICLR 2023)

This repository is the official implementation of “Denoising Masked Autoencoders Help Robust Classification”, based on the official implementation of MAE in PyTorch.

@inproceedings{wu2023dmae,
  title={Denoising Masked Autoencoders Help Robust Classification},
  author={Wu, QuanLin and Ye, Hang and Gu, Yuntian and Zhang, Huishuai and Wang, Liwei and He, Di},
  booktitle={The Eleventh International Conference on Learning Representations},
  year={2023}
}

Pre-training

The pre-training instruction is in PRETRAIN.md.

The following table provides the pre-trained checkpoints used in the paper:

Model Size Epochs Link
DMAE-Base 427MB 1100 download
DMAE-Large 1.23GB 1600 download

Fine-tuning

The fine-tuning and evaluation instruction is in FINETUNE.md.

Results on ImageNet

Results on CIFAR-10

License

This project is under the CC-BY-NC 4.0 license. See LICENSE for details.