Skip to content

xtLyu/CerP

Repository files navigation

CerP

In this repository, code is for our AAAI 2023 paper Poisoning with Cerberus: Stealthy and Colluded Backdoor Attack against Federated Learning

Installation

Install Pytorch

Reproduce experiments:

  • we can use Visdom to monitor the training progress.
python -m visdom.server -p 8098
  • run experiments for the CIFAR-100 dataset:
python main.py --params utils/X.yaml

X = mkrum, foolsgold or bulyan.

Parameters can be changed in those yaml files to reproduce our experiments.

Stay tuned for further updates, thanks!

Citation

If you find our work useful in your research, please consider citing:

@inproceedings{DBLP:conf/aaai/LyuHWLWL023,
  author       = {Xiaoting Lyu and
                  Yufei Han and
                  Wei Wang and
                  Jingkai Liu and
                  Bin Wang and
                  Jiqiang Liu and
                  Xiangliang Zhang},
  title        = {Poisoning with Cerberus: Stealthy and Colluded Backdoor Attack against
                  Federated Learning},
  booktitle    = {Thirty-Seventh {AAAI} Conference on Artificial Intelligence, {AAAI}
                  2023, Thirty-Fifth Conference on Innovative Applications of Artificial
                  Intelligence, {IAAI} 2023, Thirteenth Symposium on Educational Advances
                  in Artificial Intelligence, {EAAI} 2023, Washington, DC, USA, February
                  7-14, 2023},
  pages        = {9020--9028},
  publisher    = {{AAAI} Press},
  year         = {2023},
  url          = {https://doi.org/10.1609/aaai.v37i7.26083},
  doi          = {10.1609/AAAI.V37I7.26083},
  timestamp    = {Mon, 04 Sep 2023 16:50:26 +0200},
  biburl       = {https://dblp.org/rec/conf/aaai/LyuHWLWL023.bib},
  bibsource    = {dblp computer science bibliography, https://dblp.org}
}

Acknowledgement

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages