Towards Generic Image Manipulation Detection with Weakly-Supervised Self-Consistency Learning
Yuanhao Zhai, Tianyu Luan, David Doermann, Junsong Yuan
University at Buffalo
ICCV 2023
Our WSCL (weakly-supervised self-consistency learning) can detect and localize image manipulations, using only image-level binary labels for training.
This repo contains the MIL-FCN version of our WSCL implementation.
03/2024: add demo script! Check here for more details, and check here for the online Gradio demo!
Clone this repo
git clone [email protected]:yhZhai/WSCL.git
Install packages
pip install -r requirements.txt
We provide preprocessed CASIA (v1 and v2), Columbia, and Coverage datasets here.
Place them under the data
folder.
For other datasets, please prepare a json datalist file with similar structure as the existing datalist files in the data
folder. After that, adjust the train_dataslist
or the val_datalist
entries in the configuration files configs/final.yaml
.
Runing the following script to train on CASIAv2, and evalute on CASIAv1, Columbia and Coverage.
python main.py --load configs/final.yaml
For evaluating a pre-trained checkpoint:
python main.py --load configs/final.yaml --eval --resume checkpoint-path
We provide our pre-trained checkpoint here.
Running our manipulation model on your custom data!
Before running, please configure your desired input and output path in the demo.py
file.
python demo.py --load configs/final.yaml --resume checkpoint-path
By default, it evaluates all .jpg
files in the demo
folder, and saves the
detection result in tmp
, with manipulation probablities appended to the file names.
If you feel this project is helpful, please consider citing our paper
@inproceedings{zhai2023towards,
title={Towards Generic Image Manipulation Detection with Weakly-Supervised Self-Consistency Learning},
author={Zhai, Yuanhao and Luan, Tianyu and Doermann, David and Yuan, Junsong},
booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
pages={22390--22400},
year={2023}
}
We would like to thank the following repos for their great work: