Clone the Synchronized-BatchNorm-PyTorch repository.
cd models/networks/
git clone https://github.com/vacancy/Synchronized-BatchNorm-PyTorch
cp -rf Synchronized-BatchNorm-PyTorch/sync_batchnorm .
cd ../../
VGG model for computing loss. Download from here, move it to models/
.
For the preparation of datasets, please refer to CoCosNet.
The pretrained models can be downloaded from here. Saving the pretrained models in checkpoints
.
Then run the command
bash test_ade.sh
Then run the command
bash train_ade.sh
If you use this code for your research, please cite our papers.
@inproceedings{zhan2021unite,
title={Unbalanced Feature Transport for Exemplar-based Image Translation},
author={Zhan, Fangneng and Yu, Yingchen and Cui, Kaiwen and Zhang, Gongjie and Lu, Shijian and Pan, Jianxiong and Zhang, Changgong and Ma, Feiying and Xie, Xuansong and Miao, Chunyan},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={15028--15038},
year={2021}
}
This code borrows heavily from CoCosNet. We also thank SPADE, Synchronized Normalization and Geometric Loss.