Skip to content

PyTorch Implementation of "Light Field Image Super-Resolution with Transformers"

License

Notifications You must be signed in to change notification settings

HydrogenSulfate/LFT

 
 

Repository files navigation

LFT

PyTorch implementation of "Light Field Image Super-Resolution with Transformers", IEEE SPL 2022. [pdf].

Contributions:

  • We make the first attempt to adapt Transformers to LF image processing, and propose a Transformer-based network for LF image SR.
  • We propose a novel paradigm (i.e., angular and spatial Transformers) to incorporate angular and spatial information in an LF.
  • With a small model size and low computational cost, our LFT achieves superior SR performance than other state-of-the-art methods.

Codes and Models:

Requirement

  • PyTorch 1.3.0, torchvision 0.4.1. The code is tested with python=3.6, cuda=9.0.
  • Matlab (For training/test data generation and performance evaluation)

Datasets

We used the EPFL, HCInew, HCIold, INRIA and STFgantry datasets for both training and test. Please first download our dataset via Baidu Drive (key:7nzy) or OneDrive, and place the 5 datasets to the folder ./datasets/.

Train

  • Run Generate_Data_for_Training.m to generate training data. The generated data will be saved in ./data_for_train/ (SR_5x5_2x, SR_5x5_4x).
  • Run train.py to perform network training. Example for training LFT on 5x5 angular resolution for 4x/2xSR:
    $ python train.py --model_name LFT --angRes 5 --scale_factor 4 --batch_size 4
    $ python train.py --model_name LFT --angRes 5 --scale_factor 2 --batch_size 8
    
  • Checkpoint will be saved to ./log/.

Test

  • Run Generate_Data_for_Test.m to generate test data. The generated data will be saved in ./data_for_test/ (SR_5x5_2x, SR_5x5_4x).
  • Run test.py to perform network inference. Example for test LFT on 5x5 angular resolution for 4x/2xSR:
    python test.py --model_name LFT --angRes 5 --scale_factor 4 \
    --use_pre_pth True --path_pre_pth './pth/LFT_5x5_4x_epoch_50_model.pth'
    
    python test.py --model_name LFT --angRes 5 --scale_factor 2 \
    --use_pre_pth True --path_pre_pth './pth/LFT_5x5_2x_epoch_50_model.pth'
    
  • The PSNR and SSIM values of each dataset will be saved to ./log/.

Results:

  • Quantitative Results

  • Efficiency

  • Visual Comparisons

  • Angular Consistency

  • Spatial-Aware Angular Modeling


Citiation

If you find this work helpful, please consider citing:

@Article{LFT,
    author    = {Liang, Zhengyu and Wang, Yingqian and Wang, Longguang and Yang, Jungang and Zhou, Shilin},
    title     = {Light Field Image Super-Resolution with Transformers},
    journal   = {IEEE Signal Processing Letters},
    year      = {2022},
}


Contact

Any question regarding this work can be addressed to [email protected].

About

PyTorch Implementation of "Light Field Image Super-Resolution with Transformers"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 85.5%
  • MATLAB 14.5%