Skip to content
/ RGT Public
forked from zhengchen1999/RGT

PyTorch code for our ICLR 2024 paper "Recursive Generalization Transformer for Image Super-Resolution"

License

Notifications You must be signed in to change notification settings

Yoniqueeml/RGT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Тренировка (на VM с 16 vCPU, 24GB RAM, Tesla V100)

python -m torch.distributed.launch --nproc_per_node=1 --master_port=4321 basicsr/train.py -opt options/train/train_RGT_x4.yml --launcher pytorch
-nproc_per_node=1 для nvidia A100
RGT_x4 результат:
Одна эпоха обучается +- 9 минут
Большой недостаток: расход памяти. Для апскейла 600x600 в x4 требуется ~31 Гб

⚙️ Dependencies

  • Python 3.8
  • PyTorch 1.9.0
  • NVIDIA GPU + CUDA
# Clone the github repo and go to the default directory 'RGT'.
git clone https://github.com/zhengchen1999/RGT.git
conda create -n RGT python=3.8
conda activate RGT
pip install -r requirements.txt -f https://download.pytorch.org/whl/torch_stable.html
pip install six
python setup.py develop

🖨️ Datasets

Used training and testing sets can be downloaded as follows:

Training Set Testing Set Visual Results
DIV2K (800 training images, 100 validation images) + Flickr2K (2650 images) [complete training dataset DF2K: Google Drive / Baidu Disk] Set5 + Set14 + BSD100 + Urban100 + Manga109 [complete testing dataset: Google Drive / Baidu Disk] Google Drive / Baidu Disk

Download training and testing datasets and put them into the corresponding folders of datasets/. See datasets for the detail of the directory structure.

📦 Models

Method Params (M) FLOPs (G) PSNR (dB) SSIM Model Zoo Visual Results
RGT-S 10.20 193.08 27.89 0.8347 Google Drive / Baidu Disk Google Drive / Baidu Disk
RGT 13.37 251.07 27.98 0.8369 Google Drive / Baidu Disk Google Drive / Baidu Disk

The performance is reported on Urban100 (x4). Output size of FLOPs is 3×512×512.

🔧 Training

  • Download training (DF2K, already processed) and testing (Set5, Set14, BSD100, Urban100, Manga109, already processed) datasets, place them in datasets/.

  • Run the following scripts. The training configuration is in options/train/.

    # RGT-S, input=64x64, 4 GPUs
    python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 basicsr/train.py -opt options/train/train_RGT_S_x2.yml --launcher pytorch
    python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 basicsr/train.py -opt options/train/train_RGT_S_x3.yml --launcher pytorch
    python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 basicsr/train.py -opt options/train/train_RGT_S_x4.yml --launcher pytorch
    
    # RGT, input=64x64, 4 GPUs
    python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 basicsr/train.py -opt options/train/train_RGT_x2.yml --launcher pytorch
    python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 basicsr/train.py -opt options/train/train_RGT_x3.yml --launcher pytorch
    python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 basicsr/train.py -opt options/train/train_RGT_x4.yml --launcher pytorch
  • The training experiment is in experiments/.

🔨 Testing

🌗 Test images with HR

  • Download the pre-trained models and place them in experiments/pretrained_models/.

    We provide pre-trained models for image SR: RGT-S and RGT (x2, x3, x4).

  • Download testing (Set5, Set14, BSD100, Urban100, Manga109) datasets, place them in datasets/.

  • Run the following scripts. The testing configuration is in options/test/ (e.g., test_RGT_x2.yml).

    Note 1: You can set use_chop: True (default: False) in YML to chop the image for testing.

    # No self-ensemble
    # RGT-S, reproduces results in Table 2 of the main paper
    python basicsr/test.py -opt options/test/test_RGT_S_x2.yml
    python basicsr/test.py -opt options/test/test_RGT_S_x3.yml
    python basicsr/test.py -opt options/test/test_RGT_S_x4.yml
    
    # RGT, reproduces results in Table 2 of the main paper
    python basicsr/test.py -opt options/test/test_RGT_x2.yml
    python basicsr/test.py -opt options/test/test_RGT_x3.yml
    python basicsr/test.py -opt options/test/test_RGT_x4.yml
  • The output is in results/.

🌓 Test images without HR

  • Download the pre-trained models and place them in experiments/pretrained_models/.

    We provide pre-trained models for image SR: RGT-S and RGT (x2, x3, x4).

  • Put your dataset (single LR images) in datasets/single. Some test images are in this folder.

  • Run the following scripts. The testing configuration is in options/test/ (e.g., test_single_x2.yml).

    Note 1: The default model is RGT. You can use other models like RGT-S by modifying the YML.

    Note 2: You can set use_chop: True (default: False) in YML to chop the image for testing.

    # Test on your dataset
    python basicsr/test.py -opt options/test/test_single_x2.yml
    python basicsr/test.py -opt options/test/test_single_x3.yml
    python basicsr/test.py -opt options/test/test_single_x4.yml
  • The output is in results/.

🔎 Results

We achieved state-of-the-art performance. Detailed results can be found in the paper.

Quantitative Comparison (click to expand)
  • results in Table 2 of the main paper

Visual Comparison (click to expand)
  • results in Figure 6 of the main paper

  • results in Figure 4 of the supplementary material

  • results in Figure 5 of the supplementary material

📎 Citation

If you find the code helpful in your resarch or work, please cite the following paper(s).

@inproceedings{chen2024recursive,
  title={Recursive Generalization Transformer for Image Super-Resolution},
  author={Chen, Zheng and Zhang, Yulun and Gu, Jinjin and Kong, Linghe and Yang, Xiaokang},
  booktitle={ICLR},
  year={2024}
}

💡 Acknowledgements

This code is built on BasicSR.

About

PyTorch code for our ICLR 2024 paper "Recursive Generalization Transformer for Image Super-Resolution"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%