Skip to content

Latest commit

 

History

History
80 lines (38 loc) · 2.73 KB

README.md

File metadata and controls

80 lines (38 loc) · 2.73 KB

MFGNet_RGBT_Tracking_PyTorch

Official Implementation of MFGNet-RGBT-Tracker ("Dynamic Modality-Aware Filter Generation for RGB-T Tracking") with PyTorch [Project] [Paper]

Many RGB-T trackers attempt to attain robust feature representation by utilizing an adaptive weighting scheme (or attention mechanism). Different from these works, we propose a new dynamic modality-aware filter generation module (named MFGNet) to boost the message communication between visible and thermal data by adaptively adjusting the convolutional kernels for various input images in practical tracking. Our experimental results demonstrate the advantages of our proposed MFGNet for RGB-T tracking.

rgbt_car10

Demo:

(Red: Ours, Blue: Ground Truth, Green: RT-MDNet)

rgbt_car10

rgbt_balancebike

rgbt_flower1

rgbt_kite4

Install:

This code is developed based on Python 3.7, PyTorch 1.0, CUDA 10.1, Ubuntu 16.04, Tesla P100 * 4. Install anything it warnings.

RoI align module needs to compile first:

CUDA_HOME=/usr/local/cuda-10.1 python setup.py build_ext --inplace

Train and Test:

  1. generate the "50.pkl" with prepro_rgbt.py as the training data;

  2. train the tracker with train.py;

  3. train the rgbt_TANet with train_rgbtTANet.py;

  4. Obtain the attention maps and run the test.py for rgbt-tracking.

Results:

rgbt_kite4

rgbt_kite4

you can also download our pre-trained models and raw results for comprison: [Pretrained Models] [Raw Results]

Acknowledgement:

Citation:

If you use this code for your research, please cite the following paper:

@article{wang2020dfgrgbttrack,
  title={Dynamic Modality-Aware Filter Generation for RGB-T Tracking},
  author={Xiao Wang, Xiujun Shu, Shiliang Zhang, Bo Jiang, Yaowei Wang, Yonghong Tian, Feng Wu},
  journal={arXiv preprint},
  year={2020}
}

If you have any questions, feel free to contact me via email: [email protected]