DeAda is a new training method for person re-identification (Re-ID) networks. It is published recently on Neurocomputing - Decouple Co-adaptation: Classifier Randomization for Person Re-identification. DeAda could decouple co-adaptation in Re-ID networks, so that performance of networks could be improved. DeAda does not increase computational cost during training and testing.
This project is the implementation of DeAda on some commonly used baseline networks. Our code is adapted from the open-reid library (https://github.com/Cysu/open-reid).
-
Download using:
wget http://188.138.127.15:81/Datasets/Market-1501-v15.09.15.zip <path/to/where/you/want> unzip <path/to/>/Market-1501-v15.09.15.zip
-
- Download cuhk03 dataset from here
- Unzip the file and you will get the cuhk03_release dir which include cuhk-03.mat
- Download "cuhk03_new_protocol_config_detected.mat" from here and put it with cuhk-03.mat. We need this new protocol to split the dataset.
NOTICE: You need to change num_classes in network depend on how many people in your train dataset! e.g. 751 in Market1501.
The data structure should look like:
data/
bounding_box_train/
bounding_box_test/
query/
train.txt
val.txt
query.txt
gallery.txt
Here each *.txt file consists lines of the format: image file name, person id, camera id. train.txt consists images from bounding_box_train/, val.txt and query.txt consists images from query/, and gallery.txt consists images from bounding_box_test/.
- cudnn 7
- CUDA 9
- Pytorch v0.4.1
- Python 2.7
- torchvision
- scipy
- numpy
- scikit_learn
- ResNet. We choose two configurations: ResNet50 and ResNet152.
- DenseNet. We choose two configurations: DenseNet121 and DenseNet161.
We provie two training methods: plain (traditional SGD optimization) and deada (our proposed DeAda optimization). The training method could be specified by the argument training_method in run.sh
- Train and Evaluate by running
bash run.sh
Evaluation metric: mAP (%) and CMC Rank-1 (%).
Models + Training_method | Market-1501 | CUHK03 | DukeMTMC-reID | |||
---|---|---|---|---|---|---|
mAP | Rank-1 | mAP | Rank-1 | mAP | Rank-1 | |
ResNet50 + SGD | 70.9 | 86.8 | 41.5 | 41.4 | 62.6 | 79.6 |
ResNet50 + DeAda | 72.3 | 87.7 | 43.4 | 44.4 | 63.5 | 80.5 |
DenseNet121 + SGD | 73.2 | 89.1 | 40.5 | 41.1 | 64.7 | 81.0 |
DenseNet121 + DeAda | 76.9 | 90.5 | 41.8 | 41.9 | 66.9 | 82.0 |
ResNet152 + SGD | 74.8 | 89.2 | 47.1 | 47.1 | 65.9 | 80.7 |
ResNet152 + DeAda | 76.0 | 89.4 | 49.4 | 51.7 | 66.2 | 81.6 |
DenseNet161 + SGD | 75.6 | 90.0 | 45.5 | 45.7 | 66.7 | 81.9 |
DenseNet161 + DeAda | 78.6 | 92.1 | 46.4 | 46.1 | 68.7 | 84.1 |
Please cite our paper when you use DeAda in your research:
Long Wei, Zhenyong Wei, Zhongming Jin, Qianxiao Wei, Jianqiang Huang, Xian-Sheng Hua, Deng Cai, and Xiaofei He. "Decouple co-adaptation: Classifier randomization for person re-identification." Neurocomputing 383 (2020): 1-9.