PyTorch training code and pretrained models for DSLPT (Dynamic Sparse Local Patch Transformer).
Install system requirements:
sudo apt-get install python3-dev python3-pip python3-tk libglib2.0-0
Install python dependencies:
pip3 install -r requirements.txt
-
Download and process WFLW dataset
- Download WFLW dataset and annotation from Here.
- Unzip WFLW dataset and annotations and move files into
./Data
directory. Your directory should look like this:DSLPT └───Data │ └───WFLW │ └───WFLW_annotations │ └───list_98pt_rect_attr_train_test │ │ │ └───list_98pt_test └───WFLW_images └───0--Parade │ └───...
-
Download pretrained weight of HRNetW18C
- Download pretrained weight of HRNetW18C from Here.
- Move files into
./Config
directory. Your directory should look like this:DSLPT └───Config │ └───hrnetv2_w18_imagenet_pretrained.pth
-
python ./train.py
.
-
Download and process WFLW dataset
- Download WFLW dataset and annotation from Here.
- Unzip WFLW dataset and annotations and move files into
./dataset
directory. Your directory should look like this:DSLPT └───Dataset │ └───WFLW │ └───WFLW_annotations │ └───list_98pt_rect_attr_train_test │ │ │ └───list_98pt_test └───WFLW_images └───0--Parade │ └───...
-
Download pretrained model from Google Drive.
- WFLW
Model Name NME FR0.1 AUC0.1 download link 1 DSLPT-6-layers 4.01 2.52 0.607 download 2 DSLPT-12-layers 3.98 2.44 0.609 download Put the model in
./weights
directory. -
Test
python validate.py --checkpoint=<model_name> For example: python validate.py --checkpoint=DSLPT_WFLW_6_layers.pth
Note: if you want to use the model with 12 layers, you need to change
_C.TRANSFORMER.NUM_DECODER
for 6 to 12 in./Config/default.py
.
##Citation If you find this work or code is helpful in your research, please cite:
@ARTICLE{DSLPT,
title={Robust Face Alignment via Inherent Relation Learning and Uncertainty Estimation},
author={Jiahao Xia and Min Xu and Haimin Zhang and Jianguo Zhang and Wenjian Huang and Hu Cao and Shiping Wen},
booktitle={TPAMI},
year={2023}
}
@inproceedings{SLPT,
title={Sparse Local Patch Transformer for Robust Face Alignment and Landmarks},
author={Jiahao Xia and Weiwei Qu and Wenjian Huang and Jianguo Zhang and Xi Wang and Min Xu},
booktitle={CVPR},
year={2022}
}
DSLPT is released under the GPL-2.0 license. Please see the LICENSE file for more information.