Tensorflow implementation of NIPS 2017 paper Pose Guided Person Image Generation
- python 2.7
- tensorflow-gpu (1.4.1)
- numpy (1.14.0)
- Pillow (5.0.0)
- scikit-image (0.13.0)
- scipy (1.0.1)
- matplotlib (2.0.0)
- Pretrained models: Market-1501, DeepFashion.
- Training data in tf-record format: Market-1501, DeepFashion.
- Testing data in tf-record format: Market-1501, DeepFashion.
- Raw data: Market-1501, DeepFashion
- Testing results: Market-1501, DeepFashion
You can skip this data preparation procedure if directly using the tf-record data files.
cd datasets
./run_convert_market.sh
to download and convert the original images, poses, attributes, segmentations./run_convert_DF.sh
to download and convert the original images, poses
Note: we also provide the convert code for Market-1501 Attribute and Market-1501 Segmentation results from PSPNet. These extra info, are provided for further research. In our experiments, pose mask are obtained from pose key-points (see _getPoseMask
function in convert .py files).
- Download the tf-record training data.
- Modify the
model_dir
in therun_market_train.sh/run_DF_train.sh
scripts. - run
run_market_train.sh/run_DF_train.sh
Note: we use a triplet instead of pair real/fake for adversarial training to keep training more stable.
- Download the pretrained models and tf-record testing data.
- Modify the
model_dir
in therun_market_test.sh/run_DF_test.sh scripts
. - run
run_market_test.sh/run_DF_test.sh
Pytorch implementation Human-Pose-Transfer
@inproceedings{ma2017pose,
title={Pose guided person image generation},
author={Ma, Liqian and Jia, Xu and Sun, Qianru and Schiele, Bernt and Tuytelaars, Tinne and Van Gool, Luc},
booktitle={Advances in Neural Information Processing Systems},
pages={405--415},
year={2017}
}
Sponsered by imgcreator.zmo.ai