An addition of a novel X-Bridge method into existing framework for training pix2pix and CycleGan in PyTorch. X-Bridge was developed as a cross-modal bridge method for img2sketch and sketch2img translation in heterogeneous face recognition tasks within my dissertation work Heteroheneous Face Recognition from Facial Sketches.
X-Bridge is supervised method (i.e. needs image pairs for the training) based on Generative Adversarial Networks for image-to-image translation. The main goal of the method is to bridge the differences between two different modalities (image, sketch) in the heterogeneous face recognition task. It is combining ideas from both older approaches - pix2pix and UNIT. To be more specific, the usage of L1 loss and conditional discriminator from Pix2pix, and the idea of shared-latent space from UNIT. By combining these ideas and by adding of a reconstruction path, it was reached very realistic and precise results in image-to-sketch and sketch-to-image translation tasks. In these tasks, the X-Bridge method provides better generalization comparing to Pix2pix and also better detail generation comparing to UNIT.
X-Bridge pipeline. E = encoder, G1;G2 = generators, D1;D2 = discriminators, z = latent space. Dotted line indicates L1 loss. xr is real input from the first domain, x^f is reconstructed fake image from the first domain, x^f is translated fake image from the second domain, x^r is corresponding real image from the second domain. The translation path is on the left, whereas, the reconstruction path on the right.
Used dataset - CUFSF
Glasses and non-frontal pose translation
Real-world image translation
python train.py --dataroot ./datasets/feret_sketch_highres --name feret_sketch_my --model XBridge --direction AtoB
Switch AtoB
to BtoA
to train translation in opposite direction.
python test.py --dataroot ./datasets/feret_sketch_highres --name feret_sketch_my --model XBridge --direction AtoB
Most of the project codes are based on https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix
[Ivan Gruber]([email protected])