Gated SwitchGAN for Multi-Domain Facial Image Translation

Xiaokang Zhang, Yuanlue Zhu, Wenting Chen, Wenshuang Liu, Linlin Shen

Research output: Journal PublicationArticlepeer-review

5 Citations (Scopus)

Abstract

Recent studies on multi-domain facial image translation have achieved impressive results. The existing methods generally provide a discriminator with an auxiliary classifier to impose domain translation. However, these methods neglect important information regarding domain distribution matching. To solve this problem, we propose a switch generative adversarial network (SwitchGAN) with a more adaptive discriminator structure and a matched generator to perform delicate image translation among multiple domains. A feature-switching operation is proposed to achieve feature selection and fusion in our conditional modules. We demonstrate the effectiveness of our model. Furthermore, we also introduce a new capability of our generator that represents attribute intensity control and extracts content information without tailored training. Experiments on the Morph, RaFD and CelebA databases visually and quantitatively show that our extended SwitchGAN (i.e., Gated SwitchGAN) can achieve better translation results than StarGAN, AttGAN and STGAN. The attribute classification accuracy achieved using the trained ResNet-18 model and the FID score obtained using the ImageNet pretrained Inception-v3 model also quantitatively demonstrate the superior performance of our models.

Original languageEnglish
Pages (from-to)1990-2003
Number of pages14
JournalIEEE Transactions on Multimedia
Volume24
DOIs
Publication statusPublished - 2022
Externally publishedYes

Keywords

  • Attribute intensity control
  • Feature switching
  • GANs
  • Image translation

ASJC Scopus subject areas

  • Signal Processing
  • Media Technology
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Gated SwitchGAN for Multi-Domain Facial Image Translation'. Together they form a unique fingerprint.

Cite this