TR2016-070

Coupled Generative Adversarial Nets


    •  Liu, M.-Y., Tuzel, O., "Coupled Generative Adversarial Nets", arXiv, June 2016.
      BibTeX Download PDF
      • @techreport{MERL_TR2016-070,
      • author = {Liu, M.-Y. and Tuzel, O.},
      • title = {Coupled Generative Adversarial Nets},
      • institution = {MERL - Mitsubishi Electric Research Laboratories},
      • address = {Cambridge, MA 02139},
      • number = {TR2016-070},
      • month = jun,
      • year = 2016,
      • url = {http://www.merl.com/publications/TR2016-070/}
      • }
  • Research Areas:

    Computational Photography, Computer Vision


We propose the coupled generative adversarial nets (CoGAN) framework for generating pairs of corresponding images in two different domains. The framework consists of a pair of generative adversarial nets, each responsible for generating images in one domain. We show that by enforcing a simple weight-sharing constraint, the CoGAN learns to generate pairs of corresponding images without existence of any pairs of corresponding images in the two domains in the training set. In other words, the CoGAN learns a joint distribution of images in the two domains from images drawn separately from the marginal distributions of the individual domains. This is in contrast to the existing multi-modal generative models, which require corresponding images for training. We apply the CoGAN to several pair image generation tasks. For each task, the GoGAN learns to generate convincing pairs of corresponding images. We further demonstrate the applications of the CoGAN framework for the domain adaptation and cross-domain image generation tasks.