Lipschitz Constrained Generative Adversarial networks
Spring 2021
Master Semester Project
Project: 00404
In generative adversarial networks, it has been shown that controlling the Lipschitz regularity of a network largely improves the generative performance. For example, Wasserstein GAN (WGAN) and Spectral Normalization GAN (SNGAN) achieve this by restricting the discriminative function to be 1-Lipschitz. Recently, we developed a framework for learning activations of deep neural networks with the motivation of controlling the global Lipschitz constant of the input-output relation.
The goal of this project is 1) to investigate the effect of our framework in the generative adversarial networks and 2) to develop a better Lipschitz regularization method for training generative models. We have already seen some promising results on several toy datasets such as a mixture of Gaussians, Swiss roll, and (partially) MNIST. We aim to extend this to more complicated datasets such as human faces (CelebA dataset). The student must be familiar with PyTorch and a general understanding of the main concepts of deep learning. (+ it would be nice if one has some experience in generative adversarial networks)
References Aziznejad, S., Gupta, H., Campos, J., & Unser, M. (2020). Deep neural networks with trainable activations and controlled Lipschitz constant. arXiv preprint arXiv:2001.06263.
The goal of this project is 1) to investigate the effect of our framework in the generative adversarial networks and 2) to develop a better Lipschitz regularization method for training generative models. We have already seen some promising results on several toy datasets such as a mixture of Gaussians, Swiss roll, and (partially) MNIST. We aim to extend this to more complicated datasets such as human faces (CelebA dataset). The student must be familiar with PyTorch and a general understanding of the main concepts of deep learning. (+ it would be nice if one has some experience in generative adversarial networks)
References Aziznejad, S., Gupta, H., Campos, J., & Unser, M. (2020). Deep neural networks with trainable activations and controlled Lipschitz constant. arXiv preprint arXiv:2001.06263.
- Supervisors
- Jaejun Yoo, jaejun.yoo@epfl.ch, BM 4.141
- Michael Unser, michael.unser@epfl.ch, 021 693 51 75, BM 4.136
- Joaquim Campos