Enforcing constraints for interpolation and extrapolation in Generative Adversarial Networks

Mar 22, 2018

Panos Stinis, Tobias Hagge, Alexandre M. Tartakovsky, Enoch Yeung

Generative Adversarial Networks (GANs) are becoming popular choices for unsupervised learning. At the same time there is a concerted effort in the machine learning community to expand the range of tasks in which learning can be applied as well as to utilize methods from other disciplines to accelerate learning. With this in mind, in the current work we suggest ways to enforce given constraints in the output of a GAN both for interpolation and extrapolation. The two cases need to be treated differently. For the case of interpolation, the incorporation of constraints is built into the training of the GAN. The incorporation of the constraints respects the primary game-theoretic setup of a GAN so it can be combined with existing algorithms. However, it can exacerbate the problem of instability during training that is well-known for GANs. We suggest adding small noise to the constraints as a simple remedy that has performed well in our numerical experiments. The case of extrapolation (prediction) is more involved. First, we employ a modified interpolation training process that uses noisy data but does not necessarily enforce the constraints during training. Second, the resulting modified interpolator is used for extrapolation where the constraints are enforced after each step through projection on the space of constraints.
Mar 22, 2018

Panos Stinis, Tobias Hagge, Alexandre M. Tartakovsky, Enoch Yeung

* 29 pages

**Click to Read Paper**

Solving differential equations with unknown constitutive relations as recurrent neural networks

Oct 06, 2017

Tobias Hagge, Panos Stinis, Enoch Yeung, Alexandre M. Tartakovsky

Oct 06, 2017

Tobias Hagge, Panos Stinis, Enoch Yeung, Alexandre M. Tartakovsky

* 19 pages, 8 figures

**Click to Read Paper**