A Class of Logistic Functions for Approximating State-Inclusive Koopman Operators

Dec 08, 2017

Charles A. Johnson, Enoch Yeung

Dec 08, 2017

Charles A. Johnson, Enoch Yeung

* 8 pages

**Click to Read Paper**

A Constructive Approach for One-Shot Training of Neural Networks Using Hypercube-Based Topological Coverings

Jan 09, 2019

W. Brent Daniel, Enoch Yeung

Jan 09, 2019

W. Brent Daniel, Enoch Yeung

**Click to Read Paper**

Learning Deep Neural Network Representations for Koopman Operators of Nonlinear Dynamical Systems

Nov 17, 2017

Enoch Yeung, Soumya Kundu, Nathan Hodas

Nov 17, 2017

Enoch Yeung, Soumya Kundu, Nathan Hodas

* 16 pages, 5 figures

**Click to Read Paper**

Decomposition of Nonlinear Dynamical Systems Using Koopman Gramians

Oct 04, 2017

Zhiyuan Liu, Soumya Kundu, Lijun Chen, Enoch Yeung

Oct 04, 2017

Zhiyuan Liu, Soumya Kundu, Lijun Chen, Enoch Yeung

* 8 pages, submitted to IEEE 2018 ACC

**Click to Read Paper**

Enforcing constraints for interpolation and extrapolation in Generative Adversarial Networks

Mar 22, 2018

Panos Stinis, Tobias Hagge, Alexandre M. Tartakovsky, Enoch Yeung

Generative Adversarial Networks (GANs) are becoming popular choices for unsupervised learning. At the same time there is a concerted effort in the machine learning community to expand the range of tasks in which learning can be applied as well as to utilize methods from other disciplines to accelerate learning. With this in mind, in the current work we suggest ways to enforce given constraints in the output of a GAN both for interpolation and extrapolation. The two cases need to be treated differently. For the case of interpolation, the incorporation of constraints is built into the training of the GAN. The incorporation of the constraints respects the primary game-theoretic setup of a GAN so it can be combined with existing algorithms. However, it can exacerbate the problem of instability during training that is well-known for GANs. We suggest adding small noise to the constraints as a simple remedy that has performed well in our numerical experiments. The case of extrapolation (prediction) is more involved. First, we employ a modified interpolation training process that uses noisy data but does not necessarily enforce the constraints during training. Second, the resulting modified interpolator is used for extrapolation where the constraints are enforced after each step through projection on the space of constraints.
Mar 22, 2018

Panos Stinis, Tobias Hagge, Alexandre M. Tartakovsky, Enoch Yeung

* 29 pages

**Click to Read Paper**

Solving differential equations with unknown constitutive relations as recurrent neural networks

Oct 06, 2017

Tobias Hagge, Panos Stinis, Enoch Yeung, Alexandre M. Tartakovsky

Oct 06, 2017

Tobias Hagge, Panos Stinis, Enoch Yeung, Alexandre M. Tartakovsky

* 19 pages, 8 figures

**Click to Read Paper**