An Alternative to EM for Gaussian Mixture Models: Batch and Stochastic Riemannian Optimization

Jun 10, 2017

Reshad Hosseini, Suvrit Sra

Jun 10, 2017

Reshad Hosseini, Suvrit Sra

**Click to Read Paper**

**Click to Read Paper**

**Click to Read Paper**

Mixtures of conditional Gaussian scale mixtures applied to multiscale image representations

Sep 20, 2011

Lucas Theis, Reshad Hosseini, Matthias Bethge

We present a probabilistic model for natural images which is based on Gaussian scale mixtures and a simple multiscale representation. In contrast to the dominant approach to modeling whole images focusing on Markov random fields, we formulate our model in terms of a directed graphical model. We show that it is able to generate images with interesting higher-order correlations when trained on natural images or samples from an occlusion based model. More importantly, the directed model enables us to perform a principled evaluation. While it is easy to generate visually appealing images, we demonstrate that our model also yields the best performance reported to date when evaluated with respect to the cross-entropy rate, a measure tightly linked to the average log-likelihood.
Sep 20, 2011

Lucas Theis, Reshad Hosseini, Matthias Bethge

**Click to Read Paper**

Deep-RBF Networks Revisited: Robust Classification with Rejection

Dec 07, 2018

Pourya Habib Zadeh, Reshad Hosseini, Suvrit Sra

One of the main drawbacks of deep neural networks, like many other classifiers, is their vulnerability to adversarial attacks. An important reason for their vulnerability is assigning high confidence to regions with few or even no feature points. By feature points, we mean a nonlinear transformation of the input space extracting a meaningful representation of the input data. On the other hand, deep-RBF networks assign high confidence only to the regions containing enough feature points, but they have been discounted due to the widely-held belief that they have the vanishing gradient problem. In this paper, we revisit the deep-RBF networks by first giving a general formulation for them, and then proposing a family of cost functions thereof inspired by metric learning. In the proposed deep-RBF learning algorithm, the vanishing gradient problem does not occur. We make these networks robust to adversarial attack by adding the reject option to their output layer. Through several experiments on the MNIST dataset, we demonstrate that our proposed method not only achieves significant classification accuracy but is also very resistant to various adversarial attacks.
Dec 07, 2018

Pourya Habib Zadeh, Reshad Hosseini, Suvrit Sra

**Click to Read Paper**

Exploiting generalization in the subspaces for faster model-based learning

Oct 25, 2017

Maryam Hashemzadeh, Reshad Hosseini, Majid Nili Ahmadabadi

Oct 25, 2017

Maryam Hashemzadeh, Reshad Hosseini, Majid Nili Ahmadabadi

**Click to Read Paper**

**Click to Read Paper**

Inference and Mixture Modeling with the Elliptical Gamma Distribution

Dec 20, 2015

Reshad Hosseini, Suvrit Sra, Lucas Theis, Matthias Bethge

We study modeling and inference with the Elliptical Gamma Distribution (EGD). We consider maximum likelihood (ML) estimation for EGD scatter matrices, a task for which we develop new fixed-point algorithms. Our algorithms are efficient and converge to global optima despite nonconvexity. Moreover, they turn out to be much faster than both a well-known iterative algorithm of Kent & Tyler (1991) and sophisticated manifold optimization algorithms. Subsequently, we invoke our ML algorithms as subroutines for estimating parameters of a mixture of EGDs. We illustrate our methods by applying them to model natural image statistics---the proposed EGD mixture model yields the most parsimonious model among several competing approaches.
Dec 20, 2015

Reshad Hosseini, Suvrit Sra, Lucas Theis, Matthias Bethge

**Click to Read Paper**