An Alternative to EM for Gaussian Mixture Models: Batch and Stochastic Riemannian Optimization

Jun 10, 2017

Reshad Hosseini, Suvrit Sra

Jun 10, 2017

Reshad Hosseini, Suvrit Sra

* 21 pages, 6 figures

**Click to Read Paper**

* 5 pages

**Click to Read Paper**

* 19 pages

**Click to Read Paper**

Mixtures of conditional Gaussian scale mixtures applied to multiscale image representations

Sep 20, 2011

Lucas Theis, Reshad Hosseini, Matthias Bethge

We present a probabilistic model for natural images which is based on Gaussian scale mixtures and a simple multiscale representation. In contrast to the dominant approach to modeling whole images focusing on Markov random fields, we formulate our model in terms of a directed graphical model. We show that it is able to generate images with interesting higher-order correlations when trained on natural images or samples from an occlusion based model. More importantly, the directed model enables us to perform a principled evaluation. While it is easy to generate visually appealing images, we demonstrate that our model also yields the best performance reported to date when evaluated with respect to the cross-entropy rate, a measure tightly linked to the average log-likelihood.
Sep 20, 2011

Lucas Theis, Reshad Hosseini, Matthias Bethge

**Click to Read Paper**

Deep-RBF Networks Revisited: Robust Classification with Rejection

Dec 07, 2018

Pourya Habib Zadeh, Reshad Hosseini, Suvrit Sra

Dec 07, 2018

Pourya Habib Zadeh, Reshad Hosseini, Suvrit Sra

**Click to Read Paper**

Exploiting generalization in the subspaces for faster model-based learning

Oct 25, 2017

Maryam Hashemzadeh, Reshad Hosseini, Majid Nili Ahmadabadi

Oct 25, 2017

Maryam Hashemzadeh, Reshad Hosseini, Majid Nili Ahmadabadi

**Click to Read Paper**

* 7 pages, 4 figures

**Click to Read Paper**

Inference and Mixture Modeling with the Elliptical Gamma Distribution

Dec 20, 2015

Reshad Hosseini, Suvrit Sra, Lucas Theis, Matthias Bethge

We study modeling and inference with the Elliptical Gamma Distribution (EGD). We consider maximum likelihood (ML) estimation for EGD scatter matrices, a task for which we develop new fixed-point algorithms. Our algorithms are efficient and converge to global optima despite nonconvexity. Moreover, they turn out to be much faster than both a well-known iterative algorithm of Kent & Tyler (1991) and sophisticated manifold optimization algorithms. Subsequently, we invoke our ML algorithms as subroutines for estimating parameters of a mixture of EGDs. We illustrate our methods by applying them to model natural image statistics---the proposed EGD mixture model yields the most parsimonious model among several competing approaches.
Dec 20, 2015

Reshad Hosseini, Suvrit Sra, Lucas Theis, Matthias Bethge

* Computational Statistics & Data Analysis 2016, Vol. 101, 29-43

* 23 pages, 11 figures

**Click to Read Paper**