* 8 pages, double column. IEEE International Conference on Robotics and Automation, 2014

**Click to Read Paper and Get Code**

Manifold Gaussian Processes for Regression

Apr 11, 2016

Roberto Calandra, Jan Peters, Carl Edward Rasmussen, Marc Peter Deisenroth

Apr 11, 2016

Roberto Calandra, Jan Peters, Carl Edward Rasmussen, Marc Peter Deisenroth

* 8 pages, accepted to IJCNN 2016

**Click to Read Paper and Get Code**

Data-Efficient Reinforcement Learning with Probabilistic Model Predictive Control

Feb 22, 2018

Sanket Kamthe, Marc Peter Deisenroth

Feb 22, 2018

Sanket Kamthe, Marc Peter Deisenroth

* Accepted at AISTATS 2018,

**Click to Read Paper and Get Code**

Expectation Propagation in Gaussian Process Dynamical Systems: Extended Version

Aug 17, 2016

Marc Peter Deisenroth, Shakir Mohamed

Aug 17, 2016

Marc Peter Deisenroth, Shakir Mohamed

* Advances in Neural Information Processing Systems 25 (NIPS), pp. 2609-2617, 2012

**Click to Read Paper and Get Code**

A Probabilistic Perspective on Gaussian Filtering and Smoothing

Jun 08, 2011

Marc Peter Deisenroth, Henrik Ohlsson

Jun 08, 2011

Marc Peter Deisenroth, Henrik Ohlsson

* 14 pages. Extended version of conference paper (ACC 2011)

**Click to Read Paper and Get Code**

To scale Gaussian processes (GPs) to large data sets we introduce the robust Bayesian Committee Machine (rBCM), a practical and scalable product-of-experts model for large-scale distributed GP regression. Unlike state-of-the-art sparse GP approximations, the rBCM is conceptually simple and does not rely on inducing or variational parameters. The key idea is to recursively distribute computations to independent computational units and, subsequently, recombine them to form an overall result. Efficient closed-form inference allows for straightforward parallelisation and distributed computations with a small memory footprint. The rBCM is independent of the computational graph and can be used on heterogeneous computing infrastructures, ranging from laptops to clusters. With sufficient computing resources our distributed GP model can handle arbitrarily large data sets.

* JMLR W&CP, vol 37, 2015

* 10 pages, 5 figures. Appears in Proceedings of ICML 2015

* JMLR W&CP, vol 37, 2015

* 10 pages, 5 figures. Appears in Proceedings of ICML 2015

**Click to Read Paper and Get Code**
Hierarchical Mixture-of-Experts Model for Large-Scale Gaussian Process Regression

Dec 09, 2014

Jun Wei Ng, Marc Peter Deisenroth

We propose a practical and scalable Gaussian process model for large-scale nonlinear probabilistic regression. Our mixture-of-experts model is conceptually simple and hierarchically recombines computations for an overall approximation of a full Gaussian process. Closed-form and distributed computations allow for efficient and massive parallelisation while keeping the memory consumption small. Given sufficient computing resources, our model can handle arbitrarily large data sets, without explicit sparse approximations. We provide strong experimental evidence that our model can be applied to large data sets of sizes far beyond millions. Hence, our model has the potential to lay the foundation for general large-scale Gaussian process research.
Dec 09, 2014

Jun Wei Ng, Marc Peter Deisenroth

**Click to Read Paper and Get Code**

Differentially Private Empirical Risk Minimization with Sparsity-Inducing Norms

May 13, 2019

K S Sesh Kumar, Marc Peter Deisenroth

May 13, 2019

K S Sesh Kumar, Marc Peter Deisenroth

**Click to Read Paper and Get Code**

Meta Reinforcement Learning with Latent Variable Gaussian Processes

Jul 07, 2018

Steindór Sæmundsson, Katja Hofmann, Marc Peter Deisenroth

Jul 07, 2018

Steindór Sæmundsson, Katja Hofmann, Marc Peter Deisenroth

* 11 pages, 7 figures

**Click to Read Paper and Get Code**

Design of Experiments for Model Discrimination Hybridising Analytical and Data-Driven Approaches

May 31, 2018

Simon Olofsson, Marc Peter Deisenroth, Ruth Misener

May 31, 2018

Simon Olofsson, Marc Peter Deisenroth, Ruth Misener

**Click to Read Paper and Get Code**

Maximizing acquisition functions for Bayesian optimization

May 25, 2018

James T. Wilson, Frank Hutter, Marc Peter Deisenroth

May 25, 2018

James T. Wilson, Frank Hutter, Marc Peter Deisenroth

**Click to Read Paper and Get Code**

Gaussian Processes for Data-Efficient Learning in Robotics and Control

Oct 10, 2017

Marc Peter Deisenroth, Dieter Fox, Carl Edward Rasmussen

Oct 10, 2017

Marc Peter Deisenroth, Dieter Fox, Carl Edward Rasmussen

* IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 37, issue no 2, pages 408-423, February 2015

* 20 pages, 29 figures; fixed a typo in equation on page 8

**Click to Read Paper and Get Code**

Neural Embeddings of Graphs in Hyperbolic Space

May 29, 2017

Benjamin Paul Chamberlain, James Clough, Marc Peter Deisenroth

May 29, 2017

Benjamin Paul Chamberlain, James Clough, Marc Peter Deisenroth

* 13th international workshop on mining and learning from graphs held in conjunction with KDD, 2017

* 7 pages, 5 figures

**Click to Read Paper and Get Code**

Probabilistic Inference of Twitter Users' Age based on What They Follow

Feb 24, 2017

Benjamin Paul Chamberlain, Clive Humby, Marc Peter Deisenroth

Feb 24, 2017

Benjamin Paul Chamberlain, Clive Humby, Marc Peter Deisenroth

* 9 pages, 9 figures

**Click to Read Paper and Get Code**

From Pixels to Torques: Policy Learning with Deep Dynamical Models

Jun 18, 2015

Niklas Wahlström, Thomas B. Schön, Marc Peter Deisenroth

Jun 18, 2015

Niklas Wahlström, Thomas B. Schön, Marc Peter Deisenroth

* 9 pages

**Click to Read Paper and Get Code**

Learning deep dynamical models from image pixels

Oct 28, 2014

Niklas Wahlström, Thomas B. Schön, Marc Peter Deisenroth

Oct 28, 2014

Niklas Wahlström, Thomas B. Schön, Marc Peter Deisenroth

* 10 pages, 11 figures

**Click to Read Paper and Get Code**

Deep Gaussian Processes with Importance-Weighted Variational Inference

May 14, 2019

Hugh Salimbeni, Vincent Dutordoir, James Hensman, Marc Peter Deisenroth

Deep Gaussian processes (DGPs) can model complex marginal densities as well as complex mappings. Non-Gaussian marginals are essential for modelling real-world data, and can be generated from the DGP by incorporating uncorrelated variables to the model. Previous work on DGP models has introduced noise additively and used variational inference with a combination of sparse Gaussian processes and mean-field Gaussians for the approximate posterior. Additive noise attenuates the signal, and the Gaussian form of variational distribution may lead to an inaccurate posterior. We instead incorporate noisy variables as latent covariates, and propose a novel importance-weighted objective, which leverages analytic results and provides a mechanism to trade off computation for improved accuracy. Our results demonstrate that the importance-weighted objective works well in practice and consistently outperforms classical variational inference, especially for deeper models.
May 14, 2019

Hugh Salimbeni, Vincent Dutordoir, James Hensman, Marc Peter Deisenroth

* Appearing ICML 2019

**Click to Read Paper and Get Code**

The reparameterization trick for acquisition functions

Dec 01, 2017

James T. Wilson, Riccardo Moriconi, Frank Hutter, Marc Peter Deisenroth

Dec 01, 2017

James T. Wilson, Riccardo Moriconi, Frank Hutter, Marc Peter Deisenroth

* Accepted at the NIPS 2017 Workshop on Bayesian Optimization (BayesOpt 2017)

**Click to Read Paper and Get Code**

A Brief Survey of Deep Reinforcement Learning

Sep 28, 2017

Kai Arulkumaran, Marc Peter Deisenroth, Miles Brundage, Anil Anthony Bharath

Sep 28, 2017

Kai Arulkumaran, Marc Peter Deisenroth, Miles Brundage, Anil Anthony Bharath

* IEEE Signal Processing Magazine, Special Issue on Deep Learning for Image Understanding (arXiv extended version)

**Click to Read Paper and Get Code**

Identification of Gaussian Process State Space Models

Nov 07, 2017

Stefanos Eleftheriadis, Thomas F. W. Nicholson, Marc Peter Deisenroth, James Hensman

The Gaussian process state space model (GPSSM) is a non-linear dynamical system, where unknown transition and/or measurement mappings are described by GPs. Most research in GPSSMs has focussed on the state estimation problem, i.e., computing a posterior of the latent state given the model. However, the key challenge in GPSSMs has not been satisfactorily addressed yet: system identification, i.e., learning the model. To address this challenge, we impose a structured Gaussian variational posterior distribution over the latent states, which is parameterised by a recognition model in the form of a bi-directional recurrent neural network. Inference with this structure allows us to recover a posterior smoothed over sequences of data. We provide a practical algorithm for efficiently computing a lower bound on the marginal likelihood using the reparameterisation trick. This further allows for the use of arbitrary kernels within the GPSSM. We demonstrate that the learnt GPSSM can efficiently generate plausible future trajectories of the identified system after only observing a small number of episodes from the true system.
Nov 07, 2017

Stefanos Eleftheriadis, Thomas F. W. Nicholson, Marc Peter Deisenroth, James Hensman

**Click to Read Paper and Get Code**