State Space Gaussian Processes with Non-Gaussian Likelihoods
|
|
- Phillip Daniels
- 5 years ago
- Views:
Transcription
1 State Space Gaussian Processes with Non-Gaussian Likelihoods Hannes Nickisch 1 Arno Solin 2 Alexander Grigorievskiy 2,3 1 Philips Research, 2 Aalto University, 3 Silo.AI ICML2018 July 13, 2018
2 Outline Gaussian Processes Temporal GPs as stochastic differential equations (SDEs) Learning and inference with Gaussian Likelihoods Speeding up computation of state space model parameters Non-Gaussian likelihoods Approximate inference algorithms Computational primitives and how to compute them Experiments 2/ 14
3 Def: Gaussian Processes (GPs) Gaussian Process (GP) is a stochastic process where for any inputs t all corresponding outputs y are distributed as y N (m(t), K (t, t θ)). Denoted: f (t) GP(m(t), k(t, t θ)) Used as a prior over continuous functions in statistical models Properties (e.g. smoothness) are determined by the covariance function k(t, t θ) 3/ 14
4 Temporal Gaussian Processes Input data is 1-D, usually time Fully probabilistic (Bayesian) approach Conveniently combining structural components by covariance operations Challenges: Large datasets Non-Gaussian likelihoods Applicability for unevenly sampled data 4/ 14
5 Temporal Gaussian Processes Input data is 1-D, usually time Fully probabilistic (Bayesian) approach Conveniently combining structural components by covariance operations Challenges: Large datasets Non-Gaussian likelihoods Applicability for unevenly sampled data 4/ 14
6 GP as a Stochastic Differential Equation (SDE) Addressing challenge 1 Given a 1-D time series: {y i, t i } N i=1 Gaussian Process model: f (t) GP(m(t), k(t, t )) n y f P(y i f (t i )) i=1 GP prior Likelihood Equivalent Stochastic Differential Equation (SDE) [3] d f(t) d t y f = Ff(t) + Lw(t); f 0 N (0, P ) n P(y i Hf(t i )) i=1 Latent Posterior: Q(f D) = ( N f m + Kα, (K 1 + W) 1) f (t) = Hf(t) w(t) - multidimensional white noise F, L, H, P are determined from the covariance K [3] 5/ 14
7 GP as a Stochastic Differential Equation (SDE) Addressing challenge 1 Given a 1-D time series: {y i, t i } N i=1 Gaussian Process model: f (t) GP(m(t), k(t, t )) n y f P(y i f (t i )) i=1 GP prior Likelihood Equivalent Stochastic Differential Equation (SDE) [3] d f(t) d t y f = Ff(t) + Lw(t); f 0 N (0, P ) n P(y i Hf(t i )) i=1 Latent Posterior: Q(f D) = ( N f m + Kα, (K 1 + W) 1) f (t) = Hf(t) w(t) - multidimensional white noise F, L, H, P are determined from the covariance K [3] 5/ 14
8 Inference and Learning with Gaussian likelihood Gaussian likelihood: P(y i f (t i )) = N (y i f (t i ), σ 2 ni) Posterior parameters: W = σ 2 I n Evidence: α = (K + W 1 ) 1 (y m) log Z GPR = 1 2 α (y m) 1 2 log K + W 1 N 2 log(2πσ2 n) The naïve approach has O(N 3 ) complexity Solve SDE between time points (equivalent discrete time model): f i = A i 1 f i 1 + q i 1 ; q i 1 N (0, Q i 1 ) y i = Hf i + ɛ i ; ɛ n N (0, σ 2 n) Parameters of the discrete model: A i = A[ t i ] = e t i F, Q i = P A i P A i Inference and learning by Kalman FIlter (KF) and Rauch-Tung-Striebel (RTS) smoother in O(N) complexity 6/ 14
9 Inference and Learning with Gaussian likelihood Gaussian likelihood: P(y i f (t i )) = N (y i f (t i ), σ 2 ni) Posterior parameters: W = σ 2 I n Evidence: α = (K + W 1 ) 1 (y m) log Z GPR = 1 2 α (y m) 1 2 log K + W 1 N 2 log(2πσ2 n) The naïve approach has O(N 3 ) complexity Solve SDE between time points (equivalent discrete time model): f i = A i 1 f i 1 + q i 1 ; q i 1 N (0, Q i 1 ) y i = Hf i + ɛ i ; ɛ n N (0, σ 2 n) Parameters of the discrete model: A i = A[ t i ] = e t i F, Q i = P A i P A i Inference and learning by Kalman FIlter (KF) and Rauch-Tung-Striebel (RTS) smoother in O(N) complexity 6/ 14
10 log Z log Z KL = [ Fast computation of A i and Q i by interpolation Problem: 1 2 When there are many t i 2.9. Assumed density filtering (ADF) a.k.a. parameters computation can be slow Solution: α mvm K(α) + ld K(W) 2 i ψ : s e sx 0,..., 2 identical moments is smooth zi k = fi k Here, Q mapping, i(f i) = hence N (f m, K) j i interpolation (similar to KISS-GP [4]) Evaluate ψ on an equispaced log Z ADF = grid [ s 1, s 1 2,.., s K, where 2 s j = s 0 + j s l KL(f i) 2ρ KL ] where the remainder ρ KL = tr(w ld K(W)/ W) can be computed using computational primitive 4. single-sweep Expectation propagation (EP) In expectation propagation (EP) (Minka, 2001), the non- Gaussian likelihoods P(y i f i) are replaced by unnormalized Gaussians t i(f i) = exp(b if i W iifi 2 /2) and their parameters (b i, W ii) are iteratively (in multiple passes) updated such that Q i(f i)p(y i f i) and Q i(f i)t(f i) have k = Q i(fi)t(fi) dfi. tj(fj) df i denotes the cavity distribution. Unlike full state space EP using forward and backward passes (Heskes & Zoeter, 2002), there is a single-pass variant doing only one forward sweep that is know as assumed density filtering (ADF). It is very simple to implement in the GP setting. In fact, ADF is readily implemented by Algorithm 2 when the flag ADF is switched on. The marginal likelihood approximation takes the form α mvm K(α) + ld K(W) 2 i log z 0 i 2ρ ADF ] where the remainder ρ ADF collects a number of scalar terms depending on (b, W, α, m). 3. Experiments The experiments focus on showing that the state space formulation delivers the exactness of the full naïve solution, but with appealing computational benefits, and wide appli-,, Relative absolute difference in log Z Use 4-point interpolation: A c 1 A j 1 + c 2 A j + c 3 A j+1 + c 4 A j Coefficients {c i } i=1 are Number of interpolation grid points, K efficiently computable Figure 1. Relative differences in log Z with different approximation grid sizes for Ai and Qi, K, of solving a GP regression problem. Results calculated over 20 independent repetitions, mean±min/max errors visualized. Evaluation time (s) Naïve State space State space (K = 2000) State space (K = 10) Number of training inputs, n 10 3 Figure 2. Empirical computational times of GP prediction using the GPML toolbox implementation as a function of number of training inputs, n, and degree of approximation, K. For all four methods the maximum absolute error in predicted Non-Gaussian means was State 10 9 Space. GPs Results calculated over ten independent runs. 7/ 14 noise: y i = 0.2 sin(2π t i + 2) sin(0.6π t i ) +
11 Non-Gaussian Likelihoods Addressing challenge 2 Posterior as a Gaussian approximation: Q(f D) = N ( f m + Kα, (K 1 + W) 1) Laplace approximation (LA) Variational Bayes (VB) Direct Kullback-Liebler minimization (KL) Assumed Density Filtering (ADF) a.k.a. single sweep Expectation Propagation (EP) Laplace Approximation log P(f D) log P(f y) + log P(f t) Find the mode ˆf of this function by Newton method Hessian at the mode ˆf is precision W = 2 log P(ˆf t) [ log Z LA = 1 α mvm 2 K (α) + ld K (W) 2 ] i log P(y i ˆf i ) 8/ 14
12 Computational Primitives The following computational primitives allow to cast the covariance approximation in more generic terms: Linear system solving: solve K (W, r) := (K + W 1 ) 1 r Matrix-vector multiplications: mvm K (r) := Kr Log-determinants: ld K (W) := log B with well-conditioned B = I + W 1 2 K W 1 2 Predictions need latent mean E[f ] and variance V[f ] 9/ 14
13 Tackling computational primitives Using state space from of temporal GPs SpInGP: The first two computational primitives are calculated using SpInGP [5] approach: Idea is: using state space form compose the inverse of the covariance matrix, which turns out to be block-tridiagonal KF and RTS Smoothing: The last two primitives are solved by Kalman filtering and RTS smoothing Predictions are computed by primitive 4 and then by propagation through likelihood Comments: Derivatives of computational primitives, required for learning, are computed in a similar way SpInGP involves computations with block-tridiagonal matrices. These computations are similar to KF and RTS smoothing (see [1] Appendix) 10/ 14
14 Experiments 2-3 Experiments are designed to emphasize the paper findings and statements 1. A robust regression (Student s t likelihood) study example with n = 34,154 observations 2. Numerical effects in non-gaussian likelihoods 11/ 14
15 Experiment 4 A new interesting data set with commercial airline accidents dates scraped from Wikipedia [6] Accidents over the time-span of 100 years, n = 35,959 days We model the accident intensity as a Log Gaussian Cox process (Poisson likelihood) The GP prior is set up as: k(t, t ) = k Mat. (t, t ) + k per. (t, t ) k Mat. (t, t ) 12/ 14
16 Conclusions This paper brings together research done in state space GPs and non-gaussian approximate inference We improve stability and provide additional speed-up by fast computations of the state space model parameters We provide unifying code for all approches in GPML toolbox v. 4.2 [7] Visit our poster: #151 13/ 14
17 References [1] H. Nickisch, A. Solin, and A. Grigorievskiy (2018). State Space Gaussian Processes with Non-Gaussian Likelihood. In ICML. [2] C.E. Rasmussen and C.K.I. Williams (2006). Gaussian Processes for Machine Learning. The MIT Press. [3] J. Hartikainen, and S. Särkkä (2010). Kalman filtering and smoothing solutions to temporal Gaussian process regression models. In MLSP. [4] A. G. Wilson, and H. Nickisch (2015). Kernel Interpolation for Scalable Structured Gaussian Processes (KISS-GP). In ICML. [5] A. Grigorievskiy, N. Lawrence, and S. Särkkä (2017). Parallelizable Sparse Inverse Formulation Gaussian Processes (SpInGP). In MLSP. [6] Wikipedia (2018). URL accidents_and_incidents_involving_commercial_aircraft [7] C. E. Rasmussen and H. Nickisch (2010). Gaussian Processes for Machine Learning (GPML). In GMLR. 14/ 14
arxiv: v5 [stat.ml] 5 Jul 2018
State Space Gaussian Processes with Non-Gaussian Likelihood Hannes Nickisch 1 Arno Solin 2 Alexander Grigorievskiy 2 3 arxiv:1802.04846v5 [stat.ml 5 Jul 2018 Abstract We provide a comprehensive overview
More informationState Space Representation of Gaussian Processes
State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)
More informationarxiv: v4 [stat.ml] 28 Sep 2017
Parallelizable sparse inverse formulation Gaussian processes (SpInGP) Alexander Grigorievskiy Aalto University, Finland Neil Lawrence University of Sheffield Amazon Research, Cambridge, UK Simo Särkkä
More informationNon-Gaussian likelihoods for Gaussian Processes
Non-Gaussian likelihoods for Gaussian Processes Alan Saul University of Sheffield Outline Motivation Laplace approximation KL method Expectation Propagation Comparing approximations GP regression Model
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More informationModel Selection for Gaussian Processes
Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationNonparameteric Regression:
Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationLog Gaussian Cox Processes. Chi Group Meeting February 23, 2016
Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started
More informationNonparametric Bayesian Methods (Gaussian Processes)
[70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent
More informationPILCO: A Model-Based and Data-Efficient Approach to Policy Search
PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run
More informationCSci 8980: Advanced Topics in Graphical Models Gaussian Processes
CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More informationExpectation Propagation in Dynamical Systems
Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex
More informationApproximate Inference Part 1 of 2
Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ Bayesian paradigm Consistent use of probability theory
More informationApproximate Inference Part 1 of 2
Approximate Inference Part 1 of 2 Tom Minka Microsoft Research, Cambridge, UK Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ 1 Bayesian paradigm Consistent use of probability theory
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML
More informationComputer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression
Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:
More informationStudent-t Process Regression with Student-t Likelihood
Student-t Process Regression with Student-t Likelihood Qingtao Tang, Li Niu, Yisen Wang, Tao Dai, Wangpeng An, Jianfei Cai, Shu-Tao Xia Department of Computer Science and Technology, Tsinghua University,
More informationGaussian Process Regression with Censored Data Using Expectation Propagation
Sixth European Workshop on Probabilistic Graphical Models, Granada, Spain, 01 Gaussian Process Regression with Censored Data Using Expectation Propagation Perry Groot, Peter Lucas Radboud University Nijmegen,
More informationComputer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression
Group Prof. Daniel Cremers 4. Gaussian Processes - Regression Definition (Rep.) Definition: A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution.
More informationIntroduction to Gaussian Processes
Introduction to Gaussian Processes Neil D. Lawrence GPSS 10th June 2013 Book Rasmussen and Williams (2006) Outline The Gaussian Density Covariance from Basis Functions Basis Function Representations Constructing
More informationFactor Analysis and Kalman Filtering (11/2/04)
CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used
More informationExpectation Propagation for Approximate Bayesian Inference
Expectation Propagation for Approximate Bayesian Inference José Miguel Hernández Lobato Universidad Autónoma de Madrid, Computer Science Department February 5, 2007 1/ 24 Bayesian Inference Inference Given
More informationVariational Model Selection for Sparse Gaussian Process Regression
Variational Model Selection for Sparse Gaussian Process Regression Michalis K. Titsias School of Computer Science University of Manchester 7 September 2008 Outline Gaussian process regression and sparse
More informationExpectation propagation as a way of life
Expectation propagation as a way of life Yingzhen Li Department of Engineering Feb. 2014 Yingzhen Li (Department of Engineering) Expectation propagation as a way of life Feb. 2014 1 / 9 Reference This
More informationGaussian Processes for Machine Learning (GPML) Toolbox
Journal of Machine Learning Research 11 (2010) 3011-3015 Submitted 8/10; Revised 9/10; Published 11/10 Gaussian Processes for Machine Learning (GPML) Toolbox Carl Edward Rasmussen Department of Engineering
More informationGWAS V: Gaussian processes
GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011
More informationAn Introduction to Gaussian Processes for Spatial Data (Predictions!)
An Introduction to Gaussian Processes for Spatial Data (Predictions!) Matthew Kupilik College of Engineering Seminar Series Nov 216 Matthew Kupilik UAA GP for Spatial Data Nov 216 1 / 35 Why? When evidence
More informationLearning with Noisy Labels. Kate Niehaus Reading group 11-Feb-2014
Learning with Noisy Labels Kate Niehaus Reading group 11-Feb-2014 Outline Motivations Generative model approach: Lawrence, N. & Scho lkopf, B. Estimating a Kernel Fisher Discriminant in the Presence of
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationExpectation Propagation Algorithm
Expectation Propagation Algorithm 1 Shuang Wang School of Electrical and Computer Engineering University of Oklahoma, Tulsa, OK, 74135 Email: {shuangwang}@ou.edu This note contains three parts. First,
More informationNonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group
Nonparmeteric Bayes & Gaussian Processes Baback Moghaddam baback@jpl.nasa.gov Machine Learning Group Outline Bayesian Inference Hierarchical Models Model Selection Parametric vs. Nonparametric Gaussian
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationBayesian Deep Learning
Bayesian Deep Learning Mohammad Emtiyaz Khan AIP (RIKEN), Tokyo http://emtiyaz.github.io emtiyaz.khan@riken.jp June 06, 2018 Mohammad Emtiyaz Khan 2018 1 What will you learn? Why is Bayesian inference
More informationVariational Inference via Stochastic Backpropagation
Variational Inference via Stochastic Backpropagation Kai Fan February 27, 2016 Preliminaries Stochastic Backpropagation Variational Auto-Encoding Related Work Summary Outline Preliminaries Stochastic Backpropagation
More informationGaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008
Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 13: Learning in Gaussian Graphical Models, Non-Gaussian Inference, Monte Carlo Methods Some figures
More informationLecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu
Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes
More informationSPATIO-temporal Gaussian processes, or Gaussian fields,
TO APPEAR IN SPECIAL ISSUE: ADVANCES IN KERNEL-BASED LEARNING FOR SIGNAL PROCESSING IN THE IEEE SIGNAL PROCESSING MAGAZINE Spatio-Temporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing
More informationRelevance Vector Machines
LUT February 21, 2011 Support Vector Machines Model / Regression Marginal Likelihood Regression Relevance vector machines Exercise Support Vector Machines The relevance vector machine (RVM) is a bayesian
More informationLearning Gaussian Process Kernels via Hierarchical Bayes
Learning Gaussian Process Kernels via Hierarchical Bayes Anton Schwaighofer Fraunhofer FIRST Intelligent Data Analysis (IDA) Kekuléstrasse 7, 12489 Berlin anton@first.fhg.de Volker Tresp, Kai Yu Siemens
More informationThe Kalman Filter ImPr Talk
The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman
More informationProbabilistic & Bayesian deep learning. Andreas Damianou
Probabilistic & Bayesian deep learning Andreas Damianou Amazon Research Cambridge, UK Talk at University of Sheffield, 19 March 2019 In this talk Not in this talk: CRFs, Boltzmann machines,... In this
More informationTutorial on Gaussian Processes and the Gaussian Process Latent Variable Model
Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,
More informationProbabilistic Graphical Models Lecture 20: Gaussian Processes
Probabilistic Graphical Models Lecture 20: Gaussian Processes Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 30, 2015 1 / 53 What is Machine Learning? Machine learning algorithms
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II
1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector
More informationLearning Gaussian Process Models from Uncertain Data
Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada
More informationLecture 7: Optimal Smoothing
Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother
More informationMachine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart
Machine Learning Bayesian Regression & Classification learning as inference, Bayesian Kernel Ridge regression & Gaussian Processes, Bayesian Kernel Logistic Regression & GP classification, Bayesian Neural
More informationExpectation propagation for signal detection in flat-fading channels
Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA
More informationHilbert Space Methods for Reduced-Rank Gaussian Process Regression
Hilbert Space Methods for Reduced-Rank Gaussian Process Regression Arno Solin and Simo Särkkä Aalto University, Finland Workshop on Gaussian Process Approximation Copenhagen, Denmark, May 2015 Solin &
More informationDensity Estimation. Seungjin Choi
Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationSystem identification and control with (deep) Gaussian processes. Andreas Damianou
System identification and control with (deep) Gaussian processes Andreas Damianou Department of Computer Science, University of Sheffield, UK MIT, 11 Feb. 2016 Outline Part 1: Introduction Part 2: Gaussian
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by
More informationSTA414/2104. Lecture 11: Gaussian Processes. Department of Statistics
STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations
More informationarxiv: v2 [cs.sy] 13 Aug 2018
TO APPEAR IN IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. XX, NO. XX, XXXX 21X 1 Gaussian Process Latent Force Models for Learning and Stochastic Control of Physical Systems Simo Särkkä, Senior Member,
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationMultiple-step Time Series Forecasting with Sparse Gaussian Processes
Multiple-step Time Series Forecasting with Sparse Gaussian Processes Perry Groot ab Peter Lucas a Paul van den Bosch b a Radboud University, Model-Based Systems Development, Heyendaalseweg 135, 6525 AJ
More informationModelling Transcriptional Regulation with Gaussian Processes
Modelling Transcriptional Regulation with Gaussian Processes Neil Lawrence School of Computer Science University of Manchester Joint work with Magnus Rattray and Guido Sanguinetti 8th March 7 Outline Application
More informationGAUSSIAN PROCESS REGRESSION
GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The
More informationarxiv: v1 [stat.ml] 25 Oct 2016
Parallelizable sparse inverse formulation Gaussian processes (SpInGP) arxiv:161008035v1 [statml] 25 Oct 2016 Alexander Grigorievskiy Neil Lawrence Simo Särkkä alexandergrigorevskiy@aaltofi nlawrence@sheffieldacuk
More informationDisease mapping with Gaussian processes
EUROHEIS2 Kuopio, Finland 17-18 August 2010 Aki Vehtari (former Helsinki University of Technology) Department of Biomedical Engineering and Computational Science (BECS) Acknowledgments Researchers - Jarno
More informationActive and Semi-supervised Kernel Classification
Active and Semi-supervised Kernel Classification Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London Work done in collaboration with Xiaojin Zhu (CMU), John Lafferty (CMU),
More informationGaussian processes for inference in stochastic differential equations
Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017
More informationLA Support for Scalable Kernel Methods. David Bindel 29 Sep 2018
LA Support for Scalable Kernel Methods David Bindel 29 Sep 2018 Collaborators Kun Dong (Cornell CAM) David Eriksson (Cornell CAM) Jake Gardner (Cornell CS) Eric Lee (Cornell CS) Hannes Nickisch (Phillips
More informationUsing Gaussian Processes for Variance Reduction in Policy Gradient Algorithms *
Proceedings of the 8 th International Conference on Applied Informatics Eger, Hungary, January 27 30, 2010. Vol. 1. pp. 87 94. Using Gaussian Processes for Variance Reduction in Policy Gradient Algorithms
More informationThe Variational Gaussian Approximation Revisited
The Variational Gaussian Approximation Revisited Manfred Opper Cédric Archambeau March 16, 2009 Abstract The variational approximation of posterior distributions by multivariate Gaussians has been much
More informationGaussian Process Regression Forecasting of Computer Network Conditions
Gaussian Process Regression Forecasting of Computer Network Conditions Christina Garman Bucknell University August 3, 2010 Christina Garman (Bucknell University) GPR Forecasting of NPCs August 3, 2010
More informationA short introduction to INLA and R-INLA
A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk
More informationStatistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes
Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan 1, M. Koval and P. Parashar 1 Applications of Gaussian
More informationVariable sigma Gaussian processes: An expectation propagation perspective
Variable sigma Gaussian processes: An expectation propagation perspective Yuan (Alan) Qi Ahmed H. Abdel-Gawad CS & Statistics Departments, Purdue University ECE Department, Purdue University alanqi@cs.purdue.edu
More informationGaussian Process Regression
Gaussian Process Regression 4F1 Pattern Recognition, 21 Carl Edward Rasmussen Department of Engineering, University of Cambridge November 11th - 16th, 21 Rasmussen (Engineering, Cambridge) Gaussian Process
More informationBayesian Machine Learning
Bayesian Machine Learning Andrew Gordon Wilson ORIE 6741 Lecture 3 Stochastic Gradients, Bayesian Inference, and Occam s Razor https://people.orie.cornell.edu/andrew/orie6741 Cornell University August
More informationGaussian Processes. 1 What problems can be solved by Gaussian Processes?
Statistical Techniques in Robotics (16-831, F1) Lecture#19 (Wednesday November 16) Gaussian Processes Lecturer: Drew Bagnell Scribe:Yamuna Krishnamurthy 1 1 What problems can be solved by Gaussian Processes?
More informationGaussian Processes as Continuous-time Trajectory Representations: Applications in SLAM and Motion Planning
Gaussian Processes as Continuous-time Trajectory Representations: Applications in SLAM and Motion Planning Jing Dong jdong@gatech.edu 2017-06-20 License CC BY-NC-SA 3.0 Discrete time SLAM Downsides: Measurements
More informationMarkov Chain Monte Carlo Algorithms for Gaussian Processes
Markov Chain Monte Carlo Algorithms for Gaussian Processes Michalis K. Titsias, Neil Lawrence and Magnus Rattray School of Computer Science University of Manchester June 8 Outline Gaussian Processes Sampling
More informationNested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood
Journal of Machine Learning Research 1 (1) 75-19 Submitted /1; Revised 9/1; Published 1/1 Nested Expectation Propagation for Gaussian Process Classification with a Multinomial Probit Likelihood Jaakko
More informationDEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE
Data Provided: None DEPARTMENT OF COMPUTER SCIENCE Autumn Semester 203 204 MACHINE LEARNING AND ADAPTIVE INTELLIGENCE 2 hours Answer THREE of the four questions. All questions carry equal weight. Figures
More informationIntegrated Non-Factorized Variational Inference
Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014
More informationStatistical Techniques in Robotics (16-831, F12) Lecture#20 (Monday November 12) Gaussian Processes
Statistical Techniques in Robotics (6-83, F) Lecture# (Monday November ) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan Applications of Gaussian Processes (a) Inverse Kinematics
More informationProbabilistic Graphical Models
2016 Robert Nowak Probabilistic Graphical Models 1 Introduction We have focused mainly on linear models for signals, in particular the subspace model x = Uθ, where U is a n k matrix and θ R k is a vector
More informationBlack-box α-divergence Minimization
Black-box α-divergence Minimization José Miguel Hernández-Lobato, Yingzhen Li, Daniel Hernández-Lobato, Thang Bui, Richard Turner, Harvard University, University of Cambridge, Universidad Autónoma de Madrid.
More informationGaussian Process Regression Networks
Gaussian Process Regression Networks Andrew Gordon Wilson agw38@camacuk mlgengcamacuk/andrew University of Cambridge Joint work with David A Knowles and Zoubin Ghahramani June 27, 2012 ICML, Edinburgh
More informationCircular Pseudo-Point Approximations for Scaling Gaussian Processes
Circular Pseudo-Point Approximations for Scaling Gaussian Processes Will Tebbutt Invenia Labs, Cambridge, UK will.tebbutt@invenialabs.co.uk Thang D. Bui University of Cambridge tdb4@cam.ac.uk Richard E.
More informationState-Space Inference and Learning with Gaussian Processes
Ryan Turner 1 Marc Peter Deisenroth 1, Carl Edward Rasmussen 1,3 1 Department of Engineering, University of Cambridge, Trumpington Street, Cambridge CB 1PZ, UK Department of Computer Science & Engineering,
More informationGaussian Processes for Audio Feature Extraction
Gaussian Processes for Audio Feature Extraction Dr. Richard E. Turner (ret26@cam.ac.uk) Computational and Biological Learning Lab Department of Engineering University of Cambridge Machine hearing pipeline
More informationEfficient Variational Inference in Large-Scale Bayesian Compressed Sensing
Efficient Variational Inference in Large-Scale Bayesian Compressed Sensing George Papandreou and Alan Yuille Department of Statistics University of California, Los Angeles ICCV Workshop on Information
More informationLarge-scale Collaborative Prediction Using a Nonparametric Random Effects Model
Large-scale Collaborative Prediction Using a Nonparametric Random Effects Model Kai Yu Joint work with John Lafferty and Shenghuo Zhu NEC Laboratories America, Carnegie Mellon University First Prev Page
More information1 Bayesian Linear Regression (BLR)
Statistical Techniques in Robotics (STR, S15) Lecture#10 (Wednesday, February 11) Lecturer: Byron Boots Gaussian Properties, Bayesian Linear Regression 1 Bayesian Linear Regression (BLR) In linear regression,
More informationModel Based Clustering of Count Processes Data
Model Based Clustering of Count Processes Data Tin Lok James Ng, Brendan Murphy Insight Centre for Data Analytics School of Mathematics and Statistics May 15, 2017 Tin Lok James Ng, Brendan Murphy (Insight)
More informationGaussian Processes (10/16/13)
STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs
More informationFast Kronecker Inference in Gaussian Processes with non-gaussian Likelihoods
Fast Kronecker Inference in Gaussian Processes with non-gaussian Likelihoods Seth R. Flaxman 1, Andrew Gordon Wilson 1, Daniel B. Neill 1, Hannes Nickisch 2 and Alexander J. Smola 1 1 Carnegie Mellon University,
More informationOptimization of Gaussian Process Hyperparameters using Rprop
Optimization of Gaussian Process Hyperparameters using Rprop Manuel Blum and Martin Riedmiller University of Freiburg - Department of Computer Science Freiburg, Germany Abstract. Gaussian processes are
More informationDistributed Bayesian Learning with Stochastic Natural-gradient EP and the Posterior Server
Distributed Bayesian Learning with Stochastic Natural-gradient EP and the Posterior Server in collaboration with: Minjie Xu, Balaji Lakshminarayanan, Leonard Hasenclever, Thibaut Lienart, Stefan Webb,
More informationAuto-Encoding Variational Bayes
Auto-Encoding Variational Bayes Diederik P Kingma, Max Welling June 18, 2018 Diederik P Kingma, Max Welling Auto-Encoding Variational Bayes June 18, 2018 1 / 39 Outline 1 Introduction 2 Variational Lower
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More information