Inferring Latent Functions with Gaussian Processes in Differential Equations

Size: px
Start display at page:

Download "Inferring Latent Functions with Gaussian Processes in Differential Equations"

Transcription

1 Inferring Latent Functions with in Differential Equations Neil Lawrence School of Computer Science University of Manchester Joint work with Magnus Rattray and Guido Sanguinetti 9th December 26

2 Outline 1 Transcription Factor Concentrations 2 Exact Inference in Linear System Toy Problem Biological Problem 3 Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses

3 Advert! PUMA Project Work is part of PUMA project for Microarray analysis Will be appointing a new Post-doc in the new year!

4 Online Resources All source code and slides are available online This talk available from home page (see talks link on side). Scripts available in the gpsim toolbox MATLAB commands used for examples given in typewriter font.

5 Inference about functions Differential equation models of systems often have functional unknowns. We will focus on a protein concentration example. Gaussian processes (GPs) are probabilistic models for functions. O Hagan [1978, 1992], Rasmussen and Williams [26] GPs provide a framework for performing inference about these functions in the presence of uncertainty.

6 Overview Transcription Factor Concentrations Transcription Factor Inference Model the dynamics of gene transcription. Infer transcription factor concentration (TFC). Infer parameters of the transcription model (decay rates, sensitivities etc.) Infer these quantities using a set of known target genes.

7 Methodology Transcription Factor Concentrations Latent function Treat TFC as a latent function in a differential equation model. Assume a Gaussian process (GP) prior distribution for the latent function. Derive GP covariance jointly for genes and transcription factor. Maximise likelihood with respect to parameters (mostly physically meaningful).

8 Overview Transcription Factor Concentrations Background Understanding of cellular processes is improving through microarrays, chromatin immunoprecipitation etc.. Quantitative description of regulatory mechanisms requires: transcription factor (TF) concentrations, gene-specific constants such as the baseline expression, mrna decay rate and sensitivity to TF concentrations (TFCs).

9 Overview Transcription Factor Concentrations Background II These quantities are hard to measure directly. We show how the They can be inferred using a systems biology model and Gaussian processes (GPs). Our work provides a extensible, principled framework for attacking this problem.

10 Transcription Factor Concentrations GP Advantages GPs allow for inference of continuous profies, accounting naturally for temporal structure. GPs avoid cumbersome interpolation to estimate mrna production rates. GPs deal consistently with the uncertainty inherent in the measurements. GPs outstrip MCMC for computational efficiency. GPs have previously been proposed for solving differential equations [Graepel, 23] and dynamical systems [Murray-Smith and Pearlmutter].

11 Exact Inference in Linear System Toy Problem Biological Problem p53 Inference [Barenco et al., 26] Data consists of T measurements of mrna expression level for N different genes. We relate gene expression, x j (t), to TFC, f (t), by dx j (t) dt = B j + S j f (t) D j x j (t). (1) B j basal transcription rate of gene j, S j is sensitivity of gene j D j is the decay rate of the mrna. Dependence of mrna transcription rate on TF is linear.

12 Linear Response Solution Exact Inference in Linear System Toy Problem Biological Problem Solve for TFC The equation given in (1) can be solved to recover x j (t) = B t j + S j exp ( D j t) f (u) exp (D j u) du. (2) D j If we model f (t) as a GP then as (2) only involves linear operations x j (t) is also a GP. If we use the RBF covariance everything is analytic.

13 Covariance Functions Visualisation of RBF Covariance Exact Inference in Linear System Toy Problem Biological Problem RBF Kernel Function k ( t, t ) = exp ( (t t ) 2 l 2 ) Covariance matrix is built using the time inputs to the function. For the example above it was based on Euclidean distance. The covariance function is also known as a kernel. m n

14 Covariance Samples Exact Inference in Linear System Toy Problem Biological Problem demcovfuncsample (oxford toolbox) Figure: RBF kernel with l =.25, α = 1

15 Covariance Samples Exact Inference in Linear System Toy Problem Biological Problem demcovfuncsample (oxford toolbox) Figure: RBF kernel with l = 1, α = 1

16 Computation of Joint Covariance Exact Inference in Linear System Toy Problem Biological Problem Covariance Function Computation We rewrite equation (2) as where x j (t) = B j D j + L j [f ] (t) t L j [f ] (t) = S j exp ( D j t) f (u) exp (D j u) du (3) is a linear operator.

17 Induced Covariance Exact Inference in Linear System Toy Problem Biological Problem Gene s Covariance The new covariance function is then given by cov ( L j [f ] (t), L k [f ] ( t )) = L j L k [k ff ] ( t, t ). more explicitly k xj x k `t, t = S j S k exp ` D j t D k t Z t exp (D j u) Z t exp `D k u k ff `u, u du du. With RBF covariance these integrals are tractable.

18 Covariance Result Exact Inference in Linear System Toy Problem Biological Problem Covariance Result where k xj x k ( t, t ) = S j S k π 2 h `t kj, t = exp (γ k) 2 D j + D k jexp ˆ D `t k t» t t erf l Here γ k = D kl 2. [ hkj ( t, t ) + h jk ( t, t )] «t γ k + erf l exp ˆ `D k t»erf t «+ D j γ k + erf (γ k ) l ff. + γ k «

19 Cross Covariance Exact Inference in Linear System Toy Problem Biological Problem Correlation of x j (t) and f (t ) Need the cross-covariance terms between x j (t) and f (t ), which is obtained as k xj f ( t, t ) t ( = S j exp ( D j t) exp (D j u) k ff u, t ) du. (4) For RBF we have k `t πlsj e xj f, t 2γ j = 2 exp ˆ D `t t j t»erf ««t t γ j + erf + γ j. l l

20 Posterior for f Exact Inference in Linear System Toy Problem Biological Problem Prediction for TFC Standard Gaussian process regression techniques [see e.g. Rasmussen and Williams, 26] yield f post = K f x K 1 xx x K post ff = K ff K f x K 1 xx K xf Model parameters B j, D j and S j estimated by type II maximum likelihood, log p (x) = N (x, K xx )

21 Covariance for Transcription Model Exact Inference in Linear System Toy Problem Biological Problem RBF Kernel function for f (t) x i (t) = B t i + S i exp ( D i t) f (u) exp (D i u) du. D i Joint distribution for x 1 (t), x 2 (t) and f (t). Here: D 1 S 1 D 2 S f(t) x 1 (t) x 2 (t) f(t) x 1 (t) x 2 (t)

22 Joint Sampling of x (t) and f (t) Exact Inference in Linear System Toy Problem Biological Problem gpsimtest Figure: Left: joint samples from the transcription covariance, blue: f (t), cyan: x 1 (t) and red: x 2 (t). Right: numerical solution for f (t) of the differential equation from x 1 (t) and x 2 (t) (blue and cyan). True f (t) included for comparison.

23 Joint Sampling of x (t) and f (t) Exact Inference in Linear System Toy Problem Biological Problem gpsimtest Figure: Left: joint samples from the transcription covariance, blue: f (t), cyan: x 1 (t) and red: x 2 (t). Right: numerical solution for f (t) of the differential equation from x 1 (t) and x 2 (t) (blue and cyan). True f (t) included for comparison.

24 Joint Sampling of x (t) and f (t) Exact Inference in Linear System Toy Problem Biological Problem gpsimtest Figure: Left: joint samples from the transcription covariance, blue: f (t), cyan: x 1 (t) and red: x 2 (t). Right: numerical solution for f (t) of the differential equation from x 1 (t) and x 2 (t) (blue and cyan). True f (t) included for comparison.

25 Covariance for Transcription Model Exact Inference in Linear System Toy Problem Biological Problem RBF Kernel function for f (t) x i (t) = B t i + S i exp ( D i t) f (u) exp (D i u) du. D i Joint distribution for x 1 (t), x 2 (t) and f (t). Here: D 1 S 1 D 2 S f(t) x 1 (t) x 2 (t) f(t) x 1 (t) x 2 (t)

26 Noise Corruption Exact Inference in Linear System Toy Problem Biological Problem Estimate Underlying Noise Allow the mrna abundance of each gene at each time point to be corrupted by noise, for observations at t i for i = 1,..., T, y j (t i ) = x j (t i ) + ɛ j (t i ) (5) with ɛ j (t i ) N (, σ 2 ji ). Estimate noise level using probe-level processing techniques of Affymetrix microarrays (e.g. mmgmos, [Liu et al., 25]). The covariance of the noisy process is then K yy = Σ + K xx, with Σ = diag ( σ 2 11,..., σ2 1T,..., σ2 N1,..., σ2 NT ).

27 Artificial Data Exact Inference in Linear System Toy Problem Biological Problem Toy Problem Results from an artificial data set. We used a known TFC and derived six mrna profiles. Known TFC composed of three Gaussian basis functions. mrna profiles derived analytically. Fourteen subsamples were taken and corrupted by noise. This data was then used to infer a distribution over plausible TFCs.

28 Artificial Data Results Exact Inference in Linear System Toy Problem Biological Problem demtoyproblem Figure: Left: The TFC, f (t), which drives the system. Middle: Five gene mrna concentration profiles each obtained by using different parameter sets {B i, S i, D i } 5 i=1 (lines) along with noise corrupted data. Right: The inferred TFC (with error bars).

29 Results Exact Inference in Linear System Toy Problem Biological Problem Linear System Recently published biological data set studied using linear response model by Barenco et al. [26]. Study focused on the tumour suppressor protein p53. mrna abundance measured for five targets: DDB2, p21, SESN1/hPA26, BIK and TNFRSF1b. Quadratic interpolation for the mrna production rates to obtain gradients. They used MCMC sampling to obtain estimates of the model parameters B j, S j, D j and f (t).

30 Linear response analysis Exact Inference in Linear System Toy Problem Biological Problem Experimental Setup We analysed data using the linear response model Raw data was processed using the mmgmos model of Liu et al. [25] which provides variance as well as expression level. We present posterior distribution over TFCs. Results of inference on the values of the hyperparameters B j, S j and D j. Samples from the posterior distribution were obtained using Hybrid Monte Carlo (see e.g. Neal, 1996).

31 Linear Response Results Exact Inference in Linear System Toy Problem Biological Problem dembarenco Figure: Predicted protein concentration for p53. Solid line is mean, dashed lines 95% credibility intervals. The prediction of [Barenco et al., 26] was pointwise and is shown as crosses.

32 Results Transcription Rates Exact Inference in Linear System Toy Problem Biological Problem Estimation of Equation Parameters dembarenco DDB2 hpa26 TNFRSF2b p21 BIK Figure: Basal transcription rates. Our results (black) compared with Barenco et al. [26] (white).

33 Results Transcription Rates Exact Inference in Linear System Toy Problem Biological Problem Estimation of Equation Parameters dembarenco DDB2 hpa26 TNFRSF2b p21 BIK Figure: Sensitivities. Our results (black) compared with Barenco et al. [26] (white).

34 Results Transcription Rates Exact Inference in Linear System Toy Problem Biological Problem Estimation of Equation Parameters dembarenco DDB2 hpa26 TNFRSF2b p21 BIK Figure: Decays. Our results (black) compared with Barenco et al. [26] (white).

35 Linear Response Discussion Exact Inference in Linear System Toy Problem Biological Problem GP Results Note oscillatory behaviour, possible artifact of RBF covariance Rasmussen and Williams [see page 123 in 26]. Results are in good accordance with the results obtained by Barenco et al.. Differences in estimates of the basal transcription rates probably due to: different methods used for probe-level processing of the microarray data. Our failure to constrain f () =. Our results take about 13 minutes to produce Barenco et al. required 1 million iterations of Monte Carlo.

36 Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses More Realistic Response All the quantities in equation (1) are positive, but direct samples from a GP will not be. Linear models don t account for saturation. Solution: model response using a positive nonlinear function.

37 Formalism Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Non-linear Response Introduce a non-linearity g ( ) parameterised by θ j dx j dt = B j + g(f (t), θ j ) D j x j x j (t) = B j + exp ` D Z t j t du g(f (u), θ j ) exp `D j u. D j The induced distribution of x j (t) is no longer a GP. Derive the functional gradient and learn a MAP solution for f (t). Also compute Hessian so we can approximate the marginal likelihood.

38 Implementation Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Riemann quadrature Implementation requires a discretised time. Compute the gradient and Hessian on a grid. Integrate them by approximate Riemann quadrature. We choose a uniform grid {t p } M p=1 so that = t p t p 1 is constant. The vector f = {f p } M p=1 is the function f at the grid points. Z t I (t) = f (u) exp `D j u du I (t) MX f (t p) exp `D j t p p=1

39 Log Likelihood Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Functional Gradient Given noise-corrupted data y j (t i ) the log-likelihood is TX log p(y f, θ j ) = 1 2 i=1 j=1 " NX `xj (t i ) y j (t i ) 2 σ 2 ji # log σji 2 NT 2 log(2π) The functional derivative of the log-likelihood wrt f is δ log p(y f ) δf (t) TX NX (x j (t i ) y j (t i )) = Θ(t i t) σ 2 g (f (t))e D j (t i t) i=1 j=1 ji Θ(x) Heaviside step function.

40 Log Likelihood Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Functional Hessian Given noise-corrupted data y j (t i ) the log-likelihood is TX log p(y f, θ j ) = 1 2 i=1 j=1 " NX `xj (t i ) y j (t i ) 2 σ 2 ji # log σji 2 NT 2 log(2π) The negative Hessian of the log-likelihood wrt f is w(t, t ) = TX Θ(t i t)δ `t NX t `xj (t i ) y j (t i ) σ 2 ji g (f (t))e D j (t i t) i=1 j=1 TX + Θ(t i t)θ `t NX i t σ 2 ji g (f (t)) g `f (t ) e D j (2t i t t ) i=1 j=1 g (f ) = g/ f and g (f ) = 2 g/ f 2.

41 Implementation II Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Combine with Prior Combine these with prior to compute gradient and Hessian of log posterior Ψ(f) = log p(y f) + log p(f) [see Rasmussen and Williams, 26, chapter 3] Ψ(f) log p(y f) = K 1 f f f 2 Ψ(f) f 2 = (W + K 1 ) K prior covariance evaluated at the grid points. Use to find a MAP solution via, ˆf, using Newton s algorithm. The Laplace approximation is then log p(y ) log p(y ˆf) 1 2ˆf T K 1ˆf 1 log I + KW. (7) 2 (6)

42 Example: linear response Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Using non-rbf kernels Start by taking g( ) to be linear. Provides sanity check and smooth (i.e. non-stochastic) covariance function. Avoids double numerical integral that would normally be required.

43 Oscillatory Behaviour Fix with MLP Covariance Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses MLP Kernel Function k `t, t = αarcsin! wtt + b p (wt 2 + b + 1) (wt 2 + b + 1) A non-stationary covariance matrix [Williams, 1997]. Derived from a multi-layer perceptron (MLP). m n

44 Covariance Samples Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses demcovfuncsample (oxford toolbox) Figure: RBF kernel with l =.25, α = 1

45 Covariance Samples Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses demcovfuncsample (oxford toolbox) Figure: RBF kernel with l = 1, α = 1

46 Covariance Samples Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses demcovfuncsample (oxford toolbox) Figure: MLP kernel with α = 8, w = 1 and b = 1

47 Covariance Samples Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses demcovfuncsample (oxford toolbox) Figure: MLP kernel with α = 8, b = and w = 1

48 Response Results Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses dembarencomap1, dembarencomap Figure: Predicted protein concentration for p53 using a MAP estimation with linear response model: Left: RBF prior on f (log likelihood -11.4); Right: MLP prior on f (log likelihood -15.6). Solid line is mean prediction, dashed lines are 95% credibility intervals. The prediction of Barenco et al. was pointwise and is shown as crosses.

49 Non-linear response analysis Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Non-linear responses Exponential response model (constrains protein concentrations positive). log (1 + exp (f )) response model. 3 1+exp( f ) Inferred MAP solutions for the latent function f are plotted below.

50 exp ( ) Response Results Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses dembarencomap3, dembarencomap Figure: Predicted protein concentration for p53 using an exponential response: Left: shows results of using a squared exponential prior covariance on f (log likelihood -1.6); Right: shows results of using an MLP prior covariance on f (log likelihood -16.4). Solid line is mean prediction, dashed lines show 95% credibility intervals. The results shown are for exp(f ), hence the asymmetry of

51 log (1 + exp (f )) Response Results dembarencomap5, dembarencomap6 Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Figure: Predicted protein concentration for p53 using a log (1 + exp (x)) response: Left: shows results of using a squared exponential prior covariance on f (log likelihood -1.9); Right: shows results of using an MLP prior covariance on f (log likelihood -11.). Solid line is mean prediction, dashed lines show 95% credibility intervals. The results shown are for exp(f ), hence the

52 3 1+exp( f ) Response Results Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses dembarencomap7, dembarencomap Figure: Predicted protein concentration for p53 using a sigmoid response: Left: shows results of using a squared exponential prior covariance on f (log likelihood -14.1); Right: shows results of using an MLP prior covariance on f (log likelihood ). Solid line is mean prediction, dashed lines show 95% credibility intervals. The results shown are for exp(f ), hence the asymmetry of Neil thelawrence, credibility Guidointervals. Sanguinetti and The Magnus prediction Rattray of Gaussian Barenco Processes et al. was pointwise and is

53 Discussion Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses We have described how GPs can be used in modelling dynamics of a simple regulatory network motif. Our approach has advantages over standard parametric approaches: there is no need to restrict the inference to the observed time points, the temporal continuity of the inferred functions is accounted for naturally. GPs allow us to handle uncertainty in a natural way. MCMC parameter estimation in a discretised model can be computationally expensive. Parameter estimation can be achieved easily in our framework by type II maximum likelihood or by using efficient hybrid Monte Carlo sampling techniques All code on-line

54 Future Directions Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses This is still a very simple modelling situation. We are ignoring transcriptional delays. Here we have single transcription factor: our ultimate goal is to describe regulatory pathways with more genes. All these issues can be dealt with in the general framework we have described. Need to overcome the greater computational difficulties.

55 Other Relevant Work Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Simpler genome wide models [Sabatti and James, 26, Sanguinetti et al., 26b,a]. Better MCMC models [Rogers et al., 26]. Sorry for anyone missed!

56 Open Question Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Functional Approximations Without sampling, how can we produce a better posterior approximation?

57 Acknowledgements Riemann Quadrature Linear Response with MLP Kernel Non-linear Responses Data and Support We thank Martino Barenco for useful discussions and for providing the data. We gratefully acknowledge support from BBSRC Grant No BBS/B/76X Improved processing of microarray data with probabilistic models.

58 I M. Barenco, D. Tomescu, D. Brewer, R. Callard, J. Stark, and M. Hubank. Ranked prediction of p53 targets using hidden variable dynamic modeling. Genome Biology, 7(3):R25, 26. T. Graepel. Solving noisy linear operator equations by Gaussian processes: to ordinary and partial differential equations. In T. Fawcett and N. Mishra, editors, Proceedings of the International Conference in Machine Learning, volume 2, pages AAAI Press, 23. ISBN X. Liu, M. Milo, N. D. Lawrence, and M. Rattray. A tractable probabilistic model for Affymetrix probe-level analysis across multiple chips. Bioinformatics, 21(18): , 25. R. Murray-Smith and B. A. Pearlmutter. Transformations of Gaussian process priors. R. M. Neal. Bayesian Learning for Neural Networks. Springer, Lecture Notes in Statistics 118. A. O Hagan. Curve fitting and optimal design for prediction. Journal of the Royal Statistical Society, B, 4:1 42, A. O Hagan. Some Bayesian numerical analysis. In J. M. Bernardo, J. O. Berger, A. P. Dawid, and A. F. M. Smith, editors, Bayesian Statistics 4, pages , Valencia, Oxford University Press. C. E. Rasmussen and C. K. I. Williams. for Machine Learning. MIT Press, Cambridge, MA, 26. ISBN X. S. Rogers, R. Khanin, and M. Girolami. Model based identification of transcription factor activity from microarray data. In Probabilistic Modeling and Machine Learning in Structural and Systems Biology, Tuusula, Finland, 17-18th June 26. C. Sabatti and G. M. James. Bayesian sparse hidden components analysis for transcription regulation networks. Bioinformatics, 22(6): , 26. G. Sanguinetti, N. D. Lawrence, and M. Rattray. Probabilistic inference of transcription factor concentrations and gene-specific regulatory activities. Bioinformatics, pages , 26a. Advance Access. G. Sanguinetti, M. Rattray, and N. D. Lawrence. A probabilistic dynamical model for quantitative inference of the regulatory mechanism of transcription. Bioinformatics, 22(14): , 26b. C. K. I. Williams. Computing with infinite networks. In M. C. Mozer, M. I. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems, volume 9, Cambridge, MA, MIT Press.

Modelling Transcriptional Regulation with Gaussian Processes

Modelling Transcriptional Regulation with Gaussian Processes Modelling Transcriptional Regulation with Gaussian Processes Neil Lawrence School of Computer Science University of Manchester Joint work with Magnus Rattray and Guido Sanguinetti 8th March 7 Outline Application

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Neil D. Lawrence GPSS 10th June 2013 Book Rasmussen and Williams (2006) Outline The Gaussian Density Covariance from Basis Functions Basis Function Representations Constructing

More information

Markov Chain Monte Carlo Algorithms for Gaussian Processes

Markov Chain Monte Carlo Algorithms for Gaussian Processes Markov Chain Monte Carlo Algorithms for Gaussian Processes Michalis K. Titsias, Neil Lawrence and Magnus Rattray School of Computer Science University of Manchester June 8 Outline Gaussian Processes Sampling

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Gaussian Processes (10/16/13)

Gaussian Processes (10/16/13) STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs

More information

Modelling gene expression dynamics with Gaussian processes

Modelling gene expression dynamics with Gaussian processes Modelling gene expression dynamics with Gaussian processes Regulatory Genomics and Epigenomics March th 6 Magnus Rattray Faculty of Life Sciences University of Manchester Talk Outline Introduction to Gaussian

More information

STAT 518 Intro Student Presentation

STAT 518 Intro Student Presentation STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Machine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart

Machine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart Machine Learning Bayesian Regression & Classification learning as inference, Bayesian Kernel Ridge regression & Gaussian Processes, Bayesian Kernel Logistic Regression & GP classification, Bayesian Neural

More information

A variational radial basis function approximation for diffusion processes

A variational radial basis function approximation for diffusion processes A variational radial basis function approximation for diffusion processes Michail D. Vrettas, Dan Cornford and Yuan Shen Aston University - Neural Computing Research Group Aston Triangle, Birmingham B4

More information

Model Selection for Gaussian Processes

Model Selection for Gaussian Processes Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal

More information

The Variational Gaussian Approximation Revisited

The Variational Gaussian Approximation Revisited The Variational Gaussian Approximation Revisited Manfred Opper Cédric Archambeau March 16, 2009 Abstract The variational approximation of posterior distributions by multivariate Gaussians has been much

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

Gaussian with mean ( µ ) and standard deviation ( σ)

Gaussian with mean ( µ ) and standard deviation ( σ) Slide from Pieter Abbeel Gaussian with mean ( µ ) and standard deviation ( σ) 10/6/16 CSE-571: Robotics X ~ N( µ, σ ) Y ~ N( aµ + b, a σ ) Y = ax + b + + + + 1 1 1 1 1 1 1 1 1 1, ~ ) ( ) ( ), ( ~ ), (

More information

Reliability Monitoring Using Log Gaussian Process Regression

Reliability Monitoring Using Log Gaussian Process Regression COPYRIGHT 013, M. Modarres Reliability Monitoring Using Log Gaussian Process Regression Martin Wayne Mohammad Modarres PSA 013 Center for Risk and Reliability University of Maryland Department of Mechanical

More information

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,

More information

Learning Gaussian Process Models from Uncertain Data

Learning Gaussian Process Models from Uncertain Data Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada

More information

Bayesian learning of sparse factor loadings

Bayesian learning of sparse factor loadings Magnus Rattray School of Computer Science, University of Manchester Bayesian Research Kitchen, Ambleside, September 6th 2008 Talk Outline Brief overview of popular sparsity priors Example application:

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Variational Model Selection for Sparse Gaussian Process Regression

Variational Model Selection for Sparse Gaussian Process Regression Variational Model Selection for Sparse Gaussian Process Regression Michalis K. Titsias School of Computer Science University of Manchester 7 September 2008 Outline Gaussian process regression and sparse

More information

Gaussian Process Regression

Gaussian Process Regression Gaussian Process Regression 4F1 Pattern Recognition, 21 Carl Edward Rasmussen Department of Engineering, University of Cambridge November 11th - 16th, 21 Rasmussen (Engineering, Cambridge) Gaussian Process

More information

System identification and control with (deep) Gaussian processes. Andreas Damianou

System identification and control with (deep) Gaussian processes. Andreas Damianou System identification and control with (deep) Gaussian processes Andreas Damianou Department of Computer Science, University of Sheffield, UK MIT, 11 Feb. 2016 Outline Part 1: Introduction Part 2: Gaussian

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

Probabilistic Models for Learning Data Representations. Andreas Damianou

Probabilistic Models for Learning Data Representations. Andreas Damianou Probabilistic Models for Learning Data Representations Andreas Damianou Department of Computer Science, University of Sheffield, UK IBM Research, Nairobi, Kenya, 23/06/2015 Sheffield SITraN Outline Part

More information

Probabilistic & Unsupervised Learning

Probabilistic & Unsupervised Learning Probabilistic & Unsupervised Learning Gaussian Processes Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc ML/CSML, Dept Computer Science University College London

More information

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:

More information

Gaussian process for nonstationary time series prediction

Gaussian process for nonstationary time series prediction Computational Statistics & Data Analysis 47 (2004) 705 712 www.elsevier.com/locate/csda Gaussian process for nonstationary time series prediction Soane Brahim-Belhouari, Amine Bermak EEE Department, Hong

More information

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics STA414/2104 Lecture 11: Gaussian Processes Department of Statistics www.utstat.utoronto.ca Delivered by Mark Ebden with thanks to Russ Salakhutdinov Outline Gaussian Processes Exam review Course evaluations

More information

Gaussian Process priors with Uncertain Inputs: Multiple-Step-Ahead Prediction

Gaussian Process priors with Uncertain Inputs: Multiple-Step-Ahead Prediction Gaussian Process priors with Uncertain Inputs: Multiple-Step-Ahead Prediction Agathe Girard Dept. of Computing Science University of Glasgow Glasgow, UK agathe@dcs.gla.ac.uk Carl Edward Rasmussen Gatsby

More information

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian

More information

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group Nonparmeteric Bayes & Gaussian Processes Baback Moghaddam baback@jpl.nasa.gov Machine Learning Group Outline Bayesian Inference Hierarchical Models Model Selection Parametric vs. Nonparametric Gaussian

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Iain Murray murray@cs.toronto.edu CSC255, Introduction to Machine Learning, Fall 28 Dept. Computer Science, University of Toronto The problem Learn scalar function of

More information

GWAS V: Gaussian processes

GWAS V: Gaussian processes GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011

More information

The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan

The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan The geometry of Gaussian processes and Bayesian optimization. Contal CMLA, ENS Cachan Background: Global Optimization and Gaussian Processes The Geometry of Gaussian Processes and the Chaining Trick Algorithm

More information

FastGP: an R package for Gaussian processes

FastGP: an R package for Gaussian processes FastGP: an R package for Gaussian processes Giri Gopalan Harvard University Luke Bornn Harvard University Many methodologies involving a Gaussian process rely heavily on computationally expensive functions

More information

State Space Gaussian Processes with Non-Gaussian Likelihoods

State Space Gaussian Processes with Non-Gaussian Likelihoods State Space Gaussian Processes with Non-Gaussian Likelihoods Hannes Nickisch 1 Arno Solin 2 Alexander Grigorievskiy 2,3 1 Philips Research, 2 Aalto University, 3 Silo.AI ICML2018 July 13, 2018 Outline

More information

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

PILCO: A Model-Based and Data-Efficient Approach to Policy Search PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol

More information

Neutron inverse kinetics via Gaussian Processes

Neutron inverse kinetics via Gaussian Processes Neutron inverse kinetics via Gaussian Processes P. Picca Politecnico di Torino, Torino, Italy R. Furfaro University of Arizona, Tucson, Arizona Outline Introduction Review of inverse kinetics techniques

More information

Non Linear Latent Variable Models

Non Linear Latent Variable Models Non Linear Latent Variable Models Neil Lawrence GPRS 14th February 2014 Outline Nonlinear Latent Variable Models Extensions Outline Nonlinear Latent Variable Models Extensions Non-Linear Latent Variable

More information

Glasgow eprints Service

Glasgow eprints Service Girard, A. and Rasmussen, C.E. and Quinonero-Candela, J. and Murray- Smith, R. (3) Gaussian Process priors with uncertain inputs? Application to multiple-step ahead time series forecasting. In, Becker,

More information

Accelerating Bayesian Inference over Nonlinear Differential Equations with Gaussian Processes

Accelerating Bayesian Inference over Nonlinear Differential Equations with Gaussian Processes Accelerating Bayesian Inference over Nonlinear Differential Equations with Gaussian Processes Ben Calderhead Dept. of Computing Sci. University of Glasgow bc@dcs.gla.ac.uk Mark Girolami Dept. of Computing

More information

Bayesian Data Fusion with Gaussian Process Priors : An Application to Protein Fold Recognition

Bayesian Data Fusion with Gaussian Process Priors : An Application to Protein Fold Recognition Bayesian Data Fusion with Gaussian Process Priors : An Application to Protein Fold Recognition Mar Girolami 1 Department of Computing Science University of Glasgow girolami@dcs.gla.ac.u 1 Introduction

More information

Nonparameteric Regression:

Nonparameteric Regression: Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,

More information

Stochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints

Stochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints Stochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints Thang D. Bui Richard E. Turner tdb40@cam.ac.uk ret26@cam.ac.uk Computational and Biological Learning

More information

Recent Advances in Bayesian Inference Techniques

Recent Advances in Bayesian Inference Techniques Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian

More information

Probabilistic & Bayesian deep learning. Andreas Damianou

Probabilistic & Bayesian deep learning. Andreas Damianou Probabilistic & Bayesian deep learning Andreas Damianou Amazon Research Cambridge, UK Talk at University of Sheffield, 19 March 2019 In this talk Not in this talk: CRFs, Boltzmann machines,... In this

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes COMP 55 Applied Machine Learning Lecture 2: Gaussian processes Instructor: Ryan Lowe (ryan.lowe@cs.mcgill.ca) Slides mostly by: (herke.vanhoof@mcgill.ca) Class web page: www.cs.mcgill.ca/~hvanho2/comp55

More information

Bayesian Calibration of Simulators with Structured Discretization Uncertainty

Bayesian Calibration of Simulators with Structured Discretization Uncertainty Bayesian Calibration of Simulators with Structured Discretization Uncertainty Oksana A. Chkrebtii Department of Statistics, The Ohio State University Joint work with Matthew T. Pratola (Statistics, The

More information

Computer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression Group Prof. Daniel Cremers 4. Gaussian Processes - Regression Definition (Rep.) Definition: A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution.

More information

Advanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting)

Advanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting) Advanced Machine Learning Practical 4b Solution: Regression (BLR, GPR & Gradient Boosting) Professor: Aude Billard Assistants: Nadia Figueroa, Ilaria Lauzana and Brice Platerrier E-mails: aude.billard@epfl.ch,

More information

Derivative observations in Gaussian Process models of dynamic systems

Derivative observations in Gaussian Process models of dynamic systems Appeared in Advances in Neural Information Processing Systems 6, Vancouver, Canada. Derivative observations in Gaussian Process models of dynamic systems E. Solak Dept. Elec. & Electr. Eng., Strathclyde

More information

Probabilistic Graphical Models Lecture 20: Gaussian Processes

Probabilistic Graphical Models Lecture 20: Gaussian Processes Probabilistic Graphical Models Lecture 20: Gaussian Processes Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 30, 2015 1 / 53 What is Machine Learning? Machine learning algorithms

More information

Lecture 5: GPs and Streaming regression

Lecture 5: GPs and Streaming regression Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X

More information

Prediction of double gene knockout measurements

Prediction of double gene knockout measurements Prediction of double gene knockout measurements Sofia Kyriazopoulou-Panagiotopoulou sofiakp@stanford.edu December 12, 2008 Abstract One way to get an insight into the potential interaction between a pair

More information

Deep learning with differential Gaussian process flows

Deep learning with differential Gaussian process flows Deep learning with differential Gaussian process flows Pashupati Hegde Markus Heinonen Harri Lähdesmäki Samuel Kaski Helsinki Institute for Information Technology HIIT Department of Computer Science, Aalto

More information

Dimensional reduction of clustered data sets

Dimensional reduction of clustered data sets Dimensional reduction of clustered data sets Guido Sanguinetti 5th February 2007 Abstract We present a novel probabilistic latent variable model to perform linear dimensional reduction on data sets which

More information

Variational Principal Components

Variational Principal Components Variational Principal Components Christopher M. Bishop Microsoft Research 7 J. J. Thomson Avenue, Cambridge, CB3 0FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop In Proceedings

More information

Joint Emotion Analysis via Multi-task Gaussian Processes

Joint Emotion Analysis via Multi-task Gaussian Processes Joint Emotion Analysis via Multi-task Gaussian Processes Daniel Beck, Trevor Cohn, Lucia Specia October 28, 2014 1 Introduction 2 Multi-task Gaussian Process Regression 3 Experiments and Discussion 4 Conclusions

More information

Disease mapping with Gaussian processes

Disease mapping with Gaussian processes EUROHEIS2 Kuopio, Finland 17-18 August 2010 Aki Vehtari (former Helsinki University of Technology) Department of Biomedical Engineering and Computational Science (BECS) Acknowledgments Researchers - Jarno

More information

Confidence Estimation Methods for Neural Networks: A Practical Comparison

Confidence Estimation Methods for Neural Networks: A Practical Comparison , 6-8 000, Confidence Estimation Methods for : A Practical Comparison G. Papadopoulos, P.J. Edwards, A.F. Murray Department of Electronics and Electrical Engineering, University of Edinburgh Abstract.

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

GAUSSIAN PROCESS REGRESSION

GAUSSIAN PROCESS REGRESSION GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The

More information

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods Pattern Recognition and Machine Learning Chapter 6: Kernel Methods Vasil Khalidov Alex Kläser December 13, 2007 Training Data: Keep or Discard? Parametric methods (linear/nonlinear) so far: learn parameter

More information

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition

NONLINEAR CLASSIFICATION AND REGRESSION. J. Elder CSE 4404/5327 Introduction to Machine Learning and Pattern Recognition NONLINEAR CLASSIFICATION AND REGRESSION Nonlinear Classification and Regression: Outline 2 Multi-Layer Perceptrons The Back-Propagation Learning Algorithm Generalized Linear Models Radial Basis Function

More information

A short introduction to INLA and R-INLA

A short introduction to INLA and R-INLA A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk

More information

Gaussian Processes for Machine Learning

Gaussian Processes for Machine Learning Gaussian Processes for Machine Learning Carl Edward Rasmussen Max Planck Institute for Biological Cybernetics Tübingen, Germany carl@tuebingen.mpg.de Carlos III, Madrid, May 2006 The actual science of

More information

Gaussian Processes in Machine Learning

Gaussian Processes in Machine Learning Gaussian Processes in Machine Learning November 17, 2011 CharmGil Hong Agenda Motivation GP : How does it make sense? Prior : Defining a GP More about Mean and Covariance Functions Posterior : Conditioning

More information

Multiple-step Time Series Forecasting with Sparse Gaussian Processes

Multiple-step Time Series Forecasting with Sparse Gaussian Processes Multiple-step Time Series Forecasting with Sparse Gaussian Processes Perry Groot ab Peter Lucas a Paul van den Bosch b a Radboud University, Model-Based Systems Development, Heyendaalseweg 135, 6525 AJ

More information

Latent Variable models for GWAs

Latent Variable models for GWAs Latent Variable models for GWAs Oliver Stegle Machine Learning and Computational Biology Research Group Max-Planck-Institutes Tübingen, Germany September 2011 O. Stegle Latent variable models for GWAs

More information

STA414/2104 Statistical Methods for Machine Learning II

STA414/2104 Statistical Methods for Machine Learning II STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements

More information

Tokamak profile database construction incorporating Gaussian process regression

Tokamak profile database construction incorporating Gaussian process regression Tokamak profile database construction incorporating Gaussian process regression A. Ho 1, J. Citrin 1, C. Bourdelle 2, Y. Camenen 3, F. Felici 4, M. Maslov 5, K.L. van de Plassche 1,4, H. Weisen 6 and JET

More information

Finding minimum energy paths using Gaussian process regression

Finding minimum energy paths using Gaussian process regression Finding minimum energy paths using Gaussian process regression Olli-Pekka Koistinen Helsinki Institute for Information Technology HIIT Department of Computer Science, Aalto University Department of Applied

More information

Afternoon Meeting on Bayesian Computation 2018 University of Reading

Afternoon Meeting on Bayesian Computation 2018 University of Reading Gabriele Abbati 1, Alessra Tosi 2, Seth Flaxman 3, Michael A Osborne 1 1 University of Oxford, 2 Mind Foundry Ltd, 3 Imperial College London Afternoon Meeting on Bayesian Computation 2018 University of

More information

Integrated Non-Factorized Variational Inference

Integrated Non-Factorized Variational Inference Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II 1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector

More information

Gaussian Process Dynamical Models Jack M Wang, David J Fleet, Aaron Hertzmann, NIPS 2005

Gaussian Process Dynamical Models Jack M Wang, David J Fleet, Aaron Hertzmann, NIPS 2005 Gaussian Process Dynamical Models Jack M Wang, David J Fleet, Aaron Hertzmann, NIPS 2005 Presented by Piotr Mirowski CBLL meeting, May 6, 2009 Courant Institute of Mathematical Sciences, New York University

More information

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework

Bayesian Learning. HT2015: SC4 Statistical Data Mining and Machine Learning. Maximum Likelihood Principle. The Bayesian Learning Framework HT5: SC4 Statistical Data Mining and Machine Learning Dino Sejdinovic Department of Statistics Oxford http://www.stats.ox.ac.uk/~sejdinov/sdmml.html Maximum Likelihood Principle A generative model for

More information

Non-Factorised Variational Inference in Dynamical Systems

Non-Factorised Variational Inference in Dynamical Systems st Symposium on Advances in Approximate Bayesian Inference, 08 6 Non-Factorised Variational Inference in Dynamical Systems Alessandro D. Ialongo University of Cambridge and Max Planck Institute for Intelligent

More information

Learning in Bayesian Networks

Learning in Bayesian Networks Learning in Bayesian Networks Florian Markowetz Max-Planck-Institute for Molecular Genetics Computational Molecular Biology Berlin Berlin: 20.06.2002 1 Overview 1. Bayesian Networks Stochastic Networks

More information

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016 Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started

More information

Modeling human function learning with Gaussian processes

Modeling human function learning with Gaussian processes Modeling human function learning with Gaussian processes Thomas L. Griffiths Christopher G. Lucas Joseph J. Williams Department of Psychology University of California, Berkeley Berkeley, CA 94720-1650

More information

MTTTS16 Learning from Multiple Sources

MTTTS16 Learning from Multiple Sources MTTTS16 Learning from Multiple Sources 5 ECTS credits Autumn 2018, University of Tampere Lecturer: Jaakko Peltonen Lecture 6: Multitask learning with kernel methods and nonparametric models On this lecture:

More information

Nonparametric Bayesian Methods - Lecture I

Nonparametric Bayesian Methods - Lecture I Nonparametric Bayesian Methods - Lecture I Harry van Zanten Korteweg-de Vries Institute for Mathematics CRiSM Masterclass, April 4-6, 2016 Overview of the lectures I Intro to nonparametric Bayesian statistics

More information

Prediction of Data with help of the Gaussian Process Method

Prediction of Data with help of the Gaussian Process Method of Data with help of the Gaussian Process Method R. Preuss, U. von Toussaint Max-Planck-Institute for Plasma Physics EURATOM Association 878 Garching, Germany March, Abstract The simulation of plasma-wall

More information

Latent Force Models. Neil D. Lawrence

Latent Force Models. Neil D. Lawrence Latent Force Models Neil D. Lawrence (work with Magnus Rattray, Mauricio Álvarez, Pei Gao, Antti Honkela, David Luengo, Guido Sanguinetti, Michalis Titsias, Jennifer Withers) University of Sheffield University

More information

20: Gaussian Processes

20: Gaussian Processes 10-708: Probabilistic Graphical Models 10-708, Spring 2016 20: Gaussian Processes Lecturer: Andrew Gordon Wilson Scribes: Sai Ganesh Bandiatmakuri 1 Discussion about ML Here we discuss an introduction

More information

Introduction to Gaussian Process

Introduction to Gaussian Process Introduction to Gaussian Process CS 778 Chris Tensmeyer CS 478 INTRODUCTION 1 What Topic? Machine Learning Regression Bayesian ML Bayesian Regression Bayesian Non-parametric Gaussian Process (GP) GP Regression

More information

Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes

Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan 1, M. Koval and P. Parashar 1 Applications of Gaussian

More information

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.)

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.) Prof. Daniel Cremers 2. Regression (cont.) Regression with MLE (Rep.) Assume that y is affected by Gaussian noise : t = f(x, w)+ where Thus, we have p(t x, w, )=N (t; f(x, w), 2 ) 2 Maximum A-Posteriori

More information

The Bayesian approach to inverse problems

The Bayesian approach to inverse problems The Bayesian approach to inverse problems Youssef Marzouk Department of Aeronautics and Astronautics Center for Computational Engineering Massachusetts Institute of Technology ymarz@mit.edu, http://uqgroup.mit.edu

More information

Models. Carl Henrik Ek Philip H. S. Torr Neil D. Lawrence. Computer Science Departmental Seminar University of Bristol October 25 th, 2007

Models. Carl Henrik Ek Philip H. S. Torr Neil D. Lawrence. Computer Science Departmental Seminar University of Bristol October 25 th, 2007 Carl Henrik Ek Philip H. S. Torr Neil D. Lawrence Oxford Brookes University University of Manchester Computer Science Departmental Seminar University of Bristol October 25 th, 2007 Source code and slides

More information

output dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1

output dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1 To appear in M. S. Kearns, S. A. Solla, D. A. Cohn, (eds.) Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 999. Learning Nonlinear Dynamical Systems using an EM Algorithm Zoubin

More information

Using Gaussian Processes for Variance Reduction in Policy Gradient Algorithms *

Using Gaussian Processes for Variance Reduction in Policy Gradient Algorithms * Proceedings of the 8 th International Conference on Applied Informatics Eger, Hungary, January 27 30, 2010. Vol. 1. pp. 87 94. Using Gaussian Processes for Variance Reduction in Policy Gradient Algorithms

More information

Machine Learning Srihari. Gaussian Processes. Sargur Srihari

Machine Learning Srihari. Gaussian Processes. Sargur Srihari Gaussian Processes Sargur Srihari 1 Topics in Gaussian Processes 1. Examples of use of GP 2. Duality: From Basis Functions to Kernel Functions 3. GP Definition and Intuition 4. Linear regression revisited

More information

CSC2541 Lecture 2 Bayesian Occam s Razor and Gaussian Processes

CSC2541 Lecture 2 Bayesian Occam s Razor and Gaussian Processes CSC2541 Lecture 2 Bayesian Occam s Razor and Gaussian Processes Roger Grosse Roger Grosse CSC2541 Lecture 2 Bayesian Occam s Razor and Gaussian Processes 1 / 55 Adminis-Trivia Did everyone get my e-mail

More information

Non-Gaussian likelihoods for Gaussian Processes

Non-Gaussian likelihoods for Gaussian Processes Non-Gaussian likelihoods for Gaussian Processes Alan Saul University of Sheffield Outline Motivation Laplace approximation KL method Expectation Propagation Comparing approximations GP regression Model

More information

Multi-Kernel Gaussian Processes

Multi-Kernel Gaussian Processes Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence Arman Melkumyan Australian Centre for Field Robotics School of Aerospace, Mechanical and Mechatronic Engineering

More information