Data are collected along transect lines, with dense data along. Spatial modelling using GMRFs. 200m. Today? Spatial modelling using GMRFs

Size: px
Start display at page:

Download "Data are collected along transect lines, with dense data along. Spatial modelling using GMRFs. 200m. Today? Spatial modelling using GMRFs"

Transcription

1 Centre for Mathematical Sciences Lund University Engineering geology Lund University Results A non-stationary extension The model Estimation Gaussian Markov random fields Basics Approximating Mate rn covariances Fast estimation of the field Results Spatial statistics Interpolation Computational costs Seabed reconstruction What s done today? m 1 How do we reconstruct the seabed? Where should the next measurement be? A data set consists of geometry x, y, z Plane coordinates x, y are typically provided by GPS Depth z is measured using a Single Beam EchoSounders SBES Data are collected along transect lines, with dense data along track but no coverage between tracks Given point measurements of depth: Today? Seabed reconstruction Background 00m 1 How do we reconstruct the seabed? Where should the next measurement be? A data set consists of geometry x, y, z Plane coordinates x, y are typically provided by GPS Depth z is measured using a Single Beam EchoSounders SBES Data are collected along transect lines, with dense data along track but no coverage between tracks Given point measurements of depth: Overview Today? Seabed reconstruction Background Seattle May 4, Department of Mathematical Sciences Norwegian University of Science and Technology, Trondheim 1 Department of Statistics University of Washington Johan Lindstro m1, Finn Lindgren David Bolin Ha vard Rue3 Peter Jonsson4 Spatial modelling using Gaussian Markov Random Fields: Applied to seabed reconstruction

2 Today? What s done today? 1 Triangulate/grid the intermidiate points Interpolate the values using: Linear intepolation Splines Statistical methods can do better than this Gives no information about the uncertainties Today? What s done today? 1 Triangulate/grid the intermidiate points Interpolate the values using: Linear intepolation Splines Statistical methods can do better than this Gives no information about the uncertainties Interpolation Computations Spatials statistics Interpolation Spatial statistics in general deal with data observed over an area and with dependence between observations taken at different locations A basic problem in spatial statistics is to use observations, Yi, taken at a number of locations, {si} n i=1, to make inference about the value, Y0, at an unobserved location s0 Assume the observations come from a Gaussian field [ ] [ ] [ ] Y 0 Ñ 0 Ë 00 Ë0n N, Y Ë 0n Ënn Given known mean and covariance matrix the optimal predictor is the conditional mean E Y0 Y1,,Yn = Ñ 0 Ë0nË + nn 1 Y Ñ However the mean and variance are generally not known Interpolation Computations Spatials statistics Interpolation Spatial statistics in general deal with data observed over an area and with dependence between observations taken at different locations A basic problem in spatial statistics is to use observations, Yi, taken at a number of locations, {si} n i=1, to make inference about the value, Y0, at an unobserved location s0 Assume the observations come from a Gaussian field [ ] [ ] [ ] Y 0 Ñ 0 Ë 00 Ë0n N, Y Ë 0n Ënn Given known mean and covariance matrix the optimal predictor is the conditional mean E Y0 Y1,,Yn = Ñ 0 Ë0nË + nn 1 Y Ñ However the mean and variance are generally not known Ñ Ñ

3 Interpolation Computations Gaussian fields Assume a parametric form for the covariance and mean Y NÑθ, Ëθ The log-likelihood now becomes log Lθ Y = 1 log Ëθ 1 Y Ñθ Ëθ 1 Y Ñθ Estimate the parameters by maximising the log-likelihood The reconstruction becomes E Y0 Y1,,Yn,θ Ideally we d like to integrate out the parameter uncertainty, often done using MCMC Interpolation Computations Gaussian fields Assume a parametric form for the covariance and mean Y NÑθ, Ëθ The log-likelihood now becomes log Lθ Y = 1 log Ëθ 1 Y Ñθ Ëθ 1 Y Ñθ Estimate the parameters by maximising the log-likelihood The reconstruction becomes E Y0 Y1,,Yn,θ Ideally we d like to integrate out the parameter uncertainty, often done using MCMC Interpolation Computations Computational costs Both optimisation and MCMC will require repeated evaluations of the log-likelihood log Lθ Y = 1 log Ëθ 1 Y Ñθ Ëθ 1 Y Ñθ Evaluation of the log-likelihood has two difficults: 1 The covariance matrix is a n-by-n matrix, which contains n + n/ unique elements Calculating the determinant or inverse, or Cholesky factor of the covariance matrix requires O n 3 operations For the depth data we have observations Storing the covariance matrix requires roughly 1 GB, and evaluating the log-likelihood takes 5 minutes on a CoreDuo T8100 laptop Interpolation Computations Computational costs Both optimisation and MCMC will require repeated evaluations of the log-likelihood log Lθ Y = 1 log Ëθ 1 Y Ñθ Ëθ 1 Y Ñθ Evaluation of the log-likelihood has two difficults: 1 The covariance matrix is a n-by-n matrix, which contains n + n/ unique elements Calculating the determinant or inverse, or Cholesky factor of the covariance matrix requires O n 3 operations For the depth data we have observations Storing the covariance matrix requires roughly 1 GB, and evaluating the log-likelihood takes 5 minutes on a CoreDuo T8100 laptop

4 Basics Matérn INLA Results Gaussian Markov Random Fields GMRF:s A Gaussian Markov random field GMRF is a Gaussian random field with a Markov property The neighbours Ni to a point si are the points that in some sense are close to si The Gaussian random field x N Ñ,Q 1 has a joint distribution that satisfies pxi {xj : j i} = pxi {xj : j Ni} j / Ni xi xj { xk : k / {i,j} } Qi,j = 0 The density is px = Q 1/ exp Ô n/ 1 x Ñ Qx Ñ Fast algorithms that utilise the sparsity of Q exist c-package GMRFlib See Rue and Held 005 for extensive details on GMRF:s Basics Matérn INLA Results GMRF:s How do we choose Q? A GMRF may be computationally effective but it has been difficult to construct precision matrices that result in reasonable Gaussian fields Various ad-hoc methods exist A common solution is to use a small neighbourhood and let the precision between two points depend on the distance between the points Rue and Tjelmeland 00 created GMRF:s on rectangular grids that approximate Gaussian fields with a wide class of covariance functions Having the field defined only on a regular grid leads to issues with mapping the observations to the grid points Basics Matérn INLA Results GMRF:s How do we choose Q? A GMRF may be computationally effective but it has been difficult to construct precision matrices that result in reasonable Gaussian fields Various ad-hoc methods exist A common solution is to use a small neighbourhood and let the precision between two points depend on the distance between the points Rue and Tjelmeland 00 created GMRF:s on rectangular grids that approximate Gaussian fields with a wide class of covariance functions Having the field defined only on a regular grid leads to issues with mapping the observations to the grid points Basics Matérn INLA Results Matérn covariances The Matérn covariance family on u R d : ru,v = Cxu,xv = 1 Ò Ò v u Ò K Ò v u with scale inverse range > 0 and shape/smoothness Ò > 0 Here K Ò is a modified Bessel function Fields with Matérn covariances are solutions to an Stochastic Partial Differential Equation SPDE Whittle, 1954, / xu = Ø u, where u is spatial white noise, and = Ò + d/

5 Basics Matérn INLA Results Construction of Q Lindgren and Rue, 007 The GMRF approximation is constructed using a Finite Element method to solve the SPDE For = the precision matrix of the approximating GMRF can be written as Q = C + G C 1 C EC 1 C + G = C 1 G + I C E C 1 G + I C+G arises from the Finite Element approximation of CE is the precision of the driving spatial white noise It can be shown that CE = C C is tri-diagonal, with dense inverse To obtain a sparse Q-matrix Lindgren and Rue uses a diagonal approximation, C Basics Matérn INLA Results Construction of Q Lindgren and Rue, 007 The GMRF approximation is constructed using a Finite Element method to solve the SPDE For = the precision matrix of the approximating GMRF can be written as Q = C + G C 1 C EC 1 C + G = C 1 G + I C E C 1 G + I C+G arises from the Finite Element approximation of CE is the precision of the driving spatial white noise It can be shown that CE = C C is tri-diagonal, with dense inverse To obtain a sparse Q-matrix Lindgren and Rue uses a diagonal approximation, C Basics Matérn INLA Results Construction of Q An example Given a regular grid, and taking =, the finite element approximation of is The corresponding elements in Q are 1 Ø For an irregular triangulation things are slightly more complicated Basics Matérn INLA Results Construction of Q An example Given a regular grid, and taking =, the finite element approximation of is The corresponding elements in Q are 1 Ø For an irregular triangulation things are slightly more complicated

6 Basics Matérn INLA Results INLA Fast estimation Rue and Martino, Assume an underlying GMRF with possibly non-gaussian point observations, ie x N Ñθ,Qθ 1, pyi x,θ = pyi xi,θ Obtain a Gaussian approximation of the posterior, px y,θ exp 1 x Ñ Ñ Q x + log pyi xi, θ i through a Taylor expansion of the log-observation density 3 Use the Gaussian approximation to do numerical optimisation and integration of the log-likelihood Provides a fast way of obtaining posteriors Errors due to the Taylor expansion and numerical integration are usually smaller than the MCMC errors from a reasonable MCMC run Basics Matérn INLA Results INLA Fast estimation Rue and Martino, Assume an underlying GMRF with possibly non-gaussian point observations, ie x N Ñθ,Qθ 1, pyi x,θ = pyi xi,θ Obtain a Gaussian approximation of the posterior, px y,θ exp 1 x Ñ Ñ Q x + log pyi xi, θ i through a Taylor expansion of the log-observation density 3 Use the Gaussian approximation to do numerical optimisation and integration of the log-likelihood Provides a fast way of obtaining posteriors Errors due to the Taylor expansion and numerical integration are usually smaller than the MCMC errors from a reasonable MCMC run Basics Matérn INLA Results INLA Results Posterior densities for the hyperparameters of the model and for the underlying field where estimated using INLA Total estimation time on a CoreDuo laptop: 65 minutes Basics Matérn INLA Results Variance as a function of distance In a stationary model the variance will depend only on the distance to the closest measurement point VX Y dist If we use the variance to decide where to measure this implies that we should measure far from existing measurements The underlying field is most likely non-stationary We need a non-stationary model and a way of estimating it

7 Basics Matérn INLA Results Variance as a function of distance In a stationary model the variance will depend only on the distance to the closest measurement point VX Y dist If we use the variance to decide where to measure this implies that we should measure far from existing measurements The underlying field is most likely non-stationary We need a non-stationary model and a way of estimating it Model Estimation A non-stationary extension We now introduce a non-stationary version the SPDE through two modifications: 1 Drive the SPDE with independent Gaussian noise, but let the variance be a function of the location Let the range parameter,, vary in space Taking = we obtain s xs = 1 Õs Es, where Õs is a spatially varying precision of the driving noise Introducing diagonal matrices Õ and with elements ii = si and Õii = Õsi, The nonstationary precision matrix becomes Q = C 1 G + CÕ C 1G + Model Estimation A non-stationary extension We now introduce a non-stationary version the SPDE through two modifications: 1 Drive the SPDE with independent Gaussian noise, but let the variance be a function of the location Let the range parameter,, vary in space Taking = we obtain s xs = 1 Õs Es, where Õs is a spatially varying precision of the driving noise Introducing diagonal matrices Õ and with elements ii = si and Õii = Õsi, The nonstationary precision matrix becomes Q = C 1 G + CÕ C 1G + Model Estimation The model Gaussian point observations: Yj X N Xsj, An underlying GMRF: X N B Ñ θ,qõ, 1 The trend, B Ñ, is assumed to consist of a constant and a linear trend, with a N 0,10 6 I prior for θ Precision, Õ, and range,, are modeled using a set of basis functions eg B-splines Õs = expbq Õ and s = expbq Ideally we would like a smoothnes Õ prior on and, eg logõs N Õ 0,Ð Q 0 1, where the Ð:s are hyper-parameters This results in the following prior on Õ and, Õ N 0, Ð Õ B q Q 0 Bq 1 Finally we take hyper-priors on Ð

8 Model Estimation The model Gaussian point observations: Yj X N Xsj, An underlying GMRF: X N B Ñ θ,qõ, 1 The trend, B Ñ, is assumed to consist of a constant and a linear trend, with a N 0,10 6 I prior for θ Precision, Õ, and range,, are modeled using a set of basis functions eg B-splines Õs = expbq Õ and s = expbq Ideally we would like a smoothnes Õ prior on and, eg logõs N Õ 0,Ð Q 0 1, where the Ð:s are hyper-parameters This results in the following prior on Õ and, Õ N 0, Ð Õ B q Q 0 Bq 1 Finally we take hyper-priors on Ð Model Estimation The model Gaussian point observations: Yj X N Xsj, An underlying GMRF: X N B Ñ θ,qõ, 1 The trend, B Ñ, is assumed to consist of a constant and a linear trend, with a N 0,10 6 I prior for θ Precision, Õ, and range,, are modeled using a set of basis functions eg B-splines Õs = expbq Õ and s = expbq Ideally we would like a smoothnes Õ prior on and, eg logõs N Õ 0,Ð Q 0 1, where the Ð:s are hyper-parameters This results in the following prior on Õ and, Õ N 0, Ð Õ B q Q 0 Bq 1 Finally we take hyper-priors on Ð Model Estimation Estimation The posterior density is px,θ,, Ð, Y py X, px θ, pθp ÐpÐp, px,θ, Ð,,Y is jointly Gaussian so we can integrate out X and θ obtaining p, Ð, Y Further it is posible to explicitly calculate the derivatives of the log-likelihood, log p, Ð, Y With the derivatives we use an ordinary BFGS-algorithm to obtain ML-estimates of the parameters Estimation of parameters along with calculations of the conditional posterior expectation, and variance now takes slightly less than one hour on a CoreDuo laptop Model Estimation Estimation The posterior density is px,θ,, Ð, Y py X, px θ, pθp ÐpÐp, px,θ, Ð,,Y is jointly Gaussian so we can integrate out X and θ obtaining p, Ð, Y Further it is posible to explicitly calculate the derivatives of the log-likelihood, log p, Ð, Y With the derivatives we use an ordinary BFGS-algorithm to obtain ML-estimates of the parameters Estimation of parameters along with calculations of the conditional posterior expectation, and variance now takes slightly less than one hour on a CoreDuo laptop

9 dist Rue, H and Tjelmeland, H 00, Fitting Gaussian Markov Random Fields to Gaussian Fields, Scand J Statist, 9, Whittle, P 1954, On Stationary Processes in the Plane, Biometrika, 41, This presentation: wwwmathslthse/matstat/staff/johanl/talks/ Lindgren, F and Rue, H 007, Explicit construction of GMRF approximations to generalised Mate rn Fields on irregular grids, Tech Rep 1, Centre for Mathematical Sciences, Lund University, Lund, Sweden Rue, H and Held, L 005, Gaussian Markov Random Fields; Theory and Applications, vol 104 of Monographs on Statistics and Applied Probability, Chapman & Hall/CRC Rue, H and Martino, S 007, Approximate Bayesian inference for hierarchical Gaussian Markov random field models, J Statist Plann and Inference, 137, Bibliography In the stationary Kriging model the variance depended only on the distance to the closest measurement point This is however not the case for the non-stationary field q Use Hessian from optimisation to run MCMC, to obtain posteriors Measurement errors are probably correlated along the transect lines How do we utilize the estimates to determine where to measure next? k Variance as a function of distance l Priors for the parameters especially? Strong dependence between and How to select the number of basis functions? TODO: Unresolved issues Results VX Y

10 SPDE issues Non-uniqueness: If xu is a solution to the SPDE for =, so is xu + c exp e u, for any unit length vector e and any constant c Non-stationarity: On a bounded domain, the SPDE solutions are non-stationary, unless conditioned on suitable boundary distributions Practical solution to the non-uniqueness and non-stationarity: Zero-normal-derivative Neumann boundaries reduce the impact of the null-space solutions Resulting covariance, for Ï = [0, L] R: Cxu,xv rmu,v + rmu, v + rmu,l v = rm0,v u + rm0,v + u + rm0,l v + u Computational costs GMRF:s The log-likelihood contains a number of different terms, the most costly to compute are: log Q and b Q 1 b 1 The cholesky factor of Q = LL is sparse possibly after reordering, and can be calculated efficiently The log determinante is log Q = i log L ii 3 We now have that Q 1 b = L L 1 b where L 1 b can be calculated by solving a sparse triangular equation system Computational costs GMRF:s cont For the derivatives the additional difficult term is the derivative log Q = tr Q 1 Q Due to the sparsity the trace can be calculated as tr Q 1 Q = n i=1 j {i,ni} Q 1 ij Q ji Thus to calculate the traces we will at most need the elements of Q 1 that correspond to neighbouring points in the GMRF Given the sparse Cholesky factor these elements can be calculated in O nlogn Comparing the stationary and non-stationary model

Fast algorithms that utilise the sparsity of Q exist (c-package ( Q 1. GMRFlib) Seattle February 3, 2009

Fast algorithms that utilise the sparsity of Q exist (c-package ( Q 1. GMRFlib) Seattle February 3, 2009 GMRF:s Results References Some Theory for and Applications of Gaussian Markov Random Fields Johan Lindström 1 David Bolin 1 Finn Lindgren 1 Håvard Rue 2 1 Centre for Mathematical Sciences Lund University

More information

Spatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016

Spatial Statistics with Image Analysis. Outline. A Statistical Approach. Johan Lindström 1. Lund October 6, 2016 Spatial Statistics Spatial Examples More Spatial Statistics with Image Analysis Johan Lindström 1 1 Mathematical Statistics Centre for Mathematical Sciences Lund University Lund October 6, 2016 Johan Lindström

More information

Computation fundamentals of discrete GMRF representations of continuous domain spatial models

Computation fundamentals of discrete GMRF representations of continuous domain spatial models Computation fundamentals of discrete GMRF representations of continuous domain spatial models Finn Lindgren September 23 2015 v0.2.2 Abstract The fundamental formulas and algorithms for Bayesian spatial

More information

R-INLA. Sam Clifford Su-Yun Kang Jeff Hsieh. 30 August Bayesian Research and Analysis Group 1 / 14

R-INLA. Sam Clifford Su-Yun Kang Jeff Hsieh. 30 August Bayesian Research and Analysis Group 1 / 14 1 / 14 R-INLA Sam Clifford Su-Yun Kang Jeff Hsieh Bayesian Research and Analysis Group 30 August 2012 What is R-INLA? R package for Bayesian computation Integrated Nested Laplace Approximation MCMC free

More information

Gaussian Processes 1. Schedule

Gaussian Processes 1. Schedule 1 Schedule 17 Jan: Gaussian processes (Jo Eidsvik) 24 Jan: Hands-on project on Gaussian processes (Team effort, work in groups) 31 Jan: Latent Gaussian models and INLA (Jo Eidsvik) 7 Feb: Hands-on project

More information

Spatial Statistics with Image Analysis. Lecture L08. Computer exercise 3. Lecture 8. Johan Lindström. November 25, 2016

Spatial Statistics with Image Analysis. Lecture L08. Computer exercise 3. Lecture 8. Johan Lindström. November 25, 2016 C3 Repetition Creating Q Spectral Non-grid Spatial Statistics with Image Analysis Lecture 8 Johan Lindström November 25, 216 Johan Lindström - johanl@maths.lth.se FMSN2/MASM25L8 1/39 Lecture L8 C3 Repetition

More information

A short introduction to INLA and R-INLA

A short introduction to INLA and R-INLA A short introduction to INLA and R-INLA Integrated Nested Laplace Approximation Thomas Opitz, BioSP, INRA Avignon Workshop: Theory and practice of INLA and SPDE November 7, 2018 2/21 Plan for this talk

More information

Markov random fields. The Markov property

Markov random fields. The Markov property Markov random fields The Markov property Discrete time: (X k X k!1,x k!2,... = (X k X k!1 A time symmetric version: (X k! X!k = (X k X k!1,x k+1 A more general version: Let A be a set of indices >k, B

More information

Latent Gaussian Processes and Stochastic Partial Differential Equations

Latent Gaussian Processes and Stochastic Partial Differential Equations Latent Gaussian Processes and Stochastic Partial Differential Equations Johan Lindström 1 1 Centre for Mathematical Sciences Lund University Lund 2016-02-04 Johan Lindström - johanl@maths.lth.se Gaussian

More information

Technical Vignette 5: Understanding intrinsic Gaussian Markov random field spatial models, including intrinsic conditional autoregressive models

Technical Vignette 5: Understanding intrinsic Gaussian Markov random field spatial models, including intrinsic conditional autoregressive models Technical Vignette 5: Understanding intrinsic Gaussian Markov random field spatial models, including intrinsic conditional autoregressive models Christopher Paciorek, Department of Statistics, University

More information

The Bayesian approach to inverse problems

The Bayesian approach to inverse problems The Bayesian approach to inverse problems Youssef Marzouk Department of Aeronautics and Astronautics Center for Computational Engineering Massachusetts Institute of Technology ymarz@mit.edu, http://uqgroup.mit.edu

More information

A GAUSSIAN MARKOV RANDOM FIELD MODEL FOR TOTAL YEARLY PRECIPITATION

A GAUSSIAN MARKOV RANDOM FIELD MODEL FOR TOTAL YEARLY PRECIPITATION A GAUSSIAN MARKOV RANDOM FIELD MODEL FOR TOTAL YEARLY PRECIPITATION OVER THE AFRICAN SAHEL JOHAN LINDSTRÖM AND FINN LINDGREN Preprints in Mathematical Sciences 2008:8 Centre for Mathematical Sciences Mathematical

More information

Multivariate Gaussian Random Fields with SPDEs

Multivariate Gaussian Random Fields with SPDEs Multivariate Gaussian Random Fields with SPDEs Xiangping Hu Daniel Simpson, Finn Lindgren and Håvard Rue Department of Mathematics, University of Oslo PASI, 214 Outline The Matérn covariance function and

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Fast approximations for the Expected Value of Partial Perfect Information using R-INLA

Fast approximations for the Expected Value of Partial Perfect Information using R-INLA Fast approximations for the Expected Value of Partial Perfect Information using R-INLA Anna Heath 1 1 Department of Statistical Science, University College London 22 May 2015 Outline 1 Health Economic

More information

Spatial smoothing over complex domain

Spatial smoothing over complex domain Spatial smoothing over complex domain Alessandro Ottavi NTNU Department of Mathematical Science August 16-20 2010 Outline Spatial smoothing Complex domain The SPDE approch Manifold Some tests Spatial smoothing

More information

Bayesian multiscale analysis for time series data

Bayesian multiscale analysis for time series data Computational Statistics & Data Analysis 5 (26) 79 73 www.elsevier.com/locate/csda Bayesian multiscale analysis for time series data Tor Arne ]igård a,,håvard Rue b, Fred Godtliebsen c a Department of

More information

IMAGE MODELLING AND ESTIMATION FINN LINDGREN A STATISTICAL APPROACH CENTRUM SCIENTIARUM MATHEMATICARUM. Third edition, , Chapter 1 5

IMAGE MODELLING AND ESTIMATION FINN LINDGREN A STATISTICAL APPROACH CENTRUM SCIENTIARUM MATHEMATICARUM. Third edition, , Chapter 1 5 IMAGE MODELLING AND ESTIMATION A STATISTICAL APPROACH FINN LINDGREN Third edition, 2006-10-20, Chapter 1 5 Centre for Mathematical Sciences Mathematical Statistics CENTRUM SCIENTIARUM MATHEMATICARUM 1

More information

Statistics for extreme & sparse data

Statistics for extreme & sparse data Statistics for extreme & sparse data University of Bath December 6, 2018 Plan 1 2 3 4 5 6 The Problem Climate Change = Bad! 4 key problems Volcanic eruptions/catastrophic event prediction. Windstorms

More information

Fast kriging of large data sets with Gaussian Markov random fields

Fast kriging of large data sets with Gaussian Markov random fields Computational Statistics & Data Analysis 52 (2008) 233 2349 www.elsevier.com/locate/csda Fast kriging of large data sets with Gaussian Markov random fields Linda Hartman a,,, Ola Hössjer b,2 a Centre for

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Computer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression Group Prof. Daniel Cremers 4. Gaussian Processes - Regression Definition (Rep.) Definition: A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution.

More information

Nonparameteric Regression:

Nonparameteric Regression: Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,

More information

Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling

Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Bayesian SAE using Complex Survey Data Lecture 4A: Hierarchical Spatial Bayes Modeling Jon Wakefield Departments of Statistics and Biostatistics University of Washington 1 / 37 Lecture Content Motivation

More information

1 Bayesian Linear Regression (BLR)

1 Bayesian Linear Regression (BLR) Statistical Techniques in Robotics (STR, S15) Lecture#10 (Wednesday, February 11) Lecturer: Byron Boots Gaussian Properties, Bayesian Linear Regression 1 Bayesian Linear Regression (BLR) In linear regression,

More information

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:

More information

Parameter Estimation in High Dimensional Gaussian Distributions. Erlend Aune (NTNU, Norway) and Daniel P. Simpson (NTNU, Norway) May 28, 2018

Parameter Estimation in High Dimensional Gaussian Distributions. Erlend Aune (NTNU, Norway) and Daniel P. Simpson (NTNU, Norway) May 28, 2018 Parameter Estimation in High Dimensional Gaussian Distributions Erlend Aune (NTNU, Norway) and Daniel P. Simpson (NTNU, Norway) May 28, 2018 arxiv:1105.5256v1 [stat.co] 26 May 2011 Abstract In order to

More information

Gaussian Process Regression

Gaussian Process Regression Gaussian Process Regression 4F1 Pattern Recognition, 21 Carl Edward Rasmussen Department of Engineering, University of Cambridge November 11th - 16th, 21 Rasmussen (Engineering, Cambridge) Gaussian Process

More information

arxiv: v4 [stat.me] 14 Sep 2015

arxiv: v4 [stat.me] 14 Sep 2015 Does non-stationary spatial data always require non-stationary random fields? Geir-Arne Fuglstad 1, Daniel Simpson 1, Finn Lindgren 2, and Håvard Rue 1 1 Department of Mathematical Sciences, NTNU, Norway

More information

Spatial point processes in the modern world an

Spatial point processes in the modern world an Spatial point processes in the modern world an interdisciplinary dialogue Janine Illian University of St Andrews, UK and NTNU Trondheim, Norway Bristol, October 2015 context statistical software past to

More information

Beyond MCMC in fitting complex Bayesian models: The INLA method

Beyond MCMC in fitting complex Bayesian models: The INLA method Beyond MCMC in fitting complex Bayesian models: The INLA method Valeska Andreozzi Centre of Statistics and Applications of Lisbon University (valeska.andreozzi at fc.ul.pt) European Congress of Epidemiology

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As

More information

Spatial Statistics with Image Analysis. Lecture L02. Computer exercise 0 Daily Temperature. Lecture 2. Johan Lindström.

Spatial Statistics with Image Analysis. Lecture L02. Computer exercise 0 Daily Temperature. Lecture 2. Johan Lindström. C Stochastic fields Covariance Spatial Statistics with Image Analysis Lecture 2 Johan Lindström November 4, 26 Lecture L2 Johan Lindström - johanl@maths.lth.se FMSN2/MASM2 L /2 C Stochastic fields Covariance

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Integrated Non-Factorized Variational Inference

Integrated Non-Factorized Variational Inference Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014

More information

arxiv: v1 [stat.ap] 18 Apr 2011

arxiv: v1 [stat.ap] 18 Apr 2011 The Annals of Applied Statistics 2011, Vol. 5, No. 1, 523 550 DOI: 10.1214/10-AOAS383 c Institute of Mathematical Statistics, 2011 arxiv:1104.3436v1 [stat.ap] 18 Apr 2011 SPATIAL MODELS GENERATED BY NESTED

More information

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,

More information

Gaussian with mean ( µ ) and standard deviation ( σ)

Gaussian with mean ( µ ) and standard deviation ( σ) Slide from Pieter Abbeel Gaussian with mean ( µ ) and standard deviation ( σ) 10/6/16 CSE-571: Robotics X ~ N( µ, σ ) Y ~ N( aµ + b, a σ ) Y = ax + b + + + + 1 1 1 1 1 1 1 1 1 1, ~ ) ( ) ( ), ( ~ ), (

More information

The Variational Gaussian Approximation Revisited

The Variational Gaussian Approximation Revisited The Variational Gaussian Approximation Revisited Manfred Opper Cédric Archambeau March 16, 2009 Abstract The variational approximation of posterior distributions by multivariate Gaussians has been much

More information

Nearest Neighbor Gaussian Processes for Large Spatial Data

Nearest Neighbor Gaussian Processes for Large Spatial Data Nearest Neighbor Gaussian Processes for Large Spatial Data Abhi Datta 1, Sudipto Banerjee 2 and Andrew O. Finley 3 July 31, 2017 1 Department of Biostatistics, Bloomberg School of Public Health, Johns

More information

System Identification, Lecture 4

System Identification, Lecture 4 System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2012 F, FRI Uppsala University, Information Technology 30 Januari 2012 SI-2012 K. Pelckmans

More information

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group Nonparmeteric Bayes & Gaussian Processes Baback Moghaddam baback@jpl.nasa.gov Machine Learning Group Outline Bayesian Inference Hierarchical Models Model Selection Parametric vs. Nonparametric Gaussian

More information

System Identification, Lecture 4

System Identification, Lecture 4 System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans

More information

Nonparametric Regression With Gaussian Processes

Nonparametric Regression With Gaussian Processes Nonparametric Regression With Gaussian Processes From Chap. 45, Information Theory, Inference and Learning Algorithms, D. J. C. McKay Presented by Micha Elsner Nonparametric Regression With Gaussian Processes

More information

An EM algorithm for Gaussian Markov Random Fields

An EM algorithm for Gaussian Markov Random Fields An EM algorithm for Gaussian Markov Random Fields Will Penny, Wellcome Department of Imaging Neuroscience, University College, London WC1N 3BG. wpenny@fil.ion.ucl.ac.uk October 28, 2002 Abstract Lavine

More information

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Spatial smoothing using Gaussian processes

Spatial smoothing using Gaussian processes Spatial smoothing using Gaussian processes Chris Paciorek paciorek@hsph.harvard.edu August 5, 2004 1 OUTLINE Spatial smoothing and Gaussian processes Covariance modelling Nonstationary covariance modelling

More information

Modelling Non-linear and Non-stationary Time Series

Modelling Non-linear and Non-stationary Time Series Modelling Non-linear and Non-stationary Time Series Chapter 2: Non-parametric methods Henrik Madsen Advanced Time Series Analysis September 206 Henrik Madsen (02427 Adv. TS Analysis) Lecture Notes September

More information

Summary STK 4150/9150

Summary STK 4150/9150 STK4150 - Intro 1 Summary STK 4150/9150 Odd Kolbjørnsen May 22 2017 Scope You are expected to know and be able to use basic concepts introduced in the book. You knowledge is expected to be larger than

More information

State Space Representation of Gaussian Processes

State Space Representation of Gaussian Processes State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)

More information

Data Analysis and Uncertainty Part 2: Estimation

Data Analysis and Uncertainty Part 2: Estimation Data Analysis and Uncertainty Part 2: Estimation Instructor: Sargur N. University at Buffalo The State University of New York srihari@cedar.buffalo.edu 1 Topics in Estimation 1. Estimation 2. Desirable

More information

Generative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis

Generative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis Generative Models and Stochastic Algorithms for Population Average Estimation and Image Analysis Stéphanie Allassonnière CIS, JHU July, 15th 28 Context : Computational Anatomy Context and motivations :

More information

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes COMP 55 Applied Machine Learning Lecture 2: Gaussian processes Instructor: Ryan Lowe (ryan.lowe@cs.mcgill.ca) Slides mostly by: (herke.vanhoof@mcgill.ca) Class web page: www.cs.mcgill.ca/~hvanho2/comp55

More information

COM336: Neural Computing

COM336: Neural Computing COM336: Neural Computing http://www.dcs.shef.ac.uk/ sjr/com336/ Lecture 2: Density Estimation Steve Renals Department of Computer Science University of Sheffield Sheffield S1 4DP UK email: s.renals@dcs.shef.ac.uk

More information

arxiv: v1 [stat.co] 16 May 2011

arxiv: v1 [stat.co] 16 May 2011 Fast approximate inference with INLA: the past, the present and the future arxiv:1105.2982v1 [stat.co] 16 May 2011 Daniel Simpson, Finn Lindgren and Håvard Rue Department of Mathematical Sciences Norwegian

More information

ESTIMATING THE MEAN LEVEL OF FINE PARTICULATE MATTER: AN APPLICATION OF SPATIAL STATISTICS

ESTIMATING THE MEAN LEVEL OF FINE PARTICULATE MATTER: AN APPLICATION OF SPATIAL STATISTICS ESTIMATING THE MEAN LEVEL OF FINE PARTICULATE MATTER: AN APPLICATION OF SPATIAL STATISTICS Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, N.C.,

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Iain Murray murray@cs.toronto.edu CSC255, Introduction to Machine Learning, Fall 28 Dept. Computer Science, University of Toronto The problem Learn scalar function of

More information

Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling

Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling Kriging models with Gaussian processes - covariance function estimation and impact of spatial sampling François Bachoc former PhD advisor: Josselin Garnier former CEA advisor: Jean-Marc Martinez Department

More information

Scalable kernel methods and their use in black-box optimization

Scalable kernel methods and their use in black-box optimization with derivatives Scalable kernel methods and their use in black-box optimization David Eriksson Center for Applied Mathematics Cornell University dme65@cornell.edu November 9, 2018 1 2 3 4 1/37 with derivatives

More information

Non-stationary Gaussian models with physical barriers

Non-stationary Gaussian models with physical barriers Non-stationary Gaussian models with physical barriers Haakon Bakka; in collaboration with Jarno Vanhatalo, Janine Illian, Daniel Simpson and Håvard Rue King Abdullah University of Science and Technology

More information

Improving posterior marginal approximations in latent Gaussian models

Improving posterior marginal approximations in latent Gaussian models Improving posterior marginal approximations in latent Gaussian models Botond Cseke Tom Heskes Radboud University Nijmegen, Institute for Computing and Information Sciences Nijmegen, The Netherlands {b.cseke,t.heskes}@science.ru.nl

More information

Inference for latent variable models with many hyperparameters

Inference for latent variable models with many hyperparameters Int. Statistical Inst.: Proc. 5th World Statistical Congress, 011, Dublin (Session CPS070) p.1 Inference for latent variable models with many hyperparameters Yoon, Ji Won Statistics Department Lloyd Building,

More information

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability

More information

Gaussian Processes (10/16/13)

Gaussian Processes (10/16/13) STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs

More information

BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES

BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES BLIND SEPARATION OF INSTANTANEOUS MIXTURES OF NON STATIONARY SOURCES Dinh-Tuan Pham Laboratoire de Modélisation et Calcul URA 397, CNRS/UJF/INPG BP 53X, 38041 Grenoble cédex, France Dinh-Tuan.Pham@imag.fr

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Fast Direct Methods for Gaussian Processes

Fast Direct Methods for Gaussian Processes Fast Direct Methods for Gaussian Processes Mike O Neil Departments of Mathematics New York University oneil@cims.nyu.edu December 12, 2015 1 Collaborators This is joint work with: Siva Ambikasaran Dan

More information

Bayesian computation using INLA. 5: Spatial Markovian models The SPDE approach

Bayesian computation using INLA. 5: Spatial Markovian models The SPDE approach Bayesian computation using INLA 5: Spatial Markovian models The SPDE approach Outline Spatial models Gaussian random fields Low-dimensional methods Spatial Matérn fields Example: Continuous vs Discrete

More information

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University FEATURE EXPANSIONS FEATURE EXPANSIONS

More information

Chapter 4 - Fundamentals of spatial processes Lecture notes

Chapter 4 - Fundamentals of spatial processes Lecture notes Chapter 4 - Fundamentals of spatial processes Lecture notes Geir Storvik January 21, 2013 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites Mostly positive correlation Negative

More information

ECE531 Screencast 5.5: Bayesian Estimation for the Linear Gaussian Model

ECE531 Screencast 5.5: Bayesian Estimation for the Linear Gaussian Model ECE53 Screencast 5.5: Bayesian Estimation for the Linear Gaussian Model D. Richard Brown III Worcester Polytechnic Institute Worcester Polytechnic Institute D. Richard Brown III / 8 Bayesian Estimation

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, Jeffreys priors. exp 1 ) p 2

Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, Jeffreys priors. exp 1 ) p 2 Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, 2010 Jeffreys priors Lecturer: Michael I. Jordan Scribe: Timothy Hunter 1 Priors for the multivariate Gaussian Consider a multivariate

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Bayesian Modeling and Inference for High-Dimensional Spatiotemporal Datasets

Bayesian Modeling and Inference for High-Dimensional Spatiotemporal Datasets Bayesian Modeling and Inference for High-Dimensional Spatiotemporal Datasets Sudipto Banerjee University of California, Los Angeles, USA Based upon projects involving: Abhirup Datta (Johns Hopkins University)

More information

Spatial Statistics with Image Analysis. Lecture L11. Home assignment 3. Lecture 11. Johan Lindström. December 5, 2016.

Spatial Statistics with Image Analysis. Lecture L11. Home assignment 3. Lecture 11. Johan Lindström. December 5, 2016. HA3 MRF:s Simulation Estimation Spatial Statistics with Image Analysis Lecture 11 Johan Lindström December 5, 2016 Lecture L11 Johan Lindström - johanl@maths.lth.se FMSN20/MASM25 L11 1/22 HA3 MRF:s Simulation

More information

Introduction to Bayes and non-bayes spatial statistics

Introduction to Bayes and non-bayes spatial statistics Introduction to Bayes and non-bayes spatial statistics Gabriel Huerta Department of Mathematics and Statistics University of New Mexico http://www.stat.unm.edu/ ghuerta/georcode.txt General Concepts Spatial

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Machine Learning - MT & 5. Basis Expansion, Regularization, Validation

Machine Learning - MT & 5. Basis Expansion, Regularization, Validation Machine Learning - MT 2016 4 & 5. Basis Expansion, Regularization, Validation Varun Kanade University of Oxford October 19 & 24, 2016 Outline Basis function expansion to capture non-linear relationships

More information

Gaussian Processes for Machine Learning

Gaussian Processes for Machine Learning Gaussian Processes for Machine Learning Carl Edward Rasmussen Max Planck Institute for Biological Cybernetics Tübingen, Germany carl@tuebingen.mpg.de Carlos III, Madrid, May 2006 The actual science of

More information

GAUSSIAN PROCESS REGRESSION

GAUSSIAN PROCESS REGRESSION GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The

More information

Model Selection for Gaussian Processes

Model Selection for Gaussian Processes Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal

More information

A multi-resolution Gaussian process model for the analysis of large spatial data sets.

A multi-resolution Gaussian process model for the analysis of large spatial data sets. National Science Foundation A multi-resolution Gaussian process model for the analysis of large spatial data sets. Doug Nychka Soutir Bandyopadhyay Dorit Hammerling Finn Lindgren Stephen Sain NCAR/TN-504+STR

More information

Gaussian Processes in Machine Learning

Gaussian Processes in Machine Learning Gaussian Processes in Machine Learning November 17, 2011 CharmGil Hong Agenda Motivation GP : How does it make sense? Prior : Defining a GP More about Mean and Covariance Functions Posterior : Conditioning

More information

Bayesian Inference: Concept and Practice

Bayesian Inference: Concept and Practice Inference: Concept and Practice fundamentals Johan A. Elkink School of Politics & International Relations University College Dublin 5 June 2017 1 2 3 Bayes theorem In order to estimate the parameters of

More information

On Gaussian Process Models for High-Dimensional Geostatistical Datasets

On Gaussian Process Models for High-Dimensional Geostatistical Datasets On Gaussian Process Models for High-Dimensional Geostatistical Datasets Sudipto Banerjee Joint work with Abhirup Datta, Andrew O. Finley and Alan E. Gelfand University of California, Los Angeles, USA May

More information

Frequentist-Bayesian Model Comparisons: A Simple Example

Frequentist-Bayesian Model Comparisons: A Simple Example Frequentist-Bayesian Model Comparisons: A Simple Example Consider data that consist of a signal y with additive noise: Data vector (N elements): D = y + n The additive noise n has zero mean and diagonal

More information

A Framework for Daily Spatio-Temporal Stochastic Weather Simulation

A Framework for Daily Spatio-Temporal Stochastic Weather Simulation A Framework for Daily Spatio-Temporal Stochastic Weather Simulation, Rick Katz, Balaji Rajagopalan Geophysical Statistics Project Institute for Mathematics Applied to Geosciences National Center for Atmospheric

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

K-Means and Gaussian Mixture Models

K-Means and Gaussian Mixture Models K-Means and Gaussian Mixture Models David Rosenberg New York University October 29, 2016 David Rosenberg (New York University) DS-GA 1003 October 29, 2016 1 / 42 K-Means Clustering K-Means Clustering David

More information

Douglas Nychka, Soutir Bandyopadhyay, Dorit Hammerling, Finn Lindgren, and Stephan Sain. October 10, 2012

Douglas Nychka, Soutir Bandyopadhyay, Dorit Hammerling, Finn Lindgren, and Stephan Sain. October 10, 2012 A multi-resolution Gaussian process model for the analysis of large spatial data sets. Douglas Nychka, Soutir Bandyopadhyay, Dorit Hammerling, Finn Lindgren, and Stephan Sain October 10, 2012 Abstract

More information

Gaussian Markov Random Fields: Theory and Applications

Gaussian Markov Random Fields: Theory and Applications Gaussian Markov Random Fields: Theory and Applications Håvard Rue 1 1 Department of Mathematical Sciences NTNU, Norway September 26, 2008 Part I Definition and basic properties Outline I Introduction Why?

More information

A Process over all Stationary Covariance Kernels

A Process over all Stationary Covariance Kernels A Process over all Stationary Covariance Kernels Andrew Gordon Wilson June 9, 0 Abstract I define a process over all stationary covariance kernels. I show how one might be able to perform inference that

More information

Non-Parametric Bayes

Non-Parametric Bayes Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian

More information

Ages of stellar populations from color-magnitude diagrams. Paul Baines. September 30, 2008

Ages of stellar populations from color-magnitude diagrams. Paul Baines. September 30, 2008 Ages of stellar populations from color-magnitude diagrams Paul Baines Department of Statistics Harvard University September 30, 2008 Context & Example Welcome! Today we will look at using hierarchical

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Computer model calibration with large non-stationary spatial outputs: application to the calibration of a climate model

Computer model calibration with large non-stationary spatial outputs: application to the calibration of a climate model Computer model calibration with large non-stationary spatial outputs: application to the calibration of a climate model Kai-Lan Chang and Serge Guillas University College London, Gower Street, London WC1E

More information

Geostatistical Modeling for Large Data Sets: Low-rank methods

Geostatistical Modeling for Large Data Sets: Low-rank methods Geostatistical Modeling for Large Data Sets: Low-rank methods Whitney Huang, Kelly-Ann Dixon Hamil, and Zizhuang Wu Department of Statistics Purdue University February 22, 2016 Outline Motivation Low-rank

More information

NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET

NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET Investigating posterior contour probabilities using INLA: A case study on recurrence of bladder tumours by Rupali Akerkar PREPRINT STATISTICS NO. 4/2012 NORWEGIAN

More information

Theory of Stochastic Processes 8. Markov chain Monte Carlo

Theory of Stochastic Processes 8. Markov chain Monte Carlo Theory of Stochastic Processes 8. Markov chain Monte Carlo Tomonari Sei sei@mist.i.u-tokyo.ac.jp Department of Mathematical Informatics, University of Tokyo June 8, 2017 http://www.stat.t.u-tokyo.ac.jp/~sei/lec.html

More information