Simulation of a Gaussian random vector: A propagative approach to the Gibbs sampler
|
|
- Myra Kelly
- 6 years ago
- Views:
Transcription
1 Simulation of a Gaussian random vector: A propagative approach to the Gibbs sampler C. Lantuéjoul, N. Desassis and F. Fouedjio MinesParisTech Kriging and Gaussian processes for computer experiments CHORUS Workshop,
2 Outline Presentation of the problem reminder and notation running Gibbs sampler on large Gaussian random vectors Propagative version of the Gibbs sampler two remarks basic algorithm possible extensions Applications simulation of a nonstationary Gaussian random field conditional simulation of a Cox process 2
3 Presentation of the problem 3
4 Reminder and notation Let Y S = (Y s, s S) be a standardized Gaussian random vector with covariance matrix C. (Simple) kriging: If s S and A S. The kriging of Y s on Y A is a linear estimator Ys A = α A λ αy α. The weights λ α are chosen so as to minimize the estimation variance V ar{y s Ys A }. They are solutions of the linear system β A λ βc α,β = C α,s α A The estimation variance is σ 2A s Conditional distribution: = V ar{y s Y A s } = 1 α A λ αc α,s Y s Y A = y A is normally distributed. More precisely, we have Y s Y A = y A N ( ) ys A, σs 2A 4
5 Simulation of a Gaussian random vector using the Gibbs sampler Problem: Simulate the Gaussian vector Y S iteratively. Algorithm: (i) reset y c S; (ii) put y n S = ys; c (iii) select s U(S) and generate ys n N ( ) ys S\s, σs 2S\s ; (iv) put y c S = y n S and goto (ii). Problem: When S is large, the kriging matrices cannot be inverted. 5
6 Simulation of a Gaussian random vector using the Gibbs sampler; approximate algorithm To illustrate the approach, the components of S are depicted as the nodes of a square grid, which makes it possible to consider kriging neighbourhoods as balls B s,r. r Algorithm: (i) reset y c S; (ii) put y n S = y c S; (iii) select s U(S) and generate ys n N ( y B s,r\s s (iv) put y c S = y n S and goto (ii). ), σ 2B s,r\s s ; 6
7 Example (10000 components), spherical covariance (range 10) Neighbourhood radius of
8 Example (10000 components), spherical covariance (range 10) Neighbourhood radius of
9 Extrema Scans 9 Extrema
10 Gibbs sampler for Gaussian random vectors A propagative version 10
11 Two useful remarks Remark 1: Let Y S N(0, C). The kriging estimate ys S\s and its estimation variance σs 2S\s can be expressed using the inverse C 1 of matrix C: Remark 2: y S\s s = t s C 1 st C 1 ss y t σ 2S\s s = 1 C 1 ss Galli and Gao (2001): If X = C 1 Y, then X N(0, C 1 ). Consequence: As (C 1 ) 1 = C, the Gibbs sampler can be run exactly on X. Idea: Apply the exact Gibbs sampler to X and transport the results to Y using the relation Y = CX. 11
12 Running the Gibbs sampler on X Remark: X S\s s = t s C st C ss X t = t s C st X t = X s Y s V ar{x s X S\s s } = 1 C ss = 1 Algorithm: (i) reset x c S; (ii) put x n S = x c S; (iii) select p U(S) and generate x n p N(x c p y c p,1); (iv) put x c S = x n S and goto (ii). Definition: The point p is called a pivot. 12
13 Transporting the Gibbs sampler to Y If x n p N(x c p y c p, 1), then x n p = x c p y c p + u with u N. Then y n s = t C st x n t = t p C st x c t + C sp x n p = y c s C sp x c p + C sp (x c p y c p + u) = y c s + C sp (u y c p) Taking s = p, we obtain y n p = u, which finally gives y n s = y c s + C sp (y n p y c p) s S Algorithm: (i) reset y c S; (ii) select p U(S) and generate y n p N; (iii) put y n s = y c s + C sp (y n p y c p) for each s S; (iv) put y c S = y n S and goto (ii). 13
14 Example of a vector with components Spherical covariance function with range
15 Variograms of simulations gamma(h) Spherical covariance (10) h Simulation variograms obtained after 1, 2, 3, 5, 7, 10, 15 and 20 scans. In black, the variogram model. 15
16 Generalizations Generating the pivot value y n p can be generated as a function of y c p, e.g. y n p N(ry c p, 1 r 2 ). It is sometimes convenient to draw y n close to y c. Blocking strategy y n s = y c s + C sp (y n p y c p) y n s C sp y n p = y c s C sp y c p Thus, kriging residuals are preserved by the propagative approach of the Gibbs sampler. This remark remains valid when the pivot p is replaced by a family P of pivots. (i) put y c = 0; (ii) select P at random in S and generate y n P N(0, C PP ); (iii) put ys n = ys c + ys n,p (iv) put y c = y n, and goto (ii). y c,p s for each s S; 16
17 Application Simulation of a non-stationary Gaussian random field 17
18 Simulation of a nonstationary Gaussian random field Let Z be a nonstationary Gaussian random field. It can be written as m + σy, where Y is a (non necessarily stationary), standardized Gaussian random field. Problem: Generate Z in the discrete domain S. Notation: Let C be the covariance matrix of Y S. Algorithm: (i) generate y S N(0, C); (ii) put z s = m s + σ s y s for each s S. 18
19 Example: point source model C s,t = exp ( λ s t ) exp ( µ e ν s e ν t ) m = 0 and σ = 1. λ = 0.05, µ = 5 and ν = Simulation field
20 Application Conditional simulation of a Cox process 20
21 Motivation A piece of land is covered with trees. It is partitioned into congruent plots. The number of trees is known on a number of plots. We want to predict the number of trees on each plot This can be done by averaging a set of conditional simulations. The spatial distribution of trees is modelled by a Cox process. 21
22 Cox process Definition: A Cox process is a Poisson point process with random intensity function. Remark: The random intensity function is also called potential. 22
23 Properties of the Cox process Let N S = (N s, s S) be the number of trees on all plots, and let Z S be their potential. Mean value: Covariance between plots: Cov{N s, N t } = E{N s } = E{Z s } { Cov{Zs, Z t } if s t V ar{z s } + E{Z s } if s = t Conditional distribution: Given the potential, the numbers of trees on differents plots are conditionally independent. P {N S = n S Z S = z S } = exp( z s ) zn s s n s! s S 23
24 Cox-lognormal process The potential is lognormally distributed, i.e. Z S = exp(µ + σy S ), where Y S is a standardized Gaussian random vector (covariance matrix C). µ = and σ = 0.7. C s,t = exp( s t /15). Simulation field unit plots. 24
25 Conditional simulation of a Cox-lognormal process Problem: Generate N S given N A = n A for some A S. Idea: At each iteration, candidate values are proposed for y S using the propagative version of the Gibbs sampler. They are accepted or rejected using Metropolis-Hastings criterion. Algorithm: (i) put y c S = 0; (ii) generate p U(S) and y n p N; (iii) put z c A = e µ+σyc A, z n A = z c A e σc Ap (yn p yc p ), and generate u U; (iv) if u > p(n A z n A)/p(n A z c A), then goto (ii); (v) put y c S = y c S + C Sp (y n p y c p); (vi) goto (ii). 25
26 20 conditioning data points Reality (TL), conditioning data points (TR), and two conditional simulations (BL and BR) 26
27 100 conditioning data points Reality (TL), conditioning data points (TR), and two conditional simulations (BL and BR) 27
28 500 conditioning data points Reality (TL), conditioning data points (TR), and two conditional simulations (BL and BR) 28
29 Conclusions An iterative algorithm, derived from the Gibbs sampler, has been proposed for simulating Gaussian random vectors. At each iteration, a component is selected and randomly assigned a new value. This value is then linearly propagated to the other components. Because it does not require the inversion of the covariance matrix, this algorithm does not incur the dimensionality problem of the standard Gibbs sampler. This can be used to simulate random processes related to Gaussian random fields even if they are nonstationary or subject to soft constraints. 29
30 References Desassis N. et Lantuéjoul C. (2011) - Simulation d un vecteur gaussien: une approche propagative de l chantillonneur de Gibbs. 43 emes Journées de Statistique, Tunis (Tunisie). Lantuéjoul C. and Dessassis N. (2012) - Simulation of a Gaussian random vector: A propagative version of the Gibbs sampler. 9 th Int. Geostatistics Congress, Oslo (Norway). Fouedjio F. (2014) - Développement de modèles géostatistiques globaux non-stationnaires. Thèse MinesParisTech (à soutenir). 30
31 Rate of convergence Let Y (n) the random vector generated at the n th scan. Systematic scan If Y (0) N(0, C (0) ), then Y (n) N(0, C (n) ). Besides, we have C (n) C = B n( C (0) C )t B n where B = (Id L) 1 U. The matrices Id, L and U are respectively the identity matrix, the lower and upper parts of C C = Id + L + U If follows that the rate of convergence is governed by the square of the spectral radius of B. 31
Prediction by conditional simulation
Prediction by conditional simulation C. Lantuéjoul MinesParisTech christian.lantuejoul@mines-paristech.fr Introductive example Problem: A submarine cable has to be laid on the see floor between points
More informationIs there still room for new developments in geostatistics?
Is there still room for new developments in geostatistics? Jean-Paul Chilès MINES ParisTech, Fontainebleau, France, IAMG 34th IGC, Brisbane, 8 August 2012 Matheron: books and monographs 1962-1963: Treatise
More informationCONDITIONAL SIMULATION OF A COX PROCESS USING MULTIPLE SAMPLE SUPPORTS
CONDITIONAL SIMULATION OF A COX PROCESS USING MULTIPLE SAMPLE SUPPORTS GAVIN BROWN, JOHAN FERREIRA and CHRISTIAN LANTUÉJOUL De Beers MRM, PO Box 23329, Claremont 7735, Cape Town, South Africa De Beers
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationGeoEnv -July Simulations. D. Renard N. Desassis. Geostatistics & RGeostats
GeoEnv -July 2014 Simulations D. Renard N. Desassis Geostatistics & RGeostats 1 o Why are simulations necessary? Simulations Estimation (Kriging) produces smooth results We need a different method which
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee September 03 05, 2017 Department of Biostatistics, Fielding School of Public Health, University of California, Los Angeles Linear Regression Linear regression is,
More informationComparing Non-informative Priors for Estimation and Prediction in Spatial Models
Environmentrics 00, 1 12 DOI: 10.1002/env.XXXX Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Regina Wu a and Cari G. Kaufman a Summary: Fitting a Bayesian model to spatial
More informationPattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods
Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs
More informationCorrecting Variogram Reproduction of P-Field Simulation
Correcting Variogram Reproduction of P-Field Simulation Julián M. Ortiz (jmo1@ualberta.ca) Department of Civil & Environmental Engineering University of Alberta Abstract Probability field simulation is
More informationConvex Optimization CMU-10725
Convex Optimization CMU-10725 Simulated Annealing Barnabás Póczos & Ryan Tibshirani Andrey Markov Markov Chains 2 Markov Chains Markov chain: Homogen Markov chain: 3 Markov Chains Assume that the state
More informationGraphical Models and Kernel Methods
Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.
More informationA full scale, non stationary approach for the kriging of large spatio(-temporal) datasets
A full scale, non stationary approach for the kriging of large spatio(-temporal) datasets Thomas Romary, Nicolas Desassis & Francky Fouedjio Mines ParisTech Centre de Géosciences, Equipe Géostatistique
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationChapter 4 - Fundamentals of spatial processes Lecture notes
TK4150 - Intro 1 Chapter 4 - Fundamentals of spatial processes Lecture notes Odd Kolbjørnsen and Geir Storvik January 30, 2017 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites
More informationAdvanced Introduction to Machine Learning
10-715 Advanced Introduction to Machine Learning Homework 3 Due Nov 12, 10.30 am Rules 1. Homework is due on the due date at 10.30 am. Please hand over your homework at the beginning of class. Please see
More informationarxiv: v1 [stat.me] 3 Dec 2014
A Generalized Convolution Model and Estimation for Non-stationary Random Functions F. Fouedjio, N. Desassis, J. Rivoirard Équipe Géostatistique - Centre de Géosciences, MINES ParisTech 35, rue Saint-Honoré,
More informationCosmology & CMB. Set5: Data Analysis. Davide Maino
Cosmology & CMB Set5: Data Analysis Davide Maino Gaussian Statistics Statistical isotropy states only two-point correlation function is needed and it is related to power spectrum Θ(ˆn) = lm Θ lm Y lm (ˆn)
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo
Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative
More informationComputer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo
Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain
More informationMCMC algorithms for fitting Bayesian models
MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models
More informationApplications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices
Applications of Randomized Methods for Decomposing and Simulating from Large Covariance Matrices Vahid Dehdari and Clayton V. Deutsch Geostatistical modeling involves many variables and many locations.
More informationHierarchical Nearest-Neighbor Gaussian Process Models for Large Geo-statistical Datasets
Hierarchical Nearest-Neighbor Gaussian Process Models for Large Geo-statistical Datasets Abhirup Datta 1 Sudipto Banerjee 1 Andrew O. Finley 2 Alan E. Gelfand 3 1 University of Minnesota, Minneapolis,
More informationENGRG Introduction to GIS
ENGRG 59910 Introduction to GIS Michael Piasecki October 13, 2017 Lecture 06: Spatial Analysis Outline Today Concepts What is spatial interpolation Why is necessary Sample of interpolation (size and pattern)
More informationThe Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision
The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that
More informationIntroduction to Spatial Data and Models
Introduction to Spatial Data and Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry
More informationECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering
ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:
More information6 Markov Chain Monte Carlo (MCMC)
6 Markov Chain Monte Carlo (MCMC) The underlying idea in MCMC is to replace the iid samples of basic MC methods, with dependent samples from an ergodic Markov chain, whose limiting (stationary) distribution
More informationMonte Carlo Methods. Leon Gu CSD, CMU
Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte
More informationComputer Vision Group Prof. Daniel Cremers. 14. Sampling Methods
Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department
More informationConcentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand
Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting
More informationBayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence
Bayesian Inference in GLMs Frequentists typically base inferences on MLEs, asymptotic confidence limits, and log-likelihood ratio tests Bayesians base inferences on the posterior distribution of the unknowns
More informationGeostatistics for Gaussian processes
Introduction Geostatistical Model Covariance structure Cokriging Conclusion Geostatistics for Gaussian processes Hans Wackernagel Geostatistics group MINES ParisTech http://hans.wackernagel.free.fr Kernels
More informationCS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling
CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy
More informationComparing Non-informative Priors for Estimation and. Prediction in Spatial Models
Comparing Non-informative Priors for Estimation and Prediction in Spatial Models Vigre Semester Report by: Regina Wu Advisor: Cari Kaufman January 31, 2010 1 Introduction Gaussian random fields with specified
More informationMultivariate Geostatistics
Hans Wackernagel Multivariate Geostatistics An Introduction with Applications Third, completely revised edition with 117 Figures and 7 Tables Springer Contents 1 Introduction A From Statistics to Geostatistics
More informationNon-gaussian spatiotemporal modeling
Dec, 2008 1/ 37 Non-gaussian spatiotemporal modeling Thais C O da Fonseca Joint work with Prof Mark F J Steel Department of Statistics University of Warwick Dec, 2008 Dec, 2008 2/ 37 1 Introduction Motivation
More informationIntroduction to Geostatistics
Introduction to Geostatistics Abhi Datta 1, Sudipto Banerjee 2 and Andrew O. Finley 3 July 31, 2017 1 Department of Biostatistics, Bloomberg School of Public Health, Johns Hopkins University, Baltimore,
More informationAsymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands
Asymptotic Multivariate Kriging Using Estimated Parameters with Bayesian Prediction Methods for Non-linear Predictands Elizabeth C. Mannshardt-Shamseldin Advisor: Richard L. Smith Duke University Department
More informationChapter 4 - Fundamentals of spatial processes Lecture notes
Chapter 4 - Fundamentals of spatial processes Lecture notes Geir Storvik January 21, 2013 STK4150 - Intro 2 Spatial processes Typically correlation between nearby sites Mostly positive correlation Negative
More informationMCMC: Markov Chain Monte Carlo
I529: Machine Learning in Bioinformatics (Spring 2013) MCMC: Markov Chain Monte Carlo Yuzhen Ye School of Informatics and Computing Indiana University, Bloomington Spring 2013 Contents Review of Markov
More informationStat 535 C - Statistical Computing & Monte Carlo Methods. Lecture February Arnaud Doucet
Stat 535 C - Statistical Computing & Monte Carlo Methods Lecture 13-28 February 2006 Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Limitations of Gibbs sampling. Metropolis-Hastings algorithm. Proof
More informationMachine learning: Hypothesis testing. Anders Hildeman
Location of trees 0 Observed trees 50 100 150 200 250 300 350 400 450 500 0 100 200 300 400 500 600 700 800 900 1000 Figur: Observed points pattern of the tree specie Beilschmiedia pendula. Location of
More informationA Framework for Daily Spatio-Temporal Stochastic Weather Simulation
A Framework for Daily Spatio-Temporal Stochastic Weather Simulation, Rick Katz, Balaji Rajagopalan Geophysical Statistics Project Institute for Mathematics Applied to Geosciences National Center for Atmospheric
More informationSubstitution Random Fields with Gaussian and Gamma Distributions: Theory and Application to a Pollution Data Set
Substitution Random Fields with Gaussian and Gamma Distributions: Theory and Application to a Pollution Data Set Xavier Emery Abstract This paper presents random field models with Gaussian or gamma univariate
More informationECE531 Screencast 5.5: Bayesian Estimation for the Linear Gaussian Model
ECE53 Screencast 5.5: Bayesian Estimation for the Linear Gaussian Model D. Richard Brown III Worcester Polytechnic Institute Worcester Polytechnic Institute D. Richard Brown III / 8 Bayesian Estimation
More informationESTIMATING THE MEAN LEVEL OF FINE PARTICULATE MATTER: AN APPLICATION OF SPATIAL STATISTICS
ESTIMATING THE MEAN LEVEL OF FINE PARTICULATE MATTER: AN APPLICATION OF SPATIAL STATISTICS Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, N.C.,
More informationA Study on Minimal-point Composite Designs
A Study on Minimal-point Composite Designs by Yin-Jie Huang Advisor Ray-Bing Chen Institute of Statistics, National University of Kaohsiung Kaohsiung, Taiwan 8 R.O.C. July 007 Contents Introduction Construction
More informationMachine Learning for Data Science (CS4786) Lecture 12
Machine Learning for Data Science (CS4786) Lecture 12 Gaussian Mixture Models Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016fa/ Back to K-means Single link is sensitive to outliners We
More informationBayesian model selection in graphs by using BDgraph package
Bayesian model selection in graphs by using BDgraph package A. Mohammadi and E. Wit March 26, 2013 MOTIVATION Flow cytometry data with 11 proteins from Sachs et al. (2005) RESULT FOR CELL SIGNALING DATA
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos Contents Markov Chain Monte Carlo Methods Sampling Rejection Importance Hastings-Metropolis Gibbs Markov Chains
More informationDownscaling Seismic Data to the Meter Scale: Sampling and Marginalization. Subhash Kalla LSU Christopher D. White LSU James S.
Downscaling Seismic Data to the Meter Scale: Sampling and Marginalization Subhash Kalla LSU Christopher D. White LSU James S. Gunning CSIRO Contents Context of this research Background Data integration
More informationThe Proportional Effect of Spatial Variables
The Proportional Effect of Spatial Variables J. G. Manchuk, O. Leuangthong and C. V. Deutsch Centre for Computational Geostatistics, Department of Civil and Environmental Engineering University of Alberta
More informationHidden Markov Models for precipitation
Hidden Markov Models for precipitation Pierre Ailliot Université de Brest Joint work with Peter Thomson Statistics Research Associates (NZ) Page 1 Context Part of the project Climate-related risks for
More informationBayesian Linear Models
Bayesian Linear Models Sudipto Banerjee 1 and Andrew O. Finley 2 1 Department of Forestry & Department of Geography, Michigan State University, Lansing Michigan, U.S.A. 2 Biostatistics, School of Public
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More informationWrapped Gaussian processes: a short review and some new results
Wrapped Gaussian processes: a short review and some new results Giovanna Jona Lasinio 1, Gianluca Mastrantonio 2 and Alan Gelfand 3 1-Università Sapienza di Roma 2- Università RomaTRE 3- Duke University
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods
Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationAdvances and Applications in Perfect Sampling
and Applications in Perfect Sampling Ph.D. Dissertation Defense Ulrike Schneider advisor: Jem Corcoran May 8, 2003 Department of Applied Mathematics University of Colorado Outline Introduction (1) MCMC
More informationMarkov Switching Regular Vine Copulas
Int. Statistical Inst.: Proc. 58th World Statistical Congress, 2011, Dublin (Session CPS057) p.5304 Markov Switching Regular Vine Copulas Stöber, Jakob and Czado, Claudia Lehrstuhl für Mathematische Statistik,
More informationDiscrete solid-on-solid models
Discrete solid-on-solid models University of Alberta 2018 COSy, University of Manitoba - June 7 Discrete processes, stochastic PDEs, deterministic PDEs Table: Deterministic PDEs Heat-diffusion equation
More informationConcepts and Applications of Kriging. Eric Krause Konstantin Krivoruchko
Concepts and Applications of Kriging Eric Krause Konstantin Krivoruchko Outline Introduction to interpolation Exploratory spatial data analysis (ESDA) Using the Geostatistical Wizard Validating interpolation
More informationMCMC and Gibbs Sampling. Sargur Srihari
MCMC and Gibbs Sampling Sargur srihari@cedar.buffalo.edu 1 Topics 1. Markov Chain Monte Carlo 2. Markov Chains 3. Gibbs Sampling 4. Basic Metropolis Algorithm 5. Metropolis-Hastings Algorithm 6. Slice
More informationBuilding Blocks for Direct Sequential Simulation on Unstructured Grids
Building Blocks for Direct Sequential Simulation on Unstructured Grids Abstract M. J. Pyrcz (mpyrcz@ualberta.ca) and C. V. Deutsch (cdeutsch@ualberta.ca) University of Alberta, Edmonton, Alberta, CANADA
More informationState-Space Methods for Inferring Spike Trains from Calcium Imaging
State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline
More informationPRODUCING PROBABILITY MAPS TO ASSESS RISK OF EXCEEDING CRITICAL THRESHOLD VALUE OF SOIL EC USING GEOSTATISTICAL APPROACH
PRODUCING PROBABILITY MAPS TO ASSESS RISK OF EXCEEDING CRITICAL THRESHOLD VALUE OF SOIL EC USING GEOSTATISTICAL APPROACH SURESH TRIPATHI Geostatistical Society of India Assumptions and Geostatistical Variogram
More informationClimate Change: the Uncertainty of Certainty
Climate Change: the Uncertainty of Certainty Reinhard Furrer, UZH JSS, Geneva Oct. 30, 2009 Collaboration with: Stephan Sain - NCAR Reto Knutti - ETHZ Claudia Tebaldi - Climate Central Ryan Ford, Doug
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences)
AMS526: Numerical Analysis I (Numerical Linear Algebra for Computational and Data Sciences) Lecture 19: Computing the SVD; Sparse Linear Systems Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical
More informationLecture 15. Theory of random processes Part III: Poisson random processes. Harrison H. Barrett University of Arizona
Lecture 15 Theory of random processes Part III: Poisson random processes Harrison H. Barrett University of Arizona 1 OUTLINE Poisson and independence Poisson and rarity; binomial selection Poisson point
More informationModels for spatial data (cont d) Types of spatial data. Types of spatial data (cont d) Hierarchical models for spatial data
Hierarchical models for spatial data Based on the book by Banerjee, Carlin and Gelfand Hierarchical Modeling and Analysis for Spatial Data, 2004. We focus on Chapters 1, 2 and 5. Geo-referenced data arise
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationAdvanced topics from statistics
Advanced topics from statistics Anders Ringgaard Kristensen Advanced Herd Management Slide 1 Outline Covariance and correlation Random vectors and multivariate distributions The multinomial distribution
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationConcepts and Applications of Kriging
Esri International User Conference San Diego, California Technical Workshops July 24, 2012 Concepts and Applications of Kriging Konstantin Krivoruchko Eric Krause Outline Intro to interpolation Exploratory
More informationRemoving systematic noise from lightcurves
Removing systematic noise from lightcurves A discussion of past approaches and potential future directions Giri Gopalan 1 Dr. Peter Plavchan 2 1 Department of Statistics Harvard University 2 Caltech NASA
More informationStatistics and data analyses
Statistics and data analyses Designing experiments Measuring time Instrumental quality Precision Standard deviation depends on Number of measurements Detection quality Systematics and methology σ tot =
More informationAdaptive Monte Carlo methods
Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert
More informationModule Master Recherche Apprentissage et Fouille
Module Master Recherche Apprentissage et Fouille Michele Sebag Balazs Kegl Antoine Cornuéjols http://tao.lri.fr 19 novembre 2008 Unsupervised Learning Clustering Data Streaming Application: Clustering
More informationGeostatistics in Hydrology: Kriging interpolation
Chapter Geostatistics in Hydrology: Kriging interpolation Hydrologic properties, such as rainfall, aquifer characteristics (porosity, hydraulic conductivity, transmissivity, storage coefficient, etc.),
More informationAlternative implementations of Monte Carlo EM algorithms for likelihood inferences
Genet. Sel. Evol. 33 001) 443 45 443 INRA, EDP Sciences, 001 Alternative implementations of Monte Carlo EM algorithms for likelihood inferences Louis Alberto GARCÍA-CORTÉS a, Daniel SORENSEN b, Note a
More informationTwo Statistical Problems in X-ray Astronomy
Two Statistical Problems in X-ray Astronomy Alexander W Blocker October 21, 2008 Outline 1 Introduction 2 Replacing stacking Problem Current method Model Further development 3 Time symmetry Problem Model
More informationCovariance function estimation in Gaussian process regression
Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian
More informationConcepts and Applications of Kriging. Eric Krause
Concepts and Applications of Kriging Eric Krause Sessions of note Tuesday ArcGIS Geostatistical Analyst - An Introduction 8:30-9:45 Room 14 A Concepts and Applications of Kriging 10:15-11:30 Room 15 A
More informationSpatial statistics, addition to Part I. Parameter estimation and kriging for Gaussian random fields
Spatial statistics, addition to Part I. Parameter estimation and kriging for Gaussian random fields 1 Introduction Jo Eidsvik Department of Mathematical Sciences, NTNU, Norway. (joeid@math.ntnu.no) February
More informationOptimizing Thresholds in Truncated Pluri-Gaussian Simulation
Optimizing Thresholds in Truncated Pluri-Gaussian Simulation Samaneh Sadeghi and Jeff B. Boisvert Truncated pluri-gaussian simulation (TPGS) is an extension of truncated Gaussian simulation. This method
More informationSpatio-temporal modeling of avalanche frequencies in the French Alps
Available online at www.sciencedirect.com Procedia Environmental Sciences 77 (011) 311 316 1 11 Spatial statistics 011 Spatio-temporal modeling of avalanche frequencies in the French Alps Aurore Lavigne
More informationScientific Computing with Case Studies SIAM Press, Lecture Notes for Unit VII Sparse Matrix
Scientific Computing with Case Studies SIAM Press, 2009 http://www.cs.umd.edu/users/oleary/sccswebpage Lecture Notes for Unit VII Sparse Matrix Computations Part 1: Direct Methods Dianne P. O Leary c 2008
More informationAdaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge
Master s Thesis Presentation Adaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge Diego Selle (RIS @ LAAS-CNRS, RT-TUM) Master s Thesis Presentation
More informationMulti-resolution models for large data sets
Multi-resolution models for large data sets Douglas Nychka, National Center for Atmospheric Research National Science Foundation NORDSTAT, Umeå, June, 2012 Credits Steve Sain, NCAR Tia LeRud, UC Davis
More informationES-2 Lecture: More Least-squares Fitting. Spring 2017
ES-2 Lecture: More Least-squares Fitting Spring 2017 Outline Quick review of least-squares line fitting (also called `linear regression ) How can we find the best-fit line? (Brute-force method is not efficient)
More informationMarkov chain Monte Carlo methods in atmospheric remote sensing
1 / 45 Markov chain Monte Carlo methods in atmospheric remote sensing Johanna Tamminen johanna.tamminen@fmi.fi ESA Summer School on Earth System Monitoring and Modeling July 3 Aug 11, 212, Frascati July,
More informationLog Gaussian Cox Processes. Chi Group Meeting February 23, 2016
Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started
More informationBayesian networks: approximate inference
Bayesian networks: approximate inference Machine Intelligence Thomas D. Nielsen September 2008 Approximative inference September 2008 1 / 25 Motivation Because of the (worst-case) intractability of exact
More informationMachine Learning for Data Science (CS4786) Lecture 24
Machine Learning for Data Science (CS4786) Lecture 24 Graphical Models: Approximate Inference Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ BELIEF PROPAGATION OR MESSAGE PASSING Each
More informationMultimodal Nested Sampling
Multimodal Nested Sampling Farhan Feroz Astrophysics Group, Cavendish Lab, Cambridge Inverse Problems & Cosmology Most obvious example: standard CMB data analysis pipeline But many others: object detection,
More informationOn the spatial distribution of critical points of Random Plane Waves
On the spatial distribution of critical points of Random Plane Waves Valentina Cammarota Department of Mathematics, King s College London Workshop on Probabilistic Methods in Spectral Geometry and PDE
More informationAn introduction to adaptive MCMC
An introduction to adaptive MCMC Gareth Roberts MIRAW Day on Monte Carlo methods March 2011 Mainly joint work with Jeff Rosenthal. http://www2.warwick.ac.uk/fac/sci/statistics/crism/ Conferences and workshops
More informationLecture 8: The Metropolis-Hastings Algorithm
30.10.2008 What we have seen last time: Gibbs sampler Key idea: Generate a Markov chain by updating the component of (X 1,..., X p ) in turn by drawing from the full conditionals: X (t) j Two drawbacks:
More informationspbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models
spbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models Andrew O. Finley 1, Sudipto Banerjee 2, and Bradley P. Carlin 2 1 Michigan State University, Departments
More informationF denotes cumulative density. denotes probability density function; (.)
BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models
More information