Simulation of point process using MCMC

Similar documents
Convex Optimization CMU-10725

Markov-Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo

MCMC simulation of Markov point processes

Lecture 8: The Metropolis-Hastings Algorithm

A quick introduction to Markov chains and Markov chain Monte Carlo (revised version)

Markov Chain Monte Carlo (MCMC)

Introduction to Machine Learning CMU-10701

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling

Markov Chain Monte Carlo Methods

CS281A/Stat241A Lecture 22

Metropolis Hastings. Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601. Module 9

Quantifying Uncertainty

MCMC: Markov Chain Monte Carlo

Markov Chain Monte Carlo

University of Toronto Department of Statistics

Bayesian GLMs and Metropolis-Hastings Algorithm

17 : Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

References. Markov-Chain Monte Carlo. Recall: Sampling Motivation. Problem. Recall: Sampling Methods. CSE586 Computer Vision II

MCMC and Gibbs Sampling. Kayhan Batmanghelich

Markov Chains and MCMC

Robert Collins CSE586, PSU Intro to Sampling Methods

Markov Chain Monte Carlo (MCMC)

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

Robert Collins CSE586, PSU. Markov-Chain Monte Carlo

Stat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC

Sampling Methods (11/30/04)

Normalising constants and maximum likelihood inference

Introduction to Computational Biology Lecture # 14: MCMC - Markov Chain Monte Carlo

The Metropolis Algorithm

MCMC Methods: Gibbs and Metropolis

6 Markov Chain Monte Carlo (MCMC)

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Lecture 6: Markov Chain Monte Carlo

SAMPLING ALGORITHMS. In general. Inference in Bayesian models

Markov Chain Monte Carlo Inference. Siamak Ravanbakhsh Winter 2018

Markov Chain Monte Carlo The Metropolis-Hastings Algorithm

Simulation - Lectures - Part III Markov chain Monte Carlo

A short diversion into the theory of Markov chains, with a view to Markov chain Monte Carlo methods

Introduction to MCMC. DB Breakfast 09/30/2011 Guozhang Wang

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture February Arnaud Doucet

In many cases, it is easier (and numerically more stable) to compute

Markov Random Fields

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Computational statistics

Spatial point processes: Theory and practice illustrated with R

9 Markov chain Monte Carlo integration. MCMC

Markov Chain Monte Carlo Using the Ratio-of-Uniforms Transformation. Luke Tierney Department of Statistics & Actuarial Science University of Iowa

Markov chain Monte Carlo Lecture 9

MSc MT15. Further Statistical Methods: MCMC. Lecture 5-6: Markov chains; Metropolis Hastings MCMC. Notes and Practicals available at

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Temporal Point Processes the Conditional Intensity Function

Perfect simulation of repulsive point processes

Lecture 7 and 8: Markov Chain Monte Carlo

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture 18-16th March Arnaud Doucet

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Perfect simulation for repulsive point processes

Monte Carlo Methods. Leon Gu CSD, CMU

MCMC 2: Lecture 3 SIR models - more topics. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham

MARKOV CHAIN MONTE CARLO

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

16 : Markov Chain Monte Carlo (MCMC)

Convergence Rate of Markov Chains

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.


Markov Chain Monte Carlo

Markov chain Monte Carlo

Dynamic models. Dependent data The AR(p) model The MA(q) model Hidden Markov models. 6 Dynamic models

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

Monte Carlo integration

RANDOM TOPICS. stochastic gradient descent & Monte Carlo

MODIFIED METROPOLIS-HASTINGS ALGORITHM WITH DELAYED REJECTION FOR HIGH-DIMENSIONAL RELIABILITY ANALYSIS

' '-'in.-i 1 'iritt in \ rrivfi pr' 1 p. ru

CSC 2541: Bayesian Methods for Machine Learning

for Global Optimization with a Square-Root Cooling Schedule Faming Liang Simulated Stochastic Approximation Annealing for Global Optim

MCMC and Gibbs Sampling. Sargur Srihari

Chapter 4 Multiple Random Variables

Lect4: Exact Sampling Techniques and MCMC Convergence Analysis

A = {(x, u) : 0 u f(x)},

Towards a Bayesian model for Cyber Security

Markov chain Monte Carlo

Chapter 7. Markov chain background. 7.1 Finite state space

STAT 430/510: Lecture 15

Markov Chain Monte Carlo (MCMC) and Model Evaluation. August 15, 2017

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model

I. Bayesian econometrics

ST 740: Markov Chain Monte Carlo

Computer intensive statistical methods

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

Likelihood Inference for Lattice Spatial Processes

STA 4273H: Statistical Machine Learning

Math 456: Mathematical Modeling. Tuesday, April 9th, 2018

Part 8: GLMs and Hierarchical LMs and GLMs

Markov chain Monte Carlo

Markov Chain Monte Carlo Lecture 6

Transcription:

Simulation of point process using MCMC Jakob G. Rasmussen Department of Mathematics Aalborg University Denmark March 2, 2011 1/14

Today When do we need MCMC? Case 1: Fixed number of points Metropolis-Hastings, Metropolis, Gibbs sampling Case 2: Varying number of points Birth/death algorithm DBC, reversibility, aperiodicity, irreducibility Practical issues 2/14

When do we need MCMC? We already know how to simulate Poisson processes Point processes specified by a conditional intensity Cox processes Cluster processes But what about Markov processes? Here we need MCMC, where we will exploit that fact that Markov processes are usually specified by a density or Papangelou conditional intensity. 3/14

MCMC in the case of a fixed number of points Suppose X has fixed number of points n(x) = n with density f wrt the unit rate Poisson process. For example, the Strauss process conditioned on n(x) = n has density f(x) β n(x) γ s(x) 1[n(X) = n]. When n(x) = n the Poisson expansion simplifies to P(X F) = f({u 1,...,u n })du 1...du n. S n 4/14

Metropolis-Hastings algorithm (single point update) Repeat for m = 0,1,...: Given current state Y m = (u 1,...,u n ) (i.e. an ordered point pattern) 1. generate I Unif({1,...,n}), 2. generate V q I (Y m, ), 3. replace u I with V to obtain proposal Y prop = (u 1,...,u I 1,V,u I+1,...,u n ), 4. let Y m+1 = Y prop with acceptance prob. min{1,r(y m,y prop )} where r(y m,y prop ) is the Hastings ratio: otherwise Y m+1 = Y m. r(y m,y prop ) = f(yprop )q I (Y prop,u I ) ; f(y m )q I (Y m,v) Note: Y prop and Y m are unordered. Initial state: the binomial process is a usually a convenient initial state. 5/14

Metropolis algorithm MH algorithm with symmetric q i, i.e. q i ((u 1,...,u n ),v) = q i ((u 1,...,u i 1,v,u i+1,...,u n ),u i ). Examples of symmetric proposal densities q i ((u 1,...,u n ),v) 1, q i ((u 1,...,u n ),v) 1[v N ui ] Then Hastings ratio reduces to Note that r(y m,y prop ) = f(yprop ) f(y m ). f(y prop ) f(y m ) = λ (V,Y prop \V) λ (u i,y m \u i ) so unknown normalizing constant cancels out and in the case of a Markov process the computations are local. 6/14

Gibbs sampler MH algorithm with q i ((u 1,...,u n ),v) f({u 1,...,u i 1,v,u i+1,...,u n }) Then Hastings ratio equal to one (always accept). Rejection sampling from conditional density in case of locally stable density λ (u,x) K(u): Given Y m and I = i, for j = 1,2,... 1. generate V j K( ), 2. generate R j Unif([0,1]), 3. return first V j for which R j λ (V j,y m \u i )/K(V j ). 7/14

Varying number of points - birth/death algorithm Proposals: Birth or death? - given Y m = x, p(x) is the probability of proposing birth, 1 p(x) is the probability of proposing death. Birth: New point generated from density q b (x, ) Death: Point selected from x with probability q d (x, ) (note: if Y m =, then Y m+1 = ). Hastings ratios: Birth: Death: r b (x,v) = f(x V)(1 p(x V))q d(x V,V) f(x)p(x)q b (x,v) r d (x,v) = f(x\v)p(x\v)q b(x\v,v) f(x)(1 p(x))q d (x,v) Note: birth/death/move algorithm has all three types of steps. Initial state: the Poisson process or the empty point pattern are convenient initial states. 8/14

Detailed balance condition Let a b = min{1,r b }, a d = min{1,r d } denote the acceptance probabilities for a birth or a death. Detailed balance condition (DBC): f(x)p(x)q b (x,v)a b (x,v) = f(x v)(1 p(x v))q d (x v,v)a d (x v,v) DBC is satisfied for the birth/death algorithm. Intuitively DBC tells us that there is balance between births and deaths. 9/14

Reversibility Reversibility: If Y m Π, then P(Y m F,Y m+1 G) = P(Y m G,Y m+1 F) Intuitively reversibility tells us that if we take a step in a Markov chain, we can also go back. Let E n = {x N f : n(x) = n,f(x) > 0}}, E = {x N f : f(x) > 0}}. Reversibility implies that the Markov chain has Π an invariant distribution. Proof that birth/death algorithm is reversible: three cases: rejection: reversibility follows from Y m = Y m+1. acceptance of birth: for F n E n and G n+1 E n+1, prove P(Y m F n,y m+1 G n+1 ) = P(Y m G n+1,y m+1 F n ). acceptance of death: as for birth. 10/14

Irreducibility and periodicity Irreducibility: Definition: A Markov chain is Ψ-irreducible, if Ψ(F) > 0 implies P m (x,f) > 0 for any x N f and some m > 0. Examples: Ψ(F) = P(X F) = Π(F) or Ψ(F) = 1[ F] Note: Ψ-irreducibility implies Π-irreducibility by Proposition 7.2(i) if an invariant distribution Π exists, p. 119. Intuitively irreducibility says that we can reach any relevant part of N f. Periodicity: Definition: A Markov chain is periodic if for N f = n i=0 D j disjoint, P(x,D j ) = 1 when x D j 1 (j > 1) and P(x,D 1 ) = 1 when x D n, and Π(D 0 ) = 0. Intuitively periodicity means that the chain is going in circles (and thus cannot converge to the right distribution) Roughly speaking, aperiodicity and irredicibility implies that the Markov chain converges to the right distribution (Proposition 7.7 (ii) on page 122). 11/14

Aperiodicity and irreducibility of birth/death chain Suppose initial state Y 0 E, p( ) < 1 and for all x E there is a v x: (1 p(x))q d (x,v) > 0 f(x\v)p(x\v)q b (x\v,v) > 0. Then birth-death chain is aperiodic and irreducible. 12/14

Birth-death simulation of Strauss process Proposals distributions: p(x) = 1/2, q d (x,v) = 1/n(x) and q b (x,v) = 1/ S. Hastings ratios: r b (x,v) = βγ u x 1[ v u R] S, n(x v) n(x) r d (x,v) = βγ u x\v 1[ v u R] S. What happens if we ignore that γ 1 is required for the Strauss process to exist? 13/14

MCMC in practice Looking at plots of point patterns Y m provides little information whether the chain has converged approximately. Trace plots of various summary statistics can be useful. For example in the case of the Strauss process, n(y m ) and s(y m ) are convenient. Comparing trace plots for Y m started at different Y 0, e.g. empty pattern or Poisson process, can also be useful. Spatstat contains the rmh function for simulating point patterns using the Metropolis-Hastings algorithm. 14/14