Bayesian Modeling of Inhomogeneous Marked Spatial Point Patterns with Location Dependent Mark Distributions

Size: px
Start display at page:

Download "Bayesian Modeling of Inhomogeneous Marked Spatial Point Patterns with Location Dependent Mark Distributions"

Transcription

1 Inhomogeneous with Location Dependent Mark Distributions Matt Bognar Department of Statistics and Actuarial Science University of Iowa December 4, 2014

2

3 A spatial point pattern describes the spatial location of events in a region W location of particles in a fluid or gas location of trees in a forest location of bird nests on an island A mark (or multiple marks) may be associated with each point in the pattern type of particle trunk diameter species of bird

4 Ant Nests (Harkness & Isham 1983) Marked spatial point pattern of n = 62 ant nests in a foot field located in Spain (subset of the original dataset) Messor wasmanni ( ) and Cataglyphis bicolor ( )

5 Complete Spatial Randomness (CSR) Randomly generated 47 and 15 in W in right plot Ant nests on left, CSR on right

6 Ant Nests Observations Higher intensity in lower region Modeling spatial inhomogeneity is no big deal See Ogata & Katsura (1988), Bognar (2005), etc. Within species: Fewer close pairs of points than under CSR This regular spacing is called spatial regularity or spatial inhibition Addressed in various papers Between species: Inter-point distances are not-unlike CSR; there are close pairs and distant pairs of points Higher proportion of Cataglyphis bicolor ( ) in lower region Mark distribution appears to depend upon location Ogata & Katsura (1988) addressed this for Poisson processes (i.e. assuming no interaction between points) Spurred current work Can we model interaction between points too?

7

8 Setup Let W be a bounded subset of R 2 or R 3 A spatial point process X is a finite random subset of W A realization of X is called a spatial point pattern (SPP) A SPP is denoted by x = {x 1,..., x n } i.e. x consists of n distinct points in W

9 Density Of A Spatial Point Process The density of a spatial point process X (with respect to the unit rate Poisson process) is f (x) def = g(x) Z g(x) is the unnormalized density (or kernel) Z is the normalization constant (or partition function) Z = n=0 e W n! W g(x) dx 1... dx n W When n = 0, interpret the integral to equal 1 Z is typically intractable Assume the unnormalized density g is integrable, i.e. Z < The density f is normalized in the sense that n=0 e W n! W f (x) dx 1... dx n = 1 W

10 Gibbs Point Process Hammersley-Clifford-Ripley-Kelly Theorem (Ripley & Kelly 1977, Baddeley et al. 2004, 2013): A finite Gibbs point process has probability density (with respect to the unit rate Poisson process) of the form f (x) = g(x) Z def = exp [ U(x)] Z where the potential function (or energy function) U(x) = n i=1 n 1 V (1) (x i ) + n i=1 j=i+1 V (2) (x i, x j ) + V (i) = i th order potential (maps sub-configurations with i points into R { }) Normalizer e W Z = g(x) dx 1... dx n n! W W n=0 is typically intractable

11 Special Cases Poisson point processes V (1) is the highest order potential function Homogeneous: V (1) (x i ) = log(κ) (where κ is intensity) Inhomogeneous: V (1) (x i ) = log(κ(x i )) Z is tractable (hence the popularity of the Poisson process) Can not model interaction between points Pairwise interaction point processes V (2) is the highest order interaction (Ripley 1977) Z is intractable Can model interaction between points Triplets process V (3) is the highest order interaction (Geyer 1999) Z is intractable Can model interaction between points Assume V (i) 0 when i 2 = integrability (i.e. Z < ) (Møller et al. 2006) Restricts us to repulsive or inhibitory spatial point processes Better models exist for modeling clustering

12 Selected Literature Baddeley & Turner (2000), Bognar (2005), Bognar & Cowles (2004), Diggle (2003), Diggle et al. (1987, 1994), Geyer (1999), Harkness & Isham (1983), Heikkinen & Penttinen (1999), Mateu & Montes (2001), Møller & Waagepetersen (2004), Ogata & Tanemura (1981, 1984, 1986, 1989), Penttinen (1984), Stoyan & Stoyan (1998), Takacs (1986) and van Lieshout (2000)

13 Marked

14 Marked Let M denote the mark space Define Γ = W M = {(ω 1, ω 2 ) : ω 1 W, ω 2 M} Let (x, m) = (x 1,..., x n, m 1,..., m n ) Density of a marked Gibbs point process has the form f (x, m) = Energy function U(x, m) = g(x, m) Z def = n V (1) (x i, m i ) i=1 n 1 + i=1 j=i+1 + exp [ U(x, m)] Z n V (2) (x i, x j, m i, m j ) Partition function e W Z = n! n=0 Γ g(x, m) d(x 1, m 1)... d(x n, m n) Γ

15 Goulard et al. (1996) Homogeneous marked pairwise interacting point processes with mark dependent intensity Specification: V (1) (x i, m i ) = log [α(m i )] V (2) (x i, x j, m i, m j ) = φ(x i, x j, m i, m j ) V (m) ( ) 0, m 3 α( ) > 0 is the mark chemical activity function (intensity can depend upon mark) φ( ) > 0 is the mark pair potential function Describes interaction between marked points Interaction can depend upon marks Mark distribution does not change with location Intractable normalization constant Z Did maximum pseudo-likelihood analysis Bognar (2008) used a Bayesian framework Computational; natural point and interval estimation

16 Ogata & Katsura (1988) Inhomogeneous marked Poisson point process with location dependent mark distributions Specification: V (1) (x i, m i ) = log [κ(x i )p(m i x i )] V (m) ( ) 0, m 2 κ(x) > 0 is the intensity at location x p( x) is the mark distribution at location x p(m x) dm = 1 (summation in discrete case) M The normalization constant Z reduces to that of the unmarked inhomogeneous case (it is known in closed form) Performed a Bayesian analysis (using splines) Model does not allow for interaction between points

17 Proposed Model Inhomogeneous marked pairwise interacting point processes with location dependent mark distribution Specification: V (1) (x i, m i ) = log [κ(x i )p(m i x i )] V (2) (x i, x j, m i, m j ) = φ(x i, x j, m i, m j ) V (m) ( ) 0, m 3 Intractable normalization constant Z

18 Notes Maximum pseudo-likelihood techniques are popular since avoid intractable Z See Goulard et al. (1996), Jensen & Møller (1991), Baddeley & Turner (2000) Frequentist framework provides for point estimation, but the complex distributional properties of the estimates makes interval estimation difficult Baddeley et al. (2005) formally explores the use of parametric bootstrap techniques Selected literature: Baddeley & Møller (1989), Degenhardt (1999), Fiksel (1984), Ogata & Tanemura (1985), or Stoyan & Penttinen (2000) Lets use a Bayesian approach

19

20 We need to model the: inhomogeneous intensity location dependent parameters of the mark distribution A Voronoi tessellation on W partitions the space into tiles; tile i consists of all points in W closer to generating point i than to any other generating point Associate a constant height with each tile Yields a (discontinuous) surface over W Certainly planes, splines, other tessellation techniques, etc. could be used to model the process intensity as well

21 Example of Voronoi Tessellation with 4 tiles ants[, 2]

22 V (1) Intensity κ( ) κ( ) is modeled by a Voronoi tessellation on W Generating points where C κ i C κ = (C κ 1,..., C κ k κ ) W for i = 1,..., k κ Tile heights H κ = (H κ 1,..., H κ k κ ) where H κ i > 0 for i = 1,..., k κ Define γ κ = (C κ, H κ ) Let 2kκ parameters describing κ( ) κ γ κ(x) denote the height of the intensity tile at location x i.e. if point x resides in tile j, then κγ κ(x) = Hj κ

23 V (1) Mark Distribution p( x) Assume p(m x) is indexed by M parameters η p = (η p 1,..., ηp M ) Assume each parameter η p i can spatially vary Model η p i with a Voronoi tessellation on W Generating points C p i = (C p i1,..., C p ik i ) Tile heights H p γ p i i = (H p i1,..., Hp ik i ) = (C p i, H p i ) for i = 1,..., M 2 M i=1 k i parameters Planes, splines, etc. on W could be used as well

24 V (1) p( x) for Ant Nests Let m i = { 0 Messor wasmanni ( ) 1 Cataglyphis bicolor ( ) Let π(x) = probability of at location x Model π(x) with a Voronoi tessellation on W Generating points C p = (C p 1,..., C p k p ) Tile heights H p = (H p 1,..., Hp k p ) Let γ p = (C p, H p ) π γ p(x) denote the height of the tessellation tile at location x i.e. if point x resides in tile j, then let πγ p (x) = H p j Location dependent mark distribution m x, γ p Bern(π γ p(x)) 2k p parameters

25 V (1) Specification for Ant Nests V (1) is indexed by γ = (γ κ, γ p ) First order potential function is [ ] V γ (1) (x i, m i ) = log κ γ κ(x i ) π γ p(x i ) m i (1 π γ p(x i )) 1 m i for i = 1,..., n

26 Since spatial interaction will depend upon species, let (generalization of Strauss 1975) V (2) ψ (x i, x j, m i, m j ) = h 00 I(m i = m j = 0)I( x i x j < b 00 ) + h 01 I(m i m j )I( x i x j < b 01 ) + h 11 I(m i = m j = 1)I( x i x j < b 11 ) xi x j = Euclidean distance between x i and x j ψ = (b00, h 00, b 01, h 01, b 11, h 11) Straussian parameters h.. 0 describes the strength of inhibition between pairs of points h.. > 0 indicates spatial regularity h.. = 0 corresponds to no spatial interaction Interaction distances b.. > 0 describes the distance at which two points cease to interact

27 Let θ = (γ, ψ) is f (x, m θ) = Energy function U(x, m θ) = g(x, m θ) Z(θ) + n i=1 def = V (1) γ (x i, m i ) n 1 n i=1 j=i+1 exp[ U(x, m θ)] Z(θ) V (2) ψ (x i, x j, m i, m j ) Normalizer is an intractable function of the parameters Z(θ) = n=0 e W n! Γ g(x, m θ) d(x 1, m 1)... d(x n, m n) Γ

28 Priors: Let kκ = k p = 4 (these need not be equal); the k s are a sort of smoothing parameter parameters V (1) γ V (2) ψ C1 κ,..., C 4 κ iid Unif (W ) H1 κ,..., Hκ iid 4 Unif (0, 100) C p 1,..., C p iid 4 Unif (W ) H p 1,..., Hp iid 4 Unif (0, 1) parameters h 00, h 01, h 11 iid Unif [0, 100) b ij iid Unif (min( xi x j ), 100) for ij = 00, 01, parameters Let p(θ) = p(γ, ψ) denote the joint prior : Bayesian inference will be based upon MCMC simulations from the full posterior distribution p(θ x, m) f (x, m θ)p(θ)

29 Standard Metropolis-Hastings Update Suppose the current parameter vector at iteration t is θ (t) A singe iteration of the MH algorithm proceeds as follows: 1. Generate θ from some proposal distribution, q(θ θ (t) ) 2. Set θ (t+1) = θ with probability [ α(θ (t), θ ) = min 1, otherwise set θ (t+1) = θ (t) ] f (x, m θ )p(θ )q(θ (t) θ ) f (x, m θ (t) )p(θ (t) )q(θ θ (t) ) [ = min 1, g(x, m θ )Z(θ (t) )p(θ )q(θ (t) θ ) g(x, m θ (t) )Z(θ )p(θ (t) )q(θ θ (t) ) The acceptance probability contains a ratio of intractable normalizing constants ] Z(θ (t) )/Z(θ ) thus we can not do a standard MH algorithm

30 Back when Matt had beautiful, luscious hair... Estimated the intractable ratio using importance sampling, plugged-in the estimate into the acceptance probability, and accepted/rejected obtaining a new importance sampling estimate within each iteration was computationally costly The limiting distribution resembles p(θ x, m) provided importance sampling estimates are good However, limiting distribution is not p(θ x, m) A similar method is suggested by Liang & Jin (2013)

31 Exchange Algorithm (Murray et al. 2006) Exchange algorithm can produce samples from the exact posterior, not an approximation of it Suppose the current parameter vector at iteration t is θ (t) A single iteration of the exchange algorithm proceeds as follows: 1. Generate θ from some proposal distribution, q(θ θ (t) ) 2. Generate a marked spatial point pattern, say (x w, m w ), from f ( θ ) 3. Set θ (t+1) = θ with probability α(θ (t), θ ) [ = min 1, f (x, ] m θ )p(θ )q(θ (t) θ )f (x w, m w θ (t) ) f (x, m θ (t) )p(θ (t) )q(θ θ (t) )f (x w, m w θ ) [ = min 1, g(x, ] m θ )p(θ )q(θ (t) θ )g(x w, m w θ (t) ) g(x, m θ (t) )p(θ (t) )q(θ θ (t) )g(x w, m w θ ) otherwise set θ (t+1) = θ (t) Exchange algorithm can be viewed as a more direct version of the Møller et al. (2006) auxiliary variable algorithm

32 Simulating (x w, m w ) Marked spatial point patterns can be generated using: birth/death algorithm of Geyer & Møller (1994) reversible jump MCMC (RJMCMC) algorithm of Green (1995) (the birth/death algorithm, which preceded RJMCMC, is a special case) Neither algorithm needs the normalization constant Z( ); because we are generating spatial point patterns at a fixed θ, the intractable normalizers Z( ) cancel in the acceptance probability Once the algorithm converges, we could obtain a marked spatial point pattern (x w, m w ) from f ( θ ) and use this to perform a single update in the exchange algorithm Doing updates in this way would yield samples from the exact posterior p(θ x, m)

33 Simulating (x w, m w ): The Catch Both the birth/death and RJMCMC algorithms require burn-in Insufficient burn-in would: 1. yield a marked spatial point pattern (x w, m w ) whose distribution is not f ( θ ) 2. subvert the cancelation of the intractable Z( ) s 3. and yield the wrong limiting distribution

34 Simulating (x w, m w ): Burn-in Period Lets use a very conservative (long) burn-in period Trace plot of the total energy U during the burn-in period of the birth/death algorithm for generating (x w, m w ) from f ( θ ) (starting from a Poisson spatial point pattern) Energy Iteration Used a 5,000 iteration burn-in period for the birth/death algorithm Very computational Every iteration of the exchange algorithm requires running a new birth/death (or RJMCMC) algorithm, with sufficient burn-in, to obtain (x w, m w ) from f ( θ )

35 Simulating (x w, m w ): Notes Tracking the total energy U( ) only provides rough guidance on convergence Insufficient burn-in of the birth/death (RJMCMC) algorithm typically yields an over-dispersed posterior distribution (and incorrect point estimates) Burn-in period of 50 iterations yielded over-dispersed posterior Burn-in period of 500 iterations yielded posterior inferences similar to the 5,000 iteration burn-in period What about perfect (exact) sampling? Appears much too slow to use within the exchange algorithm Appears much more difficult to implement Plus Matt is too lazy to code this Propp & Wilson (1996), van Lieshout & Stoica (2006), Møller (2001), Berthelsen & Møller (2002, 2003)

36 Proposals Within each iteration of the exchange algorithm, a parameter was randomly selected (random scan algorithm, one-at-a-time) Uniform random walk proposals Variance of the proposals was tuned to obtain acceptance rates from 20-50%

37 Used UI Neon Cluster (LT node) Coding in C++ (delicate, tedious, but readable) Largest computational burden: the ( n 2) pairwise Euclidean distances needed to evaluate V (2) ψ Must be computed (or updated) for every iteration of the birth/death (or RJMCMC) algorithm that simulates (x w, m w ) from f ( θ ) (could do a subset of distances if we smartly update; more prone to coding errors) Used OpenMP and the Intel MKL library to parallelize across 16 Intel Xeon processors and an Intel Phi co-processor (with 240 cores) Parallelization reduced computing time only 50% or so Lots of inter-processor communication Small dataset (n = 62) and small simulated datasets (x w, m w ) limit speed gains The exchange algorithm was run for 50,000 iterations (following a 5,000 iteration burn-in period) Animation

38 : κ γ κ(x) Marginal posterior mean of intensity κ γ κ(x) (50 50 grid) Unlike a Poisson process, the intensity is not proportional to the density of points Voronoi tessellation is crude; posterior mean is surprisingly smooth

39 : π γ p(x) Marginal posterior mean of π γ p(x) (probability of Cataglyphis bicolor ) under pairwise interaction model

40 : V (2) ψ Parameters mean and 95% credible set for the V (2) ψ parameters Mean 95% Cred. Set Mean 95% Cred. Set b (37.06, 58.76) h (0.256, 0.920) b (8.70, 71.93) h (0.011, 1.245) b (32.04, 49.03) h (1.060, 5.914) Messor wasmanni ( ) exhibit moderate regularity (ĥ00 = 0.556) Cataglyphis bicolor ( ) exhibit strong repulsion (ĥ11 = 2.628) Repulsion of the between-species nest locations is quite weak (not unlike CSR) (ĥ01 = 0.285) Other authors fixed b.. = 45 agree with Takacs & Fiksel (1986) and Särkkä (1993) (used fixed b s)

41 Big Question: Does Modeling the Interaction Matter? Marginal posterior mean of π γ p(x) (probability of Cataglyphis bicolor ) under non-interaction (Poisson) model (V (i) 0 for i 2)

42 Why the difference? Cataglyphis ( ) are strongly inhibitory Messor ( ) are better able to exist near each other Thus, process must push harder to yield a Cataglyphis nest ( ) near the bottom of W Hence, the underlying probability of a Cataglyphis nest ( ), π γ p( ), must be higher near the bottom of W

43 Mini

44 : Dataset was simulated on the square region First order potential: κ(x)=0.001 p(m x)=n(1.5,0.1) κ(x)=0.002 p(m x)=n(1.5,0.1) κ(x)=0.001 p(m x)=n(0.5,0.1) κ(x)=0.002 p(m x)=n(0.5,0.1)

45 : The second order potential function was { V (2) h ψ (x xi x i, x j, m i, m j ) = j s ij b 0 otherwise where s ij = (0.5(m i + m j )) 1 and ψ = (b, h) The interaction distance scales according to the mark For two points with small marks (near 0.5), sij 2 = interaction distance b/2 For two points with large marks (near 1.5), sij 2/3 = interaction distance 3b/2 One large and one small mark will have interaction distance b Used b = 20 and h = 1

46 Simulated Dataset Simulated dataset had 198 points (circle is proportional to mark)

47 : V (1) γ The k = 2 Voronoi tiles were fixed at true locations Uniform priors, RW proposals, 10,000 updates means and 95% credible intervals 500 κ(x)=0.001 κ(x)=0.002 p(m x)=n(1.5,0.1) p(m x)=n(1.5,0.1) 400 κ^(x)= κ^(x)= ( , ) ( , ) µ^= µ^= (1.475,1.530) (1.475,1.530) κ(x)=0.001 κ(x)=0.002 p(m x)=n(0.5,0.1) p(m x)=n(0.5,0.1) κ^(x)= κ^(x)= ( , ) ( , ) µ^= µ^= (0.466,0.498) (0.466,0.498)

48 : V (2) ψ Second order potential parameters: Sim. Val. Post. Mean 95% Cred. Int. b (18.76, 21.12) h (0.596, 1.284)

49

50 Discussion It is important to model interaction in marked spatial point patterns to better understand underlying process Bayesian approach: provides inference for all parameters; Frequentist approaches typically fix interaction distance(s) (e.g. b 00, b 01, b 11) provides natural interval estimates (no bootstrap) seems to allow much more complex modeling than Frequentist approaches Can easily get credible sets for intensity and parameters of the mark distribution at any location x W (not shown) Gibbs model is more flexible/general than I thought

51 Open Questions Multiple marks Allow number of tiles k to vary in MCMC sampler Not clear if Exchange algorithm can be modified to handle a dimension change Not clear if RJMCMC can be modified to handle an intractable normalizer Use Pseudo-Marginal algorithm (Andrieu & Roberts 2009) instead of Exchange algorithm Needs more work; mixes poorly in current setup Worked great with diffusions (Stramer & Bognar 2011) Edge effects (literature suggests it is minimal for ants dataset) We assumed interaction parameters ψ didn t change across the space W Want interaction parameters to depend upon location, i.e. ψ(x i, x j )

52 Thank You

53

54 Andrieu, C. & Roberts, G. (2009), The pseudo-marginal approach for efficient Monte Carlo computations, The Annals of Statistics 37, Baddeley, A., Bárány, I., Schneider, R. & Weil, W. (2004), Spatial Point and their Applications, Lecture Notes in Mathematics, Springer, pp Baddeley, A. & Møller, J. (1989), Nearest-neighbor Markov point processes and random sets, International Statistical Review 57, Baddeley, A. & Turner, R. (2000), Practical maximum pseudolikelihood for spatial point patterns, Australian and New Zealand Journal of Statistics 42, Baddeley, A., Turner, R., Mateu, J. & Bevan, A. (2013), Hybrids of gibbs point process models and their implementation, Journal of Statistical Software 55, Baddeley, A., Turner, R., Møller, J. & Hazelton, M. (2005), Residual analysis for spatial point processes, Journal of the Royal Statistical Society, Series B 67, Berthelsen, K. & Møller, J. (2002), A primer on perfect simulation for spatial point processes, Bulletin of the Brazilian Mathematical Society 33(3).

55 Berthelsen, K. & Møller, J. (2003), and non-parametric bayesian mcmc inference for spatial point processes based on perfect simulation and path sampling, Scandinavian Journal of Statistics 30(3), pp Bognar, M. A. (2005), Bayesian inference for spatially inhomogeneous pairwise interacting point processes, Computational Statistics and Data Analysis 49(1), Bognar, M. A. (2008), Bayesian modeling of continuously marked spatial point patterns, Computational Statistics 23(3), Bognar, M. A. & Cowles, M. K. (2004), Bayesian inference for pairwise interacting point processes, Statistics and Computing 14, Degenhardt, A. (1999), Description of tree distribution patterns and their development through marked Gibbs processes, Biometrika 74, Diggle, P. J. (2003), Statistical Analysis of Spatial Point, second edn, Arnold. Diggle, P. J., Fiksel, T., Grabarnik, P., Ogata, Y., Stoyan, D. & Tanemura, M. (1994), On parameter estimation for pairwise interaction point processes, International Statistical Review 62,

56 Diggle, P. J., Gates, D. J. & Stibbard, A. (1987), A nonparametric estimator for pairwise interaction point processes, Biometrika 74, Fiksel, T. (1984), Estimation of parameterized pair potentials of marked and unmarked Gibbsian point processes, Elektron. Inform. Kybernet. 20, Geyer, C. J. (1999), inference for spatial point processes, in O. E. Barndorff-Nielsen, W. S. Kendall & M. N. M. van Lieshout, eds, Stochastic Geometry: and Computation, Chapman and Hall/CRC, London, pp Geyer, C. J. & Møller, J. (1994), Simulation procedures and likelihood inference for spatial point processes, Scandinavian Journal of Statistics 21, Goulard, M., Särkkä, A. & Grabarnik, P. (1996), Parameter estimation for marked Gibbs point processes through the maximum pseudo-likelihood method, Scandinavian Journal of Statistics 23, Green, P. J. (1995), Reversible jump Markov chain Monte Carlo computation and Bayesian model determination, Biometrika 82, Harkness, R. D. & Isham, V. (1983), A bivariate spatial point pattern of ants nests, Applied Statistics 32,

57 Heikkinen, J. & Penttinen, A. (1999), Bayesian smoothing in the estimation of the pair potential function of Gibbs point patterns, Bernoulli 5, Jensen, J. L. & Møller, J. (1991), Pseudolikelihood for exponential family models of spatial point processes, Annals of Applied Probability 3, Liang, F. & Jin, I. (2013), A monte carlo metropolis-hastings algorithm for sampling from distributions with intractable normalizing constants, Neural Computation 25, Mateu, J. & Montes, F. (2001), inference for Gibbs processes in the analysis of spatial point patterns, International Statistical Review 69, Møller, J. (2001), A Review of Perfect Simulation in Stochastic Geometry, Vol. 37 of Lecture Notes Monograph Series, Institute of Mathematical Statistics, pp Møller, J., Pettitt, A., Berthelsen, K. & Reeves, R. (2006), An efficient markov chain monte carlo method for distributions with intractable normalising constants, Biometrika 93(2), Møller, J. & Waagepetersen, R. P. (2004), Statistical Inference and Simulation for Spatial Point, Chapman & Hall/CRC, Boca Raton.

58 Murray, I., Ghahramani, Z. & MacKay, D. (2006), Mcmc for doubly-intractable distributions, in Proceedings of the 22nd Annual Conference on Uncertainty in Artificial Intelligence UAI06, pp Ogata, Y. & Katsura, K. (1988), analysis of spatial inhomogeneity for marked point patterns, Annals of the Institute of Statistical Mathematics 40(1), Ogata, Y. & Tanemura, M. (1981), Estimation of interaction potentials of spatial point patterns through the maximum likelihood procedure, Annals of the Institute of Statistical Mathematics 33, Ogata, Y. & Tanemura, M. (1984), analysis of spatial point patterns, Journal of the Royal Statistical Society B 46, Ogata, Y. & Tanemura, M. (1985), Estimation of interaction potentials of marked spatial point patterns through the maximum likelihood method, Biometrics 41, Ogata, Y. & Tanemura, M. (1986), estimation of interaction potentials and external fields of inhomogeneous spatial point patterns, in I. S. Francis, B. F. J. Manly & F. C. Lam, eds, Proceedings of the Pacific Statistical Congress, Elsevier, Amsterdam, pp

59 Ogata, Y. & Tanemura, M. (1989), estimation of soft-core interaction potentials for Gibbsian point patterns, Annals of the Institute of Statistical Mathematics 41, Penttinen, A. K. (1984), Modeling interaction in spatial point patterns: Parameter estimation by the maximum likelihood method, Jyväskylä Studies in Computer Science, Economics and Statistics 7. Propp, J. G. & Wilson, D. B. (1996), Exact sampling with coupled Markov chains and applications to statistical mechanics, Random Structures and Algorithms 9, Ripley, B. D. (1977), Modeling spatial patterns (with discussion), Journal of the Royal Statistical Society B 39, Ripley, B. D. & Kelly, F. P. (1977), Markov point processes, J. Lond. Math. Soc. 15, Särkkä, A. (1993), Pseudo-likelihood approach for pair potential estimation of Gibbs processes, Jyväskylä Studies in Computer Science, Economics and Statistics 22. Stoyan, D. & Penttinen, A. (2000), Recent applications of point process methods in forestry statistics, Statistical Science 15,

60 Stoyan, D. & Stoyan, H. (1998), Non-homogeneous Gibbs process models for forestry a case study, Biometrical Journal 40, Stramer, O. & Bognar, M. (2011), Bayesian inference for irreducible diffusion processes using the Pseudo-Marginal approach, Bayesian Analysis 6(2), Strauss, D. (1975), A model for clustering, Biometrika 62, Takacs, R. (1986), Estimator for the pair-potential of a Gibbsian point process, Math. Operationsf. Statist. Ser. Statist. 17, Takacs, R. & Fiksel, T. (1986), Interaction pair-potentials for a system of ants nests, Biometrical Journal 28, van Lieshout, M. N. M. (2000), Markov Point and Their Applications, Imperial College Press, London. van Lieshout, M. N. M. & Stoica, R. S. (2006), Perfect simulation for marked point processes, Comput. Stat. Data Anal. 51(2),

Bayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference

Bayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference Bayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference Osnat Stramer 1 and Matthew Bognar 1 Department of Statistics and Actuarial Science, University of

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

An ABC interpretation of the multiple auxiliary variable method

An ABC interpretation of the multiple auxiliary variable method School of Mathematical and Physical Sciences Department of Mathematics and Statistics Preprint MPS-2016-07 27 April 2016 An ABC interpretation of the multiple auxiliary variable method by Dennis Prangle

More information

Handbook of Spatial Statistics. Jesper Møller

Handbook of Spatial Statistics. Jesper Møller Handbook of Spatial Statistics Jesper Møller May 7, 2008 ii Contents 1 1 2 3 3 5 4 Parametric methods 7 4.1 Introduction.............................. 7 4.2 Setting and notation.........................

More information

Miscellanea An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants

Miscellanea An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants Biometrika (2006), 93, 2, pp. 451 458 2006 Biometrika Trust Printed in Great Britain Miscellanea An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants BY

More information

Non-parametric Bayesian inference for inhomogeneous Markov point processes Berthelsen, Kasper Klitgaaard; Møller, Jesper; Johansen, Per Michael

Non-parametric Bayesian inference for inhomogeneous Markov point processes Berthelsen, Kasper Klitgaaard; Møller, Jesper; Johansen, Per Michael Aalborg Universitet Non-parametric Bayesian inference for inhomogeneous Markov point processes Berthelsen, Kasper Klitgaaard; Møller, Jesper; Johansen, Per Michael Publication date: 2007 Document Version

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

A = {(x, u) : 0 u f(x)},

A = {(x, u) : 0 u f(x)}, Draw x uniformly from the region {x : f(x) u }. Markov Chain Monte Carlo Lecture 5 Slice sampler: Suppose that one is interested in sampling from a density f(x), x X. Recall that sampling x f(x) is equivalent

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Research Report no. 1 January 2004

Research Report no. 1 January 2004 MaPhySto The Danish National Research Foundation: Network in Mathematical Physics and Stochastics Research Report no. 1 January 2004 Jesper Møller, A. N. Pettitt, K. K. Berthelsen and R. W. Reeves: An

More information

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model UNIVERSITY OF TEXAS AT SAN ANTONIO Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model Liang Jing April 2010 1 1 ABSTRACT In this paper, common MCMC algorithms are introduced

More information

Jean-Michel Billiot, Jean-François Coeurjolly and Rémy Drouilhet

Jean-Michel Billiot, Jean-François Coeurjolly and Rémy Drouilhet Colloque International de Statistique Appliquée pour le Développement en Afrique International Conference on Applied Statistics for Development in Africa Sada 07 nn, 1?? 007) MAXIMUM PSEUDO-LIKELIHOOD

More information

Evidence estimation for Markov random fields: a triply intractable problem

Evidence estimation for Markov random fields: a triply intractable problem Evidence estimation for Markov random fields: a triply intractable problem January 7th, 2014 Markov random fields Interacting objects Markov random fields (MRFs) are used for modelling (often large numbers

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

Lecture 7 and 8: Markov Chain Monte Carlo

Lecture 7 and 8: Markov Chain Monte Carlo Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani

More information

Lecture 6: Graphical Models: Learning

Lecture 6: Graphical Models: Learning Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)

More information

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling Christopher Jennison Department of Mathematical Sciences, University of Bath, UK http://people.bath.ac.uk/mascj Adriana Ibrahim Institute

More information

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions

More information

Markov Chain Monte Carlo (MCMC)

Markov Chain Monte Carlo (MCMC) Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can

More information

Reminder of some Markov Chain properties:

Reminder of some Markov Chain properties: Reminder of some Markov Chain properties: 1. a transition from one state to another occurs probabilistically 2. only state that matters is where you currently are (i.e. given present, future is independent

More information

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Machine Learning Summer School

Machine Learning Summer School Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters

More information

Brief introduction to Markov Chain Monte Carlo

Brief introduction to Markov Chain Monte Carlo Brief introduction to Department of Probability and Mathematical Statistics seminar Stochastic modeling in economics and finance November 7, 2011 Brief introduction to Content 1 and motivation Classical

More information

Sampling Algorithms for Probabilistic Graphical models

Sampling Algorithms for Probabilistic Graphical models Sampling Algorithms for Probabilistic Graphical models Vibhav Gogate University of Washington References: Chapter 12 of Probabilistic Graphical models: Principles and Techniques by Daphne Koller and Nir

More information

Aalborg Universitet. Parametric methods for spatial point processes Møller, Jesper. Publication date: 2008

Aalborg Universitet. Parametric methods for spatial point processes Møller, Jesper. Publication date: 2008 Aalborg Universitet Parametric methods for spatial point processes Møller, Jesper Publication date: 2008 Document Version Publisher's PDF, also known as Version of record Link to publication from Aalborg

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Marginal Specifications and a Gaussian Copula Estimation

Marginal Specifications and a Gaussian Copula Estimation Marginal Specifications and a Gaussian Copula Estimation Kazim Azam Abstract Multivariate analysis involving random variables of different type like count, continuous or mixture of both is frequently required

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Bagging During Markov Chain Monte Carlo for Smoother Predictions

Bagging During Markov Chain Monte Carlo for Smoother Predictions Bagging During Markov Chain Monte Carlo for Smoother Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods

More information

Stat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC

Stat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC Stat 451 Lecture Notes 07 12 Markov Chain Monte Carlo Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapters 8 9 in Givens & Hoeting, Chapters 25 27 in Lange 2 Updated: April 4, 2016 1 / 42 Outline

More information

Markov Chain Monte Carlo Methods

Markov Chain Monte Carlo Methods Markov Chain Monte Carlo Methods John Geweke University of Iowa, USA 2005 Institute on Computational Economics University of Chicago - Argonne National Laboaratories July 22, 2005 The problem p (θ, ω I)

More information

Exact Bayesian Inference for the Bingham Distribution

Exact Bayesian Inference for the Bingham Distribution Exact Bayesian Inference for the Bingham Distribution Christopher J. Fallaize & Theodore Kypraios School of Mathematical Sciences, University of Nottingham, Nottingham, NG7 2RD, UK June 27, 2018 arxiv:1401.2894v1

More information

Metropolis-Hastings Algorithm

Metropolis-Hastings Algorithm Strength of the Gibbs sampler Metropolis-Hastings Algorithm Easy algorithm to think about. Exploits the factorization properties of the joint probability distribution. No difficult choices to be made to

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

Riemann Manifold Methods in Bayesian Statistics

Riemann Manifold Methods in Bayesian Statistics Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

Monte Carlo Dynamically Weighted Importance Sampling for Spatial Models with Intractable Normalizing Constants

Monte Carlo Dynamically Weighted Importance Sampling for Spatial Models with Intractable Normalizing Constants Monte Carlo Dynamically Weighted Importance Sampling for Spatial Models with Intractable Normalizing Constants Faming Liang Texas A& University Sooyoung Cheon Korea University Spatial Model Introduction

More information

Monte Carlo Methods. Leon Gu CSD, CMU

Monte Carlo Methods. Leon Gu CSD, CMU Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte

More information

The Metropolis-Hastings Algorithm. June 8, 2012

The Metropolis-Hastings Algorithm. June 8, 2012 The Metropolis-Hastings Algorithm June 8, 22 The Plan. Understand what a simulated distribution is 2. Understand why the Metropolis-Hastings algorithm works 3. Learn how to apply the Metropolis-Hastings

More information

Bayesian non-parametric model to longitudinally predict churn

Bayesian non-parametric model to longitudinally predict churn Bayesian non-parametric model to longitudinally predict churn Bruno Scarpa Università di Padova Conference of European Statistics Stakeholders Methodologists, Producers and Users of European Statistics

More information

an introduction to bayesian inference

an introduction to bayesian inference with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena

More information

Efficient adaptive covariate modelling for extremes

Efficient adaptive covariate modelling for extremes Efficient adaptive covariate modelling for extremes Slides at www.lancs.ac.uk/ jonathan Matthew Jones, David Randell, Emma Ross, Elena Zanini, Philip Jonathan Copyright of Shell December 218 1 / 23 Structural

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

Inexact approximations for doubly and triply intractable problems

Inexact approximations for doubly and triply intractable problems Inexact approximations for doubly and triply intractable problems March 27th, 2014 Markov random fields Interacting objects Markov random fields (MRFs) are used for modelling (often large numbers of) interacting

More information

Perfect simulation of repulsive point processes

Perfect simulation of repulsive point processes Perfect simulation of repulsive point processes Mark Huber Department of Mathematical Sciences, Claremont McKenna College 29 November, 2011 Mark Huber, CMC Perfect simulation of repulsive point processes

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

A Bayesian Approach to Phylogenetics

A Bayesian Approach to Phylogenetics A Bayesian Approach to Phylogenetics Niklas Wahlberg Based largely on slides by Paul Lewis (www.eeb.uconn.edu) An Introduction to Bayesian Phylogenetics Bayesian inference in general Markov chain Monte

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

Markov chain Monte Carlo

Markov chain Monte Carlo 1 / 26 Markov chain Monte Carlo Timothy Hanson 1 and Alejandro Jara 2 1 Division of Biostatistics, University of Minnesota, USA 2 Department of Statistics, Universidad de Concepción, Chile IAP-Workshop

More information

Gibbs models estimation of galaxy point processes with the ABC Shadow algorithm

Gibbs models estimation of galaxy point processes with the ABC Shadow algorithm Gibbs models estimation of galaxy point processes with the ABC Shadow algorithm Lluı s Hurtado Gil Universidad CEU San Pablo (lluis.hurtadogil@ceu.es) Radu Stoica Universite de Lorraine, IECL (radu-stefan.stoica@univlorraine.fr)

More information

ST 740: Markov Chain Monte Carlo

ST 740: Markov Chain Monte Carlo ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:

More information

A note on Reversible Jump Markov Chain Monte Carlo

A note on Reversible Jump Markov Chain Monte Carlo A note on Reversible Jump Markov Chain Monte Carlo Hedibert Freitas Lopes Graduate School of Business The University of Chicago 5807 South Woodlawn Avenue Chicago, Illinois 60637 February, 1st 2006 1 Introduction

More information

Bayesian Estimation with Sparse Grids

Bayesian Estimation with Sparse Grids Bayesian Estimation with Sparse Grids Kenneth L. Judd and Thomas M. Mertens Institute on Computational Economics August 7, 27 / 48 Outline Introduction 2 Sparse grids Construction Integration with sparse

More information

Control Variates for Markov Chain Monte Carlo

Control Variates for Markov Chain Monte Carlo Control Variates for Markov Chain Monte Carlo Dellaportas, P., Kontoyiannis, I., and Tsourti, Z. Dept of Statistics, AUEB Dept of Informatics, AUEB 1st Greek Stochastics Meeting Monte Carlo: Probability

More information

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016

Log Gaussian Cox Processes. Chi Group Meeting February 23, 2016 Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started

More information

Bayesian Phylogenetics:

Bayesian Phylogenetics: Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes

More information

LECTURE 15 Markov chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte

More information

Multimodal Nested Sampling

Multimodal Nested Sampling Multimodal Nested Sampling Farhan Feroz Astrophysics Group, Cavendish Lab, Cambridge Inverse Problems & Cosmology Most obvious example: standard CMB data analysis pipeline But many others: object detection,

More information

Adaptive HMC via the Infinite Exponential Family

Adaptive HMC via the Infinite Exponential Family Adaptive HMC via the Infinite Exponential Family Arthur Gretton Gatsby Unit, CSML, University College London RegML, 2017 Arthur Gretton (Gatsby Unit, UCL) Adaptive HMC via the Infinite Exponential Family

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

28 : Approximate Inference - Distributed MCMC

28 : Approximate Inference - Distributed MCMC 10-708: Probabilistic Graphical Models, Spring 2015 28 : Approximate Inference - Distributed MCMC Lecturer: Avinava Dubey Scribes: Hakim Sidahmed, Aman Gupta 1 Introduction For many interesting problems,

More information

18 : Advanced topics in MCMC. 1 Gibbs Sampling (Continued from the last lecture)

18 : Advanced topics in MCMC. 1 Gibbs Sampling (Continued from the last lecture) 10-708: Probabilistic Graphical Models 10-708, Spring 2014 18 : Advanced topics in MCMC Lecturer: Eric P. Xing Scribes: Jessica Chemali, Seungwhan Moon 1 Gibbs Sampling (Continued from the last lecture)

More information

Tractable Nonparametric Bayesian Inference in Poisson Processes with Gaussian Process Intensities

Tractable Nonparametric Bayesian Inference in Poisson Processes with Gaussian Process Intensities Tractable Nonparametric Bayesian Inference in Poisson Processes with Gaussian Process Intensities Ryan Prescott Adams rpa23@cam.ac.uk Cavendish Laboratory, University of Cambridge, Cambridge CB3 HE, UK

More information

Temporal point processes: the conditional intensity function

Temporal point processes: the conditional intensity function Temporal point processes: the conditional intensity function Jakob Gulddahl Rasmussen December 21, 2009 Contents 1 Introduction 2 2 Evolutionary point processes 2 2.1 Evolutionarity..............................

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning

More information

Mechanistic spatio-temporal point process models for marked point processes, with a view to forest stand data

Mechanistic spatio-temporal point process models for marked point processes, with a view to forest stand data Aalborg Universitet Mechanistic spatio-temporal point process models for marked point processes, with a view to forest stand data Møller, Jesper; Ghorbani, Mohammad; Rubak, Ege Holger Publication date:

More information

MARKOV CHAIN MONTE CARLO

MARKOV CHAIN MONTE CARLO MARKOV CHAIN MONTE CARLO RYAN WANG Abstract. This paper gives a brief introduction to Markov Chain Monte Carlo methods, which offer a general framework for calculating difficult integrals. We start with

More information

Bayesian inference and model selection for stochastic epidemics and other coupled hidden Markov models

Bayesian inference and model selection for stochastic epidemics and other coupled hidden Markov models Bayesian inference and model selection for stochastic epidemics and other coupled hidden Markov models (with special attention to epidemics of Escherichia coli O157:H7 in cattle) Simon Spencer 3rd May

More information

PSEUDO-MARGINAL METROPOLIS-HASTINGS APPROACH AND ITS APPLICATION TO BAYESIAN COPULA MODEL

PSEUDO-MARGINAL METROPOLIS-HASTINGS APPROACH AND ITS APPLICATION TO BAYESIAN COPULA MODEL PSEUDO-MARGINAL METROPOLIS-HASTINGS APPROACH AND ITS APPLICATION TO BAYESIAN COPULA MODEL Xuebin Zheng Supervisor: Associate Professor Josef Dick Co-Supervisor: Dr. David Gunawan School of Mathematics

More information

Introduction to Markov Chain Monte Carlo & Gibbs Sampling

Introduction to Markov Chain Monte Carlo & Gibbs Sampling Introduction to Markov Chain Monte Carlo & Gibbs Sampling Prof. Nicholas Zabaras Sibley School of Mechanical and Aerospace Engineering 101 Frank H. T. Rhodes Hall Ithaca, NY 14853-3801 Email: zabaras@cornell.edu

More information

Exact and approximate recursive calculations for binary Markov random fields defined on graphs

Exact and approximate recursive calculations for binary Markov random fields defined on graphs NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET Exact and approximate recursive calculations for binary Markov random fields defined on graphs by Håkon Tjelmeland and Haakon Michael Austad PREPRINT STATISTICS

More information

eqr094: Hierarchical MCMC for Bayesian System Reliability

eqr094: Hierarchical MCMC for Bayesian System Reliability eqr094: Hierarchical MCMC for Bayesian System Reliability Alyson G. Wilson Statistical Sciences Group, Los Alamos National Laboratory P.O. Box 1663, MS F600 Los Alamos, NM 87545 USA Phone: 505-667-9167

More information

Exponential families also behave nicely under conditioning. Specifically, suppose we write η = (η 1, η 2 ) R k R p k so that

Exponential families also behave nicely under conditioning. Specifically, suppose we write η = (η 1, η 2 ) R k R p k so that 1 More examples 1.1 Exponential families under conditioning Exponential families also behave nicely under conditioning. Specifically, suppose we write η = η 1, η 2 R k R p k so that dp η dm 0 = e ηt 1

More information

Gibbs Voronoi tessellations: Modeling, Simulation, Estimation.

Gibbs Voronoi tessellations: Modeling, Simulation, Estimation. Gibbs Voronoi tessellations: Modeling, Simulation, Estimation. Frédéric Lavancier, Laboratoire Jean Leray, Nantes (France) Work with D. Dereudre (LAMAV, Valenciennes, France). SPA, Berlin, July 27-31,

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 Sequential parallel tempering With the development of science and technology, we more and more need to deal with high dimensional systems. For example, we need to align a group of protein or DNA sequences

More information

Afternoon Meeting on Bayesian Computation 2018 University of Reading

Afternoon Meeting on Bayesian Computation 2018 University of Reading Gabriele Abbati 1, Alessra Tosi 2, Seth Flaxman 3, Michael A Osborne 1 1 University of Oxford, 2 Mind Foundry Ltd, 3 Imperial College London Afternoon Meeting on Bayesian Computation 2018 University of

More information

Basic math for biology

Basic math for biology Basic math for biology Lei Li Florida State University, Feb 6, 2002 The EM algorithm: setup Parametric models: {P θ }. Data: full data (Y, X); partial data Y. Missing data: X. Likelihood and maximum likelihood

More information

Bayesian Learning in Undirected Graphical Models

Bayesian Learning in Undirected Graphical Models Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul

More information

A Dirichlet Form approach to MCMC Optimal Scaling

A Dirichlet Form approach to MCMC Optimal Scaling A Dirichlet Form approach to MCMC Optimal Scaling Giacomo Zanella, Wilfrid S. Kendall, and Mylène Bédard. g.zanella@warwick.ac.uk, w.s.kendall@warwick.ac.uk, mylene.bedard@umontreal.ca Supported by EPSRC

More information

MCMC algorithms for fitting Bayesian models

MCMC algorithms for fitting Bayesian models MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models

More information

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Karl Oskar Ekvall Galin L. Jones University of Minnesota March 12, 2019 Abstract Practically relevant statistical models often give rise to probability distributions that are analytically

More information

Eco517 Fall 2013 C. Sims MCMC. October 8, 2013

Eco517 Fall 2013 C. Sims MCMC. October 8, 2013 Eco517 Fall 2013 C. Sims MCMC October 8, 2013 c 2013 by Christopher A. Sims. This document may be reproduced for educational and research purposes, so long as the copies contain this notice and are retained

More information

Learning the hyper-parameters. Luca Martino

Learning the hyper-parameters. Luca Martino Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth

More information

Stochastic Proximal Gradient Algorithm

Stochastic Proximal Gradient Algorithm Stochastic Institut Mines-Télécom / Telecom ParisTech / Laboratoire Traitement et Communication de l Information Joint work with: Y. Atchade, Ann Arbor, USA, G. Fort LTCI/Télécom Paristech and the kind

More information

Stat 516, Homework 1

Stat 516, Homework 1 Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball

More information

Inference in state-space models with multiple paths from conditional SMC

Inference in state-space models with multiple paths from conditional SMC Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September

More information

Approximate Bayesian Computation and Model Assessment for Repulsive Spatial Point Processes

Approximate Bayesian Computation and Model Assessment for Repulsive Spatial Point Processes Approximate Bayesian Computation and Model Assessment for Repulsive Spatial Point Processes arxiv:1604.07027v2 [stat.co] 26 Aug 2016 Shinichiro Shirota Department of Statistical Science, Duke University,

More information

A Repelling-Attracting Metropolis Algorithm for Multimodality

A Repelling-Attracting Metropolis Algorithm for Multimodality A Repelling-Attracting Metropolis Algorithm for Multimodality arxiv:1601.05633v5 [stat.me] 20 Oct 2017 Hyungsuk Tak Statistical and Applied Mathematical Sciences Institute Xiao-Li Meng Department of Statistics,

More information

Does Better Inference mean Better Learning?

Does Better Inference mean Better Learning? Does Better Inference mean Better Learning? Andrew E. Gelfand, Rina Dechter & Alexander Ihler Department of Computer Science University of California, Irvine {agelfand,dechter,ihler}@ics.uci.edu Abstract

More information

CLUSTER DETECTION IN SPATIAL DATA USING MARKED POINT PROCESSES. Radu S. Stoica Emilie Gay

CLUSTER DETECTION IN SPATIAL DATA USING MARKED POINT PROCESSES. Radu S. Stoica Emilie Gay CLUSTER DETECTION IN SPATIAL DATA USING MARKED POINT PROCESSES by Radu S. Stoica Emilie Gay Research Report No. 11 July 5 Unité de Biométrie Institut National de la Recherche Agronomique Avignon, France

More information

On Model Fitting Procedures for Inhomogeneous Neyman-Scott Processes

On Model Fitting Procedures for Inhomogeneous Neyman-Scott Processes On Model Fitting Procedures for Inhomogeneous Neyman-Scott Processes Yongtao Guan July 31, 2006 ABSTRACT In this paper we study computationally efficient procedures to estimate the second-order parameters

More information

The Recycling Gibbs Sampler for Efficient Learning

The Recycling Gibbs Sampler for Efficient Learning The Recycling Gibbs Sampler for Efficient Learning L. Martino, V. Elvira, G. Camps-Valls Universidade de São Paulo, São Carlos (Brazil). Télécom ParisTech, Université Paris-Saclay. (France), Universidad

More information

Learning MN Parameters with Approximation. Sargur Srihari

Learning MN Parameters with Approximation. Sargur Srihari Learning MN Parameters with Approximation Sargur srihari@cedar.buffalo.edu 1 Topics Iterative exact learning of MN parameters Difficulty with exact methods Approximate methods Approximate Inference Belief

More information

A quick introduction to Markov chains and Markov chain Monte Carlo (revised version)

A quick introduction to Markov chains and Markov chain Monte Carlo (revised version) A quick introduction to Markov chains and Markov chain Monte Carlo (revised version) Rasmus Waagepetersen Institute of Mathematical Sciences Aalborg University 1 Introduction These notes are intended to

More information