Statistical study of spatial dependences in long memory random fields on a lattice, point processes and random geometry.

Similar documents
Gibbs point processes : modelling and inference

Gibbs Voronoi tessellations: Modeling, Simulation, Estimation.

Modeling and testing long memory in random fields

Residuals and Goodness-of-fit tests for marked Gibbs point processes

Takacs Fiksel Method for Stationary Marked Gibbs Point Processes

Gibbs point processes with geometrical interactions. Models for strongly structured patterns

Jean-Michel Billiot, Jean-François Coeurjolly and Rémy Drouilhet

Multiple Random Variables

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Density Modeling and Clustering Using Dirichlet Diffusion Trees

Parametric Inference on Strong Dependence

ELEMENTS OF PROBABILITY THEORY

Multivariate Random Variable

Bayesian Regularization

LAN property for sde s with additive fractional noise and continuous time observation

An introduction to spatial point processes

Information geometry for bivariate distribution control

Lecture 2: ARMA(p,q) models (part 2)

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Integral representations in models with long memory

Gaussian, Markov and stationary processes

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

CHANGE DETECTION IN TIME SERIES

Computer Intensive Methods in Mathematical Statistics

Construction of an Informative Hierarchical Prior Distribution: Application to Electricity Load Forecasting

Practical conditions on Markov chains for weak convergence of tail empirical processes

On prediction and density estimation Peter McCullagh University of Chicago December 2004

Statistical Inference and Methods

Statistícal Methods for Spatial Data Analysis

Goodness-of-Fit Tests for Time Series Models: A Score-Marked Empirical Process Approach

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

ECE534, Spring 2018: Solutions for Problem Set #5

Long-range dependence

Estimating functions for inhomogeneous spatial point processes with incomplete covariate data

On Stochastic Adaptive Control & its Applications. Bozenna Pasik-Duncan University of Kansas, USA

Name of the Student: Problems on Discrete & Continuous R.Vs

Understanding Regressions with Observations Collected at High Frequency over Long Span

On the Estimation and Application of Max-Stable Processes

Gaussian processes for inference in stochastic differential equations

Mathematical Preliminaries

Probabilistic Graphical Models

Controlled Diffusions and Hamilton-Jacobi Bellman Equations

Partial Solutions for h4/2014s: Sampling Distributions

Nonparametric Drift Estimation for Stochastic Differential Equations

A Probability Review

1 Isotropic Covariance Functions

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

A Very Brief Summary of Statistical Inference, and Examples

Notes on the Multivariate Normal and Related Topics

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Stochastic representation of random positive-definite tensor-valued properties: application to 3D anisotropic permeability random fields

Lecture 3: Statistical sampling uncertainty

ECE 636: Systems identification

Gaussians. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Universal examples. Chapter The Bernoulli process

Formulas for probability theory and linear models SF2941

Spatial point processes: Theory and practice illustrated with R

1.4 Properties of the autocovariance for stationary time-series

Stochastic Processes

Econ 623 Econometrics II Topic 2: Stationary Time Series

Markov random fields. The Markov property

POWER DIAGRAMS AND INTERACTION PROCESSES FOR UNIONS OF DISCS

Asymptotics for posterior hazards

Bayesian Inference. Chapter 4: Regression and Hierarchical Models

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Lecture 6: Gaussian Mixture Models (GMM)

From Fractional Brownian Motion to Multifractional Brownian Motion

User Guide for Hermir version 0.9: Toolbox for Synthesis of Multivariate Stationary Gaussian and non-gaussian Series

X random; interested in impact of X on Y. Time series analogue of regression.

Lecture 14: Multivariate mgf s and chf s

Statistics (1): Estimation

Lecture 2: Poisson point processes: properties and statistical inference

Latent voter model on random regular graphs

13 : Variational Inference: Loopy Belief Propagation and Mean Field

Extreme Value Analysis and Spatial Extremes

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

LQG Mean-Field Games with ergodic cost in R d

Malliavin calculus and central limit theorems

Average-cost temporal difference learning and adaptive control variates

STAT 512 sp 2018 Summary Sheet

Introduction to Estimation Methods for Time Series models Lecture 2

Asymptotic statistics using the Functional Delta Method

Chapter 2. Discrete Distributions

MAS223 Statistical Inference and Modelling Exercises

Advanced Signal Processing Introduction to Estimation Theory

Approximation of BSDEs using least-squares regression and Malliavin weights

Discrete time processes

Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion

Modelling Non-linear and Non-stationary Time Series

Effective dynamics for the (overdamped) Langevin equation

Stochastic Differential Equations.

Spring 2016 Network Science. Solution of Quiz I

A New Trust Region Algorithm Using Radial Basis Function Models

ECON 3150/4150, Spring term Lecture 6

Final Examination Statistics 200C. T. Ferguson June 11, 2009

RESEARCH REPORT. A logistic regression estimating function for spatial Gibbs point processes.

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Linear-Quadratic-Gaussian (LQG) Controllers and Kalman Filters

Transcription:

Statistical study of spatial dependences in long memory random fields on a lattice, point processes and random geometry. Frédéric Lavancier, Laboratoire de Mathématiques Jean Leray, Nantes 9 décembre 2011.

Introduction Long memory, self-similarity in (multivariate) time series : 5 0 5 0 100 200 300 400 500 600 700 800 900 1000 10 5 0 5 10 0 100 200 300 400 500 600 700 800 900 1000 20 0 20 40 0 100 200 300 400 500 600 700 800 900 1000

Self-similarity, Long memory Introduction Long memory, self-similarity in images :

Introduction Attraction, repulsion between points :

Introduction Dependence between cells of a tesselation :

Introduction Random sets as a union of interacting balls :

Introduction General motivations Developping mathematical models that respect some prescribed features (e.g. self-similiarity, long range dependence, repulsion or attraction between points,...) ; are flexible enough (through few parameters) ; we can simulate.

Introduction General motivations Developping mathematical models that respect some prescribed features (e.g. self-similiarity, long range dependence, repulsion or attraction between points,...) ; are flexible enough (through few parameters) ; we can simulate. From a stastical point of view : fitting these models to data (inference problem) ; assessing the theoretical quality of inference (consistency, limiting law, optimality,...) ; providing some diagnostic tools : - adequation of the model to data, - change-point problems.

1 Self-similarity, Long memory Vector Fractional Brownian Motion Long memory time series Long memory random fields (images) 2 Estimation of Gibbs point processes Model validation for Gibbs point processes

1 Self-similarity, Long memory Vector Fractional Brownian Motion Long memory time series Long memory random fields (images)

0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000 Fractional Brownian Motion 2.5 2.0 1.5 1.0 0.5 0.0 H= 0.2 Some univariate examples : 1.5 1.0 0.5 0.0 H= 0.5 0.0 0.1 0.2 0.3 0.4 0.5 H= 0.8

0 200 400 600 800 1000 0 200 400 600 800 1000 0 200 400 600 800 1000 Fractional Brownian Motion 2.5 2.0 1.5 1.0 0.5 0.0 H= 0.2 Some univariate examples : 1.5 1.0 0.5 0.0 H= 0.5 0.0 0.1 0.2 0.3 0.4 0.5 H= 0.8 Aim : correlating p fractional Brownian motions Vector (or Multivariate) FBM

Vector FBM with P.-O. Amblard, J.-F. Coeurjolly, A. Philippe, D. Surgailis A multivariate process B(t) = (B 1(t),..., B p(t)) is a Vector FBM with parameter H = (H 1,..., H p) if B(0) = 0 and it is Gaussian ; it is H-self-similar, i.e. c > 0, (B 1(ct),..., B p(ct)) t R L = (c H 1 B 1(t),..., c Hp B p(t)) t R ; it has stationary increments.

Vector FBM with P.-O. Amblard, J.-F. Coeurjolly, A. Philippe, D. Surgailis A multivariate process B(t) = (B 1(t),..., B p(t)) is a Vector FBM with parameter H = (H 1,..., H p) if B(0) = 0 and it is Gaussian ; it is H-self-similar, i.e. c > 0, (B 1(ct),..., B p(ct)) t R L = (c H 1 B 1(t),..., c Hp B p(t)) t R ; it has stationary increments. Lamperti type result : If there exist a vector process (Y 1(t),..., Y p(t)) t R and real functions a 1,..., a p such that (a 1(n)Y 1(nt),..., a p(n)y p(nt)) n Z(t), fidi then the vector process (Z(t)) is self-similar. Convergence of partial sums : Any Vector FBM can be obtained as the limit of partial sums of some superlinear processes.

Comprehensive characterisation : (p = 2) 1. Let (B 1(t), B 2(t)) be a (H 1, H 2)-VFBM, then (when H 1 + H 2 1) : B 1 is a H 1-FBM and B 2 is a H 2-FBM for 0 s t, the cross-covariance is EB 1(s)B 2(t) (ρ + η)s H 1+H 2 + (ρ η)t H 1+H 2 (ρ η)(t s) H 1+H 2 with ρ = corr(b 1(1), B 2(1)), η = corr(b 1(1),B 2 ( 1)) corr(b 1 ( 1),B 2 (1)) 2 2 H 1 +H 2.

Comprehensive characterisation : (p = 2) 1. Let (B 1(t), B 2(t)) be a (H 1, H 2)-VFBM, then (when H 1 + H 2 1) : B 1 is a H 1-FBM and B 2 is a H 2-FBM for 0 s t, the cross-covariance is EB 1(s)B 2(t) (ρ + η)s H 1+H 2 + (ρ η)t H 1+H 2 (ρ η)(t s) H 1+H 2 with ρ = corr(b 1(1), B 2(1)), η = corr(b 1(1),B 2 ( 1)) corr(b 1 ( 1),B 2 (1)) 2 2 H 1 +H 2. 2. Conversely any Gaussian process with the above covariance function is a VFBM iff for some known R > 0 ρ 2 sin π 2 (H1 + H2) 2 + η 2 cos π 2 (H1 + H2) 2 R not possible to set up arbitrary correlated FBM

Comprehensive characterisation : (p = 2) 1. Let (B 1(t), B 2(t)) be a (H 1, H 2)-VFBM, then (when H 1 + H 2 1) : B 1 is a H 1-FBM and B 2 is a H 2-FBM for 0 s t, the cross-covariance is EB 1(s)B 2(t) (ρ + η)s H 1+H 2 + (ρ η)t H 1+H 2 (ρ η)(t s) H 1+H 2 with ρ = corr(b 1(1), B 2(1)), η = corr(b 1(1),B 2 ( 1)) corr(b 1 ( 1),B 2 (1)) 2 2 H 1 +H 2. 2. Conversely any Gaussian process with the above covariance function is a VFBM iff for some known R > 0 ρ 2 sin π 2 (H1 + H2) 2 + η 2 cos π 2 (H1 + H2) 2 R not possible to set up arbitrary correlated FBM 3. Consider the increments B i(n) = B i(n + 1) B i(n), for i = 1, 2.

Comprehensive characterisation : (p = 2) 1. Let (B 1(t), B 2(t)) be a (H 1, H 2)-VFBM, then (when H 1 + H 2 1) : B 1 is a H 1-FBM and B 2 is a H 2-FBM for 0 s t, the cross-covariance is EB 1(s)B 2(t) (ρ + η)s H 1+H 2 + (ρ η)t H 1+H 2 (ρ η)(t s) H 1+H 2 with ρ = corr(b 1(1), B 2(1)), η = corr(b 1(1),B 2 ( 1)) corr(b 1 ( 1),B 2 (1)) 2 2 H 1 +H 2. 2. Conversely any Gaussian process with the above covariance function is a VFBM iff for some known R > 0 ρ 2 sin π 2 (H1 + H2) 2 + η 2 cos π 2 (H1 + H2) 2 R not possible to set up arbitrary correlated FBM 3. Consider the increments B i(n) = B i(n + 1) B i(n), for i = 1, 2. The two components are either non-correlated components or E B 1(n) B 2(n + h) κ h H 1+H 2 2 Very constrained cross-correlation (For instance H 1 + H 2 1 long-range cross-dependence)

Some examples : Simulations are achieved thanks to a Wood and Chan algorithm. Below, the correlation between any couple of FBM is ρ = 0.6 H [0.3, 0.4] (decentered) H [0.8, 0.9] H [0.4, 0.8]

1 Self-similarity, Long memory Vector Fractional Brownian Motion Long memory time series Long memory random fields (images)

0 200 400 600 800 1000 0 200 400 600 800 1000 0 5 10 15 20 25 30 0 5 10 15 20 25 30 Definition : A stationary, L 2, time series (X(n)) n Z exhibits long memory if X r(n) = + n Z where r denotes the covariance function of X. Typically : for some 0 < d < 0.5, r(n) κ n 2d 1. The parameter d is called the long memory parameter. d = 0.3 : 0.010 0.000 0.010 H= 0.8 0.0 0.2 0.4 0.6 0.8 1.0 d = 0.45 : 0.004 0.002 0.000 0.002 0.0 0.2 0.4 0.6 0.8 1.0 Examples : If B is a FBM, X(n) = B(n + 1) B(n) with d = H 0.5. An I(d) time series : X(n) = (1 L) d ɛ(n) (where ɛ is a white noise and L the lag operator : Lɛ n = ɛ n 1)

Change point problem with R. Leipus, A. Philippe, D. Surgailis Constant vs non-constant long memory parameter? ex : I(d 1 ) I(d 2 ) 5 0 5 0 100 200 300 400 500 600 700 800 900 1000 10 5 0 5 10 0 100 200 300 400 500 600 700 800 900 1000 20 0 20 40 0 100 200 300 400 500 600 700 800 900 1000

Change point problem with R. Leipus, A. Philippe, D. Surgailis Basically, under H 0 : d is constant : 0 1 [nt] [nt] X n d 0.5 X(k) D([0,1]) X κb d+0.5 (t) and V ar @ X(k) A n 2d+1 k=1 k=1 Under H 1 : d increases, for some t 0 0 1 0 1 [nt 0 ] X nx V ar @ X(k) A V ar @ X(k) A k=1 k=[nt 0 ] To test for H 0 against H 1, we estimate and compare these variances.

Change point problem with R. Leipus, A. Philippe, D. Surgailis Basically, under H 0 : d is constant : 0 1 [nt] [nt] X n d 0.5 X(k) D([0,1]) X κb d+0.5 (t) and V ar @ X(k) A n 2d+1 k=1 Under H 1 : d increases, for some t 0 0 1 0 [nt 0 ] X V ar @ X(k) A V ar @ k=1 nx k=[nt 0 ] k=1 1 X(k) A To test for H 0 against H 1, we estimate and compare these variances. Let S j = P j k=1 X(k) and S n j = P n k=j+1 X(k). Define the forward variance : V k = d V ar(s 1,..., S k ) the backward variance : V n k = d V ar(s n k+1,..., S 1 ) Test statistic, Consistency I n = Z 1 V n [nt] 0 V [nt] Under H 0 : I n I(B d+0.5 ) (simulable) ; Under H 1 : I n + dt

1 Self-similarity, Long memory Vector Fractional Brownian Motion Long memory time series Long memory random fields (images)

Self-similarity, Long memory Definition : A stationary, L2, random field (X(n))n Zd exhibits long memory if X r(n) = + n Zd where r denotes the covariance function of X. Main difference with time series : possible occurence of anisotropy. Examples (d = 2) : for 0 < α < 1 r(n1, n2 ) (n21 + n22 ) α r(n1, n2 ) n1 α n2 α r(n1, n2 ) n1 + n2 α

Investigations on long memory random fields Modelling : long memory random fields appear in similar models as for time series (increment of fractional Brownian sheet ; aggregation of short memory random fields ; fractional filtering of a white noise) in some Gibbs processes on Z d in phase transition ex : Ising model at the critical temperature

Investigations on long memory random fields Modelling : long memory random fields appear in similar models as for time series (increment of fractional Brownian sheet ; aggregation of short memory random fields ; fractional filtering of a white noise) in some Gibbs processes on Z d in phase transition ex : Ising model at the critical temperature Limit theorems, statistical applications : for partial sums non Central Limit Theorems Testing for the presence of long memory. for the empirical process asymptotic degeneracy Asymptotics for U-Statistics. for some quadratic forms (with A. Philippe) non-clts for P g(i j)x(i)x(j) where (X(i)) i Z 2 Asymptotics of empirical covariance functions. is Gaussian.

Partial sums of long memory random fields Let X(n) = X k Z d a k ɛ(n k), where (a k ) l 2 and ɛ is a Gaussian white noise. Theorem Denote a(x) = P k Z d a k e i k.x. If a L 2 and λ, a(λx) = λ α a(x), 0 < α < d, then denoting A n = {1,..., n} d 1 n d/2+α X k A [nt] X k D([0,1] d ) Z R d a(x) dy j=1 e it j x j 1 dz(x) ix j The limit is the Fractional Brownian Sheet only when a(x) = Q d i=1 xi H i.

Partial sums of long memory random fields Let X(n) = X k Z d a k ɛ(n k), where (a k ) l 2 and ɛ is a Gaussian white noise. Theorem Denote a(x) = P k Z d a k e i k.x. If a L 2 and λ, a(λx) = λ α a(x), 0 < α < d, then denoting A n = {1,..., n} d 1 n d/2+α X k A [nt] X k D([0,1] d ) Z R d a(x) dy j=1 e it j x j 1 dz(x) ix j The limit is the Fractional Brownian Sheet only when a(x) = Q d 1 X 1 X Z n d/2+α X k = n d/2+α a(x)e i k.x dz(x) k A n k A [ π,π] d n Z = n α a(x) X e i k.x dz(x) [ π,π] d n d/2 k A n Z dy = a(x) [ nπ,nπ] d j=1 i=1 xi H i. Z : spectral measure of ɛ e ix Z j 1 n(e ix j /n 1) dz(x) L 2 dy a(x) R d j=1 e ix j 1 dz(x) ix j

2 Estimation of Gibbs point processes Model validation for Gibbs point processes

Point processes Notation : Denote by ϕ a point pattern on R d, i.e. ϕ = S i I x i, for I N A point process Φ is a random variable on the space Ω = {ϕ}. Example : the Poisson point process independance in locations How to introduce dependencies between the location of points?

Gibbs point processes (basic definition) They are absolutely continuous w.r.t the Poisson point process with density f θ (ϕ) = 1 c θ e V θ(ϕ), for some parameter θ R p. V θ (ϕ) : energy of ϕ also called Hamiltonian (belongs to R {+ }) ϕ is more likely to occur if V θ (ϕ) is small. if V θ (ϕ) = +, then ϕ is forbidden.

Gibbs point processes (basic definition) They are absolutely continuous w.r.t the Poisson point process with density f θ (ϕ) = 1 c θ e V θ(ϕ), for some parameter θ R p. V θ (ϕ) : energy of ϕ also called Hamiltonian (belongs to R {+ }) ϕ is more likely to occur if V θ (ϕ) is small. if V θ (ϕ) = +, then ϕ is forbidden. Example : Strauss process with range of interaction R = 0.05 on [0, 1] 2 V θ (ϕ) = θ X (x,y) ϕ 1I y x <R, θ > 0 θ = 0.35 : θ = 2.3 :

Geometric interactions

Geometric interactions For a point pattern ϕ, denote Vor(ϕ) the associated Voronoï tessellation. Gibbs Voronoï tessellation : one example V θ (ϕ) = X X V 1(C) + θ C Vor(ϕ) V 1(C) = C,C Vor(ϕ) C and C are neighbors ( + if the cell is too "irregular", 0 otherwise vol(c) vol(c ) θ > 0 θ < 0

Geometric interactions Quermass model Each point x ϕ is associated with a random radius r. V θ (ϕ) = θ 1 P(Γ) + θ 2 A(Γ) + θ 3 E(Γ) where Γ = [ B(x, r) x ϕ P : perimeter A : area E : EP characteristic (nb connected sets - nb holes). θ 1 > 0 (θ 2 = θ 3 = 0) θ 2 > 0 (θ 1 = θ 3 = 0) θ 3 > 0 (θ 1 = θ 2 = 0)

Statistical issues Let ϕ be a point pattern observed (or Vor(ϕ), or Γ,...) on a domain Λ. Assumption : ϕ is the realisation of a Gibbs point process associated to V θ for some (unknown) θ R p.

Statistical issues Let ϕ be a point pattern observed (or Vor(ϕ), or Γ,...) on a domain Λ. Assumption : ϕ is the realisation of a Gibbs point process associated to V θ for some (unknown) θ R p. Caution : The family (V θ ) θ must lead to a well defined Gibbs process for all θ (For the geometric interactions above, see D. Dereudre, R. Drouilhet, H.-O. Georgii)

Statistical issues Let ϕ be a point pattern observed (or Vor(ϕ), or Γ,...) on a domain Λ. Assumption : ϕ is the realisation of a Gibbs point process associated to V θ for some (unknown) θ R p. Caution : The family (V θ ) θ must lead to a well defined Gibbs process for all θ (For the geometric interactions above, see D. Dereudre, R. Drouilhet, H.-O. Georgii) 1 How to estimate θ? From f θ (ϕ) = 1 c θ e Vθ(ϕ) : maximum likelihood procedure. Problem : c θ is intractable and (prohibitive) time consuming simulations are required to approach it.

Statistical issues Let ϕ be a point pattern observed (or Vor(ϕ), or Γ,...) on a domain Λ. Assumption : ϕ is the realisation of a Gibbs point process associated to V θ for some (unknown) θ R p. Caution : The family (V θ ) θ must lead to a well defined Gibbs process for all θ (For the geometric interactions above, see D. Dereudre, R. Drouilhet, H.-O. Georgii) 1 How to estimate θ? From f θ (ϕ) = 1 c θ e Vθ(ϕ) : maximum likelihood procedure. Problem : c θ is intractable and (prohibitive) time consuming simulations are required to approach it. 2 Is the above assumption reasonable?

Takacs-Fiksel method Campbell equilibrium equation (Georgii, Nguyen, Zessin) Let V θ (x ϕ) = V θ (ϕ x) V θ (ϕ) (energy needed to insert x in ϕ) Φ is a Gibbs point process with energy V θ if and only if, for all function h, «! X E Φ h (x, Φ) e ZR V θ (x Φ) dx = E Φ h (x, Φ \ x). d x Φ

Takacs-Fiksel method Campbell equilibrium equation (Georgii, Nguyen, Zessin) Let V θ (x ϕ) = V θ (ϕ x) V θ (ϕ) (energy needed to insert x in ϕ) Φ is a Gibbs point process with energy V θ if and only if, for all function h, «! X E Φ h (x, Φ) e ZR V θ (x Φ) dx = E Φ h (x, Φ \ x). d Empirical counterpart : Takacs Fiksel estimation Let h 1,..., h k be K test functions (to be chosen), " KX Z ˆθ = arg min h k (x, ϕ)e Vθ(x ϕ) dx X 2 h k (x, ϕ \ x)#. θ k=1 Λ x ϕ TF estimation includes pseudo-likelihood estimation as a particular case x Φ

Takacs-Fiksel method Campbell equilibrium equation (Georgii, Nguyen, Zessin) Let V θ (x ϕ) = V θ (ϕ x) V θ (ϕ) (energy needed to insert x in ϕ) Φ is a Gibbs point process with energy V θ if and only if, for all function h, «! X E Φ h (x, Φ) e ZR V θ (x Φ) dx = E Φ h (x, Φ \ x). d Empirical counterpart : Takacs Fiksel estimation Let h 1,..., h k be K test functions (to be chosen), " KX Z ˆθ = arg min h k (x, ϕ)e Vθ(x ϕ) dx X 2 h k (x, ϕ \ x)#. θ k=1 Λ x ϕ TF estimation includes pseudo-likelihood estimation as a particular case Contributions : with J.-F. Coeurjolly, D. Dereudre, R. Drouihet, K. Stankova-Helisova x Φ Identifiability (K > dim(θ)), consistency, asymptotic normality ; Extension to a two-step procedure in presence of (possible non-hereditary) hardcore interactions ; Application to Gibbs Voronoï tessellations ; Application to Quermass model, where points are not observed.

Model Validation with J.-F. Coeurjolly Let ϕ a point pattern supposed to be the observation on Λ of a Gibbs process with parametric potential V θ. Let ˆθ be an estimate of θ from ϕ.

Model Validation with J.-F. Coeurjolly Let ϕ a point pattern supposed to be the observation on Λ of a Gibbs process with parametric potential V θ. Let ˆθ be an estimate of θ from ϕ. The residuals assess the Campbell equilibrium : for any h, Z R Λ(h) = Λ 1 h(x, ϕ)e Vˆθ(x ϕ) dx X! h(x, ϕ \ x). Λ x ϕ If the model is well specified, we expect, for any h, R Λ(h) 0.

Model Validation with J.-F. Coeurjolly Let ϕ a point pattern supposed to be the observation on Λ of a Gibbs process with parametric potential V θ. Let ˆθ be an estimate of θ from ϕ. The residuals assess the Campbell equilibrium : for any h, Z R Λ(h) = Λ 1 h(x, ϕ)e Vˆθ(x ϕ) dx X! h(x, ϕ \ x). Λ x ϕ If the model is well specified, we expect, for any h, R Λ(h) 0. We have proved : as Λ R d, R Λ(h) 0 and R Λ(h) N (0, Σ). Towards χ 2 goodness of fit tests : 2 frameworks 8 R Λ (h 1 ) ><. R Λ (h j ). >: R Λ (h s) R Λ1 (h) R Λ2 (h) R Λq (h) P sj=1 R Λ (ϕ, h j ) 2 χ 2 P q j=1 R Λ j (ϕ, h) 2 χ 2

Theoretical Ingredients For the estimation and the validation procedures, our asymptotic results rely on the Campbell equilibrium equation, an ergodic theorem (Nguyen and Zessin), the central limit theorem below.

Theoretical Ingredients For the estimation and the validation procedures, our asymptotic results rely on the Campbell equilibrium equation, an ergodic theorem (Nguyen and Zessin), the central limit theorem below. CLT for a linear functional U(Φ Λ) where Φ Λ = Φ Λ Assume that if Λ = n k=1 k for disjoint k s then U(Φ Λ) = P n k=1 U(Φ k ) Basically, if (i) Conditioned centering : E ˆU(Φ k ) Φ j, j k = 0 1 nx nx (ii) Convergence of empirical covariances : U(Φ k )U(Φ k ) L1 Σ n P then 1 n n k=1 U(Φ ) L k N (0, Σ) k=1 k =1 The key assumption (i) allows us to go without mixing assumptions (which typically do not hold for all values of θ for a Gibbs process).

Alternatives to Gibbs point processes? Gibbs processes introduce interactions in a very natural way, but they can be tedious to simulate there are many remaining challenges for inference

Alternatives to Gibbs point processes? Gibbs processes introduce interactions in a very natural way, but they can be tedious to simulate there are many remaining challenges for inference Alternatives to Gibbs processes : Cox processes : Poisson processes with random intensity function. They induce clustered point patterns.

Alternatives to Gibbs point processes? Gibbs processes introduce interactions in a very natural way, but they can be tedious to simulate there are many remaining challenges for inference Alternatives to Gibbs processes : Cox processes : Poisson processes with random intensity function. They induce clustered point patterns. Determinantal point processes (with J. Møller and E. Rubak) Their joint intensities depend on a covariance function C. They are repulsive point processes, for g(h) = 1 C(h)2 C(0) 2. Appeals : Perfect and fast simulation is available Flexible models : just consider a parametric family of covariance functions Inference is feasible by standard methods (maximum likelihood, contrast functions,...)

Thank you.