Random point patterns and counting processes

Similar documents
Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Notes 1 : Measure-theoretic foundations I

Lecture 1: Brief Review on Stochastic Processes

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Topics in Stochastic Geometry. Lecture 4 The Boolean model

Manual for SOA Exam MLC.

Information Theory and Statistics Lecture 3: Stationary ergodic processes

Poisson Processes. Stochastic Processes. Feb UC3M

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Introduction to Stochastic Processes

Exponential Distribution and Poisson Process

Lecture 4: Introduction to stochastic processes and stochastic calculus

18.175: Lecture 2 Extension theorems, random variables, distributions

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

JUSTIN HARTMANN. F n Σ.

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SMSTC (2007/08) Probability.

1 Measurable Functions

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

ELEMENTS OF PROBABILITY THEORY

Advanced Probability

Chapter 4. Measure Theory. 1. Measure Spaces

Lecture 10: Semi-Markov Type Processes

7 Poisson random measures

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

STAT 7032 Probability Spring Wlodek Bryc

Real Variables: Solutions to Homework 3

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

µ (X) := inf l(i k ) where X k=1 I k, I k an open interval Notice that is a map from subsets of R to non-negative number together with infinity

INTRODUCTION TO MARKOV CHAINS AND MARKOV CHAIN MIXING

ABSTRACT INTEGRATION CHAPTER ONE

An Introduction to Entropy and Subshifts of. Finite Type

1 Delayed Renewal Processes: Exploiting Laplace Transforms

Doléans measures. Appendix C. C.1 Introduction

Recovering randomness from an asymptotic Hamming distance

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Problem set 1, Real Analysis I, Spring, 2015.

1 Sequences of events and their limits

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

Jump Processes. Richard F. Bass

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

1. Aufgabenblatt zur Vorlesung Probability Theory

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

Homework #2 Solutions Due: September 5, for all n N n 3 = n2 (n + 1) 2 4

Essential Background for Real Analysis I (MATH 5210)

Convergence of Feller Processes

18.175: Lecture 3 Integration

MTH 202 : Probability and Statistics

Lecture 20: Reversible Processes and Queues

Zdzis law Brzeźniak and Tomasz Zastawniak

Renewal theory and its applications

Measure Theoretic Probability. P.J.C. Spreij

Measure Theoretic Probability. P.J.C. Spreij

A D VA N C E D P R O B A B I L - I T Y

T. Liggett Mathematics 171 Final Exam June 8, 2011

Chapter 1. Poisson processes. 1.1 Definitions

ON THE EQUIVALENCE OF CONGLOMERABILITY AND DISINTEGRABILITY FOR UNBOUNDED RANDOM VARIABLES

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

36-752: Lecture 1. We will use measures to say how large sets are. First, we have to decide which sets we will measure.

Abstract. 2. We construct several transcendental numbers.

Errata for FIRST edition of A First Look at Rigorous Probability

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

MARKOV CHAINS AND HIDDEN MARKOV MODELS

Notes 15 : UI Martingales

Chapter 6: Random Processes 1

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

II - REAL ANALYSIS. This property gives us a way to extend the notion of content to finite unions of rectangles: we define

December 19, Probability Theory Instituto Superior Técnico. Poisson Convergence. João Brazuna. Weak Law of Small Numbers

EE126: Probability and Random Processes

Probability Theory. Richard F. Bass

Figure 10.1: Recording when the event E occurs

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M.

1 Stat 605. Homework I. Due Feb. 1, 2011

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME

A Short Introduction to Diffusion Processes and Ito Calculus

Lecture 3. Discrete Random Variables

Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 2017 Nadia S. Larsen. 17 November 2017.

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Point Process Control

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

means is a subset of. So we say A B for sets A and B if x A we have x B holds. BY CONTRAST, a S means that a is a member of S.

Random Process Lecture 1. Fundamentals of Probability

A User s Guide to Measure Theoretic Probability Errata and comments

An Introduction to Stochastic Processes in Continuous Time

STAT STOCHASTIC PROCESSES. Contents

Solutions For Stochastic Process Final Exam

Lectures for APM 541: Stochastic Modeling in Biology. Jay Taylor

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

PROBABILITY THEORY II

Compendium and Solutions to exercises TMA4225 Foundation of analysis

P (A G) dp G P (A G)

Notes 18 : Optional Sampling Theorem

Recurrence of Simple Random Walk on Z 2 is Dynamically Sensitive

Lecture 12: Multiple Random Variables and Independence

Chapter 6. Hypothesis Tests Lecture 20: UMP tests and Neyman-Pearson lemma

Transcription:

Chapter 7 Random point patterns and counting processes 7.1 Random point pattern A random point pattern satunnainen pistekuvio) on an interval S R is a locally finite 1 random subset of S, defined on some probability space Ω, P). A random point pattern is hence a map ω Xω) from Ω to the family of locally finite subsets of S. For clarity, and following the usual convention in stochastics, the symbol ω is omitted in what follows. Example 7.1. Let U 1,..., U n be independent and uniformly distributed random numbers on the interval 2 0, 1). Then the set X = {U 1,..., U n } is a random point pattern on 0, 1). Example 7.2. Let Z be a random integer which follows a Poisson distribution with mean λ > 0. Then the set X = {n Z + : n Z} is a random point pattern on R +. Precisely speaking, in the definition of a random point pattern we need to require that the map X : Ω N S) is measurable with respect to the sigmaalgebra on N S) generated by the maps B X B, B S open, where N S) is the family of all locally finite sets subsets of S. Such technical details are unimportant in the analysis here, and hence not treated further. For details, see for example the books [Kal02, SW08]. 7.2 Counting measure and counting process The counting measure laskurimitta) of a random point pattern X on S R is a random function NB) = X B, 1 A subset X of an interval S is locally finite lokaalisti äärellinen) if X K is finite whenever K S is closed and bounded. 2 a, b) refers to the open interval a < x < b. 58

which returns the point count of X restricted to set B S. Example 7.3. The counting measure of the random point pattern X in Example 7.1 can be written as NB) = n 1U i B), B 0, 1), i=1 where the indicator of event {U i B} is defined by { 1, if U i B, 1U i B) = 0, else. Time instants related to a random phenomenon under study can be modeled as random point patterns of R +. In this case the point count on the interval [0, t] is often briefly denote by Nt) = N[0, t]) and the random function t Nt) is called the counting process laskuriprosessiksi) of the point pattern X. The definition implies that the point count of X in an interval s, t] can be expressed as X s, t] = Ns, t]) = Nt) Ns). 7.3 Independent scattering A random point pattern X is independently scattered riippumattomasti sironnut) if the random variables NA 1 ),... NA m ) are independent whenever the sets A 1,..., A m are disjoint. In this case information about the points of X within a set A is irrelevant when predicting how the point pattern behaves outside A. Independent scattering is indeed a very restrictive assumption, which only few point patterns satisfy. Example 7.4. Is the point pattern X = {U 1,..., U n } of Example 7.1 independently scattered? By dividing the open unit interval into A 1 = 0, 1/2] and A 2 = 1/2, 1), we see that On the other hand, PNA 1 ) = 0) = PU 1 > 1/2,..., U n > 1/2) = 1/2) n. PNA 1 ) = 0 NA 2 ) = n) = 1, because by definition, the equation NA 1 ) + NA 2 ) = n surely holds. shows that X is not independently scattered. This 59

The following important result characterizes how independent scattering, an intrinsically algebraic property, automatically yields a quantitative description of the distribution of point counts of the random point pattern. The result also underlines the central role of the Poisson distribution as a universal distribution describing point counts of independently scattered point patterns. A random point pattern X on R + is homogeneous tasakoosteinen) if its counting measure satisfies 3 NA + t) = st NA) for all A R + and all t 0, where A + t = {a + t : a A}. The intensity intensiteetti) of a homogeneous random point pattern is the expected point count EN0, 1]) on the unit interval 0, 1]. Theorem 7.5. Let X be a homogeneous independently scattered random point pattern on R + with intensity 0 < λ <. Then the point count of X in the interval [0, t] is Poisson-distributed with mean λt, so that λt λt)k PNt) = k) = e, k = 0, 1, 2,... k! Proof. Denote by vt) = PN0, t] = 0) the probability that there are no points of X in the interval 0, t]. Because N0, s + t] = 0 precisely when N0, s] = 0 and Ns, s + t] = 0, we see that vs + t) = PN0, s + t] = 0) = PN0, s] = 0, Ns, s + t] = 0) = PN0, s] = 0) PNs, s + t] = 0) = PN0, s] = 0) PN0, t] = 0) = vs)vt). Because v is a nonincreasing function, this implies Exercise) that vt) = e αt 7.1) for some α 0. Moreover, α > 0, because in case α = 0 the point pattern would be empty with probability one Exercise??), which would be in conflict with the assumption λ = EN0, 1]) > 0. Analogously we may conclude that α <, because α = would imply a conflict with the assumption λ = EN0, 1]) <. Let us next inspect the probability of Nt) = k for some particular t > 0 and integer k 0. Choose a large number n k and divide the interval 0, t] into equally sized subintervals I n,j = j 1 t, j t], j = 1,..., n. Denote n n { 1, if NI n,j ) > 0, θ j = 1NI n,j ) > 0) = 0, else. 3 In these lecture notes X = st Y means that X and Y are equal in distribution, that is, PX B) = PY B) for all B. 60

Then Z n = θ 1 + + θ n is the number of subintervals which contains points of X. Denote by Ω n the event that each subinterval contains at most one point. When the event Ω n occurs, we have Nt) = Z n, which implies that where ɛ n PNt) = k) = PZ n = k) + ɛ n, 7.2) = PNt) = k, Ω c n) PZ n = k, Ω c n). Because the indicator variables θ 1,..., θ n are independent due to independent scattering) and each takes on value one with probability q n = 1 vt/n), we find that Z n follows the binomial Binn, q n ) distribution. ) n PZ n = k) = q k k n1 q n ) n k, k = 0,..., n. By equation 7.1) and l Hôpital s rule we see that nq n = n1 e αt/n ) = 1 e αt/n 1/n αt as n. By the law of small numbers Theorem 7.6) this allows to conclude that αt αt)k PZ n = k) e, as n. 7.3) k! Because by Lemma 7.7, ɛ n 2PΩ c n) 0, and because the probability of the event Nt) = k does not depend on n, we see from 7.2) and 7.3) that αt αt)k PNt) = k) = e. k! Therefore Nt) is Poisson distributed with mean αt. Especially, ENt)) = αt which shows that α = λ = EN0, 1]). Lemma 7.6 Law of small numbers). Let Z n be a Binn, q n )-distributed random integer, and assume that nq n α 0, ) as n. Then lim PZ α αk n = k) = e n k! for all k 0. Proof. By definition of the Binn, q n ) distribution we find that PZ n = k) = = n! k!n k)! 1 q n) n k qn k n! 1 nq n ) k 1 nq ) n 7.4) n. n k n k)! 1 q n ) k k! n 61

Let us analyze the right side of above equation as n. The first term on the right side of 7.4) satisfies n! n k n k)! = 1 k 1 n k j=0 n j) = k 1 j=0 1 j/n) 1. Because q n 0, also the second term on the right side of 7.4) satisfies 1 1 q n ) k 1. Furthermore, the assumption nq n α implies that the third term on the right of 7.4) scales as nq n ) k αk k! k!. Hence the claim follows after verifying that lim n 1 nq n n ) n = e α. 7.5) The limit 7.5) can be justified as follows. Choose a small ɛ > 0 and select n 0 so large that α ɛ nq n α + ɛ for all n n 0. Then for all n n 0, 1 α + ɛ ) n 1 nq ) n n 1 α ɛ ) n. n n n By applying the formula 1 + x/n) n e x which is often taken as the definition of the exponential function) we see that the lower bound above converges to e α ɛ and the upper bound to e α+ɛ. Because the limiting bounds are valid for an arbitrarily small ɛ > 0, equation 7.5) follows. Lemma 7.7. Let X be a random point pattern on an interval S R with counting measure N. Let us divide the real axis into intervals I n,j = j 1, j ] n n of length 1/n, indexed by j Z. Then for any interval A S such that ENA)) <, ) P NA I n,j ) 1 for all j Z 1, as n. Proof. Define the random number D = min{ x y : x, y X A, x y} as the smallest interpoint distance of the point pattern restricted to A. When D > 1/n, then every pair of points in X A contains a gap of width 1/n, so that every interval I n,j can contain at most one point of X A. Therefore, Z n := sup j NA I n,j ) = sup X A I n,j 1 j 62

on the event D > 1/n. The assumption ENA)) < implies that the set X A is finite with probability one. Hence D > 0 with probability one, and the above inequality shows that lim n 1Z n 1) = 1 with probability one. Now by applying Lebesgue s dominated convergence theorem to justify interchanging the limit and the expectation below, it follows that lim PZ n 1) = n lim E n ) 1Z n 1) ) = E lim 1Z n 1) n = 1. 7.4 Poisson process A random function N : R + Z + is a Poisson process Poisson-prosessi) with intensity λ if Nt) Ns) = st Poiλt s)) for all s, t] R +. N has independent increments in the sense that Nt 1 ) Ns 1 ),..., Nt n ) Ns n ) are independent whenever s 1, t 1 ],..., s n, t n ] R + are disjoint. The above random function t Nt) is hence a continuous-time stochastic process with a countable state space Z +. Theorem 7.5 can now be rephrased as follows. Theorem 7.8. The counting process Nt) = N0, t] of a homogeneous independently scattered random point pattern is a Poisson process with intensity λ = EN0, 1]). 7.5 Constructing independently scattered point patterns Do independently scattered point patterns exist? Let us construct one. Define first the random numbers T 1, T 2,... by the formula T n = τ 1 + + τ n, n 1, where τ 1, τ 2,... are independent and identically distributed positive random numbers. Figure 7.1 describes a so-constructed point patterns and a corresponding counting process. 63

Nt) 0 5 10 15 20 25 * ** * *** * * * * * * * * * * ** 0 2 4 6 8 10 t Figure 7.1: A point pattern simulated using the method in Theorem 7.9 and a corresponding Poisson process path on time interval 0, 10]. 64

Theorem 7.9. If the interpoint distances τ 1, τ 2,... are exponentially distributed with rate parameter λ, then the point pattern X = {T 1, T 2,... } is homogeneous and independently scattered, and the corresponding counting process Nt) = X 0, t] = {k 1 : T k t} is a Poisson process with intensity λ. Proof. See [Kal02, Proposition 12.15] for a detailed proof. 65

Bibliography [Asm03] [BP98] [Kal02] Søren Asmussen. Applied Probability and Queues. Springer, second edition, 2003. Sergey Brin and Larry Page. The anatomy of a large-scale hypertextual web search engine. In 7th International World-Wide Web Conference WWW 1998), 1998. Olav Kallenberg. Foundations of Modern Probability. Springer, second edition, 2002. [Kul11] Vidyadhar G. Kulkarni. Introduction to Modeling and Analysis of Stochastic Systems. Springer, second edition, 2011. [LPW08] David A. Levin, Yuval Peres, and Elizabeth L. Wilmer. Markov Chains and Mixing Times. American Mathematical Society, http: //pages.uoregon.edu/dlevin/markov/, 2008. [Ros95] Sheldon M. Ross. Stochastic Processes. Wiley, second edition, 1995. [SW08] [Wil91] Rolf Schneider and Wolfgang Weil. Stochastic and Integral Geometry. Springer, Berlin, 2008. David Williams. Probability with Martingales. Cambridge University Press, 1991. 125