Stochastic Proceses Poisson Process

Size: px
Start display at page:

Download "Stochastic Proceses Poisson Process"

Transcription

1 Stochastic Proceses Poisson Process Giovanni Pistone Revised June 9, / 31

2 Plan 1. Poisson Process (Formal construction) 2. Wiener Process (Formal construction) 3. Infinitely divisible distributions (Lévy-Khinchin formula) 4. Lévy processes (Generalities) 5. Stochastic analysis of Lévy processes (Generalities)

3 References Billingsley Brémaud DD1 DDG Patrick Billingsley, Probability and measure, second ed., Wiley Series in Probability and Mathematical Statistics: Probability and Mathematical Statistics, John Wiley & Sons Inc., New York, 1986 Pierre Brémaud, An introduction to probabilistic modeling, Undergraduate texts in mathematics, Springer, New York, 1997 Didier Dacunha-Castelle and Marie Duflo, Probabilités et statistiques. tome 1: Problèmes à temps fixe, Collection Mathématiques Appliqées pour la Matrise, Masson, Paris, 1982 Didier Dacunha-Castelle, Marie Duflo, and Valentine Genon-Catalot, Execices de probabilités et statistiques. tome 2: Problèmes à temps mobile, Collection Mathématiques Appliqées pour la Matrise, Masson, 1984 Pintacuda Nicolò Pintacuda, Probabilità, Decibel-Zanichelli, Padova, 1994 R Ross Sage Williams R Core Team, R: A language and environment for statistical computing, R Foundation for Statistical Computing, Vienna, Austria, 2014 Sheldon M. Ross, Introduction to Probability Models, 10th ed., Academic Press, San Diego, CA, 2010 W. A. Stein et al., Sage Mathematics Software (Version 5.9), The Sage Development Team, 2013, David Williams, Probability with martingales, Cambridge Mathematical Textbooks, Cambridge University Press, Cambridge, 1991.

4 Random variables Random variable Let (S, S) and (R, B) be measurable spaces, and X : S R a function. The function X is a (R, B)-random variable if X 1 (B) S. If C { B is a generating set } of B, and X 1 (C) S, then X is a random variable. In fact, let E = B B X 1 (B) S. Then E is a sub-σ-algebra of B, and C E, then E = B, see [Williams 3.2.b]. The σ-algebra generated by X is X 1 (B). From C { B, we have X 1 (C) X 1 (B), } hence σ(x 1 (C)) X 1 (B) = X 1 (σ(c)). Let E = B B X 1 (B) σ(x 1 (C)). As E is a σ-algebra containing C and contained in in B, then E = B and X 1 (B) = σ(x 1 (C)). Checking Measurability Let (S, S) be a measurable space and X : S R. If C is a family of subsets of R and X 1 (C) S, then 1. X is a random variable in (S, σ(c)); 2. σ(x ) = X 1 (σ(c)) = σ(x 1 (C)). In particular, if S R and X : S x x R is the immersion, then X 1 (C) = C S = {C S C C}. 1 / 31

5 2. Stochastic Process 1 Trajectory: definition Let I be a real interval, the set of times. Let D be a set of real functions u : I R, the set of trajectories. Given t I, the mapping Π t : D R defined by Π t (u) = u(t) is the evaluation of the trajectory at t. Let D = σ(π t : t I) be the σ-algebra generated by the evaluations. The measurable space (D, D) is a space of trajectories. 2 Stochastic process: definition Given a probability space (Ω, F, P) and a space of trajectories (D, D), a stochastic process on (Ω, F, P) with trajectories in D is a random variable X : Ω D, X 1 : D F. The distribution of X is P X 1. 3 Stochastic process: distribution 1. If t 1,..., t n I, then (X t1,..., X tn ) is random variable in R n whose distribution is called finite-dimensional distribution at t 1,..., t n I. 2. Finite dimensional distributions characterize the distribution of the stochastic process. 2 / 31

6 E1 Check Use [Williams.3]: Def 3.1 applies to general range, not just to real valued functions. { The family of events of the form ω S X t1 B 1,..., X tn B n }, for B 1,..., B n B, is a π-system that generates X 1 (D). Note that it is enough to take the B s is a π-system for R. E2 Consider the polynomials l 0 (x) = 1, l 1 (x) = 3(2x 1), l 2 (x) = 5(6x 2 6x + 1); check the orthonormality on [0, 1]. Define L i (t) = t 0 l i(x)dx. Given Z 0, Z 1, Z 2 iid N(0, 1), define ( the stochastic ) process W (2) = L 0 Z 0 + L 1 Z 1 + L 2 Z 2. Compute Cov W s (2), W (2) t and the marginal distribution of (W (2) s, W (2) t ) 3 / 31

7 Note ( that W (2) is a stochastic process whose trajectories are polynomials of degree up to 3, and E W (2) ) t = E (L 0 (t)z 0 + L 1 (t)z 1 + L 2 (t)z 2 ) = l 0 (x)l 0 (x) dx = 1 2 dx = l 1 (x)l 1 (x) dx = ( 1 3(2x 1)) 2 dx = 3 (4x 2 4x + 1) dx = l 0 (x)l 1 (x) dx = 1( 3(2x 1)) dx = l 2 (x)l 2 (x) dx = ( 1 5(6x 2 6x + 1)) 2 dx = 5 (36x 4 72x x 2 12x + 1) dx = l 0 (x)l 2 (x) dx = 1( 5(6x 2 6x + 1)) dx = l 1 (x)l 2 (x) dx = (( 3(2x 1)))(( 5(6x 2 6x + 1))) dt = ( Cov W (2) s, W (2) ) t = E ((L 0 (s)z 0 + L 1 (s)z 1 + L 2 (s)z 2 )(L 0 (t)z 0 + L 1 (t)z 1 + L 2 (t)z 2 )) = (W (2) s, W (2) t ( ) N 0, To be continued in the chapter on Wiener process. n L i (s)l j (t) E ( ) n Z i Z j = L i (s)l i (t) i,j=0 [ ni=0 L 2 i (s) ni=0 ]) L i (s)l i (t) ni=0 L i (s)l i (t) ni=0 L 2 i (t) i=0 4 / 31

8 Exponential distribution The exponential distribution Exp(λ) has support [0, + [ and it is a model for random times. The parameter λ > 0 is the intensity. The positive random variable X has exponential distribution with intensity λ (λ > 0) if its density is f X (x) = λ exp( λx)(x > 0). The survival function is 1 if x < 0 and for x 0 its value is R X (x) = exp( λx). The distribution function is 0 if x < 0 and for x 0 its value is F X (x) = 1 R X (x) = 1 exp( λx). The quantile function, {x F X (x) u} = {x Q X (u) x}, 0 < u < 1, is Q X (u) = 1 λ ln(1 u), ]0, 1[. The expected value is E (X ) = + R 0 X (x)dx = + exp( λx)dx = 1 0 λ. The moment generating function is M X (t) = E (exp(tx )) = if t < λ. The λ λ t variance is Var (X ) = 1 λ. If X 2 1,..., X n are iid Exp(λ), then T = X X n is Γ(n, λ), with density f T (t) = λn t n 1 (n 1)! e λt (t > 0). [ If p is a proposition, the (p) is the truth value, i.e. the indicator function.] 5 / 31

9 The exponential distribution is memory-less Easy version Let X Exp(λ). For each a, t 0, P (X > t + a X > a) = P (X > t). More sophisticated version Let X Exp(λ) and let G be a σ-algebra independent of X. If A and B are positive G-measurable random variables, then for all t > 0, E (B(X > t + A)) = P (X > t) E (B(X > A)). Proofs: Independence implies (see [Williams 8.3-4]) ( + ) E (B(X > t + A)) = E B(x > t + A) λe λx ( dx = E Be λ(t+a)) = e λt E (Be λa) 0 The easy version is A = a, B = 1, G trivial. 6 / 31

10 E3 Same assumptions: E (B(X > t + A) G ) = Be λ(t+a). Use [Williams 9.10]. E4 Compute E ((X > t) σ({x > a} : a s)), s t. Let (S, S, P) the probability space where X is defined and let X t = (X t) be the stochastic processes that has a unit jump at the random time X. For each time t, the σ algebra σ(x t ) is {, S, {X t}, {X > t}}, with generator {X > t}. The σ-algebra F s = σ(x a : a s) has generators C = {{X > a} a s}, which is a π-system as {X > a 1 } {X > a 2 } = {X > max(a 1, a 2 )}, see [Williams 1.6]. A bounded random variable B is F s -measurable if, and only if, B = g(x )(X s) + c(x > s), with g Borel bounded and c constant. In fact the class H = {g(x )(X s) + c(x > s) g B b, c R} is a monotone class containing the indicators of C because (X > a) = (X > a)(x s) + (X > a), see [Williams 3.14]. Let us compute g and c such that E ((X > t) F s ) = g(x )(X s) + c(x > s), see [Williams 9.2]. For all s a, E ((X > a)(x > t)) = E ((X > a)(g(x )(X s) + c(x > s))), a s, hence exp ( λt) = E ((a < X s)g(x )) + c exp ( λs), a s. If a = s, exp ( λt) = c exp ( λs), then c = exp ( λ(t s)). It follows s E ((a < X s)g(x )) = g(x) λe λx dx = 0, s s a so that g = 0 and E ((X > t) F s ) = exp ( λ(t s)) (X > s). 7 / 31

11 E5 Compute E (X t X s F s ), s t. From the previous computation, E (X t X s F s ) = 1 E ((X > t) F s ) X s = 1 exp ( λ(t s)) (X > s) X s = 1 exp ( λ(t s)) (1 X s ) X s = (exp ( λ(t s)) 1)(X s 1) 0, then (X t, F t ) t 0 is a sub-martingale, see [Williams 10.3]. E6 Show that A = λ 0 (X u 1) du is an absolutely continuous compensator of X, that is it is a process with absolutely continuous trajectories and such that M = X A is a martingale [CD1]. Having checked that it is correct to exchange integration with conditional expectation, E (A t A s F s ) = ( t ) t t E λ (X u 1) du F s = λ E ((X > u) F s ) du = λ e λ(u s) (X > s) du = s s s ( t ) λ e λ(u s) du (X > s) = e λ(u s) t (X > s) = E (Xt Xs Fs ), s s so that E (X t A t F s ) = X s A s. 8 / 31

12 E7 Compute E (φ(x t ) F s ) to show that the process X is a Markov process. Compute the transitions. As φ(x t ) = φ(0)(x > t) + φ(1)(x t) = φ(1) + (φ(0) φ(1)(x > t), the conditional expectation is E (φ(x t ) F s ) = φ(1)+(φ(0) φ(1)) E ((X > t) F s ) = φ(1)+(φ(0) φ(1))e λ(t s) (X > s) = P s,t φ(x s ), with P s,t φ(x) = { ( φ(0)e λ(t s) + φ(1) 1 e λ(t s)) if x = 0 φ(1) if x = 1. The transitions are computed as P (Xs = x, Xt = y) E ((Xs = x)(xt = y)) E ((Xs = x) E ((Xt = y) Fs )) P (X t = y X s = x) = = = P (X s = x) P (X s = x) P (X s = x) and the previous formula for φ = 1 y. E8 If X 1, X 2,... is an infinite sequence of independent random variables with common distribution Exp(λ) and T 0 = 0, T 1 = X 1, T 2 = X 1 + X 2, T 3 = X 1 + X 2 + X 3... i.e., T 0 = 0 and T n = T n 1 + X n, n 1, then T n + for n a.s. 9 / 31

13 For all a > 0 a (λt) n 1 P (T n a) = λ 0 (n 1)! e λt dt 0, n, ( ) as (λt) n 1 /(n 1)! is decreasing if n λa. Cf. [Williams 12.5]. n 10 / 31

14 11. Poisson process: construction Definition Let X 1, X 2,... be iid Exp(λ) on the probability space (S, S, P) and T 0 = 0 and T n = T n 1 + X n, n 1. Let (D, D) be the space of trajectories on the set of times [0, + [ which are 0 at t = 0 and take the value n = 0, 1, 2... on the interval [t n, t n+1 [ where t 0 < t 1 < t 2... and t n as n. The Poisson process with jump times (T n ) n is defined at each ω {lim n T n = + } as the trajectory in (D, D) with jumps T 1 (ω), T 2 (ω),..., that is N t (ω) = n n (T n (ω) n < T n+1 (ω)) = # {T k (ω) T k t} = (T n t). n=0 11 / 31

15 12. Poisson process: properties from Def (0) 1. Each N t is Poisson distributed with mean λt: P (N t = n) = (λt)n n! e λt. 2. Each increment N t N s, s < t, is Poisson distributed with mean λ(t s): P (N t N s = n) = (λ(t s))n n! e λ(t s). 3. Disjoint increments are independent: for all natural n and reals t 1 < T 2 < < t n, the random variables N tj N tj 1, j = 1,..., n and independent, hence 4. the sequence N t1, N t2,..., N tn has the Markov property E ( φ(n tn ) Ntn 1, N tn 2,... ) = φ(n + N tn 1 ) (λ(t n t n 1 ) n e λ(tn tn 1) n! n=0 12 / 31

16 1. P (N t = n) = E ((T n t)(t T n < X n+1 )) = E ((T n t) exp ( λ(t T n))) = t0 e λ(t x) λn x n 1 e λx λt (λt)n dx = e. (n 1)! n! 2.a P (N s = n, N t = n) = E ((T n s)(t T n < X n+1 )) = E ((T n s) exp ( λ(t T n))) = s0 e λ(t x) λn x n 1 e λx λt (λs)n dx == e = Poisson(n, λ). (n 1)! n! P (N t N s = 0) = n=0 P (N s = n, N t = n) = λt (λs)n n=0 e = e λ(t s) = Poisson(0; λ(t s)). n! 2.b P (N s = n 1, N t = n) = E ( (T n 1 s < T n t)(t T n < X n+1 ) ) = E ( (T n 1 s < T n t) exp ( λ(t T ) n)) = e λt E ((T n 1 s)e λt ) n 1 (s T n 1 X n < t T n 1 )e λxn = ( e λt E (T n 1 s)e λt n 1 t T ) ( n 1 e λx λe λx dx = λ(t s)e λt E (T s T n 1 s)e λt ) n 1 = n 1 λ(t s)e λt s 0 e λx λn 1 x n 2 (n 2)! e λx dx = λ(t s) (λs)n 1 e λt. (n 1)! P (N t N s = 1) = n=1 P (N s = n 1, N t = n) = n=1 λ(t s) (λs)n 1 e λt = λ(t s)e λ(t s) = (n 1)! Poisson(1; λ(t s)). 2.c P (N s = n k, N t = n) = E ( (T n k s < T n k+1 )(T n t)(t T n < X n+1 ) ) = E ( (T n k s < T n k+1 )(T n t) exp ( λ(t T ) n)) = ) e λt E ((T n k s < T n k+1 )(T n t)e λtn = ( e λt E (T n k s < T n k+1 )e λt n k+1 (X n k X n 1 t T n k+1 )e λ(x n k+2 + +X n 1 )) = ( e λt E (T n k s)(t < T n k+1 )e λt n k+1 t T n k+1 0 e λy λk 1 y k 2 ) e λy dy = (k 2)! ( λt e λk 1 (k 1)! E (T n k s)(t < T n k+1 )e λt n k+1 (t T n k+1 ) k 1) 13 / 31

17 14. Poisson process: filtration Let N be a Poisson process with jump times T 1, T 2,.... We say that N is the counting process of the sequence T 1, T 2,... We have {N t n} = {T n t}. σ-algebras generated by a Poisson process 1. The σ-algebra σ(n t ) has the generating π-system {{T n t} n = 1, 2,...}. 2. The σ-algebra F t = σ(n s : s t) has generating π-system C t = {{T n1 s 1,..., T sm s m } m N, n 1 n m, s 1 < < s m t} 3. A random variable Y is F t measurable if, and only if, Y = G k (N t = k) k=0 where G k is a σ(t 1,..., T k )-measurable random variable, k Z / 31

18 15. Proofs for slide 14 items 1,2 1. The sets [a, + [, a R form a π-system for (R, B), hence {{N t a} a R} is a π-system of σ(n t ). As N t actually takes values in Z +, the system contains set of the form {N t n} = {T n t} for n = 0, 1, The π-system for F t contains sets which are finite intersections of elements of the form {T n s} for all n and s t, that is, for example, {T n1 t 1, T n2 t 2,..., T nm t m } for all m, n 1,..., n m N and t 1,..., t m R +. If we order the t i s in increasing order, consider the π-system property for each time, and the increasing order of the T n s we obtain the stated subclass. 15 / 31

19 16. Proof for slide 14 item 3 We use the monotone class theorem [Williams 3.14]. Consider the set of random variables H = G k (N t = k) k=0 G k L (G k ), k Z +, with G k = σ(t 1,..., T k ). It is a vector space, it contains the constants, and it is closed for increasing limits. In fact, on each element of the partition {N t = k}, k Z + is a generic class of bounded random variables. Moreover, it is an ring, as G k (N t = k) G k (Nt = k) = G k G k (Nt = k). k=0 k=0 k=0 Each indicator function of the form (T n s), n Z +, s t, belongs to H. In fact, (T n s)(n t = k) = (T n s, T k t, T k+1 > t) = { (T n s)(n t = k) if n k, 0 if n > k, hence (T n s) = (T n s)(n t = k) = k n k G k (N t = k), with G k = 0 for k < n and G k = (T n s) for k n. As H is closed for product, each element of the π-system C t is included, and the result follows for bounded random variables. The general case is obtained by point-wise limits of bounded random variables. 16 / 31

20 17. Poisson process: conditioning Conditioning to F t Let N be a Poisson process on (S, S) with jumps T 1, T 2,..., and let a time s 0 be given. 1. For each integrable random variable Y there exists a sequence Y k = g k (T 1,..., T k ), k = 0, 1,..., such that E (Y F s ) = Y k (N s = k). k=0 2. Each Y n = g n (T 1,..., T n ), n = 0, 1,... is characterized by E (Y (N s = n)g) = e λs E ( Y n (T n s)e λtn G ), G L (T 1,..., T n ). 17 / 31

21 18. Exercise E9 Prove the previous theorem See [Williams 9.2]. The conditional expectation E (Y F s ) is a random variable measurable for F s, then E (Y F s ) = k=0 Y k (N s = k). Note that G n(n s = n), G n = g ( Y 1,..., Y n), G n indicator, n = 0, 1,..., is a π-system for F s. Then we want E (YG n(n s = n)) = E (E (Y F s ) G n(n s = n)) = E Y k (N s = k) G n(n s = n) = E (Y ng n(n s = n)). k=0 As (N s = n) = (T n s < T n+1 ) = (T n s < T n + X n+1 ) = (T n s)(s T n < X n+1 ), from the independence of (T 1,..., T n) and X n+1, E (Y ng n(n s = n)) = E (Y ng n(t n s)(s T n < X n+1 )) = ( E Y ng n(t n s)e λ(s Tn)) = e λs ( E Y ng n(t n s)e λtn ). 18 / 31

22 19. Markov property, independent increments Key result 1. The Poisson process is a Markov process i.e., for each bounded φ: R R and times s < t we have with transition ˆφ(n) = k=n E (φ(n t ) F s ) = ˆφ(N s ), (λ(t s))k n φ(k) (k n)! = (λ(t s))m φ(n + m). m! m=0 2. It follows that the Poisson process has independent homogeneous increments i.e., for each bounded φ: R R and times s < t we have E (φ(n t N s ) F s ) = E (φ(n t s )). 19 / 31

23 Ex.10 Proof of 1. We apply the formula for conditioning to the r.v. Y = φ(n t ), that is E (φ(n t ) F s ) = k=0 Y k (N s = k), with ) the characterization E (φ(n t )(N s = n)g n) = e λs E (Y n(t n s)e λtn G n, G n = g n(t 1,..., T n). The RES is E (φ(n t )(N s = n)g n) = φ(k) E ((N t = k)(n s = n)g n) = φ(k) E ((N t = k)(n s = n)g n) k=0 because N s N t. We compute each term of the RES. For k = n we have k=n E ((N t = n)(n s = n)g n) = E ((T n s)(t < T n+1 )G n) = ( E ((T n s)(t T n < X n+1 )G n) = E (T n s)e λ(t Tn)Gn ) = e λt E ((T n s)e λtn ) G n. If k = n + 1, E ((N t = n + 1)(N s = n)g n) = E ((N s = n)(t n+1 t < T n+2 )G n) = E ((N s = n)(t n+1 t < T n+1 + X n+2 )G n) = E ((N s = n)(t n+1 t)(t T n+1 < X n+2 )G n) = e λt E ((N s = n)(t n+1 t)e λt ) n+1 G n = e λt E ((T n s < T n + X n+1 )(T n + X n+1 t)e λ(tn+x n+1 ) ) G n = e λt E ((T n s)e λtn (s T n < X n+1 )(T n t)(x n+1 t T n)e λx ) n+1 G n = e λt E ((T n s)e λtn (s T n < X n+1 t T n)e λx ) n+1 G n = λ(t s)e λt E ((T n s)e λtn ) G n 20 / 31

24 If k = n + h, h > 1, E ((N t = n + h)(n s = n)g n) = E ( (N s = n)(t n+h t < T n+h+1 )G n ) = E ( ) ( ) (N s = n)(t n+h t < T n+h + X n+h+1 )G n = E (Ns = n)(t n+h t)(t T n+h < X n+h+1 )G n = e λt E ((N s = n)(t n+h t)e λt ) n+k G n = e λt E ((N s = n)(t n+1 + (X n X n+h ) t)e λ(t n+1 +(X n+2 + +X n+h )) ) G n = e λt E ((N s = n)e λt n+1 (T n+1 t)(x n X n+h t T n+1 )e λ(x n+2 + +X n+h ) ) G n = ( ( t Tn+1 e λt E (N s = n)e λt n+1 (T n+1 t) e λx λh 1 x h 2 ) ) e λx dx G n = 0 (h 2)! λ h 1 λ h 1 (h 1)! e λt E λ h 1 (h 1)! e λt E (h 1)! e λt E ((N s = n)e λt n+1 (T n+1 t)(t T n+1 ) h 1 G n ) = ((T n s < T n+1 )e λt n+1 (T n+1 t)(t T n+1 ) h 1 G n ) = ((T n s)e λtn (s T n < X n+1 t T n)e λx n+1 (t T n X n+1 ) h 1 G n ) = λ h 1 ( ( ) ) t Tn (h 1)! e λt E (T n s)e λtn e λx (t T n x) h 1 λe λx dx G n = s Tn (λ(t s)) h h! e λt E ((T n s)e λtn ) G n 21 / 31

25 Collecting all cases E (φ(n t )(N s = n)g n) = e λt (λ(t s))k n E φ(k) (T n s)e λtn G n. (k n)! k=n It follows Y n = k=n φ(k) (λ(t s))k n (k n)! and (λ(t s))k n E (φ(n t ) F s ) = (N s = n) φ(k) = ˆφ(n)(N s = n) = ˆφ(N s ) n=0 (k n)! k=n n=0 Ex.11 Proof of / 31

26 E12Telegrapher process. Define Y t = t 0 ( 1)Nu du. Is it independent increments? A sub-martingale? A Markov process?

27 Poisson process as a Lévy process Equivalent definition (1) A counting process N(t), t 0 is a Poisson process with intensity λ > 0 if, and only if, 1. N(0) = 0 2. The process has independent increments. 3. The number of events occurring in the time interval ]s, t], with length t s, has Poisson distribution with mean λ(t s): λ(t s) (λ(t s))n P (N(t) N(s) = n) = e n! n = 0, 1, 2 R In particular, such a process has stationary increments and is continuous in probability. 23 / 31

28 Ex.13 Proof of Poisson (1) is equivalent to Poisson (0). The only if part is already proved. Let us compute the joint finite { distributions of the process } N. For t 1 < < t n the random variable (N t1,..., N tn ) has values in k Z n + k 1 k n and (discrete) density ) ) P (N t1 = k 1,..., N tn = k n = P (N t1 = k 1,..., N tn N tn 1 = k n k n 1 = n j=1 ) P (N tj N tj 1 = k j k j 1 k 0 = 0, t 0 = 0 = n j=1 e λ(t j t j 1 ) (λ(t j t j 1 )) k j k j 1 (k j k j 1 )! n = e λtn (λ(t j t j 1 )) k j k j 1. (k j=1 j k j 1 )! With t j = s s j, k j = h h j, we have s j > 0 and h j 0, for j = 1,..., n, hence ) P (N s1 = h 1,..., N s1 + +sn = h h n = n e λs j (λs j ) h j h j=1 j! n = e λ(s 1 + +sn) (λs j ) h j. h j=1 j! Let us compute the joint distribution of the arrival times. The vector (T 1,..., T n) takes values in {t 0 < t 1 < < t n}, which in turn is the image of R n > under the cumulative sum map 24 / 31

29 sum : x (x 1,..., x x n). As { ]s 1, + [ ]s n, + [ sj > 0 } is a π-system, its cumulative sum image is a π-system {]t 1, + [ ]t n, + [ 0 < t 1 < < t n}. Let us compute the probability of each element ( ) P (T 1 > t 1,..., T n > t n) = P N t1 < 1,..., N tn < n ) = P (N t1 = 0,..., N tn = k n = k 2 kn,k j <j k 2 kn,k j <j e λtn n (λ(t j t j 1 )) k j k j 1 j=2 (k j k j 1 )! This equation uniquely identifies the joint distribution of T 1,..., T n, hence the joint distribution of the inter-times X j = T j T j 1, j = 1,..., n. Assume first n = 1, ( ) P (T 1 > t 1 ) = P N t1 < 0 = e λt 1, P (T 1 > t 1 ) = λe λt 1. t 1 If n = 2, ( ) ( ) ( ) P (T 1 > t 1, T 2 > t 2 ) = P N t1 < 1, N t2 < 2 = P N t2 = 0 + P N t1 = 0, N t2 = 1 = e λt 2 + e λt 1 e λ(t 2 t 1 ) λ(t 2 t 1 ) = e λt 2 (1 + λ(t 2 t 1 )), and P (T 1 > t 1, T 2 > t 2 ) = λe λt 2, ( 1) 2 2 P (T 1 > t 1, T 2 > t 2 ) = λ 2 e λt 2. t 1 t 1 t 2 25 / 31

30 ) If n = 3 we have a summation of P (N t1 = k 1, N t2 = k 2, N t3 = k 3 over the cases k 1 k 2 k 3, k 1 < 1, k 2 < 2, k 3 < 3, that is k k k ) The time t 1 appears in P (N t1 = k 1, N t2 = k 2, N t3 = k 3 only if k 2 = 1, hence P (T 1 > t 1, T 2 > t 2, T 3 > t 3 ) = e λt 3 (λ(t 2 t 1 ) + λ(t 2 t 1 )λ(t 3 t 2 )) = t 1 t 1 e λt 3 (1 + λ(t 3 t 2 )) λ(t 2 t 1 ) = λe λt 3 (1 + λ(t 3 t 2 )) t 1 so that the recursion is apparent and gives ( 1) 3 3 t 1 t 2 t 3 P (T 1 > t 1, T 2 > t 2, T 3 > t 3 ) = λ 3 e λt 3. Variant definition (2) 26 / 31

31 A counting process N(t), t 0 is a Poisson process with intensity λ > 0 if 1. N(0) = 0 2. The process has independent and stationary increments. 3. P (N(t) = 1) = λt + o(t) as t 0 4. P (N(t) > 1) = o(t) as t 0 E (N(t)) = λt and for t 0: P (N(t) = 1) = e λt (λt) = λt + o(t) P (N(t) > 1) = 1 e λt e λt (λt) = 1 e λt (1 + (λt)) = o(t) Ex.14 The two definitions are equivalent. 27 / 31

32 ( Let us consider the moment generating function of N(t), i.e. g(t) = E e un(t)). Note g(0) = 1, and let us derive a differential equation for g. g(t + h) = E (e un(t+h)) ( = E e un(t) e u(n(t+h) N(t))) independence ( = E e un(t)) ( E e u(n(t+h) N(t))) stationarity = g(t)g(h) As h 0 g(h) = k e uk P (N(h) = k) = (1 λ + o(h)) + e u (λh + o(h)) + k>1 e uk o(h) = 1 (λh) + e u (λh) + o(h) By letting h 0 in the definition of derivative we obtain the equation g (t) = λ(e u 1)g(t) whose solution is g(t) = exp [ λ ( e u 1 )] This is the moment generating function of the Poisson(λt) distribution. Ex / 31

33 Check the following: prove the last statement; take a subdivision of the interval [0, t] and use the binomial law to count how many contain a jump of N(t); avoid the stationarity condition by introducing a new condition infinitesimal condition. 29 / 31

34 30. Poisson process: simulation The conditional distribution of a random variable X given the event B is the probability measure µ X B of the domain of X such that for all integrable φ it holds E (φ X 1 B ) = P (B) φ(x) µ X B (dx). Note that µ X B is supported by B and the notation E (φ X B ) = φ(x) µ X B (dx). The conditional distribution is identified on a monotone class. Let U 1,..., U n be iid U(0, t). The sorting map τ(t 1,..., t n) = (t (1),..., t (n) ) is almost surely defined and takes values in the open simplex (t) = {s = (s 1,..., s n) 0 < s 1 < < s n < t}. The (t)-valued random variable (U (1),..., U (n) ) = τ(u 1,..., U n) is the order statistics of (U 1,..., U n). The distribution ν of the order statistics is the image under the sorting map of the uniform distribution, (t) φ(s) ν(ds) = (t) φ(s) τ ν(ds) = t n [0,t] n φ τ(t) dt = t n Π P Π 1 ( (t)) φ τ(t) dt = n!t n (t) φ(t) dt, where P is the group of permutation matrices on R n. Past jump times are uniform and independent Let T 1, T 2,... be the arrival times of a Poisson process with intensity λ. The joint distribution of T 1,..., T n, conditional to {N(t) = n} has uniform density f (t 1,..., t n ) = n! t n (0 < t 1 < < t n < t) Such a density is the joint density of the order statistics of n random variables U 1,..., U n IID uniformly distributed on ]0, t[. 30 / 31

35 A simulation lambda <- 1; t <- 10 n <- rpois(1,lambda*t); uniform <- runif(n,0,t) x <- sort(uniform) plot(c(0,x,t),c(0:n,n),type="s",xlab="t",ylab="n") N t 31 / 31

36 Ex.16 Proof. Consider the monotone class {t φ 1 (t 1 ) φ n(t n) φ i bounded}. E (φ 1 (T 1 ) φ n(t n)(n t = n)) = E (φ 1 (T 1 ) φ n(t n)(n t = n)) = E (φ 1 (T 1 ) φ n(t n)(t n t)(t T n < X n+1 )) = e λt E (φ 1 (T 1 ) φ n(t n)(t n t)e λtn ) = e λt E (φ 1 (X 1 ) φ n(x X n)(x X n t)e λ(x 1 + +Xn)) = λ n e λt φ 1 (x 1 ) φ n(x x n)(x x n t) dx 1 dx n = = λ n e λt ]0, [ n (t) = (λt)n e λt n! φ 1 (s 1 ) φ n(s n) dt 1 dt n φ 1 (s 1 ) φ n(s n) n! t n 1 (t) dt 1 dt n. 31 / 31

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

STAT 380 Markov Chains

STAT 380 Markov Chains STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring 2016 1 / 38 1/41 PoissonProcesses.pdf (#2) Poisson Processes

More information

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have Lectures 16-17: Poisson Approximation 1. The Law of Rare Events Theorem 2.6.1: For each n 1, let X n,m, 1 m n be a collection of independent random variables with PX n,m = 1 = p n,m and PX n,m = = 1 p

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

EDRP lecture 7. Poisson process. Pawe J. Szab owski

EDRP lecture 7. Poisson process. Pawe J. Szab owski EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Arrivals and waiting times

Arrivals and waiting times Chapter 20 Arrivals and waiting times The arrival times of events in a Poisson process will be continuous random variables. In particular, the time between two successive events, say event n 1 and event

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory JUMP PROCESSES GENERALIZING STOCHASTIC INTEGRALS WITH JUMPS Tyler Hofmeister University of Calgary Mathematical and Computational Finance Laboratory Overview 1. General Method 2. Poisson Processes 3. Diffusion

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Random variables and transform methods

Random variables and transform methods Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

Chapter 2. Poisson Processes

Chapter 2. Poisson Processes Chapter 2. Poisson Processes Prof. Ai-Chun Pang Graduate Institute of Networking and Multimedia, epartment of Computer Science and Information Engineering, National Taiwan University, Taiwan utline Introduction

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc CIMPA SCHOOL, 27 Jump Processes and Applications to Finance Monique Jeanblanc 1 Jump Processes I. Poisson Processes II. Lévy Processes III. Jump-Diffusion Processes IV. Point Processes 2 I. Poisson Processes

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 4: Introduction to stochastic processes and stochastic calculus Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 217 Lecture 13: The Poisson Process Relevant textbook passages: Sections 2.4,3.8, 4.2 Larsen Marx [8]: Sections

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture

More information

Point Process Control

Point Process Control Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

STAT Sample Problem: General Asymptotic Results

STAT Sample Problem: General Asymptotic Results STAT331 1-Sample Problem: General Asymptotic Results In this unit we will consider the 1-sample problem and prove the consistency and asymptotic normality of the Nelson-Aalen estimator of the cumulative

More information

This ODE arises in many physical systems that we shall investigate. + ( + 1)u = 0. (λ + s)x λ + s + ( + 1) a λ. (s + 1)(s + 2) a 0

This ODE arises in many physical systems that we shall investigate. + ( + 1)u = 0. (λ + s)x λ + s + ( + 1) a λ. (s + 1)(s + 2) a 0 Legendre equation This ODE arises in many physical systems that we shall investigate We choose We then have Substitution gives ( x 2 ) d 2 u du 2x 2 dx dx + ( + )u u x s a λ x λ a du dx λ a λ (λ + s)x

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Notes on Stochastic Calculus

Notes on Stochastic Calculus Notes on Stochastic Calculus David Nualart Kansas University nualart@math.ku.edu 1 Stochastic Processes 1.1 Probability Spaces and Random Variables In this section we recall the basic vocabulary and results

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

13 The martingale problem

13 The martingale problem 19-3-2012 Notations Ω complete metric space of all continuous functions from [0, + ) to R d endowed with the distance d(ω 1, ω 2 ) = k=1 ω 1 ω 2 C([0,k];H) 2 k (1 + ω 1 ω 2 C([0,k];H) ), ω 1, ω 2 Ω. F

More information

PROBABILITY DISTRIBUTIONS

PROBABILITY DISTRIBUTIONS Review of PROBABILITY DISTRIBUTIONS Hideaki Shimazaki, Ph.D. http://goo.gl/visng Poisson process 1 Probability distribution Probability that a (continuous) random variable X is in (x,x+dx). ( ) P x < X

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Chapter 4. Continuous Random Variables

Chapter 4. Continuous Random Variables Chapter 4. Continuous Random Variables Review Continuous random variable: A random variable that can take any value on an interval of R. Distribution: A density function f : R R + such that 1. non-negative,

More information

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama

Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Qualifying Exam CS 661: System Simulation Summer 2013 Prof. Marvin K. Nakayama Instructions This exam has 7 pages in total, numbered 1 to 7. Make sure your exam has all the pages. This exam will be 2 hours

More information

Continuous time Markov chains

Continuous time Markov chains Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

A PECULIAR COIN-TOSSING MODEL

A PECULIAR COIN-TOSSING MODEL A PECULIAR COIN-TOSSING MODEL EDWARD J. GREEN 1. Coin tossing according to de Finetti A coin is drawn at random from a finite set of coins. Each coin generates an i.i.d. sequence of outcomes (heads or

More information

A Study of Poisson and Related Processes with Applications

A Study of Poisson and Related Processes with Applications University of Tennessee, Knoxville Trace: Tennessee Research and Creative Exchange University of Tennessee Honors Thesis Projects University of Tennessee Honors Program 5-2013 A Study of Poisson and Related

More information

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross. Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M Ross John L Weatherwax November 14, 27 Introduction Chapter 1: Introduction to Stochastic Processes Chapter 1:

More information

Jump-type Levy Processes

Jump-type Levy Processes Jump-type Levy Processes Ernst Eberlein Handbook of Financial Time Series Outline Table of contents Probabilistic Structure of Levy Processes Levy process Levy-Ito decomposition Jump part Probabilistic

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Statistical test for some multistable processes

Statistical test for some multistable processes Statistical test for some multistable processes Ronan Le Guével Joint work in progress with A. Philippe Journées MAS 2014 1 Multistable processes First definition : Ferguson-Klass-LePage series Properties

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

3. Poisson Processes (12/09/12, see Adult and Baby Ross) 3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

Mod-φ convergence I: examples and probabilistic estimates

Mod-φ convergence I: examples and probabilistic estimates Mod-φ convergence I: examples and probabilistic estimates Valentin Féray (joint work with Pierre-Loïc Méliot and Ashkan Nikeghbali) Institut für Mathematik, Universität Zürich Summer school in Villa Volpi,

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

for all f satisfying E[ f(x) ] <.

for all f satisfying E[ f(x) ] <. . Let (Ω, F, P ) be a probability space and D be a sub-σ-algebra of F. An (H, H)-valued random variable X is independent of D if and only if P ({X Γ} D) = P {X Γ}P (D) for all Γ H and D D. Prove that if

More information

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015 Markov chains c A. J. Ganesh, University of Bristol, 2015 1 Discrete time Markov chains Example: A drunkard is walking home from the pub. There are n lampposts between the pub and his home, at each of

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 ) 1 Introduction These problems are meant to be practice problems for you to see if you have understood the material reasonably well. They are neither exhaustive (e.g. Diffusions, continuous time branching

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

PoissonprocessandderivationofBellmanequations

PoissonprocessandderivationofBellmanequations APPENDIX B PoissonprocessandderivationofBellmanequations 1 Poisson process Let us first define the exponential distribution Definition B1 A continuous random variable X is said to have an exponential distribution

More information

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Random Processes. DS GA 1002 Probability and Statistics for Data Science. Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)

More information

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem Definition: Lévy Process Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes David Applebaum Probability and Statistics Department, University of Sheffield, UK July

More information

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes ECE353: Probability and Random Processes Lecture 18 - Stochastic Processes Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu From RV

More information

Notes 9 : Infinitely divisible and stable laws

Notes 9 : Infinitely divisible and stable laws Notes 9 : Infinitely divisible and stable laws Math 733 - Fall 203 Lecturer: Sebastien Roch References: [Dur0, Section 3.7, 3.8], [Shi96, Section III.6]. Infinitely divisible distributions Recall: EX 9.

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information