1. Stochastic Processes and filtrations

Size: px
Start display at page:

Download "1. Stochastic Processes and filtrations"

Transcription

1 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S is F S-measurable. 1 T = {0, 1,..., N} or T = N {0} (discrete time) 2 or T = [0, ) (continuous time). The state space will be (S, S) = (R, B(R)) (or (S, S) = (R d, B(R d )) for some d 2).

2 Example 1.1 Random walk: Let ε n, n 1, be a sequence of iid random variables with { 1 with probability 1 ε n = 2 1 with probability 1 2 Define X 0 = 0 and, for k = 1, 2,..., X k = k i=1 ε i

3 For a fixed ω Ω the function t X t (ω) is called (sample) path (or realization) associated to ω of the stochastic process X.

4 Let X and Y be two stochastic processes on (Ω, F). 1. Stoch. pr., Definition 1.2 (i) X and Y have the same finite-dimensional distributions if, for all n 1, for all t 1,..., t n [0, ) and A B(R n ): P((X t1, X t2,..., X tn ) A) = P((Y t1, Y t2,..., Y tn ) A). (ii) X and Y are modifications of each other, if, for each t [0, ), P(X t = Y t ) = 1. (iii) X and Y are indistinguishable, if P(X t = Y t for every t [0, )) = 1.

5 Example Stoch. pr.,

6 Definition 1.4 A stochastic process X is called continuous (right continuous), if almost all paths are continuous (right continuous), i.e.,. P({ω : t X t (ω) continuous (right continuous)}) = 1 Proposition 1.5 If X and Y are modifications of each other and if both processes are a.s. right continuous then X and Y are indistinguishable.

7 Let (Ω, F) be given and {F t, t 0} be an increasing family of σ-algebras (i.e., for s < t, F s F t ), such that F t F for all t. That s a filtration. For example: Definition 1.6 The filtration F X t generated by a stochastic process X is given by F X t = σ{x s, s t}.

8 Definition 1.7 A stochastic process is adapted to a filtration (F t ) t 0 if, for each t 0 X t : (Ω, F t ) (R, B(R)) is measurable. (Short: X t is F t -measurable.)

9 Definition 1.8 ( The usual conditions ) Let F t + = ε>0 F t+ε. We say that a filtration (F t ) t 0 satisfies the usual conditions if 1 (F t ) t 0 is right-continuous, that is, F t + = F t, for all t 0. 2 (F t ) t 0 is complete. That is, F 0 contains all subsets of P-nullsets.

10 2. Brownian motion as a of random walks 1. Stoch. pr., Example 2.1 Scaled random walk: for t k = k N, k = 0, 1, 2,..., we set X N t k := 1 N Xk = 1 N k i=1 ε i

11 Definition 2.2 (Brownian motion) A standard, one-dimensional Brownian motion is a continuous adapted process W on (Ω, F, (F t ) t 0, P) with W 0 = 0 a.s. such that, for all 0 s < t, 1 W t W s is independent of F s 2 W t W s is normally distributed with mean 0 and variance t s.

12 Recall the scaled random walk: for t k = k N, k = 0, 1, 2,..., we set X t N k := 1 Xk = 1 k ε i N N By interpolation we define a continuous process on [0, ): Xt N = 1 [Nt] ε i + (Nt [Nt])ε [Nt]+1 (2.1) N i=1 i=1

13 Theorem 2.3 (Donsker) Let (Ω, F, P) be a probability space with a sequence of iid random variables with mean 0 and variance 1. Define (X N t ) t 0 as in (2.1). Then (X N t ) t 0 converges to (W t ) t 0 weakly.

14 Clarification of the statement: 1 Ω = C[0, ) 2 On Ω = C[0, ) one can construct a probability measure P (the Wiener measure) such that the canonical process (W t ) t 0 is a Brownian motion. Let ω = ω(t) t 0 C[0, ). Then the canonical process is given as W t (ω) := ω(t). 3 (X N t ) t 0 defines a measure probability P N on Ω = C[0, ). Indeed, X N : (Ω, F) (C[0, ), B(C[0, )) measurable, that means we consider the random variable ω X N (ω, ), which maps ω to a continous function in t (i.e. the path). Let A B(C[0, )), then P N (A) := P({ω : X N (ω, A}). 4 So, Donsker s theorem says that on ( Ω, F) = (C[0, ), B(C[0, ))) we have that P N w P.

15 Steps of the proof: 1 Show that the finite-dimensional distributions of X N converge to the finite-dimensional distributions of W. That means, show that, for all n 1, and all s 1,..., s n [0, ) (X N s 1, X N s 2,..., X N s n ) w (W s1, W s2,..., W sn ). 2 Show that (P N ) N 1 is a tight familiy of probability measures on ( Ω, F). 3 (1) and (2) imply P N w P.

16 Easy partial result of step (1) of the proof: 1. Stoch. pr., By (CLT), for fixed t [0, ), for N, Theorem 2.4 (CLT) X N t w W t Let (ξ n ) n 1 be iid, mean=0, variance=σ 2. Then, for each x R, ( ) lim P 1 n ξ i x = Φ σ (x), n n i=1 where Φ σ (x) is the distribution function of N(0, σ 2 ). In other words where Z N(0, σ 2 ). 1 n n w ξ i Z, i=1

17 3. Brownian motion as a Gaussian process 1. Stoch. pr., Definition 3.1 A stochastic process (X t ) t 0 is called Gaussian process, if for each finite familiy of time points {t 1,..., t n } the vector (X t1,..., X tn ) has a multivariate normal distribution. A Gaussian process is called centered if E[X t ] = 0 for all t. The covariance function is given by γ(s, t) = Cov(X s, X t ). Theorem 3.2 A Brownian motion is a centered Gaussian process with γ(s, t) = s t (= min(s, t)). Conversely, a centered Gaussian process with continuous paths and covariance function γ(s, t) = s t is a Brownian motion.

18 Given is a filtered probability space (Ω, F, (F t ) t 0, P). 1. Stoch. pr., A random time T is a measurable function Definition 4.1 T : (Ω, F) ([0, ], B([0, ]). If X is a stochastic process and T a random time, we define the function X T on {T < } by: X T (ω) = X T (ω) (ω)

19 Definition 4.2 A random time T is a stopping time for (F t ) t 0 if, for all t 0, Proposition 4.3 {T t} = {ω : T (ω) t} F t. 1 If T (ω) = t a.s. for some constant t 0, then T is a stopping time. 2 Every stopping time T satisfies that, for all t, {T < t} F t. (4.1) 3 If the filtration is right continuous and a random time T satisfies (4.1) for all t, then T is a stopping time.

20 Lemma 4.4 Let S and T be stopping for (F t ) t 0. Then S T, S T and S + T are stopping for (F t ) t 0. Definition 4.5 Let T be a stopping time for (F t ) t 0. The σ-field F T of events determined prior to the stopping time T is given by Remark: F T = {A F : A {T t} F t, for all t 0}. 1 F T is a σ-field. 2 T is F T -measurable. 3 If T (ω) = t a.s., then F T = F t.

21 Example 4.6 Let (W t ) t 0 be a standard Brownian motion on (Ω, F, (F t ) t 0, P). Let T a be the first passage time of the level a R, i.e., T a (ω) = inf{t > 0 : W t (ω) = a}. (4.2) Then T a is a stopping time for (F t ) t 0.

22 Remark: Let S and T be stopping for (F t ) t Stoch. pr., 1 For A F S we have that A {S T } F T. 2 If, moreover, S T then F S F T. Definition 4.7 A stochastic process (X t ) t 0 is called progressively measurable for (F t ) t 0 if, for all t and all A B(R), the set that means {(s, ω) : 0 s t, ω Ω, X s (ω) A} B([0, t]) F t, (s, ω) X s (ω) is B([0, t]) F t B(R) measurable.

23 Example: If (X t ) t 0 is adapted and right-continuous then it is progressively measurable. Proposition 4.8 Let X be progressively measurable on (Ω, F, (F t ) t 0 ) and let T be a stopping time for (F t ) t 0. Then 1 X T defined on {T < } is an F T -measurable random variable and 2 the stopped process (X T t ) t 0 is progressively measurable.

24 5. Conditional expected value 1. Stoch. pr., Definition 5.1 Let (Ω, F, P) be a probability space and X : Ω R be an F B(R)-measurable random variable such that E[ X ] = X dp <. Let G F be a (smaller) σ-algebra. Then there exists a random variable Y with the following properties: 1 Y is G B(R)-measurable. 2 Y is integrable, i.e., E[ Y ] <. 3 For all A G: E[Y 1I A ] = E[X 1I A ]. The G-measurable random variable Y is called conditional expected value of X given G. We write Y = E[X G] a.s.

25 Remark: 1 E[X G] is unique with respect to equality a.s. 2 Suppose that, additionally, E[X 2 ] <. Then E[X G] is the orthogonal projection of X L 2 (Ω, F) onto the subspace L 2 (Ω, G). 3 If G = {, Ω} then E[X G] = E[X ] a.s.

26 Definition 5.2 Let Z be a function on Ω. Then E[X Z] := E[X σ(z)]. Example 5.3

27 Example 5.4 X : (Ω, F) (R, B(R)). Suppose Ω = k=1 Ω k such that Ω i Ω j = for i j. Let G := σ{ω k, k 1}. E[X G] =

28 Example 5.5 Suppose (X, Z) has a bivariate density f (x, z).

29 Theorem 5.6 (A list of properties of conditional ) In the following let X, Y, X n, n 1, be integrable random variables on (Ω, F, P). The following holds (a) If X is G-measurable, then E[X G] = X a.s. (b) Linearity: E[aX + by G] = ae[x G] + be[y G] a.s. (c) Positivity: If X 0 a.s. then E[X G] 0 a.s. (d) Monotone convergence (MON): If 0 X n and X n X a.s. then E[X n G] E[X G] a.s..

30 (e) Fatou: If 0 X n, for all n, then 1. Stoch. pr., E[lim inf n X n G] lim inf E[X n G]. n (f) Dominated convergence (DOM): If X n Y, for all n 1 and Y L 1 (i.e. integrable) and lim X n = X a.s. then lim n E[X n G] = E[X G] a.s. (g) Jensen s inequality: Let ϕ : R R be a convex function such that ϕ(x ) is integrable. Then ϕ (E[X G]) E[ϕ(X ) G]

31 (h) Tower Property: If G 1 G 2 F then E [E[X G 2 ] G 1 ] = E[X G 1 ] a.s. (i) Taking out what is known: Let Y be G-measurable and let E[ XY ] <. Then E[XY G] = YE[X G] (j) Role of independence: If X is independent of G, then E[X G] = E[X ]

32 and the discrete time stochastic integral 1. Stoch. pr., Definition 6.1 A stochastic process (X t ) t 0 on (Ω, F, (F t ) t 0, P) is a martingale if X is adapted and E[ X t ] <, for all t 0, and, for s < t, E[X t F s ] = X s a.s. (6.1) If the = in (6.1) is replaced by ( ) the process is called supermartingale (submartingale). Remark: 1 (6.1) is equivalent to E[X t X s F s ] = 0 a.s. 2 Suppose (X t ) n=0 is a stochastic process in discrete time adapted to (F n ) n=0 then the martingale property reads as: for all n 1 E[X n F n 1 ] = X n 1 a.s.

33 Example 6.2 (Random walk: sum of independent r.v.) Let ε k, k 1, be independent with E[ε k ] = 0. Let F n = σ{ε 1,..., ε n }, F 0 = {, Ω}. Define X 0 = 0 and, for n 1, X n = n ε k. k=1 Then (X n ) n 0 is a martingale for (F n ) n 0. Example 6.3 (Product of independent r.v.) Let ε k, k 1, be non-negative and independent with E[ε k ] = 1. Let F n = σ{ε 1,..., ε n }, F 0 = {, Ω}. Define X 0 = 1 and, for n 1, X n = n ε k. k=1 Then (X n ) n 0 is a martingale for (F n ) n 0.

34 Example 6.4 Let X be a r.v. on (Ω, F, P) such that E[ X ] <. Let F t ) t 0 be any filtration with F t F, for each t 0. Define X t = E[X F t ]. Then (X t ) t 0 is a martingale for (F t ) t 0.

35 7. Stochastic integral in discrete time Given is a discrete filtered probability space (Ω, F, (F n ) n 0, P). 1. Stoch. pr., Definition 7.1 A discrete stochastic process (H n ) n=1 is called predictable if H n : (Ω, F n 1 ) (R, B(R)) is measurable. (Short: H n is F n 1 -measurable.)

36 Definition 7.2 Let (X n ) n=0 be an adapted stochastic process and (H n) n=1 be a predictable stochastic process on (Ω, F, (F n ) n 0, P). Then the stochastic integral in discrete time at time n 1 is given by (H X ) n = n H k (X k X k 1 ). k=1 We define (H X ) 0 = 0. The stochastic integral process is given by ((H X ) n ) n=0.

37 Theorem 7.3 Let H be bounded and predictable and X be a martingale on (Ω, F, (F n ) n 0, P). Then the stochastic integral process ((H X ) n ) n=0 is a martingale.

38 Theorem 7.4 (Doob s Optional Stopping Theorem) Let (M t ) t 0 be a martingale in continuous time on (Ω, F, (F t ) t 0, P). Then, for a bounded stopping time T it holds that E[M T ] = M 0. More generally, if S T are bounded stopping, then E[M T F S ] = M S a.s.

39 8. Reflection principle, passage, running maximum/minimum 1. Stoch. pr., Definition 8.1 (Markov property) Brownian motion satisfies the Markov property, that means, for all t 0 and s > 0: P(W t+s y F t ) = P(W t+s y W t ) a.s.

40 Reflection principle: Recall that T a = inf{t > 0 : W t = a}. 1. Stoch. pr., P(T a < t) = P(T a < t) = 2P(W t > a) (8.1)

41 In all the following (W t ) t 0 is a standard Brownian motion on (Ω, F, (F t ) t 0, P). Theorem 8.2 (Strong Markov property) The BM (W t ) t 0 satisfies the strong Markov property: for each stopping time τ with P(τ < ) = 1 it holds that Ŵ t = W τ+t W τ,, t 0 is a standard Brownian motion independent of F τ. We will see later that P(T a < ) = 1 for all a R.

42 Theorem 8.3 (Reflection principle) Let a 0 and T a be the passage time for a. { W t W t = 2W Ta W t = 2a W t for t T a for t T a Then ( W t ) t 0 is again a standard Brownian motion.

43 Lemma 8.4 P(sup t 0 W t = + and inf t 0 W t = ) = 1

44 Theorem 8.5 (Recurrence) P(T a < ) = 1 for all a R.

45 Theorem 8.6 (Distribution of passage time) Let a 0. The density of T a is given by f Ta (t) = a 2π t 3 2 e a2 2t, which is the density of an invers-gamma-distribution with parameters 1 a2 2 and 2. In particular, we have that E[T a] = +.

46 Running maximum and minimum of Brownian motion: 1. Stoch. pr., Let M t = max 0 s t W s and m t = min 0 s t W s. Theorem For each a > 0: P(M t a) = P(T a t) = P(T a < t) = 2P(W t > a). 2 For each a < 0: P(m t a) = 2P(W t < a).

47 Theorem 8.8 (Joint distribution) Let y x 0. Then P(W t x, M t y) = P(W t 2y x). This implies that the joint distribution of (W t, M t ) has the following density: for y 0, x y 2 (2y x) f W,M (x, y) = e (2y x)2 2t. π t 3 2

48 Let a < 0 < b and let τ be the first time to leave an interval (a, b) (exit time), i.e., τ = inf{t > 0 : W t / (a, b)}. We have that τ = min(t a, T b ) = T a T b.

49 Theorem 8.9 (Exit time) Let a < 0 < b and τ = T a T b as above. Then P(τ < ) = 1 and E[τ] = ab. Moreover P(W τ = b) = P(W τ = a) = a a + b = a a + b b a + b = b a + b

50 9. Quadratic variation and path properties of BM 1. Stoch. pr., Definition 9.1 A stochastic process has finite quadratic variation if there exists an a.s. finite stochastic process ([X, X ] t ) t 0 such that for all t and partitions π n = {t0 n,..., tn m n } with 0 = t0 n < tn 1 < < tn m n = t and δ n = max i=1,...,mn (ti n ti 1 n ) 0 we have that, for n, V 2 t (π n ) := m n i=1 (X t n i X t n i 1 ) 2 [X, X ] t in probability. (9.1)

51 Theorem 9.2 Let (W t ) t 0 be a Brownian motion. Then Vt 2 (π n ) t in L 2, that means lim E[(V t 2 (π n ) t) 2 ] = 0. n Therefore it follows that V 2 t (π n ) t in probability. Hence [W, W ] t = t a.s.

52 Corollary 9.3 (W t ) t 0 has a.s. paths of infinite variation on each interval.

53 Corollary 9.4 For almost all ω the path t W t (ω) is not monotone in each interval.

54 Remark 9.5 Almost all paths of (W t ) t 0 are not differentiable on each interval.

55 10. Stochastic integral by Ito 1. Stoch. pr., For the whole chapter let (W t ) t 0 be a standard Brownian motion on (Ω, F, (F t ) t 0, P). Everything will be based on this filtered probability space. Recall: Riemann integral: b a f (t)dt = lim δn 0 n f (ξi n )(ti n ti 1) n i=1 where a = t n 0 < tn 1 < < tn n = b, t n i 1 ξn i t n i and δ n = max i=1,...,n (t n i t n i 1 ).

56 Riemann-Stieltjes-integral: Let f be a continuous function and g be a function of bounded variation. Then the following limit exists and is called R-S-integral of f with respect to g: b a f (t)dg(t) := lim δn 0 n f (ti 1)(g(t n i n ) g(ti 1)), n i=1 where a = t n 0 < tn 1 < < tn n = b and δ n = max i=1,...,n (t n i t n i 1 ).

57 Theorem 10.1 Fix [0, t] and let, for each n, 0 = t n 0 < tn 1 < < tn n = t. If g is a function such that lim δ n 0 n f (ti 1)(g(t n i n ) g(ti 1)) n i=1 exists for each continuous function f : [0, t] R, then g is of bounded variation on [0, t]. Remark 10.2 No pathwise R-S-integral for Brownian motion!!!

58 Ito integral with simple integrands: Definition 10.3 A stochastic process (X t ) 0 t T is called simple predictable integrand if n 1 X t = ξ 0 1I [t0,t 1](t) + ξ i 1I (ti,t i+1 ](t), where 0 = t 0 < t 1 < < t n = T and ξ 0,..., ξ n 1 are random variables such that ξ i is F ti -measurable and E[ξi 2 ] <, for i = 1,..., n 1. For each t T the Ito integral at time t for a simple integrand X is given by: t 0 i=1 n 1 X s dw s =: (X W ) t = ξ i (W ti+1 t W ti t). i=0

59 Remark 10.4 If t < T is such that t k t < t k+1 t n = T, this means k 1 (X W ) t = ξ i (W ti+1 W ti ) + ξ k (W t W tk ). i=0 For T this means n 1 (X W ) T = ξ i (W ti+1 W ti ) i=0

60 Theorem 10.5 (Properties of Ito integral for simple integrands) 1 Linearity: X, Y simple, α, β constants, then T 0 T T (αx t + βy t ) dw t = α X t dw t + β Y t dw t Let 0 a < b. Then T 0 1I (a,b](t)dw t = W b W a and T 0 1I (a,b](t)x t dw t = b a X tdw t. 3 Zero mean: E[ T 0 X tdw t ] = 0 [ ( ) ] T 2 4 Isometry: E 0 X tdw t = T 0 E[X t 2 ]dt

61 (More) general integrands: 1. Stoch. pr., Lemma 10.6 Let (X t ) 0 t T be a bounded and progressively measurable stochastic process. Then there exists a sequence of simple predictable processes (Xt m ) 0 t T, m 1, such that [ ] T lim E m 0 X m t X t 2 dt = 0. (10.1)

62 Lemma 10.7 Let (X t ) 0 t T be a progressively measurable stochastic process such that [ ] T E Xt 2 dt <. (10.2) 0 Then there exists a sequence of simple predictable processes (Xt m ) 0 t T, m 1, such that (10.1) holds, i.e., [ ] T lim E m 0 X m t X t 2 dt = 0.

63 Theorem 10.8 (Ito integral for general integrands with (10.2)) Let (X t ) 0 t T be a progressively measurable stochastic process such that (10.2) holds, i.e., E[ T 0 X 2 t dt] <. Then the stochastic integral T 0 X tdw t is defined and has the following properties: 1 Linearity: X, Y as above, then T 0 T T (αx t + βy t ) dw t = α X t dw t + β Y t dw t Let 0 a < b. Then T 0 1I (a,b](t)dw t = W b W a and T 0 1I (a,b](t)x t dw t = b a X tdw t. 3 Zero mean: E[ T 0 X tdw t ] = 0 [ ( ) ] T 2 4 Isometry: E 0 X tdw t = T 0 E[X t 2 ]dt

64 Theorem 10.9 (Ito integral for even more general integrands) Let (X t ) 0 t T be a progressively measurable stochastic process such that ( ) T P Xt 2 dt < = 1. (10.3) 0 Then the stochastic integral T 0 X tdw t is defined and has satisfies (1) and (2) of Theorem Remark Attention: in this case (3) Zero-Mean and (4) Isometry are not satisfied in general!!! If, additionally the stronger condition (10.2) holds, then we are in the case of Theorem 10.8 and Zero-Mean and Isometry hold.

65 Corollary Let X be continuous and adapted. Then T 0 X tdw t exists. In particular, for any continuous function f : R R the stochastic integral T 0 f (W t)dw t exists. Remark The Ito integral is not monotone.

66 Ito integral process: Let X be progressively measurable such that (10.3) holds, i.e., T P( Xs 2 ds < ) = 1. 0 Then t 0 X sdw s is defined for all t T. This gives a stochastic process (I t ) 0 t T with I t = t 0 X s dw s. There exists a modification with continuous paths: we always take this modification.

67 Definition A martingale (M t ) 0 t T is called quadratic integrable on [0, T ] if Theorem sup E[Mt 2 ] <. t [0,T ] Let X be progressively measurable such that (10.2) holds, i.e., E[ T 0 X s 2 ds] <. Then (I t ) 0 t T, where I t = t 0 X sdw s, is a quadratic integrable martingale.

68 Corollary Let f : R R be a continuous and bounded function. Then (I t ) 0 t T, where I t = t 0 f(w s)dw s, is a quadratic integrable martingale.

69 Quadratic variation and covariation of Ito integrals: Theorem Let X be progressively measurable such that (10.3 holds. The quadratic variation of t 0 X sdw s satisfies [ t t ] t X s dw s, X s dw s = Xs 2 ds a.s for all 0 t T.

70 Covariation: Let I t = t 0 X sdw s and Ĩ t = t 0 X s dw s, for 0 t T, and progressively measurable processes X, X such that (10.3) holds. The covariation of I and Ĩ is given by Corollary [I, Ĩ] t := 1 2 It holds that [I, Ĩ] t = t 0 X s X s ds a.s. ( ) [I + Ĩ, I + Ĩ] t [I, I] t [Ĩ, Ĩ] t.

71 Remark Analogously to the quadratic variation the covariation of two stochastic processes Y and X can be defined as follows: m n [Y, X ] t = lim (Y t n n i Y t n i 1 )(X t n i X t n i 1 ), i=1 in probability where 0 = t0 n < tn 1 <... tn m n = t and max i=1,...,mn (ti n ti 1 n ) 0 for n.

72 Integration by parts: Theorem (Integration by parts) Let X, Y be continuous and adapted processes such that (10.3) holds for both processes. Then the following holds, for each 0 t T, t t X t Y t = X 0 Y 0 + X s dy s + Y s dx s + [X, Y ] t. 0 0 Notation: d(x t Y t ) = X t dy t + Y t dx t + d[x, Y ] t

73 : change of variables 1. Stoch. pr.,

74 Theorem 11.1 Let (W t ) t 0 be a Brownian motion and let f : R R be a twice continuously differentiable function (f C 2 (R)). Then, for each t, f (W t ) f (W 0 ) = t 0 f (W s )dw s t 0 f (W s )ds. (11.1) In differential notation (11.1) reads as follows: df (W t ) = f (W t )dw t f (W t )dt

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Brownian Motion. Chapter Definition of Brownian motion

Brownian Motion. Chapter Definition of Brownian motion Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

Math 735: Stochastic Analysis

Math 735: Stochastic Analysis First Prev Next Go To Go Back Full Screen Close Quit 1 Math 735: Stochastic Analysis 1. Introduction and review 2. Notions of convergence 3. Continuous time stochastic processes 4. Information and conditional

More information

Stochastic Integration and Continuous Time Models

Stochastic Integration and Continuous Time Models Chapter 3 Stochastic Integration and Continuous Time Models 3.1 Brownian Motion The single most important continuous time process in the construction of financial models is the Brownian motion process.

More information

Martingales. Chapter Definition and examples

Martingales. Chapter Definition and examples Chapter 2 Martingales 2.1 Definition and examples In this chapter we introduce and study a very important class of stochastic processes: the socalled martingales. Martingales arise naturally in many branches

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

PROBABILITY THEORY II

PROBABILITY THEORY II Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Properties of an infinite dimensional EDS system : the Muller s ratchet

Properties of an infinite dimensional EDS system : the Muller s ratchet Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Math 635: An Introduction to Brownian Motion and Stochastic Calculus

Math 635: An Introduction to Brownian Motion and Stochastic Calculus First Prev Next Go To Go Back Full Screen Close Quit 1 Math 635: An Introduction to Brownian Motion and Stochastic Calculus 1. Introduction and review 2. Notions of convergence and results from measure

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales

More information

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula Group 4: Bertan Yilmaz, Richard Oti-Aboagye and Di Liu May, 15 Chapter 1 Proving Dynkin s formula

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

MARTINGALES: VARIANCE. EY = 0 Var (Y ) = σ 2

MARTINGALES: VARIANCE. EY = 0 Var (Y ) = σ 2 MARTINGALES: VARIANCE SIMPLEST CASE: X t = Y 1 +... + Y t IID SUM V t = X 2 t σ 2 t IS A MG: EY = 0 Var (Y ) = σ 2 V t+1 = (X t + Y t+1 ) 2 σ 2 (t + 1) = V t + 2X t Y t+1 + Y 2 t+1 σ 2 E(V t+1 F t ) =

More information

ECON 2530b: International Finance

ECON 2530b: International Finance ECON 2530b: International Finance Compiled by Vu T. Chau 1 Non-neutrality of Nominal Exchange Rate 1.1 Mussa s Puzzle(1986) RER can be defined as the relative price of a country s consumption basket in

More information

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem Nonlinear representation, backward SDEs, and application to the Principal-Agent problem Ecole Polytechnique, France April 4, 218 Outline The Principal-Agent problem Formulation 1 The Principal-Agent problem

More information

Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance

Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance 15th March-April 19th, 211 at Lappeenranta University of Technology(LUT)-Finland By

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Continuous martingales and stochastic calculus

Continuous martingales and stochastic calculus Continuous martingales and stochastic calculus Alison Etheridge January 8, 218 Contents 1 Introduction 3 2 An overview of Gaussian variables and processes 5 2.1 Gaussian variables.........................

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

Math Probability Theory and Stochastic Process

Math Probability Theory and Stochastic Process Math 288 - Probability Theory and Stochastic Process Taught by Horng-Tzer Yau Notes by Dongryul Kim Spring 217 This course was taught by Horng-Tzer Yau. The lectures were given at MWF 12-1 in Science Center

More information

Advanced Probability

Advanced Probability Advanced Probability Perla Sousi October 10, 2011 Contents 1 Conditional expectation 1 1.1 Discrete case.................................. 3 1.2 Existence and uniqueness............................ 3 1

More information

Generalized Gaussian Bridges of Prediction-Invertible Processes

Generalized Gaussian Bridges of Prediction-Invertible Processes Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

MA50251: Applied SDEs

MA50251: Applied SDEs MA5251: Applied SDEs Alexander Cox February 12, 218 Preliminaries These notes constitute the core theoretical content of the unit MA5251, which is a course on Applied SDEs. Roughly speaking, the aim of

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

On continuous time contract theory

On continuous time contract theory Ecole Polytechnique, France Journée de rentrée du CMAP, 3 octobre, 218 Outline 1 2 Semimartingale measures on the canonical space Random horizon 2nd order backward SDEs (Static) Principal-Agent Problem

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

Lectures 22-23: Conditional Expectations

Lectures 22-23: Conditional Expectations Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation

More information

1. Stochastic Process

1. Stochastic Process HETERGENEITY IN QUANTITATIVE MACROECONOMICS @ TSE OCTOBER 17, 216 STOCHASTIC CALCULUS BASICS SANG YOON (TIM) LEE Very simple notes (need to add references). It is NOT meant to be a substitute for a real

More information

Itô s formula. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Itô s formula. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Itô s formula Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Itô s formula Probability Theory

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

On the submartingale / supermartingale property of diffusions in natural scale

On the submartingale / supermartingale property of diffusions in natural scale On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property

More information

Martingale Theory and Applications

Martingale Theory and Applications Martingale Theory and Applications Dr Nic Freeman June 4, 2015 Contents 1 Conditional Expectation 2 1.1 Probability spaces and σ-fields............................ 2 1.2 Random Variables...................................

More information

Stochastic Calculus. Alan Bain

Stochastic Calculus. Alan Bain Stochastic Calculus Alan Bain 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral and some of its applications. They

More information

Notes 15 : UI Martingales

Notes 15 : UI Martingales Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

1.1 Definition of BM and its finite-dimensional distributions

1.1 Definition of BM and its finite-dimensional distributions 1 Brownian motion Brownian motion as a physical phenomenon was discovered by botanist Robert Brown as he observed a chaotic motion of particles suspended in water. The rigorous mathematical model of BM

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0). Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Lecture 12: Diffusion Processes and Stochastic Differential Equations

Lecture 12: Diffusion Processes and Stochastic Differential Equations Lecture 12: Diffusion Processes and Stochastic Differential Equations 1. Diffusion Processes 1.1 Definition of a diffusion process 1.2 Examples 2. Stochastic Differential Equations SDE) 2.1 Stochastic

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Introduction to stochastic analysis

Introduction to stochastic analysis Introduction to stochastic analysis A. Guionnet 1 2 Department of Mathematics, MIT, 77 Massachusetts Avenue, Cambridge, MA 2139-437, USA. Abstract These lectures notes are notes in progress designed for

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Risk-Minimality and Orthogonality of Martingales

Risk-Minimality and Orthogonality of Martingales Risk-Minimality and Orthogonality of Martingales Martin Schweizer Universität Bonn Institut für Angewandte Mathematik Wegelerstraße 6 D 53 Bonn 1 (Stochastics and Stochastics Reports 3 (199, 123 131 2

More information

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012 1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

Advanced Probability

Advanced Probability Advanced Probability University of Cambridge, Part III of the Mathematical Tripos Michaelmas Term 2006 Grégory Miermont 1 1 CNRS & Laboratoire de Mathématique, Equipe Probabilités, Statistique et Modélisation,

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0). Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution

More information

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting

More information

A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Adaptive Multilevel Splitting Algorithm

A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Adaptive Multilevel Splitting Algorithm A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Algorithm F. Cérou 1,2 B. Delyon 2 A. Guyader 3 M. Rousset 1,2 1 Inria Rennes Bretagne Atlantique 2 IRMAR, Université de Rennes

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

Probability Theory I: Syllabus and Exercise

Probability Theory I: Syllabus and Exercise Probability Theory I: Syllabus and Exercise Narn-Rueih Shieh **Copyright Reserved** This course is suitable for those who have taken Basic Probability; some knowledge of Real Analysis is recommended( will

More information