Brownian Motion. Chapter Definition of Brownian motion

Size: px
Start display at page:

Download "Brownian Motion. Chapter Definition of Brownian motion"

Transcription

1 Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier proposed Brownian motion as a continuous time model of stock price fluctuations, which improves the discrete time models by allowing the stock price to change at any instant. The mathematical theory of Brownian motion was later developed by Norbert Wiener. Therefore, Brownian motion is also known as Wiener process. 5.1 Definition of Brownian motion Brownian motion is a particular example of a continuous time stochastic process. To formally define a continuous time stochastic process X t : t 0, it requires a probability triple (Ω, F, P) such that X t is F -measurable for all t. As in the discrete case, we shall rarely specify (Ω, F, P) explicitly. On the other hand, the probability space (Ω B, F B, P B ) induced by the Brownian motion can be defined as follows. Definition 5.1. (Probability Space of Brownian Motion) The probability space induced by the Brownian motion, (Ω B, F B, P B ), is given by Ω B = all continous functions ω : [0, ) R F B is the σ-field generated by the finite dimensional sets, i.e., F B = σω : ω(t i ) B i,i = 1,...,n, n N, where B i B and t i [0, ). P B is a probability measure on (Ω B,F B ) such that P B (ω : ω(0) = 0) = 1 and P B ω : ω(t i ) B i,i = 1,...,n = B 1 φ t1...t n (x 1,...,x n )dx n...dx 1.(5.1) B n where φ t1...t n (x 1,x 2,...,x n ) is the joint p.d.f. of a Gaussian random vector with zero mean and covariance matrix (min(t i,t j )) i, j. 83

2 84 5 Brownian Motion With the probability space (Ω B, F B, P B ), the Brownian motions are defined as the elements in Ω B with the following properties. To simplify notations, we denote (Ω B, F B, P B ) by (Ω, F, P). Definition 5.2. (Brownian motion) A real-valued stochastic process W t : t 0 is a P-Brownian motion (or a P-Wiener process) if for some σ R +, under P, 1. (Stationary increment) For each s 0 and t > 0 the random variable W t+s W s is normally distributed with mean zero and variance σ 2 t, 2. (Independent increment) For each n 1 and any time indexes 0 t 0 t 1 t n, the random variables W tr W tr 1 are independent. r=1,...,n 3. (Starts at 0) P(W 0 = 0) = 1. Remark (Standard Brownian Motion) The process W t : t 0 with σ 2 = 1 is called standard Brownian motion. The parameter σ 2 is known as the variance parameter. For simplicity, unless otherwise stated we shall assume that σ 2 = Brownian motion started from x can be obtained as x +W t : t 0. Example 5.1. For any positive constant c, the process W t = c 1/2 W ct is a Brownian motion. To see this, we verify the three conditions in Definition (Stationary increment) For each s 0 and t > 0 the random variable W t+s W s D = c 1/2 (W c(t+s) W cs ) D = c 1/2 N(0,ct) = N(0,t). 2. (Independent increment) For each n 1 and any times 0 = t 0 < t 1 < < t n, the random variables W tr W = tr 1 c 1/2 (W ctr W ctr 1 ) r=1,...,n r=1,...,n are independent since W ctr W ctr 1 are independent. r=1,...,n 3. (Starts at 0) W 0 = c 1/2 W c0 = 0 since W 0 = 0. Remark 5.2. (Fractal) The above example shows that Brownian motion is a fractal, i.e., a set of points that shows self-similar pattern: For example, if c = 0.01, the graph of the rescaled process W t = c 1/2 W ct = 10W 0.01t on t [0,1] is the graph of 10 times of the original Brownian motion W t from t [0,0.01]. However, the shapes of W t t [0,1] and W t t [0,1] are similar, since they are both standard Brownian motions. In other words, it does t matter what scale you examine the Brownian motion, they look just the same. Example 5.2. (Translational Invariance) For any s 0, the process W t = W t+s W s is a standard Brownian motion. The verification is left to Exercise 5.1 Combining conditions 1 and 2 of Definition 5.2, elementary algebra yields the transition probabilities of standard Brownian motion.

3 5.2 Continuity and Non-Differentiability of Brownian Motion 85 Definition 5.3. (Transition Probabilities) The transition probability of a standard Brownian motion from x to y in time t is the conditional probability density of W t+s given W s = x, which is given by p(t, x, y) := 1 exp 2πt (x y)2. 2t Transition probability is a useful tool in establishing many probabilistic properties of Brownian motion. For example, a simpler formula for the joint density can be deduced (compare to (5.1)). Proposition 5.1. (Joint Probability Density) For 0 = t 0 < t 1 < t 2 < < t n, with x 0 = 0, the joint probability density function of W t1,,w tn is given explicitly as f (x 1,, x n ) = n j=1 p ( t j t j 1, x j 1, x j ). The joint distributions of W t1,,w tn for each n 1 and all t 1,, t n are known as the finite dimensional distributions of the process. Proof. By simple transformation, the joint density of W t1 = x 1, W t2 = x 2,..., W tn = x n is equivalent to the joint density of W t1 = x 1, W t2 W t1 = x 2 x 1,..., W tn W tn 1 = x n x n 1. From the stationary and independent increment properties, the latter set of random variables are independent and have probability densities equal to the transition probabilities p ( ) t j t j 1, x j 1, x j, j = 1,...,n, respectively. By the independence, the joint density is given by the product of the transition probabilities and thus Proposition 5.1 follows. The conditional mean and variance can be computed easily using the stationary and independent increment of Brownian motion (See Exercise 5.6). Lemma 5.1 For any s,t > 0, 1. E (W t+s W s W r : 0 r s) = 0, 2. Cov(W s, W t ) = s t. 5.2 Continuity and Non-Differentiability of Brownian Motion In this section we show that Brownian motion is continuous but non-differentiable. In other words, Brownian motion has a rough path. Theorem 5.2. (Properties of Brownian Motions) 1. Brownian motion W t is continuous in probability at any t, i.e., for any ε > 0, lim P( W t+u W t ε) = 0. u 0

4 86 5 Brownian Motion 2. Brownian motion W t is almost surely non-differentiable at any t. Proof. 1. (Continuity). From the translation invariance property of Brownian motion established in Example 5.2, we have W t+s W s D = Wt for any s 0. Thus it suffices to show the continuity of a Brownian motion at t = 0. We need to show that for any ε > 0 which holds because P( W t ε) = 2 lim P( W t ε) = 0, t 0 ( ( )) ε 1 Φ 2(1 Φ( )) = 0, t as t 0, where Φ( ) is the c.d.f. of a standard Normal distribution. In fact, Brownian motion is almost surely continuous, i.e., P(lim u 0 W t+u W t = 0) = 1, but the proof is much more involved. 2. (Non-differentiability). Again, by the translation invariance property, it suffices to focus on the case t = 0. If W t is differentiable at 0, then W t t converges as ] t 0. We will show that, with probability 1, for any n, there exists a t (0, 1n 4 such that W t t > n, which implies that W t is not differentiable at 0. Let Wt A n = > n for some t (0, 1n ] t 4,. Since Wt t > n at t = 1 A n 4 n, we have ( ) ) W1/n 4 P(A n ) P 1/n 4 > n (Put t = 1n4 ( = P n 2 W 1/n 4 > 1 ) (Simple algebra) n ( = P W 1 > 1 ) (Example 5.1 : W t = n 2 W n t/n 4 is a B.M.) 1, as n. Using the fact that A n is a contracting sequence of events, we have ( ) P(W t is non-differentiable at 0) P A n = lim P(A n ) = 1. n n=1

5 5.3 Lévy s construction of Brownian motion 87 Indeed, whether such a bizarre process actually exists is far from obvious. We will devote next section to show the existence of B.M. by explicitly constructing a stochastic process that satisfies Definition Lévy s construction of Brownian motion In this section we present a construction of B.M. due to Lévy s polygonal approximation A polygonal approximation Levy s idea is to construct a path of Brownian motion by a limit of polygonal interpolation. Particularly, by specifying the endpoints of the process and then infilling the values in the middle, we can ensure the almost sure convergence of the approximation. We require the following lemma. Lemma 5.3 Suppose that W t : t 0 is a Brownian motion starts at x 0. Conditional on W t = x 1, the probability density function of W t 2 is p t 2 (x) := 2 πt exp 1 2 ( x 1 2 (x 0 + x 1 ) ) 2. In other words, the W t/2 W 0 = x 0,W t = x 1 N( x 0+x 1 2, t 4 ). Proof. By the property of B.M., (W t/2,w t ) N((x 0,x 0 ),Σ), where Σ = ( 4/t 2/t Direct calculations show that Σ = t 2 /4 and Σ 1 = 2/t 2/t p.d.f. can be expressed as f Wt/2,W t (x,y) = t 4 1 2π Σ e 1 2 (x x 0, y x 0 )Σ 1 (x x 0, y x 0 ) = 1 πt e 1 t ((x x 0) 2 +(y x) 2 ) ( ) t/2 t/2. t/2 t ). Thus, the joint, the conditional dis- Since the marginal distribution of W t is f Wt (y) = 1 tribution of W t/2 given W t is 2πt e (y x 0 )2 2t f Wt/2 W t =x 1 (x) = f Wt/2,W t (x,x 1 )/ f Wt (x 1 ) = ( 1 e 1 2(t/4) 2π(t/4) x x 0 +x 1 2 ) 2,

6 88 5 Brownian Motion completing the proof. Without loss of generality we take the range of t to be [0, 1]. Lévy s construction builds (inductively) a polygonal approximation to the Brownian motion from a countable collection of independent standard normal random variables. In particular, in the n-th step of the approximation, a process X n = X n (t) on t = [0,1] will be defined on the grid G n = k/2 n k=0,1,...,2 n using the i.i.d. Normal variables Z (k2 n ). As the grid G n becomes finer as n, the B.M. is defined by the limit of X n. The construction is illustrated in Figure 5.1. Levy s Construction of Brownian motion: Initialization. Set X 0 (0) = 0, X 0 (1) = Z(1) and X 0 (t) = tz(1), i.e., X 0 is a linear function on [0, 1] with vertices on the grid G 0 = k/2 0 k=0,1. The inductive Step. Given X n with vertices on the grid G n, we define X n+1 with vertices on G n+1 by infilling more points in the the set G n+1 \G n. In other words, X n+1 takes the same value as X n in the grid G n. That is, denoting X k n X n (k/2 n ) and X k n+1 X n+1(k/2 n+1 ), we set X 2k n+1 = X k n for k = 1,...,2 n. (5.2) It remains to infill Xn+1 2k 1 for k = 1,...,2 n. Using Lemma 5.3 with (x 0,x 1 ) = (Xn+1 2k 2,X 2k n+1 ) and t = 1/2n, the conditional distribution of Xn+1 2k 1 is N ( 1 2 (Xn+1 2k 2 + X n+1 2k ),2 n 2). Therefore, we can generate Xn+1 2k 1 by Xn+1 2k 1 = 1 2k 2 (Xn X n+1 2k ) + 2 (n/2+1) Zn+1 2k 1. (5.3) Thus, we obtain X n+1 (t) on the grid G n+1. Next, define X n+1 (t) on t [0,1] by joining the points, i.e., for k = 0,...,2 n+1. [ k X n+1 (t) = Xn+1 k + (2n+1 t k)(xn+1 k+1 X n+1 k ) for t 2 n+1, k + 1 ] 2 n+1. Repeat the induction forever (n ). Remark 5.3. The n-th process, X n (t), is linear in each interval [(k 1)2 n, k2 n ], is continuous in t and satisfies X n (0) = 0. It is thus determined by the values X k n : k = 1,, 2 n. By (5.2) and the linear property in the construction, we have 1 2k 2 (Xn X n+1 2k ) = 1 k 1 (Xn + Xn k ) = X n ((2k 1)/2 n+1 ). 2 Thus (5.3) can be written as

7 5.3 Lévy s construction of Brownian motion 89 Fig. 5.1 Lévy s sequence of polygonal approximations to Brownian motion. X 2k 1 n+1 = X n((2k 1)/2 n+1 ) + 2 (n/2+1) Z 2k 1 n+1. (5.4) The representation (5.4) will be useful in the next subsection Convergence to Brownian motion To justify the above construction of B.M., it is important to show the existence of lim n X n (t). Since X n (t) is a random function, the limit should be understood as a random function X(t) on (Ω B,F B ). We will show that X n converges uniformly to X almost surely, i.e., P( lim n max t [0,1] (X n(t) X(t)) = 0) = 1. (5.5) Note that the term uniform refers to the operation max t [0,1]. Moreover, we need to verify that the limit X(t) satisfies Definition 5.2, hence a Brownian motion. This is the content of the following theorem. Theorem 5.4. (Convergence of the Polygonal Construction of Brownian Motion) 1. There exists a random function X(t) on (Ω B,F B ) satisfying (5.5). 2. The X(t) satisfies Definition 5.2. Proof. 1. (Existence of Limit). Notice that max t [0,1] X n+1 (t) X n (t) is attained at the vertex t (2k 1)2 (n+1) : k = 1, 2,, 2 n. Using (5.4), we have

8 90 5 Brownian Motion P max X n+1(t) X n (t) 2 4 n t [0,1] ( = P max Z (2k 1)2 (n+1)) 2 n k 2 n 2 n P Z(1) 2 4 n +1. Now using the result of Exercise 5.3 (with t = 1), for x > 0 PZ(1) x 1 x 2π exp x2, 2 and combining this with the fact that exp 2 n +1 2 < 2 2n+2, we obtain that for n 4 2 n P Z(1) 2 4 n +1 2n n +1 exp 2 n π 2n 2 n n+2 < 2 n. Consider now for k > n 4 P max X k(t) X n (t) 2 n 4 +3 t [0,1] P P k 1 max X j+1 (t) X j (t) 2 4 n +3 j=n t [0,1] max t [0,1] k 1 1 j=n Xj+1 (t) X j (t) 2 j 4, j = n,, k 1 2 j 1 2 n+1, where the second inequality follows from the fact that if m j 2 j/4 for all j = n,...,k 1, then k 1 j=n m j k 1 j=n 2 j/4 2 n/4+3. In other words, we have P max X k(t) X n (t) 2 4 n +3 2 n+1 t [0,1] for all k n (The case k = n is trivial). Since the maximum of X k (t) can only increase by the addition of a new vertex, the event on the left is increasing with k, hence

9 5.4 The Reflection Principle and Hitting Times 91 [ ] P max X k(t) X n (t) 2 4 n +3 t [0,1] k n [ ] = lim P max X k(t) X n (t) 2 n n+1. k t [0,1] In particular, for any ε > 0, [ ] lim P max X k(t) X n (t) ε = 0, n t [0,1] k n which implies the existence of the limit. Denote the limit by X(t) = lim n X n (t). 2. (The Limit is B.M.) In the first step of the construction, X 0 (0) = 0 and X 0 (1) = Z(1) have the same distributional property of B.M., i.e., W 0 = 0 and W 1 N(0,1). In each induction step, a new vertex is constructed using the covariance structure of the B.M.. Thus Properties 1-3 in Definition 5.2 are automatically satisfied for X n (t) restricted to the grid G n = k2 n k=0,1,,2 n. Since we don t change the values of X k (t) on t G n for k > n, the same must be true for the limit X on n=1 G n. Before we show Properties 1-3 on the whole interval [0,1], we show the continuity of X(t): For any ε > 0 and n 0, P( X(t + δ) X(t) > ε) P( X(t + δ) X n (t + δ) > ε 3 ) + P( X n(t + δ) X n (t) > ε 3 ) + P( X n(t) X(t) > ε 3 ). Note that the middle term goes to zero as δ 0 by the continuity of X n. Also, from (5.5) the first and the third term can be arbitrarily small because n can be arbitrarily large. Thus P( X(t + δ) X(t) > ε) converges to 0 as δ 0, yielding the continuity of the limit. Finally, note that any real number on [0, 1] can be arbitrarily closely approximated by some elements in n=1 G n. Thus by approximating any 0 < t 1 < t 2 < < t n < 1 from n=1 G n, the continuity of X(t) implies that all Properties hold without restriction for t [0, 1]. 5.4 The Reflection Principle and Hitting Times Having proved that Brownian motion actually exists, we now explore more properties related to B.M.

10 92 5 Brownian Motion Continuous time martingale and Stopping times We have introduced martingale and stopping time in the discrete time context. In fact, similar notions can be defined in continuous time. In this subsection we define continuous time martingale and stopping time and state some useful results. Definition 5.4. (Continuous Time Filtration) The sequence of σ-field F t t 0 is called a filtration if F s F t for all s t. Definition 5.5. (Continuous Time Stochastic Process) A continuous time Stochastic Process is a random function X t t 0 (or X(t) t 0, a function of t), from some abstract measurable space (Ω,F ) to the measurable space (C,σ(C )), where C is some space of functions. Some commonly used space of function includes C k, the space of k-th time differentiable functions, and D, the space of right continuous functions with left limits. Definition 5.6. (Natural Filtration) If the stochastic process X t satisfies X t F t for all t 0, then X t is adapted to F t. If Ft X = σx s,s t, then the stochastic process X t is adapted to Ft X by construction. Hence, Ft X is called the natural filtration of X t. Example 5.3. i) If Z t = t 0 X s ds, then Z t is adapted to F X ii) If M t = max 0 s t W s, then M t is adapted to Ft W. iii)if Z t = Wt+1 2 W t 2, then Z t is NOT adapted to Ft W. Definition 5.7. (Predictable Process) Given a filtration F t t 0, the stochastic process X t t 0 is F t predictable if X t is F t measurable, where F t = s<t F s. Definition 5.8. (Continuous Time Martingale) Let (Ω, F, P) be a probability space with filtration F t t 0. A stochastic process M t is called a (P,F t ) martingale if i) M t is adapted to F t. ii) E( M t ) <. iii)e(m t F s ) = M s if s < t. Example 5.4. Let W t be a standard Brownian motion and Ft W be the natural filtration of the B.M., then i) W t is a (P,Ft W ) martingale. ii) Wt 2 t is a (P,Ft W ) martingale. Definition 5.9. (Stopping Time) Given a stochastic process X t adapted to a filtration F t, a random variable τ is called a stopping time if τ t F t. t.

11 5.4 The Reflection Principle and Hitting Times 93 In other words, with information F t up till time t, we can determine whether or not τ t. Example 5.5. Given a standard Brownian motion W t, consider the first hitting time T a = inft 0 : W t = a. It is easy to see that T a is a stopping time because T a t = s [0,t] : W s = a F t, (the set in the middle depends only on W s : 0 s t). Notice also that, by definition, if T a <, then W Ta = a. Other examples of stopping times include T a := inft 0 : W t = a and T (a,b) := inft 0 : W t / (a,b). Just as for random walks, an example of a random time that is not a stopping time is the last time that a process hits some level. From the property of independent increment, Brownian motion has no memory. That is, if W t : t 0 is a Brownian motion and s 0 is any fixed time, then W t+s W s : t 0 is also a Brownian motion, independent of W r : 0 r s. In fact, we have the following non-trivial property of Brownian motion (the proof is out of scope). Theorem 5.5. (Brownian motion after stopping) If W t is a Brownian motion and τ is a stopping time adapted to Ft W, then for t 0 is also a Brownian motion. W t = W t+τ W τ, Similar to the discrete time case, we have the following theorems about continuous time martingale and stopping time. Theorem 5.6. (Stopped Martingale is a Martingale) If X t is a P,F t martingale and τ is a stopping time adapted to F t, then X t τ is also a P,F t martingale. Theorem 5.7. (Optional Stopping Theorem) Suppose that X t is a P,F t martingale and τ is a stopping time adapted to F t such that the following conditions hold: 1. τ < a.s.. 2. X τ is integrable, i.e., E( X τ ) <. 3. E(X t 1 τ>t ) 0 as t. Then E(X τ ) = E(X 0 ).

12 94 5 Brownian Motion The reflection principle Not surprisingly, there is often much to be gained from exploiting the symmetry inherent in Brownian motion. As a warm-up we calculate the distribution function of T a = mint : W t = a. Lemma 5.8 (Distribution function of first hitting time) Let W t : t 0 be a P- Brownian motion started from W 0 = 0 and let a > 0, then PT a < t = 2PW t > a. Proof. If W t > a, then by continuity of the Brownian path, T a < t. Moreover, from Theorem 5.5, W t+ta W Ta : t 0 is a Brownian motion. So, by symmetry, Thus PW t W Ta > 0 T a < t = 1 2. PW t > a = PT a < t W t W Ta > 0 = PT a < tpw t W Ta > 0 T a < t = 1 2 PT a < t. By verifying the definition of Brownian motions, we have a more refined version of the above idea. Lemma 5.9 (The reflection principle) Let W t : t 0 be a standard Brownian motion and let T be a stopping time. Define W t := Wt, if t T ; 2W T W t, if t > T. then W t : t 0 is also a standard Brownian motion. Notice that if T = T a (the first hitting time on a), then W t reflects the portion of the path after T a about the line x = a (see Figure 5.2). The reflection principle is the key to prove the following result, which is useful to price certain barrier options (see Example 6.12). Lemma 5.10 (Joint distribution of Brownian motion and its maximum) Let M t := max s [0,t] W s, the maximum level reached by Brownian motion in the time interval [0, t]. Then for a > 0, a x and all t 0, ( 2a x P(M t a W t x) = 1 Φ ), t

13 5.4 The Reflection Principle and Hitting Times 95 Fig. 5.2 The reflection principle when T = T a. where Φ(x) is the c.d.f. of standard normal distribution N(0,1). Proof. Notice that M t 0 and is non-decreasing in t and if, for a > 0, T a is defined to be the first hitting time of level a, then M t a = T a t. On T a t, we can take T = T a in Lemma 5.9 (reflection principle) to obtain W t x = 2W Ta W t x = 2a x W t. Thus we have P(M t a W t x) = P(T a t W t x) ( ) = P T a t 2a x W t ) = P (2a x W t (see explanations below) ( 2a x = 1 Φ ). t Note that the Reflection Principle is cleverly used in the the third equality to facilitate the derivation of the joint density: As we focus on a x, we have 2a x a. If 2a x W t, then necessarily W t has hit a before time t. (if W t hasn t hit a, then from definition, W t < a 2a x). Now, since W t has hit level a, T a < t must have happened, so we only need to consider the probability of the event 2a x W t. With the distributional result on the first hitting time T a in Lemma 5.8, we have the following result

14 96 5 Brownian Motion Theorem (Divergence and Return) Brownian motion will almost surely eventually hit any and every real value (no matter how large or negative), i.e., ( ) ( ) P supw s = = P inf W s = = 1. s 0 s 0 Also, no matter how far above the axis, the B.M. will almost surely be back down to zero at some later time. Proof. Note the equivalence of the events T a t = sup s t W s a. Since sup s t W s a is a contracting sequence of events as a increases, and an increasing sequence of events as t increases, we have ( ) P supw s = s 0 ( = lim P a sup s 0 = lim a lim t P ( ) W s a ) sup W s a 0 s t = lim lim P(T a t) a t = lim lim 2P(W t a) (Lemma 5.8) a t = lim lim 2 ( 1 Φ(a/ t) ) a t = lim 2(1 Φ(0)) a = 1. The equality P(inf s 0 W s = ) = 1 can be shown similarly (Exercise 5.5). Lastly we show the second statement. Given any s, from Example 5.2, W t W t+s W s is a B.M. As we just showed W t hits any value, say u, no matter large or negative, then W t W t+tu W Tu is a Brownian motion. The fact that W t hits any value implies that W t must return to Hitting a sloping line For pricing a perpetual American put option,we shall use the following result. Proposition 5.12 (Hitting an Upward Slopping Line) Set T a,b := inft 0 : W t = a+bt, where T a,b is taken to be infinity if no such time exists. Then for θ > 0, a > 0 and b 0 Eexp ( ) θt a,b = exp a b + b 2 + 2θ.

15 5.4 The Reflection Principle and Hitting Times 97 Before we prove Proposition 5.12, we consider the simple case b = 0. Lemma 5.13 Let W t be a Brownian motion and let T a := inft 0 : W t = a. Then for θ > 0, a > 0 Eexp θt a = exp a 2θ. Proof. First assume that a > 0 (The case a < 0 follows by symmetry). We apply the optional stopping theorem to the martingale M t = e σw t 1 2 σ 2t. and the stopping time T a. Since M Ta n is a stopped martingale, we have 1 = E(M 0 ) = E(M Ta 0) = E(M Ta n). (5.6) Note that from Theorem 5.11 that P(T a < ) = 1. Thus, lim n M Ta n = M Ta. Moreover, M Ta n = e σw Ta n 1 2 σ 2 (T a n) e σa, so the Dominant Convergence Theorem (DCT) can be applied. To be specific, taking limit on both sides of (5.6), 1 = lim E(M Ta n n) D.C.T. = E( lim M Ta n n) = E(M Ta ) = e σa E(e 1 2 σ 2 T a ). (5.7) The proof is completed by taking θ = 1 2 σ 2. Proof. (Proof of Proposition 5.12) Fix θ > 0, and for a > 0, b 0, set ψ(a, b) := Eexp θt a,b. (5.8) Note that for any a 1 and a 2 such that a = a 1 +a 2, the first hitting time for a+bt can be decomposed into the sum of two independent first hitting times of a 1 + bt and a 2 + bt, i.e., T a1 +a 2,b = T a1,b + ( T a1 +a 2,b T a1,b) D= Ta1,b + T a2,b, where T a2,b is independent of T a1,b and has the same distribution as T a2,b. (see Figure 5.3). In other words, and this implies (see Exercise 5.24) that ψ(a 1 + a 2, b) = ψ(a 1, b)ψ(a 2, b), ψ(a, b) = exp k(b)a for some function k(b). Since b 0, the process must hit the level a before it can hit the line x = a + bt. Thus T a,b can be decomposed into two parts: T a,b = T a + T bta,b, where T a = mint : W t = a (see Figure 5.4). Writing f Ta for the p.d.f. of T a. Conditioning on T a, we have

16 98 5 Brownian Motion ψ(a, b) = E ( E ( exp )) θt Ta a,b = f Ta (t)e ( exp θt Ta a,b = t ) dt 0 = f Ta (t)exp θte ( exp ) θt bt,b dt 0 = f Ta (t)exp θtexp k(b)btdt 0 = Eexp [θ + k(b)b]t a = exp a 2[θ + k(b)b]. We now have two expressions for ψ(a, b). Equating them gives k 2 (b) = 2θ + 2k(b)b, Note from (5.8) that for θ > 0 we must have ψ(a, b) 1. Thus k(b) must be positive, and we choose k(b) = b + b 2 + 2θ, which completes the proof. Fig. 5.3 In the notation of Proposition 5.12, T a1 +a 2,b = T a1,b + T a2,b where T a2,b has the same distribution as T a2,b.

17 5.5 Variation of Brownian Motions 99 Fig. 5.4 In the notation of Proposition 5.12, T a,b = T a + T bta,b where T bta,b has the same distribution as T bta,b. Remark 5.4. For a real constant µ, we refer to the process W t µ := W t + µt as a Brownian motion with drift µ. Then, in the notation above, T a,b can be interpreted as the first hitting time of the level a by a Brownian motion with drift b. 5.5 Variation of Brownian Motions The notion of variation of a process will be useful in defining stochastic integrals in the next chapter. Definition (Variation of a function) Let Π = (0 = t 0,t 1,...,t N(Π) = T ) be a partition of the interval [0,T ], N(Π) be the number of intervals partitioned by Π, and δ(π) be the mesh of Π, i.e., the length of the longest interval in Π, δ(π) = max t i t i 1. i=1,...,n(π) Then, the p-variation of a function f is defined as lim δ 0 sup Π:δ(Π)=δ N(Π) f (t i ) f (t i 1 ) p. (5.9) i=1

18 100 5 Brownian Motion First we show the convergence of the second variation, or quadratic variation, of Brownian motion. Theorem (Quadratic Variation of Brownian Motion) Let W t be a Brownian motion. Define the quadratic variation (2-variation) w.r.t. Π by S(Π) = N(Π) j=1 2 W t j W t j 1, (5.10) If Π n is a sequence of partitions of [0,T ] such that δ(π n ) 0, then E( S(Π n ) T 2 ) 0. (5.11) Proof. To stress on the dependence of t j s on Π n, we use t n, j to denote the t j of Π n. Since T = N(Π n) j=1 t n, j t n, j 1, we have where S(Π n ) T 2 = = N(Π n ) j=1 N(Π n ) j=1 2 W tn, j W tn, j 1 2 (t n, j t n, j 1 ) δn, 2 j + 2 δ n, j δ n,k, (5.12) j<k δ n, j = W tn, j W tn, j 1 2 (t n, j t n, j 1 ). Since Brownian motion has independent increment, we have E(δ n, j δ n,k ) = E(δ n, j )E(δ n,k ) = 0, if j k. (5.13) Also, using the fact that E(X 2 ) = σ 2 and E(X 4 ) = 3σ 4 for X N(0,σ 2 ) (Exercise 5.11), we have E(δn, 2 j) = E [ W tn, j W tn, j W tn, j W tn, j 1 2 (t n, j t n, j 1 ) + (t n, j t n, j 1 ) 2] = 3(t n, j t n, j 1 ) 2 2(t n, j t n, j 1 ) 2 + (t n, j t n, j 1 ) 2 = 2(t n, j t n, j 1 ) 2. (5.14) Combing (5.13) and (5.14), taking expectation on both sides of (5.12) gives

19 5.5 Variation of Brownian Motions 101 E S(Π n ) T 2 = = N(Π n ) j=1 E(δn, 2 j) + 2 E(δ n, j δ n,k ) j<k N(Π n ) 2(t n, j t n, j 1 ) 2 j=1 N(Π n ) 2δ(Π n ) j=1 = 2δ(Π n )T 0, (t n, j t n, j 1 ) since T is fixed and δ(π n ) 0. Thus the proof is completed. Motivated by the above theorem, we have the following general definition. Definition (Quadratic Variation Process) Suppose that M t is a martingale. The quadratic variation process associated with M t t 0 is the process [M] t t 0 such that for any sequence of partition Π n of [0,T ] with δ(π n ) 0, as n. E N(Π n ) j=1 2 M t j M t j 1 2 [M] T 0, Corollary 5.15 Theorem 5.14 shows that the quadratic variation process of standard Brownian motion W t is [W] T = T. Corollary 5.16 (First Variation of Brownian Motion) The first variation of Brownian motion is for any interval [0,T ]. Proof. Suppose on the contrary that the first variation is not, then there exist some T and some K such that N(Π) lim sup W t j W t j 1 = K <, δ 0 Π:δ(Π)=δ j=1 where Π is a partition of [0,T ]. Then lim δ 0 sup Π:δ(Π)=δ N(Π) W t j W t j 1 2 j=1 lim δ 0 K lim δ 0 sup Π:δ(Π)=δ j=1,...,n(π) N(Π) W t j W t j 1 sup Π:δ(Π)=δ sup W t j W t j 1 Π:δ(Π)=δ 0 (continuity of W t ), j=1 W t j W t j 1

20 102 5 Brownian Motion contradicting Corollary 5.15 that the quadratic variation of B.M. is T. Thus the first variation of B.M. is. 5.6 Exercises Exercise 5.1. Show that the process W t in Example 5.2 is a standard Brownian motion. Exercise 5.2. Based on the probability measure in Definition 5.1, can you deduce property 1 and 2 in Definition 5.2 and the transition probability in Definition 5.3? Exercise 5.3. Using integration by parts, prove that if W t : t 0 is a standard Brownian motion under P, for x > 0, 1 PW t x exp y2 t dy x 2πt 2t x 2π exp x2. 2t Exercise 5.4. Find the third variation of a standard Brownian motion. Exercise 5.5. If W t is a Brownian motion, show that P(inf s 0 W s = ) = 1. Exercise 5.6. Prove Lemma 5.1. Exercise 5.7. Let Z be normally distributed with mean zero and variance one under the measure P. What is the distribution of tz? Is the process X t := tz a Brownian motion? Exercise 5.8. Suppose that W t and W t are independent Brownian motions under the measure P and let ρ [ 1,1] be a constant. Is the process X t := ρw t + 1 ρ 2 W t a Brownian motion? Exercise 5.9. Let W t : t 0 be standard Brownian motion under the measure P. Which of the following are P-Brownian motions? 1. W t : t 0, 2. cw t/c 2 : t 0, where c is some non-zero constant, 3. tw 1 : t 0, 4. W 2t W t : t 0. Justify your answers. Exercise Let W t : t 0 be standard Brownian motion under the measure P, and let F t, t 0, be filtration for this Brownian motion. Show that Wt 2 t is a P,F t martingale. Exercise Suppose that X is normally distributed with mean µ and variance σ 2. Calculate EexpθX and hence evaluate EX 2 and EX 4.

21 5.6 Exercises 103 Exercise Brownian motion is not adequate as a stock market model. First, it has constant mean, whereas the company stocks usually grow at some rates. Moreover, it may be too noisy (variance of the path increments is large) or not noisy enough. We can scale to change the noisiness and we can artificially introduce a drift, but this is still not a good model, since it may take negative values. Suppose that W t : t 0 is standard Brownian motion under P. Define a new processs t : t 0 by S t := µt + σw t where σ > 0 and µ R are constants. Show that for all values of σ > 0, µ R and T > 0 there is a positive probability that S T is negative. Exercise Let W t be a standard Brownian motion and Ft W be the natural filtration of the B.M.. Show that e σw t σ2 2 t is a (P,F W ) martingale. Exercise Let W t : t 0 be standard Brownian motion under P. For a > 0, let T a be the hitting time of level a, i.e., T a = inft 0 : W t = a. Use the fact Eexp θt a = exp a 2θ to calculate 1. ET a, 2. PT a <. Exercise Let W t : t 0 be standard Brownian motion under P and define M t : t 0 by M t := max s [0,t] W s Suppose that x a. Calculate 1. P(M t a W t x), 2. P(M t a W t x). Exercise Let W t : t 0 be standard Brownian motion under P and define M t : t 0 by M t := max s [0,t] W s. 1. Show that M t, W t and M t W t have the same marginal distribution, with density f Mt (z) = 2 ( ) z φ 1 (0, ) (z), t t where φ( ) is the p.d.f. of N(0,1). 2t 2. EM t = π. Exercise Find the probability density function of the stopping time T a = inft 0 : W t = a. Exercise Let W t : t 0 be standard Brownian motion under P. Derive the joint distribution of W t and m t := min s [0,t] W s. (Hint: Consider W t.) Exercise Let W t : t 0 be standard Brownian motion under P, and s [0,t). Show that the conditional distribution of W s given W t = b is Normal and give its mean and variance. t

22 104 5 Brownian Motion Exercise Let (X,Y ) be a pair of random variables with joint density function f X,Y (x,y) = 2 x + y (2 x + y)2 exp 1 2π 2 y x. Show that X and Y are standard normal random variable and that they are uncorrelated but not independent. Exercise Let W t : t 0 be standard Brownian motion under P. For the Gaussian process Y t = exp αtw exp2αt, find its mean and covariance functions, µ(t) = E(Y t ) and γ k = Cov(Y t,y t+k ). Exercise In attempt to show the almost surely continuity of Brownian motion, Keith provided the following argument: Is the argument correct? Explain. P(limW t = 0) t 0 = P ( n 1 lim W t < 1 ) t 0 n ( = lim P lim W n t < 1 ) t 0 n ( = lim P some t, t t, W t < 1 ) n n ( = lim P t R t t W t < 1 ) n n lim P ( m=1 t 1/m W t < 1 ) n n = lim lim ( P t 1/m W t < 1 ) n m n ( lim lim P W 1/m < 1 ) n m n lim lim [Φ( m/n) Φ( m/n)] n m 1. Exercise In this exercise we describe a second proof of the fact P(sup t 0 W t = ) = 1. Let Z := sup t [0, ) W t. 1. Show that for any c > 0, cz has the same distribution as Z. 2. Using the above result, show that, P(Z [1,n)) = 0 and P(Z (1/n,1]) = 0 for all n 1. Thus show that with probability one, Z 0,.

23 5.6 Exercises By noting that Z = 0 W 1 < 0 sup t 0 W t+1 W 1 = 0 and that W t+1 W 1 is a B.M., argue that P(Z = 0) P(W 1 < 0)P( Z = 0) = P(W 1 < 0)P(Z = 0). (5.15) Hence deduce that P(Z = ) = Using similar argument, show that P(inf t [0, ) W t = ) = 1. Exercise Show that if a continuous function f (x) satisfies f (a+b) = f (a) f (b) and f (1) 0, then f (x) = e cx for some c.

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 7 9/5/013 The Reflection Principle. The Distribution of the Maximum. Brownian motion with drift Content. 1. Quick intro to stopping times.

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

1.1 Definition of BM and its finite-dimensional distributions

1.1 Definition of BM and its finite-dimensional distributions 1 Brownian motion Brownian motion as a physical phenomenon was discovered by botanist Robert Brown as he observed a chaotic motion of particles suspended in water. The rigorous mathematical model of BM

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

Continuous martingales and stochastic calculus

Continuous martingales and stochastic calculus Continuous martingales and stochastic calculus Alison Etheridge January 8, 218 Contents 1 Introduction 3 2 An overview of Gaussian variables and processes 5 2.1 Gaussian variables.........................

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

On the quantiles of the Brownian motion and their hitting times.

On the quantiles of the Brownian motion and their hitting times. On the quantiles of the Brownian motion and their hitting times. Angelos Dassios London School of Economics May 23 Abstract The distribution of the α-quantile of a Brownian motion on an interval [, t]

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there

More information

Properties of an infinite dimensional EDS system : the Muller s ratchet

Properties of an infinite dimensional EDS system : the Muller s ratchet Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

MARTINGALES: VARIANCE. EY = 0 Var (Y ) = σ 2

MARTINGALES: VARIANCE. EY = 0 Var (Y ) = σ 2 MARTINGALES: VARIANCE SIMPLEST CASE: X t = Y 1 +... + Y t IID SUM V t = X 2 t σ 2 t IS A MG: EY = 0 Var (Y ) = σ 2 V t+1 = (X t + Y t+1 ) 2 σ 2 (t + 1) = V t + 2X t Y t+1 + Y 2 t+1 σ 2 E(V t+1 F t ) =

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1 Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

1 IEOR 6712: Notes on Brownian Motion I

1 IEOR 6712: Notes on Brownian Motion I Copyright c 005 by Karl Sigman IEOR 67: Notes on Brownian Motion I We present an introduction to Brownian motion, an important continuous-time stochastic process that serves as a continuous-time analog

More information

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 4: Introduction to stochastic processes and stochastic calculus Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t, CHAPTER 2 FUNDAMENTAL CONCEPTS This chapter describes the fundamental concepts in the theory of time series models. In particular, we introduce the concepts of stochastic processes, mean and covariance

More information

5 Birkhoff s Ergodic Theorem

5 Birkhoff s Ergodic Theorem 5 Birkhoff s Ergodic Theorem Birkhoff s Ergodic Theorem extends the validity of Kolmogorov s strong law to the class of stationary sequences of random variables. Stationary sequences occur naturally even

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

MA8109 Stochastic Processes in Systems Theory Autumn 2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013 Norwegian University of Science and Technology Department of Mathematical Sciences MA819 Stochastic Processes in Systems Theory Autumn 213 1 MA819 Exam 23, problem 3b This is a linear equation of the form

More information

1 Math 285 Homework Problem List for S2016

1 Math 285 Homework Problem List for S2016 1 Math 85 Homework Problem List for S016 Note: solutions to Lawler Problems will appear after all of the Lecture Note Solutions. 1.1 Homework 1. Due Friay, April 8, 016 Look at from lecture note exercises:

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

Information and Credit Risk

Information and Credit Risk Information and Credit Risk M. L. Bedini Université de Bretagne Occidentale, Brest - Friedrich Schiller Universität, Jena Jena, March 2011 M. L. Bedini (Université de Bretagne Occidentale, Brest Information

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Notes 15 : UI Martingales

Notes 15 : UI Martingales Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.

More information

The strictly 1/2-stable example

The strictly 1/2-stable example The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such

More information

Numerical Methods with Lévy Processes

Numerical Methods with Lévy Processes Numerical Methods with Lévy Processes 1 Objective: i) Find models of asset returns, etc ii) Get numbers out of them. Why? VaR and risk management Valuing and hedging derivatives Why not? Usual assumption:

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1]. 1 Introduction The content of these notes is also covered by chapter 3 section B of [1]. Diffusion equation and central limit theorem Consider a sequence {ξ i } i=1 i.i.d. ξ i = d ξ with ξ : Ω { Dx, 0,

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

On a class of stochastic differential equations in a financial network model

On a class of stochastic differential equations in a financial network model 1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Stochastic Integration and Continuous Time Models

Stochastic Integration and Continuous Time Models Chapter 3 Stochastic Integration and Continuous Time Models 3.1 Brownian Motion The single most important continuous time process in the construction of financial models is the Brownian motion process.

More information

1. Aufgabenblatt zur Vorlesung Probability Theory

1. Aufgabenblatt zur Vorlesung Probability Theory 24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f

More information

Stochastic Analysis. King s College London Version 1.4 October Markus Riedle

Stochastic Analysis. King s College London Version 1.4 October Markus Riedle Stochastic Analysis King s College London Version 1.4 October 215 Markus Riedle Preface These notes are for a course on Stochastic Analysis at King s College London. Given the limited time and diverse

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Partial differential equation for temperature u(x, t) in a heat conducting insulated rod along the x-axis is given by the Heat equation:

Partial differential equation for temperature u(x, t) in a heat conducting insulated rod along the x-axis is given by the Heat equation: Chapter 7 Heat Equation Partial differential equation for temperature u(x, t) in a heat conducting insulated rod along the x-axis is given by the Heat equation: u t = ku x x, x, t > (7.1) Here k is a constant

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0}

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0} VARIATION OF ITERATED BROWNIAN MOTION Krzysztof Burdzy University of Washington 1. Introduction and main results. Suppose that X 1, X 2 and Y are independent standard Brownian motions starting from 0 and

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

1. Stochastic Processes and Stationarity

1. Stochastic Processes and Stationarity Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture Note 1 - Introduction This course provides the basic tools needed to analyze data that is observed

More information

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012 1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.

More information

Continuous and Discrete random process

Continuous and Discrete random process Continuous and Discrete random and Discrete stochastic es. Continuous stochastic taking values in R. Many real data falls into the continuous category: Meteorological data, molecular motion, traffic data...

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Notes 18 : Optional Sampling Theorem

Notes 18 : Optional Sampling Theorem Notes 18 : Optional Sampling Theorem Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Chapter 14], [Dur10, Section 5.7]. Recall: DEF 18.1 (Uniform Integrability) A collection

More information

Some Tools From Stochastic Analysis

Some Tools From Stochastic Analysis W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click

More information

Introduction to stochastic analysis

Introduction to stochastic analysis Introduction to stochastic analysis A. Guionnet 1 2 Department of Mathematics, MIT, 77 Massachusetts Avenue, Cambridge, MA 2139-437, USA. Abstract These lectures notes are notes in progress designed for

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of

More information

Convoluted Brownian motions: a class of remarkable Gaussian processes

Convoluted Brownian motions: a class of remarkable Gaussian processes Convoluted Brownian motions: a class of remarkable Gaussian processes Sylvie Roelly Random models with applications in the natural sciences Bogotá, December 11-15, 217 S. Roelly (Universität Potsdam) 1

More information

Hardy-Stein identity and Square functions

Hardy-Stein identity and Square functions Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information