Chapter 1. Poisson processes. 1.1 Definitions

Size: px
Start display at page:

Download "Chapter 1. Poisson processes. 1.1 Definitions"

Transcription

1 Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s<t. A filtration satisfies the usual conditions if it is complete: N 2F t for all t whenever P(N) =and it is right continuous: F t+ = F t for all t, wheref t+ = \ "> F t+". Definition 1.1 Let {F t } be a filtration, not necessarily satisfying the usual conditions. A Poisson process with parameter > is a stochastic process X satisfying the following properties: (1) X =, a.s. (2) The paths of X t are right continuous with left limits. (3) If s<t, then X t X s is a Poisson random variable with parameter (t s). (4) If s<t, then X t X s is independent of F s. Define X t =lim s!t,s<t X s,thelefthandlimitattimet, and X t = X t X t, the size of the jump at time t. Wesayafunctionf is increasing if s<timplies f(s) apple f(t). We use strictly increasing when s<timplies f(s) <f(t). We have the following proposition. 1

2 2 CHAPTER 1. POISSON PROCESSES Proposition 1.2 Let X be a Poisson process. With probability one, the paths of X t are increasing and are constant except for jumps of size 1. There are only finitely many jumps in each finite time interval. Proof. For any fixed s<t,wehavethatx t X s has the distribution of apoissonrandomvariablewithparameter (t s), hence is non-negative, a.s.; let N s,t be the null set of! s where X t (!) <X s (!). The set of pairs (s, t) withs and t rational is countable, and so N = [ s,t2q+ N s,t is also a null set, where we write Q + for the non-negative rationals. For!/2 N, X t X s whenever s<tare rational. In view of the right continuity of the paths of X, thisshowsthepathsofx are increasing with probability one. Similarly, since Poisson random variables only take values in the nonnegative integers, X t is a non-negative integer, a.s. Using this fact for every t rational shows that with probability one, X t takes values only in the nonnegative integers when t is rational, and the right continuity of the paths implies this is also the case for all t. Since the paths have left limits, there can only be finitely many jumps in finite time. It remains to prove that X t is either or 1 for all t. Let t >. If there were a jump of size 2 or larger at some time t strictly less than t,thenforeachn su ciently large there exists apple k n apple 2 n such that X (kn+1)t /2 n X k nt /2n 2. Therefore P(9s <t : X s 2) apple P(9k apple 2 n : X (k+1)t /2 n X kt /2n 2) (1.1) apple 2 n sup P(X (k+1)t /2 n X kt /2 n 2) kapple2 n =2 n P(X t /2 n 2n ) apple 2 n (1 P(X t /2 n =) P(X t /2 n =1)) =2 n 1 e t /2 n ( t /2 n )e t /2 n. We used Definition 1.1(3) for the two equalities. By l Hôpital s rules, (1 e x xe x )/x! asx!. We apply this with x = t /2 n,andseethat the last line of (1.1) tends to as n!1.sincethelefthandsideof(1.1) does not depend on n, it must be. This holds for each t.

3 1.2. STOPPING TIMES Stopping times Throughout this section we suppose we have a filtration {F t } satisfying the usual conditions. Definition 1.3 A random variable T :! [, 1] is a stopping time if for all t, (T <t) 2F t. We say T is a finite stopping time if T<1, a.s. We say T is a bounded stopping time if there exists K 2 [, 1) such that T apple K, a.s. Note that T can take the value infinity. Stopping times are also known as optional times. Given a stochastic process X, wedefinex T (!) tobeequaltox(t (!),!), that is, for each! we evaluate t = T (!) andthenlookatx(,!)atthis time. Proposition 1.4 Suppose F t satisfies the usual conditions. Then (1) T is a stopping time if and only if (T apple t) 2F t for all t. (2) If T = t, a.s., then T is a stopping time. (3) If S and T are stopping times, then so are S _ T and S ^ T. (4) If T n, n =1, 2,..., are stopping times with T 1 apple T 2 apple, then so is sup n T n. (5) If T n, n =1, 2,..., are stopping times with T 1 T 2, then so is inf n T n. (6) If s and S is a stopping time, then so is S + s. Proof. We will just prove part of (1), leaving the rest as an exercise. Note (T apple t) =\ n N (T < t +1/n) 2F t+1/n for each N. Thus (T apple t) 2 \ N F t+1/n F t+ = F t. It is often useful to be able to approximate stopping times from the right. If T is a finite stopping time, that is, T<1, a.s.,define T n (!) =(k +1)/2 n if k/2 n apple T (!) < (k +1)/2 n. (1.2)

4 4 CHAPTER 1. POISSON PROCESSES Define F T = {A 2F: for each t>, A\ (T apple t) 2F t }. (1.3) This definition of F T,whichissupposedtobethecollectionofeventsthat are known by time T,isnotveryintuitive. Butitturnsoutthatthis definition works well in applications. Proposition 1.5 Suppose {F t } is a filtration satisfying the usual conditions. (1) F T is a -field. (2) If S apple T, then F S F T. (3) If F T + = \ "> F T +", then F T + = F T. (4) If X t has right continuous paths, then X T is F T -measurable. Proof. If A 2F T,thenA c \ (T apple t) =(T apple t) \ [A \ (T apple t)] 2F t,so A c 2F T.Therestoftheproofof(1)iseasy. Suppose A 2F S and S apple T.ThenA \ (T apple t) =[A \ (S apple t)] \ (T apple t). We have A \ (S apple t) 2F t because A 2F S,while(T apple t) 2F t because T is astoppingtime.thereforea \ (T apple t) 2F t,whichproves(2). For (3), if A 2F T +,thena 2F T +" for every ", andsoa\(t +" apple t) 2F t for all t. Hence A\(T apple t ") 2F t for all t,orequivalentlya\(t apple t) 2F t+" for all t. Thisistrueforall", soa \ (T apple t) 2F t+ = F t.thissaysa2f T. (4) Define T n by (1.2). Note (X Tn 2 B) \ (T n = k/2 n )=(X k/2 n 2 B) \ (T n = k/2 n ) 2F k/2 n. Since T n only takes values in {k/2 n : k }, weconclude(x Tn 2 B) \ (T n apple t) 2F t,andso(x Tn 2 B) 2F Tn F T +1/2 n. Hence X Tn is F T +1/2 n measurable. If n m, thenx Tn is measurable with respect to F T +1/2 n F T +1/2 m. Since X Tn! X T,thenX T is F T +1/2 m measurable for each m. Therefore X T is measurable with respect to F T + = F T. 1.3 Markov properties Let us begin with the Markov property.

5 1.3. MARKOV PROPERTIES 5 Theorem 1.6 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let P be a Poisson process with respect to {F t }. If u is a fixed time, then Y t = P t+u P u is a Poisson process independent of F u. Proof. Let G t = F t+u. It is clear that Y has right continuous paths, is zero at time, has jumps of size one, and is adapted to {G t }. Since Y t Y s = P t+u P s+u,theny t Y s is a Poisson random variable with mean (t s) thatisindependentoff s+u = G s. The strong Markov property is the Markov property extended by replacing fixed times u by finite stopping times. Theorem 1.7 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let P be a Poisson process adapted to {F t }. If T is a finite stopping time, then Y t = P T +t P T is a Poisson process independent of F T. Proof. We will first show that whenever m 1, t 1 < <t m, f is a bounded continuous function on R m,anda 2F T,then E [f(y t1,...,y tm ); A] =E [f(p t1,...,p tm )] P(A). (1.4) Once we have done this, we will then show how (1.4) implies our theorem. To prove (1.4), define T n by (1.2). We have E [f(p Tn+t1 P Tn,...,P Tn+tm P Tn ); A] (1.5) 1X = E [f(p Tn+t1 P Tn,...,P Tn+tm P Tn ); A, T n = k/2 n ] = k=1 1X E [f(p t1 +k/2 n P k/2 n,...,p t m+k/2 n P k/2 n); A, T n = k/2 n ]. k=1 Following the usual practice in probability that, means and, we use E [ ; A, T n = k/2 n ] as an abbreviation for E [ ; A \ (T n = k/2 n )]. Since A 2F T,thenA\(T n = k/2 n )=A\((T <k/2 n )\(T <(k 1)/2 n )) 2 F k/2 n. We use the independent increments property of Poisson process and

6 6 CHAPTER 1. POISSON PROCESSES the fact that P t P s has the same law as P t s to see that the sum in the last line of (1.5) is equal to 1X E [f(p t1 +k/2 n P k/2 n,...,p t m+k/2 n P k/2 n)] P(A, T n = k/2 n ) k=1 = 1X E [f(p t1,...,p tm )] P(A, T n = k/2 n ) k=1 = E [f(p t1,...,p tm )] P(A), which is the right hand side of (1.4). Thus E [f(p Tn+t 1 P Tn,...,P Tn+t m P Tn ); A] (1.6) = E [f(p t1,...,p tm )] P(A). Now let n!1. By the right continuity of the paths of P, the boundedness and continuity of f, andthedominatedconvergencetheorem,theleft hand side of (1.6) converges to the left hand side of (1.4). If we take A = in(1.4),weobtain E [f(y t1,...,y tm )] = E [f(p t1,...,p tm )] whenever m 1, t 1,...,t m 2 [, 1), and f is a bounded continuous function on R m.thisimpliesthatthefinitedimensionaldistributionsofy and P are the same. Since Y has right continuous paths, Y is a Poisson process. Next take A 2F T.Byusingalimitargument,(1.4)holdswheneverf is the indicator of a Borel subset B of R d,orinotherwords, whenever B is a cylindrical set. P(Y 2 B,A) =P(Y 2 B)P(A) (1.7) When we discuss the Skorokhod topology, we will be able be more precise for the independence argument. Observe that what was needed for the above proof to work is not that P be a Poisson process, but that the process P have right continuous paths and that P t P s be independent of F s and have the same distribution as P t s. We therefore have the following corollary.

7 1.4. A CHARACTERIZATION 7 Corollary 1.8 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let X be a process adapted to {F t }. Suppose X has paths that are right continuous with left limits and suppose X t X s is independent of F s and has the same law as X t s whenever s<t.ift is a finite stopping time, then Y t = X T +t X T is a process that is independent of F T and X and Y have the same law. 1.4 A characterization Another characterization of the Poisson process is as follows. Let T 1 =inf{t : X t =1}, thetimeofthefirstjump. DefineT i+1 =inf{t >T i : X t =1}, so that T i is the time of the i th jump. Proposition 1.9 The random variables T 1,T 2 T 1,...,T i+1 T i,... are independent exponential random variables with parameter. Proof. In view of Corollary 1.8 it su ces to show that T 1 is an exponential random variable with parameter. If T 1 >t,thenthefirstjumphasnot occurred by time t, sox t is still zero. Hence P(T 1 >t)=p(x t =)=e t, using the fact that X t is a Poisson random variable with parameter t. We can reverse the characterization in Proposition 1.9 to construct a Poisson process. We do one step of the construction, leaving the rest as an exercise. Let U 1,U 2,... be independent exponential random variables with parameter and let T j = P j i=1 U i.define X t (!) =k if T k (!) apple t<t k+1 (!). (1.8) An examination of the densities shows that an exponential random variable has a gamma distribution with parameters and r =1,soT j is a gamma random variable with parameters and j. Thus P(X t <k)=p(t k >t)= Z 1 t e x ( x) k 1 dx. (k)

8 8 CHAPTER 1. POISSON PROCESSES Performing the integration by parts repeatedly shows that Xk 1 P(X t <k)= e i= t ( t)i, i! and so X t is a Poisson random variable with parameter t. We will use the following proposition later. Proposition 1.1 Let {F t } be a filtration satisfying the usual conditions. Suppose X =, a.s., X has paths that are right continuous with left limits, X t X s is independent of F s if s<t, and X t X s has the same law as X t s whenever s<t. If the paths of X are piecewise constant, increasing, all the jumps of X are of size 1, and X is not identically, then X is a Poisson process. Proof. Let T =andt i+1 = inf{t >T i : X t =1}, i =1, 2,...Wewill show that if we set U i = T i T i 1,thentheU i s are i.i.d. exponential random variables. By Corollary 1.8, the U i s are independent and have the same law. Hence it su ces to show U 1 is an exponential random variable. We observe P(U 1 >s+ t) =P(X s+t =)=P(X s+t X s =,X s =) = P(X t+s X s =)P(X s =)=P(X t =)P(X s =) = P(U 1 >t)p(u 1 >s). Setting f(t) =P(U 1 >t), we thus have f(t + s) =f(t)f(s). Since f(t) is decreasing and <f(t) < 1, we conclude P(U 1 >t)=f(t) =e t for some >, or U 1 is an exponential random variable. 1.5 Martingales We define continuous time martingales. Let {F t } be a filtration, not necessarily satisfying the usual conditions.

9 1.5. MARTINGALES 9 Definition 1.11 M t is a continuous time martingale with respect to the filtration {F t } and the probability measure P if (1) E M t < 1 for each t; (2) M t is F t measurable for each t; (3) E [M t F s ]=M s, a.s., if s<t. Part (2) of the definition can be rephrased as saying M t is adapted to F t. If in part (3) = is replaced by, then M t is a submartingale, andifit is replaced by apple, then we have a supermartingale. Taking expectations in Definition 1.11(3), we see that if s<t,thenem s apple E M t is M is a submartingale and E M s E M t if M is a supermartingale. Thus submartingales tend to increase, on average, and supermartingales tend to decrease, on average. If P t is a Poisson process with index,thenp t t is a continuous time martingale. To see this, E [P t t F s ]=E [P t P s F s ] t + P s = E [P t P s ] t + P s = (t s) = t + P s = P s s. We give another example of a martingale. Example 1.12 Recall that given a filtration {F t },eachf t is contained in F, where(, F, P) is our probability space. Let X be an integrable F measurable random variable, and let M t = E [X F t ]. Then and M is a martingale. E [M t F s ]=E [E [X F t ] F s ]=E [X F s ]=M s, We derive the analogs of Doob s inequalities in the stochastic process context. Theorem 1.13 Suppose M t is a martingale or non-negative submartingale with paths that are right continuous with left limits. Then

10 1 CHAPTER 1. POISSON PROCESSES (1) P(sup M s ) apple E M t /. sapplet (2) If 1 <p<1, then p pe E [sup M s ] p apple Mt p. sapplet p 1 Proof. We will do the case where M t is a martingale, the submartingale case being nearly identical. Let D n = {kt/2 n :applek apple 2 n }. If we set N (n) k = M kt/2 n and G (n) k = F kt/2 n,itisclearthat{n (n) k } is a discrete time martingale with respect to {G (n) k }. Let A n = {sup sapplet,s2dn M s > }. By Doob s inequality for discrete time martingales, (n) P(A n )=P(max N kapple2n k > ) apple E N (n) 2 n = E M t. Note that the A n are increasing, and since M t is right continuous, Then [ n A n = {sup M s > }. sapplet P(sup M s > )=P([ n A n )= lim P(A n ) apple E M t /. sapplet n!1 If we apply this with replaced by " and let "!, we obtain (1). The proof of (2) is similar. By Doob s inequality for discrete time martingales, p pe E [sup p (n) p pe ] apple N 2 p 1 n p = Mt p. p 1 N (n) k kapple2 n Since sup kapple2 n N (n) k p increases to sup sapplet M s p by the right continuity of M, (2) follows by Fatou s lemma. Here is an example. If P t is a Poisson process of index,thenp t t is amartingale.soe a(pt t) is a submartingale for any real number a. Then P(sup sapplet P s s A) =P(sup e a(ps s > e aa ) apple e aa E e apt e a t. sapplet

11 1.5. MARTINGALES 11 We know E e apt =exp (e a 1) t. We substitute this in the above and then optimize over a. We will need Doob s optional stopping theorem for continuous time martingales. Theorem 1.14 Let {F t } be a filtration satisfying the usual conditions. If M t is a martingale or non-negative submartingale whose paths are right continuous, sup t E M 2 t < 1, and T is a finite stopping time, then E M T E M. Proof. We do the submartingale case, the martingale case being very similar. By Doob s inequality (Theorem 1.13(1)), Letting t!1,wehavee [sup t E [sup Ms 2 ] apple 4E Mt 2. sapplet M 2 t ] < 1 by Fatou s lemma. Let us first suppose that T < K,a.s.,forsomerealnumberK. Define T n by (1.2). Let N (n) k = M k/2 n, G (n) k = F k/2 n,ands n =2 n T n. By Doob s optional stopping theorem applied to the submartingale N (n) k,wehave E M = E N (n) apple E N (n) S n = E M Tn. Since M is right continuous, M Tn! M T,a.s. Therandomvariables M Tn are bounded by 1+sup t M 2 t,sobydominatedconvergence,e M Tn! E M T. We apply the above to the stopping time T ^ K to get E M T ^K E M. The random variables M T ^K are bounded by 1+sup t Mt 2,sobydominated convergence, we get E M T E M when we let K!1. We present the continuous time version of Doob s martingale convergence theorem. We will see that not only do we get limits as t!1,butalsoa regularity result. Let D n = {k/2 n : k }, D = [ n D n. Theorem 1.15 Let {M t : t 2D}be either a martingale, a submartingale, or a supermartingale with respect to {F t : t 2D}and suppose sup t2d E M t < 1. Then

12 12 CHAPTER 1. POISSON PROCESSES (1) lim t!1 M t exists, a.s. (2) With probability one M t has left and right limits along D. The second conclusion says that except for a null set, if t 2 [, 1), then both lim t2d,t"t M t and lim t2d,t#t M t exist and are finite. The null set does not depend on t. Proof. Martingales are also submartingales and if M t is a supermartingale, then M t is a submartingale, so we may without loss of generality restrict our attention to submartingales. By Doob s inequality, P( sup M t > ) apple 1 E M n. t2d n,tapplen Letting n!1and using Fatou s lemma, This is true for all set. P(sup t2d M t > ) apple 1 sup E M t. t,sowithprobabilityone,{ M t : t 2D}is a bounded Therefore the only way either (1) or (2) can fail is that if for some pair of rationals a<bthe number of upcrossings of [a, b] by{m t : t 2D}is infinite. Recall that we define upcrossings as follows. Given an interval [a, b] andasubmartingalem, ifs 1 =inf{t : M t apple a}, T i =inf{t >S i : M t b}, ands i+1 =inf{t >T i : M t apple a}, thenthenumber of upcrossings up to time u is sup{k : T k apple u}. Doob s upcrossing lemma tells us that if V n is the number of upcrossings by {M t : t 2D n \ [,n]}, then E V n apple E M n b a. Letting n!1and using Fatou s lemma, the number of upcrossings of [a, b] by {M t : t 2D}has finite expectation, hence is finite, a.s. If N a,b is the null set where the number of upcrossings of [a, b] by{m t : t 2D}is infinite and N = [ a<b,a,b2q+ N a,b,whereq + is the collection of non-negative rationals, then P(N) =. If!/2 N, then(1)and(2)hold. As a corollary we have

13 1.5. MARTINGALES 13 Corollary 1.16 Let {F t } be a filtration satisfying the usual conditions, and let M t be a martingale with respect to {F t }. Then M has a version that is also a martingale and that in addition has paths that are right continuous with left limits. Proof. Let D be as in the above proof. For each integer N 1, E M t apple E M N < 1 for t apple N since M t is a submartingale by the conditional expectation form of Jensen s inequality. Therefore M t^n has left and right limits when taking limits along t 2D.SinceNis arbitrary, M t has left and right limits when taking limits along t 2D,exceptforasetof! s that form anullset.let fm t = lim M u. u2d,u>t,u!t It is clear that f M has paths that are right continuous with left limits. Since F t+ = F t and f M t is F t+ measurable, then f M t is F t measurable. Let N be fixed. We will show {M t ; t apple N} is a uniformly integrable family of random variables. Let ">. Since M N is integrable, there exists such that if P(A) <,thene[ M N ; A] <".IfLislargeenough, P( M t >L) apple E M t /L apple E M N /L <.Then E [ M t ; M t >L] apple E [ M N ; M t >L] <", since M t is a submartingale and ( M t >L) 2F t. Uniform integrability is proved. Now let t<n.ifb 2F t, E [ f M t ; B] = lim E [M u; B] =E [M t ; B]. u2d,u>t,u!t Here we used the Vitali convergence theorem and the fact that M t is a martingale. Since f M t is F t measurable, this proves that f M t = M t,a.s. SinceN was arbitrary, we have this for all t. WethushavefoundaversionofM that has paths that are right continuous with left limits. That f M t is a martingale is easy. The following technical result will be used in the next chapter. A function f is increasing if s<timplies f(s) apple f(t). A process A t has increasing paths if the function t! A t (!) isincreasingforalmostevery!.

14 14 CHAPTER 1. POISSON PROCESSES Proposition 1.17 Suppose {F t } is a filtration satisfying the usual conditions and suppose A t is an adapted process with paths that are increasing, are right continuous with left limits, and A 1 =lim t!1 A t exists, a.s. Suppose X is a non-negative integrable random variable, and M t is a version of the martingale E [X F t ] which has paths that are right continuous with left limits. Suppose E [XA 1 ] < 1. Then E Z 1 XdA s = E Z 1 M s da s. (1.9) Proof. First suppose X and A are bounded. Let n>1andletuswrite E R 1 XdA s as 1X E [X(A k/2 n A (k 1)/2 n)]. k=1 Conditioning the k th summand on F k/2 n,thisisequalto h X 1 i E E [X F k/2 n](a k/2 n A (k 1)/2 n). k=1 Given s and n, defines n to be that value of k/2 n such that (k k/2 n.wethenhave E Z 1 XdA s = E Z 1 1)/2 n <sapple M sn da s. (1.1) For any value of s, s n # s as n!1,andsincem has right continuous paths, M sn! M s. Since X is bounded, so is M. By dominated convergence, the right hand side of (1.1) converges to E Z 1 M s da s. This completes the proof when X and A are bounded. We apply this to X ^ N and A ^ N, letn! 1,andusemonotoneconvergenceforthe general case. The only reason we assume X is non-negative is so that the integrals make sense. The equation (1.9) can be rewritten as E Z 1 XdA s = E Z 1 E [X F s ] da s. (1.11)

15 1.5. MARTINGALES 15 We also have E Z t XdA s = E Z t E [X F s ] da s (1.12) for each t. This follows either by following the above proof or by applying Proposition 1.17 to A s^t.

16 16 CHAPTER 1. POISSON PROCESSES

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Point Process Control

Point Process Control Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Probability Theory II. Spring 2016 Peter Orbanz

Probability Theory II. Spring 2016 Peter Orbanz Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Martingales. Chapter Definition and examples

Martingales. Chapter Definition and examples Chapter 2 Martingales 2.1 Definition and examples In this chapter we introduce and study a very important class of stochastic processes: the socalled martingales. Martingales arise naturally in many branches

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

( f ^ M _ M 0 )dµ (5.1)

( f ^ M _ M 0 )dµ (5.1) 47 5. LEBESGUE INTEGRAL: GENERAL CASE Although the Lebesgue integral defined in the previous chapter is in many ways much better behaved than the Riemann integral, it shares its restriction to bounded

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

CHAPTER 1. Martingales

CHAPTER 1. Martingales CHAPTER 1 Martingales The basic limit theorems of probability, such as the elementary laws of large numbers and central limit theorems, establish that certain averages of independent variables converge

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. Institute for Applied Mathematics WS17/18 Massimiliano Gubinelli Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. [version 1, 2017.11.1] We introduce

More information

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi Real Analysis Math 3AH Rudin, Chapter # Dominique Abdi.. If r is rational (r 0) and x is irrational, prove that r + x and rx are irrational. Solution. Assume the contrary, that r+x and rx are rational.

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

Introduction to stochastic analysis

Introduction to stochastic analysis Introduction to stochastic analysis A. Guionnet 1 2 Department of Mathematics, MIT, 77 Massachusetts Avenue, Cambridge, MA 2139-437, USA. Abstract These lectures notes are notes in progress designed for

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Properties of an infinite dimensional EDS system : the Muller s ratchet

Properties of an infinite dimensional EDS system : the Muller s ratchet Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

Martingales, standard filtrations, and stopping times

Martingales, standard filtrations, and stopping times Project 4 Martingales, standard filtrations, and stopping times Throughout this Project the index set T is taken to equal R +, unless explicitly noted otherwise. Some things you might want to explain in

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and

More information

Branching processes. Chapter Background Basic definitions

Branching processes. Chapter Background Basic definitions Chapter 5 Branching processes Branching processes arise naturally in the study of stochastic processes on trees and locally tree-like graphs. After a review of the basic extinction theory of branching

More information

Notes on uniform convergence

Notes on uniform convergence Notes on uniform convergence Erik Wahlén erik.wahlen@math.lu.se January 17, 2012 1 Numerical sequences We begin by recalling some properties of numerical sequences. By a numerical sequence we simply mean

More information

Chapter 5. Weak convergence

Chapter 5. Weak convergence Chapter 5 Weak convergence We will see later that if the X i are i.i.d. with mean zero and variance one, then S n / p n converges in the sense P(S n / p n 2 [a, b])! P(Z 2 [a, b]), where Z is a standard

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

7 Convergence in R d and in Metric Spaces

7 Convergence in R d and in Metric Spaces STA 711: Probability & Measure Theory Robert L. Wolpert 7 Convergence in R d and in Metric Spaces A sequence of elements a n of R d converges to a limit a if and only if, for each ǫ > 0, the sequence a

More information

Analysis Finite and Infinite Sets The Real Numbers The Cantor Set

Analysis Finite and Infinite Sets The Real Numbers The Cantor Set Analysis Finite and Infinite Sets Definition. An initial segment is {n N n n 0 }. Definition. A finite set can be put into one-to-one correspondence with an initial segment. The empty set is also considered

More information

4 Expectation & the Lebesgue Theorems

4 Expectation & the Lebesgue Theorems STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

Advanced Probability

Advanced Probability Advanced Probability University of Cambridge, Part III of the Mathematical Tripos Michaelmas Term 2006 Grégory Miermont 1 1 CNRS & Laboratoire de Mathématique, Equipe Probabilités, Statistique et Modélisation,

More information

Chapter 6. Markov processes. 6.1 Introduction

Chapter 6. Markov processes. 6.1 Introduction Chapter 6 Markov processes 6.1 Introduction It is not uncommon for a Markov process to be defined as a sextuple (, F, F t,x t, t, P x ), and for additional notation (e.g.,,, S,P t,r,etc.) tobe introduced

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

g 2 (x) (1/3)M 1 = (1/3)(2/3)M.

g 2 (x) (1/3)M 1 = (1/3)(2/3)M. COMPACTNESS If C R n is closed and bounded, then by B-W it is sequentially compact: any sequence of points in C has a subsequence converging to a point in C Conversely, any sequentially compact C R n is

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

CONVERGENCE OF RANDOM SERIES AND MARTINGALES CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

On the submartingale / supermartingale property of diffusions in natural scale

On the submartingale / supermartingale property of diffusions in natural scale On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

3 Integration and Expectation

3 Integration and Expectation 3 Integration and Expectation 3.1 Construction of the Lebesgue Integral Let (, F, µ) be a measure space (not necessarily a probability space). Our objective will be to define the Lebesgue integral R fdµ

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Homework 1 due on Monday September 8, 2008

Homework 1 due on Monday September 8, 2008 Homework 1 due on Monday September 8, 2008 Chapter 1, Problems 4, 5, 7. Also solve the following problems (A) Let E be the collection of intervals in = (, ) of type (r, ) where r is an arbitrary rational

More information

2.2 Some Consequences of the Completeness Axiom

2.2 Some Consequences of the Completeness Axiom 60 CHAPTER 2. IMPORTANT PROPERTIES OF R 2.2 Some Consequences of the Completeness Axiom In this section, we use the fact that R is complete to establish some important results. First, we will prove that

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

THEOREMS, ETC., FOR MATH 515

THEOREMS, ETC., FOR MATH 515 THEOREMS, ETC., FOR MATH 515 Proposition 1 (=comment on page 17). If A is an algebra, then any finite union or finite intersection of sets in A is also in A. Proposition 2 (=Proposition 1.1). For every

More information

The Uniform Integrability of Martingales. On a Question by Alexander Cherny

The Uniform Integrability of Martingales. On a Question by Alexander Cherny The Uniform Integrability of Martingales. On a Question by Alexander Cherny Johannes Ruf Department of Mathematics University College London May 1, 2015 Abstract Let X be a progressively measurable, almost

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

Convergence of Local Supermartingales

Convergence of Local Supermartingales Convergence of Local Supermartingales Martin Larsson Johannes Ruf August 20, 2018 Abstract We characterize the event of convergence of a local supermartingale. Conditions are given in terms of its predictable

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

1 Independent increments

1 Independent increments Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

ECON 2530b: International Finance

ECON 2530b: International Finance ECON 2530b: International Finance Compiled by Vu T. Chau 1 Non-neutrality of Nominal Exchange Rate 1.1 Mussa s Puzzle(1986) RER can be defined as the relative price of a country s consumption basket in

More information

ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS

ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS 1. Cardinal number of a set The cardinal number (or simply cardinal) of a set is a generalization of the concept of the number of elements

More information

Insert your Booktitle, Subtitle, Edition

Insert your Booktitle, Subtitle, Edition C. Landim Insert your Booktitle, Subtitle, Edition SPIN Springer s internal project number, if known Monograph October 23, 2018 Springer Page: 1 job: book macro: svmono.cls date/time: 23-Oct-2018/15:27

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Notes 15 : UI Martingales

Notes 15 : UI Martingales Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.

More information

STOCHASTIC MODELS FOR WEB 2.0

STOCHASTIC MODELS FOR WEB 2.0 STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes

More information

Martingale Theory and Applications

Martingale Theory and Applications Martingale Theory and Applications Dr Nic Freeman June 4, 2015 Contents 1 Conditional Expectation 2 1.1 Probability spaces and σ-fields............................ 2 1.2 Random Variables...................................

More information

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions 18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Optimal stopping for non-linear expectations Part I

Optimal stopping for non-linear expectations Part I Stochastic Processes and their Applications 121 (2011) 185 211 www.elsevier.com/locate/spa Optimal stopping for non-linear expectations Part I Erhan Bayraktar, Song Yao Department of Mathematics, University

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information