A D VA N C E D P R O B A B I L - I T Y
|
|
- Randall Dixon
- 5 years ago
- Views:
Transcription
1 A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E
2
3 Contents 1 Conditional Expectation Discrete Case Existence and Uniqueness Conditional Jensen s Inequalities Product Measures and Fubini s Theorem Examples of Conditional Expectation Notation for Example Sheet Discrete Time Martingales Optional Stopping Hitting Probabilities for a Simple Symmetric Random Walk Martingale Convergence Theorem Uniform Integrability Backwards Martingales Applications of Martingales Martingale proof of the Radon-Nikodym theorem 26 3 Stochastic Processes in Continuous Time 27
4 4 andrew tulloch 4 Bibliography 29
5 1 Conditional Expectation Let (Ω, F, P) be a probability space. Ω is a set, F is a σ-algebra on Ω, and P is a probability measure on (Ω, F). Definition 1.1. F is a σ-algebra on Ω if it satisfies, Ω F A F ==> A c F (A n ) n 0 is a collection of sets in F then n A n F. Definition 1.2. P is a probability measure on (Ω, F) if P : F [0, 1] is a set function. P( ) = 0, P(Ω) = 1, (A n ) n 0 is a collection of pairwise disjoint sets in F, then P( n A n ) = n P(A n ). Definition 1.3. The Borel σ-algebra B(R) is the σ-algebra generated by the open sets of R. Call O the collection of open subsets of R, then B(R) = {ξ : ξ is a sigma algebra containing O} (1.1) Definition 1.4. A a collection of subsets of Ω, then we write σ(a) = {ξ : ξ a sigma algebra containing A} Definition 1.5. X is a random variable on (Ω, F) if X : Ω > R is a function with the property that X 1 (V) F for all V open sets in R.
6 6 andrew tulloch Exercise 1.6. If X is a random variable then {B R, X 1 (B) F} is a σ-algebra and contains B(R). If (X i, i I) is a collection of random variables, then we write σ(x i, i I) = σ({ω Ω : X i(ω) B},i I,B B(R))) and it is the smallest σ-algebra that makes all the X i s measurable. Definition 1.7. First we define it for the positive simple random variables. E ( n i=1 c i 1(A i ) with c i positive constants, (A i ) F. ) = n P(A i ). (1.2) i=1 We can extend this to any positive random variable X 0 by approximation X as the limit of piecewise constant functions. For a general X, we write X = X + X with X + = max(x, 0), X = max( X, 0). If at least one of E(X + ) or E(X ) is finite, then we define E(X) = E(X + ) + E(X ). We call X integrable if E( X ) <. Definition 1.8. Let A, B F, P(B) > 0. Then P(A B) P(A B) = P(B) E[X B] = E(X1(B)) P(B) Goal - we want to define E(X G) that is a random variable measurable with respect to the σ-algebra G. 1.1 Discrete Case Suppose G is a σ-algebra countably generated (B i)i N is a collection of pairwise disjoint sets in F with B i = Ω. Let G = σ(b i, i N). It is easy to check that G = { j J B j, J N}. Let X be an integrable random variable. Then X = E(X G) = E(X B i ) I(B i ) i N
7 advanced probability 7 (i) X is G-measurable (check). (ii) and so X is integrable. E ( X ) E( X ) (1.3) (iii) G G, then (check). E ( XI(G) = E ( X I(G) )) (1.4) 1.2 Existence and Uniqueness Definition 1.9. A F, A happens almost surely (a.s.) if P(A) = 1. Theorem 1.10 (Monotone Convergence Theorem). If X n 0 is a sequence of random variables and X n X as n a.s, then E(X n ) E(X) (1.5) almost surely as n. Theorem 1.11 (Dominated Convergence Theorem). If (X n ) is a sequence of random variables such that X n Y for Y an integrable random variable, then if X n as X then E(X n ) as E(X). Definition For p [1, ), f measurable functions, then f p = E[ f p ] 1 p (1.6) f = inf{λ : f λa.e.} (1.7) Definition L p = L p (Ω, F, P) = { f : f p < } Formally, L p is the collection of equivalence classes where two functions are equivalent if they are equal a.e. We will just represent an element of L p by a function, but remember that equality is a.e.
8 8 andrew tulloch Theorem The space (L 2, 2 ) is a Hilbert space with U, V >= E(UV). Suppose H is a closed subspace, then f L 2 there exists a unique g H such that f g 2 = inf{ f h 2, h H and f g, h = 0 for all h H. We call g the orthogonal projection of f onto H. Theorem Let (Ω, F, ) be an underlying probability space, and let X be an integrable random variable, and let G F sub σ-algebra. Then there exists a random variable Y such that (i) Y is G-measurable (ii) If A G, E(XI(A) = E(YI(A))) (1.8) and Y is integrable. Moreover, if Y also satisfies the above properties, then Y = Y a.s. Remark Y is called a version of the conditional expectation of X given G and we write G = σ(z) as Y = E(X G). Remark (b) could be replaced by the following condition: for all Z G-measurable, bounded random variables, E(XZ) = E(YZ) (1.9) Proof. Uniqueness - let Y satisfy (a) and (b). If we consider {Y Y > 0} = A, A is G measurable. From (b), E ( (Y Y)I(A) = E(XI(A)) ) E(XI(A)) = 0 and hence P(Y Y > 0)) = 0 which implies that Y Y a.s. Similarly, Y Y a.s. Existence - Complete the following three steps: (i) X L 2 (Ω, F, ) is a Hilbert space with U, V = E(UV). The space L 2 (Ω, G, ) is a closed subspace. X n X(L 2 ) => X n p X => subseqxnk as X => X = lim sup X nk (1.10)
9 advanced probability 9 We can write L 2 (Ω, F, ) = L 2 (Ω, G, ) + L 2 (Ω, G, ) X = Y + Z Set Y = E(X G), Y is G-measurable, A G. E(XI(A)) = EYI(A) + EZI(A) }{{} =0 (ii) If X 0 then Y 0 a.s. Consider A = {Y < 0}, then 0 E(XI(A)) = E(YI(A)) 0 (1.11) Thus P(A) = 0 Y 0 a.s. Let X 0, Set 0 X n = max(x, n) n, so X n L 2 for all n. Write Y n = E(X n G), then Y n 0 a.s., Y n is increasing a.s.. Set Y = lim sup Y n. So Y is G-measurable. We will show Y = E(X G) a.s. For all A G, we need to check E(XI(A)) = E(YI(A)). We know that E(X n I(A)) = E(Y n I(A)), and Y n Y a.s. Thus, by monotone convergence theorem, E(XI(A)) = E(YI(A)). If X is integrable, setting A = Ω, we have Y is integrable. (iii) X is a general random variable, not necessarily in L 2 or 0. Then we have that X = X + + X. We define E(X G) = E(X + G) E(X G). This satisfies (a), (b). Remark If X 0, we can always define Y = E(X G) a.s. The integrability condition of Y may not be satisfied. Definition Let G 0, G 1,... be sub σ-algebras of F. Then they are called independent if for all i, j N, P ( G i G j ) = Π n i=1 P(G i ) (1.12) Theorem (i) If X 0 then E(X G) 0
10 10 andrew tulloch (ii) E(E(X G)) = E(X) (A = Ω) (iii) X is G-measurable implies E(X G) = X a.s. (iv) X is independent of G, then E(X G) = E(X). Theorem 1.21 (Fatau s lemma). X n 0, then for all n, E(lim inf X n ) lim inf E(X n ) (1.13) Theorem 1.22 (Conditional Monotone Convergence). Let X n 0, X n X a.s. Then E(X n G) E(X G) a.s. (1.14) Proof. Set Y n = E(X n G). Then Y n 0 and Y n is increasing. Set Y = lim sup Y n. Then Y is G-measurable. Theorem 1.23 (Conditional Fatau s Lemma). X n 0, then E(lim inf X n G) lim inf E(X n G) a.s. (1.15) Proof. Let X denote the limit inferior of the X n. For every natural number k define pointwise the random variable Y k = inf n k X n. Then the sequence Y 1, Y 2,... is increasing and converges pointwise to X. For k n, we have Y k X n, so that E(Y k G) E(X n G) a.s (1.16) by the monotonicity of conditional expectation, hence E(Y k G) inf n k E(X n G) a.s. (1.17) because the countable union of the exceptional sets of probability zero is again a null set. Using the definition of X, its representation as pointwise limit of the Y k, the monotone convergence theorem for conditional expectations, the last inequality, and the definition of the limit inferior, it follows that almost surely
11 advanced probability 11 ( E lim inf n ) X n G = E(X G) (1.18) ( ) = E lim Y k G (1.19) k = lim k E(Y k G) (1.20) lim k inf n k E(X n G) (1.21) = lim inf n E(X n G) (1.22) Theorem 1.24 (Conditional dominated convergence). TODO 1.3 Conditional Jensen s Inequalities Let X be an integrable random variable such that φ(x) is integrable of φ is non-negative. Suppose G F is a σ-algebra. Then E(φ(X) G) φ(e(x G)) (1.23) almost surely. In particular, if 1 p <, then E(X G) p X p (1.24) Proof. Every convex function can be written as φ(x) = sup i N (a i x + b i ), a i, b i R. Then E(φ(X) G) ae(x G) + b i E(φ(X) G) sup(a i E(X G) + b i ) i N = φ(e(x G) The second part follows from E(X G) p p = E( E(X G) p ) E(E( X p G)) = E( X p ) = X p p (1.25) Proposition 1.25 (Tower Property). Let X L 1, H G F be
12 12 andrew tulloch sub-σ-algebras. Then E(E(X G) H) = E(X H) (1.26) almost surely. Proof. Clearly E(X H) is H-measurable. Let A H. Then E(E(X H) I(A)) = E(XI(A)) = E(E(X G) I(A)) (1.27) Proposition Let X L 1, G F be sub-σ-algebras. Suppose that Y is bounded, G-measurable. Then E(XY G) = YE(X G) (1.28) almost surely. Proof. Clearly YE(X G) is G-measurable. Let A G. Then E(YE(X G) I(A)) = E E(X G) (YI(A)) }{{} G-measurable, bounded = E(XYI(A)) (1.29) Definition A collection A of subsets of Ω is called a π-system if for all A, B A, then A B A. Proposition 1.28 (Uniqueness of extension). Suppose that ξ is a σ- algebra on E. Let µ 1, µ 2 be two measures on (E, ξ) that agree on a π-system generating ξ and µ 1 (E) = µ 2 (E) <. Then µ 1 = µ 2 everywhere on ξ. Theorem Let X L 1, G, H F two sub-σ-algebras. If σ(x, G) is independent of H, then E(X σ(g, H)) = E(X G) (1.30) almost surely.
13 advanced probability 13 Proof. Take A G, B H. E(E(X G) I(A) I(B)) = P(B) E(E(X G) I(A)) = P(B) E(XI(A)) = E(XI(A) I(B)) = E(E(X σ(g, H)) I(A B)) Assume X 0, the general case follows by writing X = X + X. Now, letting F F, we have that µ(f) = E(E(X G) I(F)), and if µ, ν are two measures on (Ω, pf), setting A = {A B, A G, B H}. Then A is a π-system. µ, ν are two measurables that agree on the π-system A and µ(ω) = E(E(X G)) = E(X) = νω <, since X is integrable. Note that A generates σ(g, H). So, by the uniqueness of extension theorem, µ, ν agree everywhere on σ(g, H). Remark If we only had X independent of H and G independent of H, the conclusion can fail. For example, consider coin tosses X, Y independent 0, 1 with probability 1 2, and Z = I(X = Y). 1.4 Product Measures and Fubini s Theorem Definition A measure space (E, ξ, µ) is called σ-finite if there exists sets (S n ) n with S n = E and µ(s n ) < for all n. Let (E 1, ξ 1, µ 1 ) and (E 2, ξ 2, µ 2 ) be two σ-finite measure spaces, with A = {A 1 A 2 : A 1 ξ 1, A 2 ξ 2 } a π-system of subsets of E = E 1 E 2. Define ξ = ξ 1 ξ 2 = σ(a). Definition 1.32 (Product measure). Let (E 1, ξ 1, µ 1 ) and (E 2, ξ 2, µ 2 ) be two σ-finite measure spaces. Then there exists a unique measure µ on (E, ξ) (µ = µ 1 µ 2 ) satisfying µ(a 1 A 2 ) = µ 1 (A 1 )µ 2 (A 2 ) (1.31) for all A 1 ξ 1, A 2 ξ 2.
14 14 andrew tulloch Theorem 1.33 (Fubini s Theorem). Let (E 1, ξ 1, µ 1 ) and (E 2, ξ 2, µ 2 ) be σ-finite measure spaces. Let f 0, f is ξ-measurable. Then µ( f ) = E 1 ( ) f (x 1, x 2 )µ 2 (dx 2 ) µ 1 (dx 1 ) (1.32) E 2 If f is integrable, then x 2 f (x 1, x 2 ) is u 2 -integrable for u 1 -almost all x. Moreover, x 1 E f (x 2 1, x 2 µ 2 (dx 2 ) is µ 1 -integrable and µ( f ) is given by (1.32). 1.5 Examples of Conditional Expectation Definition A random vector (X 1, X 2,..., X n ) R n is called a Gaussian random vector if and only if for all a 1,..., a n R, a 1 X a n X n (1.33) is a Gaussian random variable. (X t ) t 0 is called a Gaussian process if for all 0 t 1 t 2 t n, the vector X t1,..., X tn is a Gaussian random vector. Example 1.35 (Gaussian case). Let (X, Y) e a Gaussian vector in R 2. We want to calculate E(X Y) = E(X σ(y)) = X (1.34) where X = f (Y) with f a Borel function. Let s try f of a linear function X = ay = b, a, b R to be determined. Note that E(X) = E(X ) and E(X X)Y = 0 Cov(X X, Y) = 0 by laws of conditional expectation. Then we have that ae(y) + b = E(X) Cov(X, Y) = av(x) (1.35) TODO - continue inference 1.6 Notation for Example Sheet 1 (i) G H = σ(g, H). (ii) Let X, Y be two random variables taking values in R with joint density f X,Y (x, y) and h : R R be a Borel function such that
15 advanced probability 15 h(x) is integrable. We want to calculate E(h(X) Y) = E(h(X) σ(y)) (1.36) Let g be bounded and measurable. Then E(h(X)g(Y)) = h(x)g(y) f X,Y (x, y)dxdy (1.37) with 0/0 = 0 = h(x)g(y) f X,Y(x, y) f f Y (y) Y (y)dxdy (1.38) ( = h(x) f ) X,Y(x, y) dx g(y) f f Y (y) Y (y)dy (1.39) Set φ(y) = h(x) f X,Y(x,y) dx if f f Y (y) Y (y) > 0, and 0 otherwise. Then we have almost surely, and E(h(X) Y) = φ(y) (1.40) E(h(X) Y) = h(x)ν(y, dx) (1.41) with ν(y, dx) = f X,Y(x,y) f Y (y) I( f Y (y) > 0) dx = f X Y (x y)dx. ν(y, dx) is called the conditional distribution of X given Y = y and f X Y (x y) is the conditional density of X given Y = y.
16
17 2 Discrete Time Martingales Let (Ω, F, P) be a probability space and (E, ξ) a measurable space. Usually E = R, R d, C. For us, E = R. A sequence X = (X n ) n 0 of random variables taking values in E is called a stochastic process. A filtration is an increasing family (F n ) n 0 of sub-σ-algebras of F n, so F n F n+1. Intuitively, F n is the information available to us at time n. To every stochastic process X we associate a filtration called the natural filtration (Fn X ) n 0, Fn X = σ(x k, k n) (2.1) A stochastic process X is called adapted to (F n ) n 0 if X n is F n - measurable for all n. A stochastic process X is called integrable if X n is integrable for all n. Definition 2.1. An adapted integrable process (X n ) n 0 taking values in R is called a (i) martingale if E(X n F m ) = X m for all n m. (ii) super-martingale if E(X n F m ) X m for all n m. (iii) sub-martingale if E(X n F m ) X m for all n m. Remark 2.2. A (sub,super)-martingale with respect to a filtration F n is also a (sub, super)-martingale with respect to the natural filtration of X n (by the tower property)
18 18 andrew tulloch Example 2.3. Suppose (ξ i ) are iid random variables with E(ξ i ) = 0. Set X n = i=1 n ξ i. Then (X n ) is a martingale. Example 2.4. As above, but let (ξ i ) be iid with E(ξ i ) = 1. Then X n = Πi=1 n ξ i is a martingale. Definition 2.5. A random variables T : Ω Z + { } is called a stopping time if {T n} F n for all n. Equivalently, {T = n} F n for all n. Example 2.6. (i) Constant times are trivial stopping times. (ii) A B(R). Define T A = inf{n 0 X n A}, with inf =. Then T A is a stopping time. Proposition 2.7. Let S, T, (T n ) be stopping times on the filtered probability space (Ω, F, (F n ), P). Then S T, S T, inf n T n, lim inf n T n, lim sup n T n are stopping times. Notation. T stopping time, then X T (ω) = X T(ω) (ω). The stopped process X T is defined by Xt T = X T t. F T = {A F A T T F t, t}. Proposition 2.8. (Ω, F, (F n ), P), X = (X n ) n 0 is adapted. (i) S T, stopping times, then F S F T (ii) X T I(T < ) is F T -measurable. (iii) T a stopping time, then X T is adapted (iv) If X is integrable, then X T is integrable. Proof. Let A ξ. Need to show that {X T I(T < ) A} F T. {X T I(T < )} {T t} = s t {T = s} {X s A} F t (2.2) }{{}}{{} F s F t F s F t 2.1 Optional Stopping Theorem 2.9. Let X be a martingale.
19 advanced probability 19 (i) If T is a stopping time, then X T is also a martingale. In particular, E(X T t ) = E(X 0 ) for all t. (ii) (iii) (iv) Proof. By the tower property, it is sufficient to check E(X T t F t 1 ) = E = t 1 i=1 X s I(T = s) F t 1 + E(X t I(T > t 1) F t 1 ) }{{} F s F t 1 t 1 I(T = s) X s + I(t > t 1) X t 1 = X T (t 1) s=0 Since it is a martingale, E(X T t ) = E(X 0 ). Theorem Let X be a martingale. (i) If T is a stopping time, then X T is also a martingale, so in particular E(X T t ) = E(X 0 ) (2.3) (ii) If X T are bounded stopping times, then E(X T F S ) = X S almost surely. Proof. Let S T n. Then X T = (X T X T 1 ) + (X T 1 X T 2 ) + + (X S+1 X S ) + X S = X s + n k=0 (X k+1 X k )I(S k < T). Let A F s. Then E(X T I(A)) = E(X s I(A)) + n k=0 E((X k+1 X k )I(S k < T) I(A)) (2.4) = E(X s I(A)) (2.5) Remark The optimal stopping theorem also holds for super/submartingales with the respective martingale inequalities in the statement.
20 20 andrew tulloch Example Suppose that (ξ i ) i are random variables with P(ξ i = 1) = P(ξ i = 1) = 1 2 (2.6) Set X 0 = 0, X n = n i=1 ξ i. This is a simply symmetric random walk on X n. Let T = inf{n 0 : X n = 1}. Then T < = 1, but T is not bounded. Proposition If X is a positive supermartingale and T is a stopping time which is finite almost surely (P(T < ) = 1), then E(X T ) E(X 0 ) (2.7) Proof. ( ) E(X T ) = E lim inf X t T t lim inf t E(X t T) E(X 0 ) (2.8) 2.2 Hitting Probabilities for a Simple Symmetric Random Walk Let (ξ i ) be iid ±1 equally likely. Let X 0 = 0, X n = n i=1 ξ i. For all x Z let T x = inf{n 0 : X n = x} (2.9) which is a stopping time. We want to explore hitting probabilities (P(T a < T b )) for a, b > 0. If E(T) <, then by (iv) in Theorem 2.10, E(X T ) = E(X 0 ) = 0. E(X T ) = ap(t a < T b ) + bp(t b < T a ) = 0 (2.10) and thus obtain that P(T a < T b ) = b a + b. (2.11) Remains to check E(T) <. We have P(ξ 1 = 1, ξ a+b = 1) = 1 2 a+b.
21 advanced probability Martingale Convergence Theorem Theorem Let X = (X n ) n 0 be a (super-)-martingale bounded in L 1, that is, sup n 0 E( X n ) <. Then X n converges as n almost surely towards an a.s. finite limit X L 1 (F ) with F = σ(f n, n 0). To prove it we will use Doob s trick which counts up-crossings of intervals with rational endpoints. Corollary Let X be a positive supermartingale. Then it converges to an almost surely finite limit as n. Proof. E( X n ) = E(X n ) E(X 0 ) < (2.12) Proof. Let x = (x n ) n be a sequence of real numbers, and let a < b be two real numbers. Let T 0 (x) = 0 and inductively for k 0, S k+1 (x) = inf{n T k (x) : x n a}t k+1 (x) = inf{n S k+1 (x) : x n b} (2.13) with the usual convention that inf =. Define N n ([a, b], x) = sup{k 0 : T k (x) n} - the number of up-crossings of the interval [a, b] by the sequence x by the time n. As n, we have N n ([a, b], x) N([a, b], x) = sup{k 0 : T k (x) < }, (2.14) the total number of up-crossings of the interval [a, b]. Lemma A sequence of rationals x = (x n ) n converges in R = R {± } if and only if N([a, b], x) < for all rationals a, b. Proof. Assume x converges. Then if for some a < b we had that N([a, b], x) =, then lim inf n x n a < b lim sup n x n, which is a contradiction. Then, suppose that x does converge. Then lim inf n x n > lim sup n x n, and so taking a, b rationals between these two numbers gives that N([a, b], x) = as required.
22 22 andrew tulloch Theorem 2.17 (Doob s up-crossing inequality). Let X be a supermartingale and a < b be two real numbers. Then for all n 0, (b a)e(n n ([a, b], X)) E ( (X n a) ) (2.15) Proof. For all k, X Tk X Sk b a (2.16) 2.4 Uniform Integrability Theorem Suppose X L 1. Then the collection of random variables {E(X G)} (2.17) for G F a sub-σ-algebra is uniformly integrable. Proof. Since X L 1, for all ɛ > 0 there exists S > 0 such that if A F and P(A) < δ, then E( X I(A)) ɛ. Set Y = E(X G). Then E( Y ) E( X ). Choose λ < such that E( X ) λδ. Then P( Y λ) E( Y ) λ δ (2.18) by Markov s inequality. Then E( Y I( Y λ)) E(E( X G) I( Y λ)) (2.19) = E( X I( Y λ)) (2.20) ɛ (2.21) Definition A process X = (X n ) n 0 is called a uniformly integrable martingale if it is a martingale and the collection (X n ) is uniformly integrable. Theorem Let X be a martingale. Then the following are equivalent.
23 advanced probability 23 (i) X is a uniformly integrable martingale. (ii) X converges almost surely and in L 1 to a limit X as n. (iii) There exists a random variable Z L 1 such that X n = E(Z F n ) almost surely for all n 0. Theorem 2.21 (Chapter 13 of Williams). Let X n, X L 1 for all n 0 and suppose that X n only if (X n ) is uniformly integrable. Proof. We proceed as follows. as X as n. Then X n converges to X in L 1 if and (i) (ii) Since X is uniformly integrable, it is bounded in L 1 and by the martingale convergence theorem, we get that X n converges almost surely to a finite limit X. By the previous theorem, Theorem 2.21 gives L 1 convergence. (ii) (iii) Set Z = X. We need to show that X n = E(Z F n ) almost surely for all n 0. For all m n by the martingale property we have X n E(X F n ) 1 = E(X m X F n ) 1 X m X 1 0 (2.22) as m. (iii) (i) E(Z F n ) is a martingale by the tower property of conditional expectation. Uniform integrability follows from Theorem Remark If X is UI then X = E(Z F ) a.s where F = σ(f n, n 0). Remark If X is a super/sub-martingale UI, then it converges almost surely and in L 1 to a finite limit X with E(X F n ) ( )( )X n almost surely. Example Let X 1, X 2,... be iid random variables with P(X = 0) = P(X = 2) = 2 1. Set Y n = X 1 X n. Then Y n is a martingale. As E(Y n ) = 1 for all n, we have (Y n ) is bounded in L 1, and it converges almost surely to 0. But E(Y n ) = 1 for all n, and hence it does not converge in L 1.
24 24 andrew tulloch If X is a UI martingale and T is a stopping time, then we can unambiguously define X T = X n I(T = n) + X I(T = ) (2.23) n=0 Theorem 2.25 (Optional stopping for UI martingales). Let X be a UI martingale and let S, T be stopping times with S T. Then E(X T F S ) = X S (2.24) almost surely. Proof. We first show that E(X F T ) = X T almost surely for any stopping time T. First, check that X T L 1. Since X n E( X F n ), we have E( X T ) = Let B F T. Then E( X n I(T = n) + E( X I(T = ))) (2.25) n=0 E( X I(T = n)) n Z + { } = E( X ) (2.26) (2.27) E(I(B) X T ) = E(I(B) I(T = n) X n ) (2.28) n Z + { } = E(I(B) I(T = n) X ) (2.29) n Z + { } = E(I(B) X ) (2.30) where for the second equality we used that E(X F n ) = X n almost surely. Clearly X T isf T -measurable, and hence E(X F T ) = X T almost surely. Using the tower property of conditional expectation, we have
25 advanced probability 25 for stopping times S T (as F S F T ), E(X T F S ) = E(E(X F T ) F S ) (2.31) = E(X F S ) (2.32) = X S (2.33) almost surely. 2.5 Backwards Martingales Let... G 2 G 1 G 0 be a sequence of Applications of Martingales Fill in proof from lecture notes Theorem 2.26 (Kolmogrov s 0 1 law). Let (X i ) i 1 be a sequence of IID random variables. Let F n = σ(x k, k n) and F = n 0 F n. Then F is trivial - that is, every A F has probability P(A) {0, 1}. Proof. Let G n = σ(x k, k n) and A F. Since G n is independent of F n+1, we have that E(I(A) G n ) = P(A) (2.34) Theorem 2.26 (LN ) gives that P(A) = E(I(A) G n ) converges to E(I(A) G ) as n, where G = σ(g n, n 0). Then we deduce that E(I(A) G n ) = I(A) = P(A) as F G. Therefore, P(A) = link to correct theorem Theorem 2.27 (Strong law of large numbers). Let (X i ) i 1 be a sequence of iid random variables in L 1 with µ = E(X i ). Let S n = n i=1 X i and S 0 = 0. Then S n n µ as n almost surely and in L 1. Proof. Theorem 2.28 (Kakutani s product martingale theorem). Let (X n ) n 0 be a sequence of independent non-negative random variables of mean 1. Let M 0 = 1, M n = i=1 n X i for n N. Then (M n ) n 0 is a non-negative martingale and M n M a.s. as n for some random variable M. We set a n=e( Xn), then a n (0, 1]. Moreover, fill in, this is somewhat involved. (i) If n a n > 0, then M n M in L 1 and E(M ) = 1, (ii) If n a n = 0, then M = 0 almost surely.
26 26 andrew tulloch Proof. fill in Martingale proof of the Radon-Nikodym theorem Let P, Q be two probability measures on the measurable space Ω, F. Assume that F is countably generated, that is, there exists a collection of sets (F n ) n N such that F = σ(f N, n N). Then the following are equivalent. (i) P(A) = 0 Q(A) for all A F. That is, Q is absolutely continuous with respect to P and write Q << P (ii) For all ɛ > 0, there exists δ > 0 such that P(A) δ Q(A) ɛ. (iii) There exists a non-negative random variable X such that Q(A) = E P (XI(A)) (2.35) Proof. (i) (ii). If (ii) does not hold, then there exists ɛ > 0 such that for all n 1 there exists a set A n with P(A n ) 1 and Q(A n 2 n ) ɛ. By Borel-Cantelli, we get that P(A n i.o) = 0. Therefore from (i) we get that Q(A n i.o) = 0. But Q(A n i.o) = Q( n k n A k ) = lim n Q( k n A k ) ɛ (2.36) which is a contradiction. (ii) (iii). Consider the filtration F n = σ(f k, k n). Let A n = {H 1 H n H i = F i or F c i } (2.37) then it is easy to see that F n = σ(a n ). Note also that sets in A n are disjoint. continue proof
27 3 Stochastic Processes in Continuous Time Our setting is a probability space (Ω, F, P) a probability space with t J R + = [0, ) Definition 3.1. A filtration on (Ω, F, P) is an increasing collection of σ-algebras (F t ) t J, satisfying F s F t for t s. A stochastic process in continuous time is an ordered collection of random variables on Ω.
28
29 4 Bibliography
Advanced Probability
Advanced Probability Perla Sousi October 10, 2011 Contents 1 Conditional expectation 1 1.1 Discrete case.................................. 3 1.2 Existence and uniqueness............................ 3 1
More informationAdvanced Probability
Advanced Probability University of Cambridge, Part III of the Mathematical Tripos Michaelmas Term 2006 Grégory Miermont 1 1 CNRS & Laboratoire de Mathématique, Equipe Probabilités, Statistique et Modélisation,
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationLectures 22-23: Conditional Expectations
Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationPart III Advanced Probability
Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after
More informationNotes 13 : Conditioning
Notes 13 : Conditioning Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Sections 0, 4.8, 9, 10], [Dur10, Section 5.1, 5.2], [KT75, Section 6.1]. 1 Conditioning 1.1 Review
More informationProbability Theory. Richard F. Bass
Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2
More informationLecture 6 Basic Probability
Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationMartingale Theory and Applications
Martingale Theory and Applications Dr Nic Freeman June 4, 2015 Contents 1 Conditional Expectation 2 1.1 Probability spaces and σ-fields............................ 2 1.2 Random Variables...................................
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationMATH 418: Lectures on Conditional Expectation
MATH 418: Lectures on Conditional Expectation Instructor: r. Ed Perkins, Notes taken by Adrian She Conditional expectation is one of the most useful tools of probability. The Radon-Nikodym theorem enables
More informationADVANCED PROBABILITY: SOLUTIONS TO SHEET 1
ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y
More informationDynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)
Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationMath 6810 (Probability) Fall Lecture notes
Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary
More informationLecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.
Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal
More informationRandom Process Lecture 1. Fundamentals of Probability
Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus
More informationSTOCHASTIC MODELS FOR WEB 2.0
STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes
More informationPROBABILITY THEORY II
Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016
More informationCONVERGENCE OF RANDOM SERIES AND MARTINGALES
CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the
More informationNotes on Measure, Probability and Stochastic Processes. João Lopes Dias
Notes on Measure, Probability and Stochastic Processes João Lopes Dias Departamento de Matemática, ISEG, Universidade de Lisboa, Rua do Quelhas 6, 1200-781 Lisboa, Portugal E-mail address: jldias@iseg.ulisboa.pt
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationSTOR 635 Notes (S13)
STOR 635 Notes (S13) Jimmy Jin UNC-Chapel Hill Last updated: 1/14/14 Contents 1 Measure theory and probability basics 2 1.1 Algebras and measure.......................... 2 1.2 Integration................................
More informationIf Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.
20 6. CONDITIONAL EXPECTATION Having discussed at length the limit theory for sums of independent random variables we will now move on to deal with dependent random variables. An important tool in this
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More informationMATHS 730 FC Lecture Notes March 5, Introduction
1 INTRODUCTION MATHS 730 FC Lecture Notes March 5, 2014 1 Introduction Definition. If A, B are sets and there exists a bijection A B, they have the same cardinality, which we write as A, #A. If there exists
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family
More informationMeasure and integration
Chapter 5 Measure and integration In calculus you have learned how to calculate the size of different kinds of sets: the length of a curve, the area of a region or a surface, the volume or mass of a solid.
More information18.175: Lecture 3 Integration
18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability
More informationAn Introduction to Stochastic Calculus
An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Weeks 3-4 Haijun Li An Introduction to Stochastic Calculus Weeks 3-4 1 / 22 Outline
More informationLecture 21 Representations of Martingales
Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationProbability and Measure
Probability and Measure Robert L. Wolpert Institute of Statistics and Decision Sciences Duke University, Durham, NC, USA Convergence of Random Variables 1. Convergence Concepts 1.1. Convergence of Real
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More informationConditional expectation
Chapter II Conditional expectation II.1 Introduction Let X be a square integrable real-valued random variable. The constant c which minimizes E[(X c) 2 ] is the expectation of X. Indeed, we have, with
More informationECON 2530b: International Finance
ECON 2530b: International Finance Compiled by Vu T. Chau 1 Non-neutrality of Nominal Exchange Rate 1.1 Mussa s Puzzle(1986) RER can be defined as the relative price of a country s consumption basket in
More informationMATH MEASURE THEORY AND FOURIER ANALYSIS. Contents
MATH 3969 - MEASURE THEORY AND FOURIER ANALYSIS ANDREW TULLOCH Contents 1. Measure Theory 2 1.1. Properties of Measures 3 1.2. Constructing σ-algebras and measures 3 1.3. Properties of the Lebesgue measure
More informationNotes 15 : UI Martingales
Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.
More informationStochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik
Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2
More information3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable
More informationJUSTIN HARTMANN. F n Σ.
BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation
More informationFundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales
Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time
More informationFE 5204 Stochastic Differential Equations
Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture
More information1. Probability Measure and Integration Theory in a Nutshell
1. Probability Measure and Integration Theory in a Nutshell 1.1. Measurable Space and Measurable Functions Definition 1.1. A measurable space is a tuple (Ω, F) where Ω is a set and F a σ-algebra on Ω,
More informationErgodic Properties of Markov Processes
Ergodic Properties of Markov Processes March 9, 2006 Martin Hairer Lecture given at The University of Warwick in Spring 2006 1 Introduction Markov processes describe the time-evolution of random systems
More informationLecture 22 Girsanov s Theorem
Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n
More informationA PECULIAR COIN-TOSSING MODEL
A PECULIAR COIN-TOSSING MODEL EDWARD J. GREEN 1. Coin tossing according to de Finetti A coin is drawn at random from a finite set of coins. Each coin generates an i.i.d. sequence of outcomes (heads or
More informationUseful Probability Theorems
Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)
More informationMartingale Theory for Finance
Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.
More information1 Probability space and random variables
1 Probability space and random variables As graduate level, we inevitably need to study probability based on measure theory. It obscures some intuitions in probability, but it also supplements our intuition,
More informationABSTRACT EXPECTATION
ABSTRACT EXPECTATION Abstract. In undergraduate courses, expectation is sometimes defined twice, once for discrete random variables and again for continuous random variables. Here, we will give a definition
More informationIntegration on Measure Spaces
Chapter 3 Integration on Measure Spaces In this chapter we introduce the general notion of a measure on a space X, define the class of measurable functions, and define the integral, first on a class of
More informationInference for Stochastic Processes
Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space
More informationLecture Notes for MA 623 Stochastic Processes. Ionut Florescu. Stevens Institute of Technology address:
Lecture Notes for MA 623 Stochastic Processes Ionut Florescu Stevens Institute of Technology E-mail address: ifloresc@stevens.edu 2000 Mathematics Subject Classification. 60Gxx Stochastic Processes Abstract.
More informationLecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.
1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More information4 Expectation & the Lebesgue Theorems
STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does
More informationStochastics Process Note. Xing Wang
Stochastics Process Note Xing Wang April 2014 Contents 0.1 σ-algebra............................................ 3 0.1.1 Monotone Class Theorem............................... 3 0.2 measure and expectation....................................
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationHILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define
HILBERT SPACES AND THE RADON-NIKODYM THEOREM STEVEN P. LALLEY 1. DEFINITIONS Definition 1. A real inner product space is a real vector space V together with a symmetric, bilinear, positive-definite mapping,
More informationWahrscheinlichkeitstheorie Prof. Schweizer Additions to the Script
Wahrscheinlichkeitstheorie Prof. Schweizer Additions to the Script Thomas Rast WS 06/07 Warning: We are sure there are lots of mistakes in these notes. Use at your own risk! Corrections and other feedback
More informationWiener Measure and Brownian Motion
Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationLecture 4: Conditional expectation and independence
Lecture 4: Conditional expectation and independence In elementry probability, conditional probability P(B A) is defined as P(B A) P(A B)/P(A) for events A and B with P(A) > 0. For two random variables,
More informationBrownian Motion and Stochastic Calculus
ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s
More information7 Convergence in R d and in Metric Spaces
STA 711: Probability & Measure Theory Robert L. Wolpert 7 Convergence in R d and in Metric Spaces A sequence of elements a n of R d converges to a limit a if and only if, for each ǫ > 0, the sequence a
More informationStochastic Processes in Discrete Time
Stochastic Processes in Discrete Time May 5, 2005 Contents Basic Concepts for Stochastic Processes 3. Conditional Expectation....................................... 3.2 Stochastic Processes, Filtrations
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationChapter II Independence, Conditional Expectation
Chapter II Independence, Conditional Expectation The notions of statistical independence, conditional expectation, and conditional probability are the cornerstones of probability theory. Since probabilities
More informationCHAPTER 1. Martingales
CHAPTER 1 Martingales The basic limit theorems of probability, such as the elementary laws of large numbers and central limit theorems, establish that certain averages of independent variables converge
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More information1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),
Real Analysis 2, Math 651, Spring 2005 April 26, 2005 1 Real Analysis 2, Math 651, Spring 2005 Krzysztof Chris Ciesielski 1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer
More informationNotes 1 : Measure-theoretic foundations I
Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,
More informationProbability and Measure
Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability
More informationMartingales, standard filtrations, and stopping times
Project 4 Martingales, standard filtrations, and stopping times Throughout this Project the index set T is taken to equal R +, unless explicitly noted otherwise. Some things you might want to explain in
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationMath 635: An Introduction to Brownian Motion and Stochastic Calculus
First Prev Next Go To Go Back Full Screen Close Quit 1 Math 635: An Introduction to Brownian Motion and Stochastic Calculus 1. Introduction and review 2. Notions of convergence and results from measure
More informationMATH/STAT 235A Probability Theory Lecture Notes, Fall 2013
MATH/STAT 235A Probability Theory Lecture Notes, Fall 2013 Dan Romik Department of Mathematics, UC Davis December 30, 2013 Contents Chapter 1: Introduction 6 1.1 What is probability theory?...........................
More informationLecture 5 Theorems of Fubini-Tonelli and Radon-Nikodym
Lecture 5: Fubini-onelli and Radon-Nikodym 1 of 13 Course: heory of Probability I erm: Fall 2013 Instructor: Gordan Zitkovic Lecture 5 heorems of Fubini-onelli and Radon-Nikodym Products of measure spaces
More informationn E(X t T n = lim X s Tn = X s
Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:
More informationMath 735: Stochastic Analysis
First Prev Next Go To Go Back Full Screen Close Quit 1 Math 735: Stochastic Analysis 1. Introduction and review 2. Notions of convergence 3. Continuous time stochastic processes 4. Information and conditional
More informationPreliminaries to Stochastic Analysis Autumn 2011 version. Xue-Mei Li The University of Warwick
Preliminaries to Stochastic Analysis Autumn 2011 version Xue-Mei Li The University of Warwick Typset: October 21, 2011 Chapter 1 Preliminaries For completion and easy reference we include some basic concepts
More informationExercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).
Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution
More informationMeasurable functions are approximately nice, even if look terrible.
Tel Aviv University, 2015 Functions of real variables 74 7 Approximation 7a A terrible integrable function........... 74 7b Approximation of sets................ 76 7c Approximation of functions............
More information2 n k In particular, using Stirling formula, we can calculate the asymptotic of obtaining heads exactly half of the time:
Chapter 1 Random Variables 1.1 Elementary Examples We will start with elementary and intuitive examples of probability. The most well-known example is that of a fair coin: if flipped, the probability of
More informationMeasure Theoretic Probability. P.J.C. Spreij
Measure Theoretic Probability P.J.C. Spreij this version: September 16, 2009 Contents 1 σ-algebras and measures 1 1.1 σ-algebras............................... 1 1.2 Measures...............................
More information1.1. MEASURES AND INTEGRALS
CHAPTER 1: MEASURE THEORY In this chapter we define the notion of measure µ on a space, construct integrals on this space, and establish their basic properties under limits. The measure µ(e) will be defined
More informationLecture 11. Probability Theory: an Overveiw
Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the
More informationLecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011
Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding
More informationHarmonic functions on groups
20 10 Harmonic functions on groups 0 DRAFT - updated June 19, 2018-10 Ariel Yadin -20-30 Disclaimer: These notes are preliminary, and may contain errors. Please send me any comments or corrections. -40
More informationMathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )
Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical
More informationSTOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI
STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More information5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing.
5 Measure theory II 1. Charges (signed measures). Let (Ω, A) be a σ -algebra. A map φ: A R is called a charge, (or signed measure or σ -additive set function) if φ = φ(a j ) (5.1) A j for any disjoint
More information