Brownian Motion and Conditional Probability
|
|
- Monica Morton
- 5 years ago
- Views:
Transcription
1 Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical significance. It originated as a model of the phenomenon observed by Robert Brown in 1828 that pollen grains suspended in water perform a continual swarming motion and in Bachelier s work in 1900 as a model of the stock market. On the theoretical side, Brownian motion is a Gaussian Markov process with stationary independent increments. We will consider Brownian motion on the interval [0, 1]. Definition Standard Brownian motion on [0, 1] is a real-valued stochastic process B(t), t [0, 1] that has the following properties: (a) B(0) = 0 a.s. (b) Almost surely, t B(t) is continuous in t [0, 1]. (c) For 0 t 0 < t 1 < < t k 1, the increments B(t 1 ) B(t 0 ), B(t 2 ) B(t 1 ),..., B(t n ) B(t n 1 ) are independent. (d) For 0 t < t + h 1, B(t + h) B(t) N(0, h). It is non-trivial to show that such a process exists. We will give two constructions of SBM on [0, 1] due to Kolmogorov and Lévy, respectively and prove the following theorem. Theorem 10.2 (Existence of SBM). SBM on [0, 1], (B(t), t [0, 1]) exists. Moreover, (B(t), t [0, 1]) is almost surely γ-hölder for any fixed γ (0, 1/2). We recall that a function f : [0, 1] R is γ-hölder if f(t) f(s) sup 0s<t1 t s γ < Kolmogorov s Construction of SBM Let Z k, k 0 be i.i.d. N(0, 1) random variables. Let e k (t) := 2 sin πkt, t [0, 1], k 1. Clearly, {e k } k1 is an orthonormal basis for L 2 0 ([0, 1], B, λ) with zero boundary condition. Define, for n 0, n X n (t) := Z k ek(t) πk + Z 0 t, t [0, 1]. 10-1
2 10-2 Week 10: Brownian Motion and Conditional Probability We have, for m > n 0 X n X m 2 2 = where the norm is w.r.t. L 2 ([0, 1], B, λ) and thus m k=n+1 Z 2 k π 2 k 2 sup X n X m 2 2 = Zk 2 mn π 2 0 almost surely as n. k2 k>n Thus X n is a.s. Cauchy in L 2 ([0, 1], B, λ) and has a a.s. limit, say X(t). Thus almost surely X(t) = Z k ek(t) πk + Z 0 t exists a.e. t [0, 1]. For fixed t [0, 1], X n (t) a.s. X(t) implies that X(t) N(0, t sin2 πkt = t). Moreover, (πk) 2 for any countable set C, we have P(X n (t) X(t) t C) = 1. Yet, it is not necessarily true that P(X n (t) X(t) t [0, 1]) = 1. Definition We say that (Ŷ (t)) t [0,1] is a modification of (Y (t)) t [0,1] if P(Y (t) = Ŷ (t)) = 1 for all t [0, 1]. Furthermore, (Ŷ (t)) t [0,1] is a continuous modification of (Y (t)) t [0,1] if t almost surely and P(Y (t) = Ŷ (t)) = 1 for all t [0, 1]. Ŷ (t) is continuous To construct a continuous modification of X(t), t [0, 1], we will use the following theorem. Theorem 10.4 (Kolmogorov s continuity theorem). Let (X t, t [0, 1]) be a stochastic process such that E X t X s α c t s 1+δ for all t, s [0, 1] for some c, α, δ > 0. Then there exists a continuous modification of (X t, t [0, 1]) which is a.s. γ-hölder for γ (0, δ/α). To construct a continuous modification and get SBM on [0, 1], we note that for any 0 t < s 1, we have a.s. e k (u) X(u) = uz 0 + Z k for u = t, s. πk Thus, where σ 2 t,s := (t s) 2 + X(t) X(s) = (t s)z Z k 2(sin πkt sin πks) 2 (πk) 2 2k 0 (t s) 2 + sin πkt sin πks πk k=k 0 +1 N ( 0, σt,s 2 ) 8 (πk) 2 3k 0(t s) π 2 k 0 for any k 0 1 where we used the fact sin x sin y min{2, x y }. Taking k 0 = 1/π t s, we get σ 2 t,s c t s
3 Week 10: Brownian Motion and Conditional Probability 10-3 for some c (0, ). In fact, one can show that σ 2 t,s = t s. Now, if Y N(0, σ 2 ), for any α > 0 we have E Y α = σ α E Z α where Z N(0, 1). Thus we have E X(t) X(s) α c α E Z α t s α/2 for all 0 s < t 1. By Kolmogorov s Continuity theorem with α > 2, δ = (α 2)/2 we can get a continuous modification of X(t), t [0, 1] which is a.s. γ-hölder with 0 < γ < 1/2 1/α. Since α > 2 is arbitrary we have the proof of Theorem Now we prove Kolmogorov s Continuity theorem (Theorem 10.4). Proof of Theorem We consider the set of dyadic rationals in [0, 1). Define D n = {k2 n k = 0, 1,..., 2 n 1} for n 0 and D = {1} n0 D n. Clearly, D is a countable dense subset of [0, 1]. We define Step 1. First we claim that for γ (0, δ/α), n X(t) := X(t + 2 n ) X(t), t D n, n 0. P n=0 t D n ( n X(t) 2 γn) <, (10.1) which implies that n=0 P ( sup t Dn 2 γn n X(t) 1 ) < and by first Borel-Cantelli lemma and thus the random variable P ( sup t D n 2 γn n X(t) 1 i.o. ) = 0 R γ := sup sup 2 γn n X(t) < almost surely. (10.2) n0 t D n Using Markov s inequality and the hypothesis that E X t X s α c t s 1+δ for all t, s [0, 1] for some c, α, δ > 0 we can bound the LHS of (10.1) by 2 αγn E n X(t) α c 2 n 2 αγn 2 n(1+δ) = c 2 n(δ αγ) < n=0 t D n n=0 n=0 as γ < δ/α and this proves the claim. Step 2. We claim that R γ := X(t) X(s) sup s,t D,s<t t s γ 3R γ. (10.3) 1 2 γ We can ignore the case s = 0, t = 1, as X(1) X(0) R γ. Otherwise, for t, s D, s < t we have t s [2 k, 2 2 k ) for some k 1 and there exists u D k such that u s < u+2 k t < u+3 2 k. If t u k, we take s k = u, t k = s k + 2 k, otherwise we take s k = u + 2 k, t k = s k + 2 k. Note that, t k s k = 2 k and s k D k and thus X(t k ) X(s k ) R γ 2 γk. Moreover, we can
4 10-4 Week 10: Brownian Motion and Conditional Probability choose a sequence s i, t i D i, k < i n such that s n = s, t n = t and s i s i 1, t i t i 1 {0, 2 i } for all k < i n. Since n X(t) = X(t k ) + (X(t i ) X(t i 1 )), we have X(s) = X(s k ) + X(t) X(s) R γ (2 γk + 2 n i=k+1 i=k+1 n i=k+1 ) 2 γi (X(s i ) X(s i 1 )) 32 γk 1 2 γ R γ 3R γ 1 2 γ t s γ. Step 3. Fix γ (0, δ/α). By equation (10.3), the event A = {R γ < } has probability 1. Fix ω A. Now we claim that for t [0, 1] \ D and any sequence t n t, t n D, lim n X(t n ) exists and does not depend on the particular choice of the sequence (t n ) n1. The proof follows from the fact X(t n ) X(t m ) R γ t n t m γ for all m, n and thus the sequence X(t n ) is Cauchy and has a limit. The uniqueness of the limit follows from the same argument. Step 4. We define ˆX(t) = { X(t) if t D lim n X(t n ) if t [0, 1] \ D, t n t with t n D. We claim that ˆX(t), t [0, 1] is a.s. continuous and γ-hölder for γ (0, δ/α). Moreover, ˆX(t) = X(t) a.s. for t [0, 1]. It is easy to see that sup ˆX(t) ˆX(s) 0s<t1 t s γ 3R γ < a.s. 1 2 γ by following the same argument as in step 3. In particular, ˆX(t), t [0, 1] is a.s. continuous and γ-hölder for γ (0, δ/α). Moreover, ˆX(t) = X(t) a.s. for t D and if tn t, t n D then ˆX(t n ) ˆX(t) a.s. by definition, X(t n ) X(t) a.s. by the same argument with D replaced by D {t}, thus ˆX(t) = X(t) a.s. for t [0, 1] Lévy s Construction of SBM Lévy constructed SBM as a uniform limit of random continuous function in the space C([0, 1]) with norm, so that the limiting function is automatically continuous. As before, we consider the set of dyadic rationals in [0, 1]. Define D n = {k2 n k = 0, 1,..., 2 n 1} for n 0 and D = {1} n0 D n. We will sequentially define the values at the points of {1} D n for n 0. Let Z d, d D be a sequence of i.i.d. N(0, 1) random variables. For n 1, d D n \ D n 1 we define the functions { 1 2 n t d if t d 2 n ϕ d (t) := 0 otherwise.
5 Week 10: Brownian Motion and Conditional Probability 10-5 Note that, for fixed n 1, the functions φ d ( ), d D n \ D n 1 have disjoint support. Moreover, the functions {2 (n+1)/2 φ d ( ) d D n \ D n 1, n 1} forms an orthonormal basis of the Dirichlet space D([0, 1], B, λ) with zero boundary condition under the Dirichlet inner product f, g D := 1 0 f g. Finally we define the random continuous function X l (t) := t Z 1 + l 2 (n+1)/2 n=1 Z d ϕ d (t), t [0, 1] for l 1. This is same as subsequently defining the value at the midpoints and taking a linear approximation. Note that, for m > l 1 we have X m (t) X l (t) = m n=l+1 m n=l+1 2 (n+1)/2 Z d ϕ d (t) Z d ϕ d (t) 2 (n+1)/2 m n=l+1 2 (n+1)/2 max Z d. In particular, We claim that, sup X m X l 2 (n+1)/2 max Z d. ml n>l P( max Z d c n i.o.) = 0 for c > 2 log 2. Proof follows from First Borel-Cantelli lemma and the fact that P( Z d x) e x2 /2 for x > 0. Thus K := sup n1 n 1/2 max Z d < a.s. In particular, (X l ) l1 is a.s. Cauchy in C([0, 1]) with norm, and has a a.s. uniform limit B(t), t [0, 1] which is a.s. continuous. One can check the other properties quite easily Conditional Expectation Definition, existence and uniqueness Definition Given σ-fields G F and a r.v X L 1 (Ω, F, P), we define E(X G) as a r.v Y s.t. 1. Y is in L 1 (Ω, G, P). 2. E(X1 A ) = E(Y 1 A ) A G E((X Y )Z) = 0 for all bounded G-measurable function Z. Lemma Conditional expectation, if exists, is unique a.s.
6 10-6 Week 10: Brownian Motion and Conditional Probability Proof. Suppose Y 1, Y 2 are conditional expectations of X given G. Take W := Y 1 Y 2 which is G measurable. Then E(W 1 A ) = 0 for all A G. Take A = {W > ε} G. Then, Similarly, taking A = {W < ε} G we get 0 = E(W 1 W >ε ) > ε P(W > ε) = P(W > ε) = 0 0 = E(W 1 W < ε ) < ε P(W < ε) = P(W < ε) = 0 Therefor P (W [ ε, ε]) = 1 for all ε > 0. Hence P(W = 0) = 1 and Y 1 = Y 2 a.s. We will prove that Theorem Conditional expectation exists. We will prove this later Properties of Conditional Expectation Let G F be σ-fields, then E( G) : L 1 (F) L 1 (G) where L p (H) := L p (Ω, H, P), p 1 for a sub σ-field H of F. (i) Positive: X 0 a.s. = X := E(X G) 0 a.s. Proof: Using E(X1 A ) = E( X1 A ) A G; for A = { X < 0}, we get E( X1 X<0 ) = 0 which implies X 0 a.s. (ii) Linear: E(X + Y G) = E(X G) + E(Y G) a.s. and E(cX G) = c E(X G) a.s. for c R. (iii) Contractive on L p (F) L p (G): For X L p (F), p 1 we have E(X G) L p (G) and E(X G) p X p. Proof for for p = 1: Let X = E(X G), we have E(X1 A ) = E( X1 A ) for all A G. Taking A 1 = { X 0}, A 2 = { X < 0}, we have, E X = E( X1 X0 ) E( X1 X<0 ) = E(X1 X0 ) E(X1 X<0 ) E X. For general p > 1, use E(XZ) = E( XZ) for any bounded G-measurable Z and the fact that X p p = E( X X p 1 sgn( X)). (iv) William s Tower Property: Let G H. Suppose E( G) and E( H) are well defined, then E(E(X H) G) = E(X G). (v) If X is G-measurable then E(X G) = X a.s. (vi) Projection: E(E(X G) G) = E(X G), which follows from Tower property. (vii) Monotone: X Y a.s. = E(X G) E(Y G) a.s. (viii) Conditional MCT: X n 0, X n X = E(X n G) E(X G) a.s. (ix) E(X {, Ω}) = E X.
7 Week 10: Brownian Motion and Conditional Probability 10-7 (x) Jensen s Inequality: If φ is convex and E φ(x) <, φ(e(x G)) E(φ(X) G) a.s. (xi) Cauchy-Schwartz and Hölder Inequality: E(XY G) (E( X p G)) 1/p (E( Y q G)) 1/q a.s. for 1/p + 1/q = 1, p, q 1 and X L p (F), Y L q (F). (xii) Suppose F 0 F 1 F 2 F = σ ( n0 F n ) F, and X L p (F), p 1. Then E(X F n ) a.s./lp E(X F ). In particular, if F = F then E(X F n ) a.s./lp X. (xiii) ( Xn = E(X F n ) ) n0 is adapted to ( F n )n0 and E( X n F n 1 ) = X n Proof of Theorem 10.7: Construction of E( G) Here are a number of proofs for Theorem (Measure theoretic proof ). We will use the following theorem from measure theory. Theorem 10.8 (Lebesgue-Radon-Nikodym derivative). Let µ and λ be two finite positive measures on (Ω, F) such that µ λ (µ is absolutely continuous w.r.t. λ, i.e., λ(a) = 0 = µ(a) = 0). Then there exists a measurable function f L 1 (Ω, F, λ) s.t., f = dµ dλ µ(a) = fdλ A F. A Assume that X 0. Consider Theorem 10.8 with λ := P and µ(a) := A Xd P for all A G. λ, µ are positive finite measures on (Ω, G) and µ λ. Then by Theorem 10.8 there exists f L 1 (Ω, G, P) s.t. µ(a) = fd P = E(X1 A ) = E(f1 A ) A G. A So f is the conditional expectation. In general X = X + X and we define E(X G) = E(X + G) E(X G). (Functional analysis proof ). We will use the following lemma on orthogonal projection in Hilbert spaces. Lemma Let K H be a close subspace of Hilbert space H. Then for all X H, there exists a unique decomposition X = Y + Z s.t Y K and Z K. Assume X L 2 (Ω, F, P) which is a Hilbert space. We will use shorthand notation L 2 (F) for L 2 (Ω, F, P). By Lemma 10.9 there exists a unique decomposition X = Y + E such that Y L 2 (G) and E L 2 (G). So, Z L 2 (G), E((X Y )Z) = 0 = Y = E(X G) This proof can be generalized to X L 1.
8 10-8 Week 10: Brownian Motion and Conditional Probability (Hands on proof ). (i) Assume that G <. Then G = σ(c 1,..., C k ) such that {C 1, C 2,..., C k } is a disjoint partition of Ω. Now any G-measurable r.v. Z is of the form, Z = k i=1 a i1 Ci for some a 1, a 2,..., a k R. Then the conditional expectation is, E(X G) = Y := k i=1 E(X1 Ci ) E(1 Ci ) 1 C i, since for all G-measurable r.v. Z, E(Y Z) = k k a i i=1 j=1 E(X1 Cj ) E(1 Cj ) E(1 C j 1 Ci ) = k a i E(X1 Ci ) = E(XZ). i=1 (ii) Suppose G 1 G 2 G = σ ( i1 G i ) and X L 2 (F). Then, let X n = E(X G n ). We have X n 2 X 2. Let n = X n X n 1, n 1. By William s Tower property we have, E( X n G n 1 ) = X n 1. Thus for any L 2 (G n 1 ) r.v. Y, E(( X n X n 1 )Y ) = 0. Thus { X 1, 2, 3,...} are uncorrelated. By definition of n, we have X n = X n. This implies Sn 2 := X n 2 2 = X n 2 2 X 2. Thus X n is L 2 -Cauchy, since Sn 2 S 2 E X 2. Therefore X L 2 n X. Claim: X is G-measurable. Claim: E(X1 A ) = E( X 1 A ) for all A n1 G n. To verify the claim, note that: n s.t. A G m for m n. Also E(X1 A ) = E( X m 1 A ), m n, and as m, it converges to E( X 1 A ). Now, {A : E(X1 A ) = E( X 1 A )} is a λ system and n1 G n is a π-system. By π λ theorem, E(X1 A ) = E( X 1 A ) for all A σ( n1 G n ) = G. Therefore, we have, X = E(X G) a.s. (iii) If G = σ(c 1, C 2,...) then take G n = σ(c 1, C 2,..., C n ), n 1. In general, given X L 2 (F), G F, S := sup H <,H G E(X H) 2 exists. Take H i s.t. E(X H i ] 2 S and work with G i = σ(h 1 H 2 H i ), i 1. (iv) Finally, from X L 2 (F), one can generalize to X L 1 (F). See homework exercise for the complete proof.
Random Process Lecture 1. Fundamentals of Probability
Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus
More informationIf Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.
20 6. CONDITIONAL EXPECTATION Having discussed at length the limit theory for sums of independent random variables we will now move on to deal with dependent random variables. An important tool in this
More information1 Independent increments
Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More informationAdvanced Probability
Advanced Probability Perla Sousi October 10, 2011 Contents 1 Conditional expectation 1 1.1 Discrete case.................................. 3 1.2 Existence and uniqueness............................ 3 1
More information4 Expectation & the Lebesgue Theorems
STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does
More informationLectures 22-23: Conditional Expectations
Lectures 22-23: Conditional Expectations 1.) Definitions Let X be an integrable random variable defined on a probability space (Ω, F 0, P ) and let F be a sub-σ-algebra of F 0. Then the conditional expectation
More informationGAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM
GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales
More information18.175: Lecture 3 Integration
18.175: Lecture 3 Scott Sheffield MIT Outline Outline Recall definitions Probability space is triple (Ω, F, P) where Ω is sample space, F is set of events (the σ-algebra) and P : F [0, 1] is the probability
More informationABSTRACT CONDITIONAL EXPECTATION IN L 2
ABSTRACT CONDITIONAL EXPECTATION IN L 2 Abstract. We prove that conditional expecations exist in the L 2 case. The L 2 treatment also gives us a geometric interpretation for conditional expectation. 1.
More informationMATH 418: Lectures on Conditional Expectation
MATH 418: Lectures on Conditional Expectation Instructor: r. Ed Perkins, Notes taken by Adrian She Conditional expectation is one of the most useful tools of probability. The Radon-Nikodym theorem enables
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More information4th Preparation Sheet - Solutions
Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space
More information3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable
More informationSolutions to the Exercises in Stochastic Analysis
Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional
More informationTHEOREMS, ETC., FOR MATH 515
THEOREMS, ETC., FOR MATH 515 Proposition 1 (=comment on page 17). If A is an algebra, then any finite union or finite intersection of sets in A is also in A. Proposition 2 (=Proposition 1.1). For every
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationFunctional Analysis. Martin Brokate. 1 Normed Spaces 2. 2 Hilbert Spaces The Principle of Uniform Boundedness 32
Functional Analysis Martin Brokate Contents 1 Normed Spaces 2 2 Hilbert Spaces 2 3 The Principle of Uniform Boundedness 32 4 Extension, Reflexivity, Separation 37 5 Compact subsets of C and L p 46 6 Weak
More informationGaussian Processes. 1. Basic Notions
Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian
More informationHILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define
HILBERT SPACES AND THE RADON-NIKODYM THEOREM STEVEN P. LALLEY 1. DEFINITIONS Definition 1. A real inner product space is a real vector space V together with a symmetric, bilinear, positive-definite mapping,
More informationStat 643 Review of Probability Results (Cressie)
Stat 643 Review of Probability Results (Cressie) Probability Space: ( HTT,, ) H is the set of outcomes T is a 5-algebra; subsets of H T is a probability measure mapping from T onto [0,] Measurable Space:
More informationWiener Measure and Brownian Motion
Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u
More informationPart III Advanced Probability
Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after
More informationHilbert spaces. Let H be a complex vector space. Scalar product, on H : For all x, y, z H and α, β C α x + β y, z = α x, z + β y, z x, y = y, x
Hilbert spaces Let H be a complex vector space. Scalar product, on H : For all x, y, z H and α, β C α x + β y, z = α x, z + β y, z x, y = y, x x, x 0 x, x = 0 x = 0 Orthogonality: x y x, y = 0 A scalar
More informationAnalysis Comprehensive Exam Questions Fall 2008
Analysis Comprehensive xam Questions Fall 28. (a) Let R be measurable with finite Lebesgue measure. Suppose that {f n } n N is a bounded sequence in L 2 () and there exists a function f such that f n (x)
More information2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?
MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationPROBABILITY THEORY II
Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016
More informationMeasure, Integration & Real Analysis
v Measure, Integration & Real Analysis preliminary edition 10 August 2018 Sheldon Axler Dedicated to Paul Halmos, Don Sarason, and Allen Shields, the three mathematicians who most helped me become a mathematician.
More information1 Sequences of events and their limits
O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More information1. Probability Measure and Integration Theory in a Nutshell
1. Probability Measure and Integration Theory in a Nutshell 1.1. Measurable Space and Measurable Functions Definition 1.1. A measurable space is a tuple (Ω, F) where Ω is a set and F a σ-algebra on Ω,
More informationStochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik
Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationMATHS 730 FC Lecture Notes March 5, Introduction
1 INTRODUCTION MATHS 730 FC Lecture Notes March 5, 2014 1 Introduction Definition. If A, B are sets and there exists a bijection A B, they have the same cardinality, which we write as A, #A. If there exists
More information9 Radon-Nikodym theorem and conditioning
Tel Aviv University, 2015 Functions of real variables 93 9 Radon-Nikodym theorem and conditioning 9a Borel-Kolmogorov paradox............. 93 9b Radon-Nikodym theorem.............. 94 9c Conditioning.....................
More informationIn terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.
1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t
More informationMath 6810 (Probability) Fall Lecture notes
Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),
More informationA Concise Course on Stochastic Partial Differential Equations
A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original
More informationElementary Probability. Exam Number 38119
Elementary Probability Exam Number 38119 2 1. Introduction Consider any experiment whose result is unknown, for example throwing a coin, the daily number of customers in a supermarket or the duration of
More informationLebesgue-Radon-Nikodym Theorem
Lebesgue-Radon-Nikodym Theorem Matt Rosenzweig 1 Lebesgue-Radon-Nikodym Theorem In what follows, (, A) will denote a measurable space. We begin with a review of signed measures. 1.1 Signed Measures Definition
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More information5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing.
5 Measure theory II 1. Charges (signed measures). Let (Ω, A) be a σ -algebra. A map φ: A R is called a charge, (or signed measure or σ -additive set function) if φ = φ(a j ) (5.1) A j for any disjoint
More informationConditional expectation
Chapter II Conditional expectation II.1 Introduction Let X be a square integrable real-valued random variable. The constant c which minimizes E[(X c) 2 ] is the expectation of X. Indeed, we have, with
More informationNotes 15 : UI Martingales
Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.
More informationDo stochastic processes exist?
Project 1 Do stochastic processes exist? If you wish to take this course for credit, you should keep a notebook that contains detailed proofs of the results sketched in my handouts. You may consult any
More informationPart III Stochastic Calculus and Applications
Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly
More informationFinite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product
Chapter 4 Hilbert Spaces 4.1 Inner Product Spaces Inner Product Space. A complex vector space E is called an inner product space (or a pre-hilbert space, or a unitary space) if there is a mapping (, )
More informationSTOR 635 Notes (S13)
STOR 635 Notes (S13) Jimmy Jin UNC-Chapel Hill Last updated: 1/14/14 Contents 1 Measure theory and probability basics 2 1.1 Algebras and measure.......................... 2 1.2 Integration................................
More informationTheorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1
Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be
More informationThe main results about probability measures are the following two facts:
Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a
More informationSTAT 7032 Probability Spring Wlodek Bryc
STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,
More informationExercise Exercise Homework #6 Solutions Thursday 6 April 2006
Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume
More informationNotes 13 : Conditioning
Notes 13 : Conditioning Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Sections 0, 4.8, 9, 10], [Dur10, Section 5.1, 5.2], [KT75, Section 6.1]. 1 Conditioning 1.1 Review
More information1 Orthonormal sets in Hilbert space
Math 857 Fall 15 1 Orthonormal sets in Hilbert space Let S H. We denote by [S] the span of S, i.e., the set of all linear combinations of elements from S. A set {u α : α A} is called orthonormal, if u
More informationThe Kolmogorov continuity theorem, Hölder continuity, and the Kolmogorov-Chentsov theorem
The Kolmogorov continuity theorem, Hölder continuity, and the Kolmogorov-Chentsov theorem Jordan Bell jordan.bell@gmail.com Department of Mathematics, University of Toronto June 11, 2015 1 Modifications
More informationReal Analysis Problems
Real Analysis Problems Cristian E. Gutiérrez September 14, 29 1 1 CONTINUITY 1 Continuity Problem 1.1 Let r n be the sequence of rational numbers and Prove that f(x) = 1. f is continuous on the irrationals.
More informationMeasure Theory on Topological Spaces. Course: Prof. Tony Dorlas 2010 Typset: Cathal Ormond
Measure Theory on Topological Spaces Course: Prof. Tony Dorlas 2010 Typset: Cathal Ormond May 22, 2011 Contents 1 Introduction 2 1.1 The Riemann Integral........................................ 2 1.2 Measurable..............................................
More informationMATH MEASURE THEORY AND FOURIER ANALYSIS. Contents
MATH 3969 - MEASURE THEORY AND FOURIER ANALYSIS ANDREW TULLOCH Contents 1. Measure Theory 2 1.1. Properties of Measures 3 1.2. Constructing σ-algebras and measures 3 1.3. Properties of the Lebesgue measure
More informationLecture 3: Expected Value. These integrals are taken over all of Ω. If we wish to integrate over a measurable subset A Ω, we will write
Lecture 3: Expected Value 1.) Definitions. If X 0 is a random variable on (Ω, F, P), then we define its expected value to be EX = XdP. Notice that this quantity may be. For general X, we say that EX exists
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More informationn E(X t T n = lim X s Tn = X s
Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:
More information9 Brownian Motion: Construction
9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of
More informationKernel Method: Data Analysis with Positive Definite Kernels
Kernel Method: Data Analysis with Positive Definite Kernels 2. Positive Definite Kernel and Reproducing Kernel Hilbert Space Kenji Fukumizu The Institute of Statistical Mathematics. Graduate University
More informationAdvanced Probability
Advanced Probability University of Cambridge, Part III of the Mathematical Tripos Michaelmas Term 2006 Grégory Miermont 1 1 CNRS & Laboratoire de Mathématique, Equipe Probabilités, Statistique et Modélisation,
More informationProbability Theory. Richard F. Bass
Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2
More information4 Sums of Independent Random Variables
4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables
More information17. Convergence of Random Variables
7. Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then lim f n = f if lim f n (x) = f(x) for all x in R. This
More informationPreliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012
Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of
More informationStochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes
BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents
More informationMATH & MATH FUNCTIONS OF A REAL VARIABLE EXERCISES FALL 2015 & SPRING Scientia Imperii Decus et Tutamen 1
MATH 5310.001 & MATH 5320.001 FUNCTIONS OF A REAL VARIABLE EXERCISES FALL 2015 & SPRING 2016 Scientia Imperii Decus et Tutamen 1 Robert R. Kallman University of North Texas Department of Mathematics 1155
More informationChapter 8. General Countably Additive Set Functions. 8.1 Hahn Decomposition Theorem
Chapter 8 General Countably dditive Set Functions In Theorem 5.2.2 the reader saw that if f : X R is integrable on the measure space (X,, µ) then we can define a countably additive set function ν on by
More informationProduct measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 2017 Nadia S. Larsen. 17 November 2017.
Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 017 Nadia S. Larsen 17 November 017. 1. Construction of the product measure The purpose of these notes is to prove the main
More informationIntegration on Measure Spaces
Chapter 3 Integration on Measure Spaces In this chapter we introduce the general notion of a measure on a space X, define the class of measurable functions, and define the integral, first on a class of
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More information1 Inner Product Space
Ch - Hilbert Space 1 4 Hilbert Space 1 Inner Product Space Let E be a complex vector space, a mapping (, ) : E E C is called an inner product on E if i) (x, x) 0 x E and (x, x) = 0 if and only if x = 0;
More informationLecture 21 Representations of Martingales
Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let
More informationMeasurable functions are approximately nice, even if look terrible.
Tel Aviv University, 2015 Functions of real variables 74 7 Approximation 7a A terrible integrable function........... 74 7b Approximation of sets................ 76 7c Approximation of functions............
More informationProblem Set. Problem Set #1. Math 5322, Fall March 4, 2002 ANSWERS
Problem Set Problem Set #1 Math 5322, Fall 2001 March 4, 2002 ANSWRS i All of the problems are from Chapter 3 of the text. Problem 1. [Problem 2, page 88] If ν is a signed measure, is ν-null iff ν () 0.
More informationDynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)
Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic
More informationV. SUBSPACES AND ORTHOGONAL PROJECTION
V. SUBSPACES AND ORTHOGONAL PROJECTION In this chapter we will discuss the concept of subspace of Hilbert space, introduce a series of subspaces related to Haar wavelet, explore the orthogonal projection
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More information1 Math 241A-B Homework Problem List for F2015 and W2016
1 Math 241A-B Homework Problem List for F2015 W2016 1.1 Homework 1. Due Wednesday, October 7, 2015 Notation 1.1 Let U be any set, g be a positive function on U, Y be a normed space. For any f : U Y let
More informationRiesz Representation Theorems
Chapter 6 Riesz Representation Theorems 6.1 Dual Spaces Definition 6.1.1. Let V and W be vector spaces over R. We let L(V, W ) = {T : V W T is linear}. The space L(V, R) is denoted by V and elements of
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationLecture 22: Variance and Covariance
EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce
More informationFE 5204 Stochastic Differential Equations
Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture
More informationLecture 21: Expectation of CRVs, Fatou s Lemma and DCT Integration of Continuous Random Variables
EE50: Probability Foundations for Electrical Engineers July-November 205 Lecture 2: Expectation of CRVs, Fatou s Lemma and DCT Lecturer: Krishna Jagannathan Scribe: Jainam Doshi In the present lecture,
More informationReal Analysis Notes. Thomas Goller
Real Analysis Notes Thomas Goller September 4, 2011 Contents 1 Abstract Measure Spaces 2 1.1 Basic Definitions........................... 2 1.2 Measurable Functions........................ 2 1.3 Integration..............................
More information(1) Consider the space S consisting of all continuous real-valued functions on the closed interval [0, 1]. For f, g S, define
Homework, Real Analysis I, Fall, 2010. (1) Consider the space S consisting of all continuous real-valued functions on the closed interval [0, 1]. For f, g S, define ρ(f, g) = 1 0 f(x) g(x) dx. Show that
More informationMTH 404: Measure and Integration
MTH 404: Measure and Integration Semester 2, 2012-2013 Dr. Prahlad Vaidyanathan Contents I. Introduction....................................... 3 1. Motivation................................... 3 2. The
More informationMeasure-theoretic probability
Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is
More informationReminder Notes for the Course on Measures on Topological Spaces
Reminder Notes for the Course on Measures on Topological Spaces T. C. Dorlas Dublin Institute for Advanced Studies School of Theoretical Physics 10 Burlington Road, Dublin 4, Ireland. Email: dorlas@stp.dias.ie
More information3 Integration and Expectation
3 Integration and Expectation 3.1 Construction of the Lebesgue Integral Let (, F, µ) be a measure space (not necessarily a probability space). Our objective will be to define the Lebesgue integral R fdµ
More informationApplied Analysis (APPM 5440): Final exam 1:30pm 4:00pm, Dec. 14, Closed books.
Applied Analysis APPM 44: Final exam 1:3pm 4:pm, Dec. 14, 29. Closed books. Problem 1: 2p Set I = [, 1]. Prove that there is a continuous function u on I such that 1 ux 1 x sin ut 2 dt = cosx, x I. Define
More informationBrownian Motion. Chapter Stochastic Process
Chapter 1 Brownian Motion 1.1 Stochastic Process A stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ,P and a real valued stochastic
More information