P (A G) dp G P (A G)

Size: px
Start display at page:

Download "P (A G) dp G P (A G)"

Transcription

1 First homework assignment. Due at 12:15 on 22 September Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume Y another r.v. for which P (Y = 0 or Y = 1) = 1. Prove Y σ(x) iff there exists a ϕ : R R Borel Measurable function such Y = ϕ(x). Homework 3. Let X and Y be random variables on the same probability space. Prove X and Y are independent iff for every ϕ : R R bounded measurable functions we have E [ϕ(y ) X = E [ϕ(y ). Homework 4. Assume X, Y are jointly continuous r.v. with joint distribution function f(x). Prove yf(x, y)dy R E [Y X = E [Y σ(x) = f(x, y)dy. Homework 5. Let Y σ(g). Prove Homework 6. Let X, Y L 1 (Ω, F, P) satisfying Show P(X = Y ) = 1. E [X G Y A G E [X 1 A E [Y 1 A. E [X Y = Y and E [Y X = X Homework 7. Prove the general version of Bayes s formula: Given the probability space (Ω, F, P) and let G be a sub-σ-algebra of F. Let G G. Show P (A G) dp G P (G A) = dp (1) P (A G) Homework 8. Prove the conditional variance formula where Var(X Y ) = E [X 2 Y (E [X Y ) 2. Ω R Var(X) = E [Var(X Y ) + Var (E [X Y), (2) Homework 9. Let X 1, X 2,... iid r.v. and N is a non-negative integer valued r.v. is independent of X i, i 1. Prove ( N ) Var X i = E [N Var(X) + (E [X) 2 Var(N). (3) i=1 Second homework assignment. Due at 12:15 on 29 September Homework 10. Let X t be a Poisson(1). That is a Poisson process with rate λ = 1. (See Durrett s book p. 139 if you forgot the definition.) Find: E [X 1 X 2 and E [X 2 X 1. Homework 11. Construct a martingale which is NOT a Markov chain. } Homework 12. For every i = 1,..., m let { M n (i) be a sequence of martingales w.r.t. {X n} n=1 n=1. Show M n := max M n (i) 1 i n is a submartingal w.r.t. {X n }. 1

2 Homework 13. Let ξ 1, ξ 2,... standard normal variables. (Recall in this case the moment generating function M(θ) = E [ e θξ i = e θ 2 /2.) Let a, b R and Prove (a) X n 0 a.s. iff b > 0 (b) X n 0 in L r iff r < 2b a 2. n S n := ξ k and X n := e asn bn Homework 14. Let S n := X X n, where X 1, X 2,... are iid with X 1 Exp(1). Verify M n := is a martingale w.r.t. the natural filtration F n. n! esn (1 + S n ) n+1 Third homework assignment. Due at 12:15 on 6 October Homework 15. Prove the following two definitions of λ-system L are equivalent: Definition 1. (a) Ω L. (b) If A, B L and A B then B \ A L (c) If A n L and A n A ( is A n A n+1 and A = n=1a n ) then A L. Definition 2. (i) Ω L. (ii) If A L then A c L. (iii) If A i A j =, A i L then i=1a i L. Homework 16. There are n white and n black balls in an urn. We pull out all of them one-by-one without replacement. Whenever we pull: a black ball we have to pay 1$, a white ball we receive 1$. Let X 0 := 0 and X i be the amount we gained or lost after the i-th ball was pulled. We define Y i := X i 2n i, for 1 i 2n 1, and Z i := (a) Prove Y = (Y i ) and Z = (Z i ) are martingales. (b) Find Var(X i ) =? X 2 i (2n i) (2n i)(2n i 1) for 1 i 2n 2. Homework 17. Let X, Y be two independent Exp(λ) r.v. and Z := X +Y. Show for any non-negative measurable h we have E [h(x) Z = 1 Z h(t)dt. Z 0 Homework 18. Let X 1, X 2,... be iid r.v. with P (X n = n 2 ) = 1 n 2 and P ( X n = n2 S n := X X n. Show (a) lim n S n /n =. 2 n 2 1 ) = 1 1 n 2. Let

3 (b) {S n } is a martingale which converges to a.s.. Homework 19. (This was withdrawn on this week and it is reassigned next week.) A player s winnings per unit stake on game n are ξ n, where {ξ} n=1 are i.i.d. r.v. P (ξ n = 1) = p and P (ξ n = 1) = q := 1 p, where surprisingly, p (1/2, 1), is p > q. That is with probability q < 1/2 the player losses her stake and with probability p she gets back twice of her stake. Let C n be the player s stake on game n. We assume C n is previsible, is C n+1 F n := σ (ξ 1,..., ξ n ) for all n. Further, we assume 0 C n Y n 1, where Y n 1 is the fortune of the player at time n. We call α := p log p + q log q + log 2 the entropy. Prove (a) log Z n nα is a supermartingale. This means the rate of winnings E [log Y n log Y 0 nα (b) There exists a strategy for which log Z n nα is a martingale. Fourth homework assignment. Due at 12:15 on 13 October Homework 20. (Reassigned from last week) A player s winnings per unit stake on game n are ξ n, where {ξ} n=1 are i.i.d. r.v. P (ξ n = 1) = p and P (ξ n = 1) = q := 1 p, where surprisingly, p (1/2, 1), is p > q. That is with probability q < 1/2 the player losses her stake and with probability p she gets back twice of her stake. Let C n be the player s stake on game n. We assume C n is previsible, is C n+1 F n := σ (ξ 1,..., ξ n ) for all n. Further, we assume 0 C n Y n 1, where Y n 1 is the fortune of the player at time n. We call α := p log p + q log q + log 2 the entropy. Prove (a) log Y n nα is a supermartingale. This means the rate of winnings E [log Y n log Y 0 nα (b) There exists a strategy for which log Y n nα is a martingale. Homework 21. Let X, Z be R d -valued r.v. defined on the (Ω, F, P). Assume E [ e it X+is Z = E [ e itx E [ e isz, s, t R n. Prove X, Z are independent. Homework 22. Let X N (µ, Σ) in R n. Let X 1 := (X 1,..., X p ) and X 2 := (X p+1,..., X n ). Let Σ, Σ 1 and Σ 2 the covariance matrix of X, X 1 and X 2 respectively. Prove X 1 and X 2 are independent if and only if ( ) Σ1 0 Σ =. 0 Σ 2 Homework 23. Let Y N (µ, Σ) and let B be a non-singular matrix. Find the distribution of X = B Y. Homework 24. Let Y N (µ, Σ) in R 2, Y = (Y 1, Y 2 ) and let a 1, a 2 R. Find the distribution of a 1 Y 1 + a 2 Y 2. Fifth homework assignment. Due at 12:15 on 20 October Homework 25. Construct a random vector (X, Y ) such both X and Y are one-dimensional normal distributions but (X, Y ) is NOT a bivariate normal distribution. Homework 26. Let X = (X 1,..., X d ) be standard multivariate normal distribution. Let Σ be an n n positive semi-definite, symmetric matrix. and let µ R d. Prove there exists an affine transformation T : R d R d, such T (X) N (µ, Σ). 3

4 Homework 27. Let X be the height of the father and Y be the height of the son in sample of father-son pairs. Then (X, Y ) is bivariate normal. Assume E [X = 68 (in inches), E [Y = 69, σ X = σ Y = 2, ρ = 0.5, where ρ is the correlation of (X, Y ). Find the conditional distribution of Y given X = 80 (6 feet 8 inches). (That is find the parameters in Y N (µ Y, σ 2 (Y )).) Homework 28 (Extension of part (iii) of Doob s optional stopping Theorem). Let X be a supermartingale. Let T be a stopping time with E [T <.(Like in part (iii) of Doob s optional stopping Theorem.) Assume there is a C such Prove E [X T E [X 0. E [ X k X k 1 F k 1 (ω) C, k > 0 and for a.e. ω. Homework 29. Let X n be a discrete time birth-death process with probabilities with transition probabilities p(i, i + 1) = p i, p(i, i 1) = 1 p i =: q i, p 0 = 1 We define the following function:, g : N R +, g(k) := 1 + k 1 j q i j=1 i=1 p i (a) Prove Z n := g(x n ) is a martingale for the natural filtration. (b) Let 0 < i < n be fixed. Find the probability a process started from i gets to n earlier than to 0. Sixth homework assignment. Due at 12:15 on 3 November Homework 30. Let {ε n } be an iid sequence of real numbers satisfying P (ε n = ±1) = 1 2. Show n ε na n converges alsmost surely iff a 2 n <. Homework 31. Let X = (X n ) be an L 2 random walk is a martingale. Let σ 2 be the variance of the k-th increment Z k := X k X k 1 for all k. Prove the quadratic variance is A n = nσ 2. Homework 32. Prove the assertion of Remark 5.6 from File C. Homework 33. Let M = (M n ) be a martingale with M 0 = 0 and M k M k 1 < C for a C R. Let T 0 be a stopping time and we assume E [T. Let U n := n (M k M k 1 ) 2 1 T k, V n := 2 (M i M i 1 ) (M j M j 1 ) 1 T j. 1 i<j n U := (M k M k 1 ) 2 1 T k, V := 2 (a) M 2 T n = U n + V n and M 2 T = U + V. 1 i<j (M i M i 1 ) (M j M j 1 ) 1 T j. Prove (b) Further, if E [T 2 < then lim n U n = U a.s. and E [U < and E [V n = E [V = 0. (c) Conclude lim n E [M 2 T n = E [M 2 T. Homework 34 (Wald equalities). Let Y 1, Y 2,... be iid r.v. with Y i L 1. Let S n := Y Y n and we write µ := E [Y i. Given a stopping time T 1 satisfying: E [T <. Prove 4

5 (a) E [S T = µ E [T. (4) (b) Further, assume Y i are bounded ( C i R s.t. Y i < C i ) and E [T 2 <. We write σ 2 := Var(Y i ). Then E [ (S T µt ) 2 = σ 2 E [T. (5) Hint: Introduce an appropriate martingale and apply the result of the previous exercise. Sevens homework assignment. Due at 12:15 on 10 November Homework 35 (Branching Processes). You might want to recall what you have learned about Bransching Processes. (See File C of the course "Stochastic Processes".) A Branching Process Z = (Z n ) n=0 is defined recursively by a given family of Z + valued iid rv. { X (n) } k as follows: k,n=1 Z 0 := 1, Z n+1 := X (n+1) X (n+1) Z n, n 0. Let µ = E [ X (n) k and Fn = σ(z 0, Z 1,... Z n ). We write f(s) for the generating function. is Further, let f(s) = P ( X (n) k = l ) s p l l=0 }{{} l for any k, n. {extinction} := {Z n 0} = { n, Z n = 0} {explosion} = {Z n }. let q := P ([extinction) :=. Recall we learned q is the smaller (if there are two) fixed point of f(s). That is q is the smallest solution of f(q) = q. Prove (a) E [Z n = µ n. Hint: Use induction. (b) For every s 0 we have E [ s Z n+1 F n = f(s) Z n. Explain why it is true q Zn lim Z n = Z exists a.s. n (c) Let T := min {n : Z n = 0}. (T = if Z n > 0 always.) is a martingale and (d) Prove q = E [ q Z T = E [ q Z 1 T = + E [ q ZT 1 T <. (e) Prove E [ q Z 1 T = = 0. (f) Conclude if T (ω) = then Z =. (g) Prove P (extinction) + P (explosion) = 1. (6) Homework 36 (Branching Processes cont.). Here we assume Prove µ = E [ X (n) k < and 0 < σ 2 := Var(X (n) k ) <. (a) M n = Z n /µ n is a martingale for the natural filtration F n (b) E [ Z 2 n+1 F n = µ 2 Z 2 n + σ 2 Z n. Conclude M is bounded in L 2 µ > 1. 5

6 (c) If µ > 1 then M := lim n M n exists (in L 2 and a.s.) and Var(M ) = σ 2 µ(µ 1). Homework 37 (Branching Processes cont.). Assume q = 1 (q was defined in the one but last exercise as the probability of extinction). Prove M n = Z n /µ n is NOT a UI martingale. Eights homework assignment. Due at 12:15 on 24 November Homework 38. Let X 1, X 2,... be iid rv. with continuous distribution distribution function. Let E i be the event a record occurs at time n. That is E 1 := Ω and E n := {X n > X m, m < n}. Prove {E i } i=1 independent and P (E i) = 1 i. Homework 39 (Continuation). Let E 1, E 2,... be independent with P (E i ) = 1/i. Let Y i := 1 Ei and N n := Y Y n. (In the special case of the previous homework, N n is the number of records until time n.) Prove (a) Y k 1/k log k converges almost surely. N (b) Using Krocker s Lemma conclude lim n = 1 a.s.. n log n (c) Apply this to the situation of the previous exercise to get an estimate on the number of records until time n. Homework 40. Let C be a class of rv on (Ω, F, P). Prove the following assertions (1) and (2) are equivalent: 1. C is UI. 2. Both of the following two conditions hold: (a) C is L 1 -bounded. That is A := sup {E [ X : X C} < AND (b) ε > 0, δ > 0 s.t. F F and P (F ) < δ = E [ X ; F < ε. Homework 41. Let C and D be UI classes of rv.. Prove C + D := {X + Y : X C and Y D} is also UI. Hint: use the previous exercise. Homework 42. Let C be a UI family of rv.. Let us define D := {Y : X C, G sub-σ-algebra of F s.t. Y = E [X G}. Prove D is also UI. Ninth homework assignment. Due at 12:15 on 1 December Homework 43. Let X 1, X 2,... be iid. rv. with E [X + = and E [X <. (Recall X = X + X and X +, X 0.) Use SLLN to prove S n /n a.s., where S n := X X n. Hint: For an M > 0 let Xi M := X i M and Sn M := Xn M + + Xn M. Explain why lim S M n n /n E [ Xi M and lim inf n S n/n lim n S M n /n. Homework 44. Let X 1, X 2,... be iid rv with E [ X i <. Prove E [X 1 S n = S n /n. (This is trivial intuitively from symmetry, but prove it with formulas.) Homework 45. Let C be a class of random variables of (Ω, F, P). Then the following condition implies C is UI: If p > 1 and A R such E [ X p < A for all X C. (L p bounded for some p > 1.) 6

7 Homework 46. Let C be a class of random variables of (Ω, F, P). Then the following condition implies C is UI: Y L 1 (Ω, F, P), s.t. X C we have X(ω) Y (ω). (C is dominated by an integrable (non-negative) r.v..) Homework 47. (Infinite Monkey Theorem) Monkey typing random on a typewriter for infinitely time will type the complete works of Shakespeare eventually. Tenth homework assignment. Due at 12:15 on 8 December Homework 48 (Azuma-Hoeffding Inequality). (a) Assume Y is a r.v. which takes values from [ c, c and E [Y = 0 holds. Prove for all θ R we have E [ e ( ) 1 θy cosh(θc) exp 2 θ2 c 2. Hint: Let f(z) := exp(θz), z [ c, c. Then by the convexity of f we have f(y) c y 2c f( c) + c + y 2c f(c). (b) Let M be a martingale with M 0 = 0 such or a sequence of positive numbers {c n } n=1 we have M n M n 1 c n, n. Then the following inequality holds for all x > 0: ( ) ( P sup M k x exp 1 ) n k n 2 x2 / c 2 k. This exercise is from the Williams book. Hint: Use submartingale inequlaity as in the proof of LIL. Then present M n (in the exponent) like a telescopic sum, then use the orthogonality of martingale increments. Use part (a), then find the minimum in θ of the expression in the exponent. Homework 49 (Exercise for Markov chain CLT). Consider the following Markov chain X = {X} n=0 : The state space is Z. The transition probabilities are as follows: p(0, 1) = p(0, 1) = 1. For an arbitrary 2 x N + := N \ {0} we have (a) Find the stationary measure π for X. Definitions p(x, x + 1) = p(x, 0) = 1 2, and p( x, x 1) = p( x, 0) = 1 2. Define the operator P : L 1 (Z, π) L 1 (Z, π) by (P g)(i) := p(i, j)g(j) and let I be the identity on L 1 (Z, π). Basically P is an infinite matrix and g is an infinite column vector and the action of the operator P on g is the product of this infinite matrix P with the infinite column vector g like (P g)(i), i Z. Let f : Z R be an arbitrary function satisfying the following conditions: x Z, f(x) = f( x), and a < 2 s.t. f(x) < a x for all x large enough. For example: polynomials of the form f(x) = n b 2i 1 x 2i 1 (b) Construct an U L 2 (π) such ((I P ) U)(i) = f(i). (c) From now on we always assume f(x) = x 3. Determine i=1 j Z σ 2 := E π [ (U(X1 ) U(X 0 ) + f(x 0 )) 2. (d) Prove P ( 3σ n f(x 1 ) + + f(x n ) 3σ n) 0.99 for sufficiently large n. 7

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

March 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang.

March 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang. Florida State University March 1, 2018 Framework 1. (Lizhe) Basic inequalities Chernoff bounding Review for STA 6448 2. (Lizhe) Discrete-time martingales inequalities via martingale approach 3. (Boning)

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Martingale Theory for Finance

Martingale Theory for Finance Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales

More information

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006 Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume

More information

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0). Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0). Exercises: sheet 1 1. Prove: Let X be Poisson(s) and Y be Poisson(t) distributed. If X and Y are independent, then X + Y is Poisson(t + s) distributed (t, s > 0). This means that the property of a convolution

More information

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. Institute for Applied Mathematics WS17/18 Massimiliano Gubinelli Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. [version 1, 2017.11.1] We introduce

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

Probability Theory I: Syllabus and Exercise

Probability Theory I: Syllabus and Exercise Probability Theory I: Syllabus and Exercise Narn-Rueih Shieh **Copyright Reserved** This course is suitable for those who have taken Basic Probability; some knowledge of Real Analysis is recommended( will

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

Probability Theory II. Spring 2016 Peter Orbanz

Probability Theory II. Spring 2016 Peter Orbanz Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and

More information

STOCHASTIC MODELS FOR WEB 2.0

STOCHASTIC MODELS FOR WEB 2.0 STOCHASTIC MODELS FOR WEB 2.0 VIJAY G. SUBRAMANIAN c 2011 by Vijay G. Subramanian. All rights reserved. Permission is hereby given to freely print and circulate copies of these notes so long as the notes

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Martingale Theory and Applications

Martingale Theory and Applications Martingale Theory and Applications Dr Nic Freeman June 4, 2015 Contents 1 Conditional Expectation 2 1.1 Probability spaces and σ-fields............................ 2 1.2 Random Variables...................................

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

1 Stat 605. Homework I. Due Feb. 1, 2011

1 Stat 605. Homework I. Due Feb. 1, 2011 The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d

More information

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 ) 1 Introduction These problems are meant to be practice problems for you to see if you have understood the material reasonably well. They are neither exhaustive (e.g. Diffusions, continuous time branching

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Mean-field dual of cooperative reproduction

Mean-field dual of cooperative reproduction The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

CONVERGENCE OF RANDOM SERIES AND MARTINGALES CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the

More information

Stochastic Models (Lecture #4)

Stochastic Models (Lecture #4) Stochastic Models (Lecture #4) Thomas Verdebout Université libre de Bruxelles (ULB) Today Today, our goal will be to discuss limits of sequences of rv, and to study famous limiting results. Convergence

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

Lecture 4 An Introduction to Stochastic Processes

Lecture 4 An Introduction to Stochastic Processes Lecture 4 An Introduction to Stochastic Processes Prof. Massimo Guidolin Prep Course in Quantitative Methods for Finance August-September 2017 Plan of the lecture Motivation and definitions Filtrations

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

On the convergence of sequences of random variables: A primer

On the convergence of sequences of random variables: A primer BCAM May 2012 1 On the convergence of sequences of random variables: A primer Armand M. Makowski ECE & ISR/HyNet University of Maryland at College Park armand@isr.umd.edu BCAM May 2012 2 A sequence a :

More information

Homework 1 due on Monday September 8, 2008

Homework 1 due on Monday September 8, 2008 Homework 1 due on Monday September 8, 2008 Chapter 1, Problems 4, 5, 7. Also solve the following problems (A) Let E be the collection of intervals in = (, ) of type (r, ) where r is an arbitrary rational

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

T 1. The value function v(x) is the expected net gain when using the optimal stopping time starting at state x:

T 1. The value function v(x) is the expected net gain when using the optimal stopping time starting at state x: 108 OPTIMAL STOPPING TIME 4.4. Cost functions. The cost function g(x) gives the price you must pay to continue from state x. If T is your stopping time then X T is your stopping state and f(x T ) is your

More information

1 Exercises for lecture 1

1 Exercises for lecture 1 1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Stochastic Processes Theory for Applications Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv Swgg&sfzoMj ybr zmjfr%cforj owf fmdy xix Acknowledgements xxi 1 Introduction and review

More information

Admin and Lecture 1: Recap of Measure Theory

Admin and Lecture 1: Recap of Measure Theory Admin and Lecture 1: Recap of Measure Theory David Aldous January 16, 2018 I don t use bcourses: Read web page (search Aldous 205B) Web page rather unorganized some topics done by Nike in 205A will post

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Homework for MATH 4603 (Advanced Calculus I) Fall Homework 13: Due on Tuesday 15 December. Homework 12: Due on Tuesday 8 December

Homework for MATH 4603 (Advanced Calculus I) Fall Homework 13: Due on Tuesday 15 December. Homework 12: Due on Tuesday 8 December Homework for MATH 4603 (Advanced Calculus I) Fall 2015 Homework 13: Due on Tuesday 15 December 49. Let D R, f : D R and S D. Let a S (acc S). Assume that f is differentiable at a. Let g := f S. Show that

More information

CHAPTER 3: LARGE SAMPLE THEORY

CHAPTER 3: LARGE SAMPLE THEORY CHAPTER 3 LARGE SAMPLE THEORY 1 CHAPTER 3: LARGE SAMPLE THEORY CHAPTER 3 LARGE SAMPLE THEORY 2 Introduction CHAPTER 3 LARGE SAMPLE THEORY 3 Why large sample theory studying small sample property is usually

More information

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011 Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding

More information

MATH 56A SPRING 2008 STOCHASTIC PROCESSES

MATH 56A SPRING 2008 STOCHASTIC PROCESSES MATH 56A SPRING 008 STOCHASTIC PROCESSES KIYOSHI IGUSA Contents 4. Optimal Stopping Time 95 4.1. Definitions 95 4.. The basic problem 95 4.3. Solutions to basic problem 97 4.4. Cost functions 101 4.5.

More information

CHAPTER 1. Martingales

CHAPTER 1. Martingales CHAPTER 1 Martingales The basic limit theorems of probability, such as the elementary laws of large numbers and central limit theorems, establish that certain averages of independent variables converge

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

ECON 2530b: International Finance

ECON 2530b: International Finance ECON 2530b: International Finance Compiled by Vu T. Chau 1 Non-neutrality of Nominal Exchange Rate 1.1 Mussa s Puzzle(1986) RER can be defined as the relative price of a country s consumption basket in

More information

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition Preface Preface to the First Edition xi xiii 1 Basic Probability Theory 1 1.1 Introduction 1 1.2 Sample Spaces and Events 3 1.3 The Axioms of Probability 7 1.4 Finite Sample Spaces and Combinatorics 15

More information

1. Aufgabenblatt zur Vorlesung Probability Theory

1. Aufgabenblatt zur Vorlesung Probability Theory 24.10.17 1. Aufgabenblatt zur Vorlesung By (Ω, A, P ) we always enote the unerlying probability space, unless state otherwise. 1. Let r > 0, an efine f(x) = 1 [0, [ (x) exp( r x), x R. a) Show that p f

More information

Hints/Solutions for Homework 3

Hints/Solutions for Homework 3 Hints/Solutions for Homework 3 MATH 865 Fall 25 Q Let g : and h : be bounded and non-decreasing functions Prove that, for any rv X, [Hint: consider an independent copy Y of X] ov(g(x), h(x)) Solution:

More information

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias Notes on Measure, Probability and Stochastic Processes João Lopes Dias Departamento de Matemática, ISEG, Universidade de Lisboa, Rua do Quelhas 6, 1200-781 Lisboa, Portugal E-mail address: jldias@iseg.ulisboa.pt

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

17. Convergence of Random Variables

17. Convergence of Random Variables 7. Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then lim f n = f if lim f n (x) = f(x) for all x in R. This

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

QUESTIONS AND SOLUTIONS

QUESTIONS AND SOLUTIONS UNIVERSITY OF BRISTOL Examination for the Degrees of B.Sci., M.Sci. and M.Res. (Level 3 and Level M Martingale Theory with Applications MATH 36204 and M6204 (Paper Code MATH-36204 and MATH-M6204 January

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

University of Regina. Lecture Notes. Michael Kozdron

University of Regina. Lecture Notes. Michael Kozdron University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating

More information

Inference for Stochastic Processes

Inference for Stochastic Processes Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

On Optimal Stopping Problems with Power Function of Lévy Processes

On Optimal Stopping Problems with Power Function of Lévy Processes On Optimal Stopping Problems with Power Function of Lévy Processes Budhi Arta Surya Department of Mathematics University of Utrecht 31 August 2006 This talk is based on the joint paper with A.E. Kyprianou:

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Convergence of Random Variables

Convergence of Random Variables 1 / 15 Convergence of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay March 19, 2014 2 / 15 Motivation Theorem (Weak

More information

Asymptotic Statistics-III. Changliang Zou

Asymptotic Statistics-III. Changliang Zou Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Notes 18 : Optional Sampling Theorem

Notes 18 : Optional Sampling Theorem Notes 18 : Optional Sampling Theorem Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Chapter 14], [Dur10, Section 5.7]. Recall: DEF 18.1 (Uniform Integrability) A collection

More information

STAT 200C: High-dimensional Statistics

STAT 200C: High-dimensional Statistics STAT 200C: High-dimensional Statistics Arash A. Amini April 27, 2018 1 / 80 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d

More information