P (A G) dp G P (A G)

Similar documents
ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

Math 6810 (Probability) Fall Lecture notes

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Selected Exercises on Expectations and Some Probability Inequalities

Useful Probability Theorems

Exercises Measure Theoretic Probability

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

1. Stochastic Processes and filtrations

A D VA N C E D P R O B A B I L - I T Y

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

March 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang.

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Martingale Theory for Finance

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

Exercise Exercise Homework #6 Solutions Thursday 6 April 2006

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Exercises: sheet 1. k=1 Y k is called compound Poisson process (X t := 0 if N t = 0).

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Probability Theory. Richard F. Bass

Probability Theory I: Syllabus and Exercise

Probability and Measure

Probability Theory II. Spring 2016 Peter Orbanz

STOCHASTIC MODELS FOR WEB 2.0

1 Presessional Probability

Martingale Theory and Applications

Exercises Measure Theoretic Probability

1 Stat 605. Homework I. Due Feb. 1, 2011

Lecture 12. F o s, (1.1) F t := s>t

STAT 200C: High-dimensional Statistics

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

MAS223 Statistical Inference and Modelling Exercises

Mean-field dual of cooperative reproduction

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

Stochastic Models (Lecture #4)

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Verona Course April Lecture 1. Review of probability

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

1 Sequences of events and their limits

STA205 Probability: Week 8 R. Wolpert

Lecture 22: Variance and Covariance

STAT 7032 Probability Spring Wlodek Bryc

FE 5204 Stochastic Differential Equations

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

Lecture 4 An Introduction to Stochastic Processes

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

ELEMENTS OF PROBABILITY THEORY

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

On the convergence of sequences of random variables: A primer

Homework 1 due on Monday September 8, 2008

Formulas for probability theory and linear models SF2941

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Multivariate Random Variable

T 1. The value function v(x) is the expected net gain when using the optimal stopping time starting at state x:

1 Exercises for lecture 1

The main results about probability measures are the following two facts:

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Admin and Lecture 1: Recap of Measure Theory

{σ x >t}p x. (σ x >t)=e at.

Homework for MATH 4603 (Advanced Calculus I) Fall Homework 13: Due on Tuesday 15 December. Homework 12: Due on Tuesday 8 December

CHAPTER 3: LARGE SAMPLE THEORY

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

MATH 56A SPRING 2008 STOCHASTIC PROCESSES

CHAPTER 1. Martingales

conditional cdf, conditional pdf, total probability theorem?

Probability and Measure

Brownian Motion and Conditional Probability

. Find E(V ) and var(v ).

Gaussian vectors and central limit theorem

Lecture 1: August 28

ECON 2530b: International Finance

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

1. Aufgabenblatt zur Vorlesung Probability Theory

Hints/Solutions for Homework 3

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

Brownian Motion and Stochastic Calculus

17. Convergence of Random Variables

Random Process Lecture 1. Fundamentals of Probability

QUESTIONS AND SOLUTIONS

Applications of Ito s Formula

Lecture 2: Repetition of probability theory and statistics

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

University of Regina. Lecture Notes. Michael Kozdron

Inference for Stochastic Processes

STAT 418: Probability and Stochastic Processes

On Optimal Stopping Problems with Power Function of Lévy Processes

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Exercises. T 2T. e ita φ(t)dt.

Convergence of Random Variables

Asymptotic Statistics-III. Changliang Zou

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Exercises in stochastic analysis

Notes 18 : Optional Sampling Theorem

STAT 200C: High-dimensional Statistics

Transcription:

First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume Y another r.v. for which P (Y = 0 or Y = 1) = 1. Prove Y σ(x) iff there exists a ϕ : R R Borel Measurable function such Y = ϕ(x). Homework 3. Let X and Y be random variables on the same probability space. Prove X and Y are independent iff for every ϕ : R R bounded measurable functions we have E [ϕ(y ) X = E [ϕ(y ). Homework 4. Assume X, Y are jointly continuous r.v. with joint distribution function f(x). Prove yf(x, y)dy R E [Y X = E [Y σ(x) = f(x, y)dy. Homework 5. Let Y σ(g). Prove Homework 6. Let X, Y L 1 (Ω, F, P) satisfying Show P(X = Y ) = 1. E [X G Y A G E [X 1 A E [Y 1 A. E [X Y = Y and E [Y X = X Homework 7. Prove the general version of Bayes s formula: Given the probability space (Ω, F, P) and let G be a sub-σ-algebra of F. Let G G. Show P (A G) dp G P (G A) = dp (1) P (A G) Homework 8. Prove the conditional variance formula where Var(X Y ) = E [X 2 Y (E [X Y ) 2. Ω R Var(X) = E [Var(X Y ) + Var (E [X Y), (2) Homework 9. Let X 1, X 2,... iid r.v. and N is a non-negative integer valued r.v. is independent of X i, i 1. Prove ( N ) Var X i = E [N Var(X) + (E [X) 2 Var(N). (3) i=1 Second homework assignment. Due at 12:15 on 29 September 2016. Homework 10. Let X t be a Poisson(1). That is a Poisson process with rate λ = 1. (See Durrett s book p. 139 if you forgot the definition.) Find: E [X 1 X 2 and E [X 2 X 1. Homework 11. Construct a martingale which is NOT a Markov chain. } Homework 12. For every i = 1,..., m let { M n (i) be a sequence of martingales w.r.t. {X n} n=1 n=1. Show M n := max M n (i) 1 i n is a submartingal w.r.t. {X n }. 1

Homework 13. Let ξ 1, ξ 2,... standard normal variables. (Recall in this case the moment generating function M(θ) = E [ e θξ i = e θ 2 /2.) Let a, b R and Prove (a) X n 0 a.s. iff b > 0 (b) X n 0 in L r iff r < 2b a 2. n S n := ξ k and X n := e asn bn Homework 14. Let S n := X 1 + + X n, where X 1, X 2,... are iid with X 1 Exp(1). Verify M n := is a martingale w.r.t. the natural filtration F n. n! esn (1 + S n ) n+1 Third homework assignment. Due at 12:15 on 6 October 2016. Homework 15. Prove the following two definitions of λ-system L are equivalent: Definition 1. (a) Ω L. (b) If A, B L and A B then B \ A L (c) If A n L and A n A ( is A n A n+1 and A = n=1a n ) then A L. Definition 2. (i) Ω L. (ii) If A L then A c L. (iii) If A i A j =, A i L then i=1a i L. Homework 16. There are n white and n black balls in an urn. We pull out all of them one-by-one without replacement. Whenever we pull: a black ball we have to pay 1$, a white ball we receive 1$. Let X 0 := 0 and X i be the amount we gained or lost after the i-th ball was pulled. We define Y i := X i 2n i, for 1 i 2n 1, and Z i := (a) Prove Y = (Y i ) and Z = (Z i ) are martingales. (b) Find Var(X i ) =? X 2 i (2n i) (2n i)(2n i 1) for 1 i 2n 2. Homework 17. Let X, Y be two independent Exp(λ) r.v. and Z := X +Y. Show for any non-negative measurable h we have E [h(x) Z = 1 Z h(t)dt. Z 0 Homework 18. Let X 1, X 2,... be iid r.v. with P (X n = n 2 ) = 1 n 2 and P ( X n = n2 S n := X 1 + + X n. Show (a) lim n S n /n =. 2 n 2 1 ) = 1 1 n 2. Let

(b) {S n } is a martingale which converges to a.s.. Homework 19. (This was withdrawn on this week and it is reassigned next week.) A player s winnings per unit stake on game n are ξ n, where {ξ} n=1 are i.i.d. r.v. P (ξ n = 1) = p and P (ξ n = 1) = q := 1 p, where surprisingly, p (1/2, 1), is p > q. That is with probability q < 1/2 the player losses her stake and with probability p she gets back twice of her stake. Let C n be the player s stake on game n. We assume C n is previsible, is C n+1 F n := σ (ξ 1,..., ξ n ) for all n. Further, we assume 0 C n Y n 1, where Y n 1 is the fortune of the player at time n. We call α := p log p + q log q + log 2 the entropy. Prove (a) log Z n nα is a supermartingale. This means the rate of winnings E [log Y n log Y 0 nα (b) There exists a strategy for which log Z n nα is a martingale. Fourth homework assignment. Due at 12:15 on 13 October 2016. Homework 20. (Reassigned from last week) A player s winnings per unit stake on game n are ξ n, where {ξ} n=1 are i.i.d. r.v. P (ξ n = 1) = p and P (ξ n = 1) = q := 1 p, where surprisingly, p (1/2, 1), is p > q. That is with probability q < 1/2 the player losses her stake and with probability p she gets back twice of her stake. Let C n be the player s stake on game n. We assume C n is previsible, is C n+1 F n := σ (ξ 1,..., ξ n ) for all n. Further, we assume 0 C n Y n 1, where Y n 1 is the fortune of the player at time n. We call α := p log p + q log q + log 2 the entropy. Prove (a) log Y n nα is a supermartingale. This means the rate of winnings E [log Y n log Y 0 nα (b) There exists a strategy for which log Y n nα is a martingale. Homework 21. Let X, Z be R d -valued r.v. defined on the (Ω, F, P). Assume E [ e it X+is Z = E [ e itx E [ e isz, s, t R n. Prove X, Z are independent. Homework 22. Let X N (µ, Σ) in R n. Let X 1 := (X 1,..., X p ) and X 2 := (X p+1,..., X n ). Let Σ, Σ 1 and Σ 2 the covariance matrix of X, X 1 and X 2 respectively. Prove X 1 and X 2 are independent if and only if ( ) Σ1 0 Σ =. 0 Σ 2 Homework 23. Let Y N (µ, Σ) and let B be a non-singular matrix. Find the distribution of X = B Y. Homework 24. Let Y N (µ, Σ) in R 2, Y = (Y 1, Y 2 ) and let a 1, a 2 R. Find the distribution of a 1 Y 1 + a 2 Y 2. Fifth homework assignment. Due at 12:15 on 20 October 2016. Homework 25. Construct a random vector (X, Y ) such both X and Y are one-dimensional normal distributions but (X, Y ) is NOT a bivariate normal distribution. Homework 26. Let X = (X 1,..., X d ) be standard multivariate normal distribution. Let Σ be an n n positive semi-definite, symmetric matrix. and let µ R d. Prove there exists an affine transformation T : R d R d, such T (X) N (µ, Σ). 3

Homework 27. Let X be the height of the father and Y be the height of the son in sample of father-son pairs. Then (X, Y ) is bivariate normal. Assume E [X = 68 (in inches), E [Y = 69, σ X = σ Y = 2, ρ = 0.5, where ρ is the correlation of (X, Y ). Find the conditional distribution of Y given X = 80 (6 feet 8 inches). (That is find the parameters in Y N (µ Y, σ 2 (Y )).) Homework 28 (Extension of part (iii) of Doob s optional stopping Theorem). Let X be a supermartingale. Let T be a stopping time with E [T <.(Like in part (iii) of Doob s optional stopping Theorem.) Assume there is a C such Prove E [X T E [X 0. E [ X k X k 1 F k 1 (ω) C, k > 0 and for a.e. ω. Homework 29. Let X n be a discrete time birth-death process with probabilities with transition probabilities p(i, i + 1) = p i, p(i, i 1) = 1 p i =: q i, p 0 = 1 We define the following function:, g : N R +, g(k) := 1 + k 1 j q i j=1 i=1 p i (a) Prove Z n := g(x n ) is a martingale for the natural filtration. (b) Let 0 < i < n be fixed. Find the probability a process started from i gets to n earlier than to 0. Sixth homework assignment. Due at 12:15 on 3 November 2016. Homework 30. Let {ε n } be an iid sequence of real numbers satisfying P (ε n = ±1) = 1 2. Show n ε na n converges alsmost surely iff a 2 n <. Homework 31. Let X = (X n ) be an L 2 random walk is a martingale. Let σ 2 be the variance of the k-th increment Z k := X k X k 1 for all k. Prove the quadratic variance is A n = nσ 2. Homework 32. Prove the assertion of Remark 5.6 from File C. Homework 33. Let M = (M n ) be a martingale with M 0 = 0 and M k M k 1 < C for a C R. Let T 0 be a stopping time and we assume E [T. Let U n := n (M k M k 1 ) 2 1 T k, V n := 2 (M i M i 1 ) (M j M j 1 ) 1 T j. 1 i<j n U := (M k M k 1 ) 2 1 T k, V := 2 (a) M 2 T n = U n + V n and M 2 T = U + V. 1 i<j (M i M i 1 ) (M j M j 1 ) 1 T j. Prove (b) Further, if E [T 2 < then lim n U n = U a.s. and E [U < and E [V n = E [V = 0. (c) Conclude lim n E [M 2 T n = E [M 2 T. Homework 34 (Wald equalities). Let Y 1, Y 2,... be iid r.v. with Y i L 1. Let S n := Y 1 + + Y n and we write µ := E [Y i. Given a stopping time T 1 satisfying: E [T <. Prove 4

(a) E [S T = µ E [T. (4) (b) Further, assume Y i are bounded ( C i R s.t. Y i < C i ) and E [T 2 <. We write σ 2 := Var(Y i ). Then E [ (S T µt ) 2 = σ 2 E [T. (5) Hint: Introduce an appropriate martingale and apply the result of the previous exercise. Sevens homework assignment. Due at 12:15 on 10 November 2016. Homework 35 (Branching Processes). You might want to recall what you have learned about Bransching Processes. (See File C of the course "Stochastic Processes".) A Branching Process Z = (Z n ) n=0 is defined recursively by a given family of Z + valued iid rv. { X (n) } k as follows: k,n=1 Z 0 := 1, Z n+1 := X (n+1) 1 + + X (n+1) Z n, n 0. Let µ = E [ X (n) k and Fn = σ(z 0, Z 1,... Z n ). We write f(s) for the generating function. is Further, let f(s) = P ( X (n) k = l ) s p l l=0 }{{} l for any k, n. {extinction} := {Z n 0} = { n, Z n = 0} {explosion} = {Z n }. let q := P ([extinction) :=. Recall we learned q is the smaller (if there are two) fixed point of f(s). That is q is the smallest solution of f(q) = q. Prove (a) E [Z n = µ n. Hint: Use induction. (b) For every s 0 we have E [ s Z n+1 F n = f(s) Z n. Explain why it is true q Zn lim Z n = Z exists a.s. n (c) Let T := min {n : Z n = 0}. (T = if Z n > 0 always.) is a martingale and (d) Prove q = E [ q Z T = E [ q Z 1 T = + E [ q ZT 1 T <. (e) Prove E [ q Z 1 T = = 0. (f) Conclude if T (ω) = then Z =. (g) Prove P (extinction) + P (explosion) = 1. (6) Homework 36 (Branching Processes cont.). Here we assume Prove µ = E [ X (n) k < and 0 < σ 2 := Var(X (n) k ) <. (a) M n = Z n /µ n is a martingale for the natural filtration F n (b) E [ Z 2 n+1 F n = µ 2 Z 2 n + σ 2 Z n. Conclude M is bounded in L 2 µ > 1. 5

(c) If µ > 1 then M := lim n M n exists (in L 2 and a.s.) and Var(M ) = σ 2 µ(µ 1). Homework 37 (Branching Processes cont.). Assume q = 1 (q was defined in the one but last exercise as the probability of extinction). Prove M n = Z n /µ n is NOT a UI martingale. Eights homework assignment. Due at 12:15 on 24 November 2016. Homework 38. Let X 1, X 2,... be iid rv. with continuous distribution distribution function. Let E i be the event a record occurs at time n. That is E 1 := Ω and E n := {X n > X m, m < n}. Prove {E i } i=1 independent and P (E i) = 1 i. Homework 39 (Continuation). Let E 1, E 2,... be independent with P (E i ) = 1/i. Let Y i := 1 Ei and N n := Y 1 + + Y n. (In the special case of the previous homework, N n is the number of records until time n.) Prove (a) Y k 1/k log k converges almost surely. N (b) Using Krocker s Lemma conclude lim n = 1 a.s.. n log n (c) Apply this to the situation of the previous exercise to get an estimate on the number of records until time n. Homework 40. Let C be a class of rv on (Ω, F, P). Prove the following assertions (1) and (2) are equivalent: 1. C is UI. 2. Both of the following two conditions hold: (a) C is L 1 -bounded. That is A := sup {E [ X : X C} < AND (b) ε > 0, δ > 0 s.t. F F and P (F ) < δ = E [ X ; F < ε. Homework 41. Let C and D be UI classes of rv.. Prove C + D := {X + Y : X C and Y D} is also UI. Hint: use the previous exercise. Homework 42. Let C be a UI family of rv.. Let us define D := {Y : X C, G sub-σ-algebra of F s.t. Y = E [X G}. Prove D is also UI. Ninth homework assignment. Due at 12:15 on 1 December 2016. Homework 43. Let X 1, X 2,... be iid. rv. with E [X + = and E [X <. (Recall X = X + X and X +, X 0.) Use SLLN to prove S n /n a.s., where S n := X 1 + + X n. Hint: For an M > 0 let Xi M := X i M and Sn M := Xn M + + Xn M. Explain why lim S M n n /n E [ Xi M and lim inf n S n/n lim n S M n /n. Homework 44. Let X 1, X 2,... be iid rv with E [ X i <. Prove E [X 1 S n = S n /n. (This is trivial intuitively from symmetry, but prove it with formulas.) Homework 45. Let C be a class of random variables of (Ω, F, P). Then the following condition implies C is UI: If p > 1 and A R such E [ X p < A for all X C. (L p bounded for some p > 1.) 6

Homework 46. Let C be a class of random variables of (Ω, F, P). Then the following condition implies C is UI: Y L 1 (Ω, F, P), s.t. X C we have X(ω) Y (ω). (C is dominated by an integrable (non-negative) r.v..) Homework 47. (Infinite Monkey Theorem) Monkey typing random on a typewriter for infinitely time will type the complete works of Shakespeare eventually. Tenth homework assignment. Due at 12:15 on 8 December 2016. Homework 48 (Azuma-Hoeffding Inequality). (a) Assume Y is a r.v. which takes values from [ c, c and E [Y = 0 holds. Prove for all θ R we have E [ e ( ) 1 θy cosh(θc) exp 2 θ2 c 2. Hint: Let f(z) := exp(θz), z [ c, c. Then by the convexity of f we have f(y) c y 2c f( c) + c + y 2c f(c). (b) Let M be a martingale with M 0 = 0 such or a sequence of positive numbers {c n } n=1 we have M n M n 1 c n, n. Then the following inequality holds for all x > 0: ( ) ( P sup M k x exp 1 ) n k n 2 x2 / c 2 k. This exercise is from the Williams book. Hint: Use submartingale inequlaity as in the proof of LIL. Then present M n (in the exponent) like a telescopic sum, then use the orthogonality of martingale increments. Use part (a), then find the minimum in θ of the expression in the exponent. Homework 49 (Exercise for Markov chain CLT). Consider the following Markov chain X = {X} n=0 : The state space is Z. The transition probabilities are as follows: p(0, 1) = p(0, 1) = 1. For an arbitrary 2 x N + := N \ {0} we have (a) Find the stationary measure π for X. Definitions p(x, x + 1) = p(x, 0) = 1 2, and p( x, x 1) = p( x, 0) = 1 2. Define the operator P : L 1 (Z, π) L 1 (Z, π) by (P g)(i) := p(i, j)g(j) and let I be the identity on L 1 (Z, π). Basically P is an infinite matrix and g is an infinite column vector and the action of the operator P on g is the product of this infinite matrix P with the infinite column vector g like (P g)(i), i Z. Let f : Z R be an arbitrary function satisfying the following conditions: x Z, f(x) = f( x), and a < 2 s.t. f(x) < a x for all x large enough. For example: polynomials of the form f(x) = n b 2i 1 x 2i 1 (b) Construct an U L 2 (π) such ((I P ) U)(i) = f(i). (c) From now on we always assume f(x) = x 3. Determine i=1 j Z σ 2 := E π [ (U(X1 ) U(X 0 ) + f(x 0 )) 2. (d) Prove P ( 3σ n f(x 1 ) + + f(x n ) 3σ n) 0.99 for sufficiently large n. 7