Selected Exercises on Expectations and Some Probability Inequalities
|
|
- Amberlynn Jones
- 6 years ago
- Views:
Transcription
1 Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ <. [Compare Cauchy-Schwartz and Ceby sev.] #2 Show the identity: (exp( exp( (x+a))) exp( exp( x)))dx = a, [generalization is sometimes much easier.] #3 If {X n } is a sequence of identically distributed random variables with finite mean, then calculate E( max n X ) n #4 If X and Y are independent and for some p > 0, E X + Y p < then E X p < and E Y p <. #5 For arbitrary events {E }, show for each n N, n P( n =E ) P(E ) P(E E k ), = k n n n P( n =E ) ( P(E )) 2 /E(( E ) 2 ). = = (Bonferroni) If for each n, E (n), =,..., n are independent and P( n =E (n) ) 0 as n, then P( n =E (n) ) n = P(E(n) ). If = P(E ) = and there exists c > 0 such that we have for any m < n, then P( sup E n ) > 0. P(E m E n ) cp(e m )P(E n ), #6 Suppose that (B(t)) is a one dimensional Brownian motion. What is ( t B(s)ds)? Well 0 defined? #7 If E(Y 2 F) = X 2 and E(Y F) = X for two random variables X, Y where F is a σ field, then Y = X a.e. #8 Suppose S is distributed in exponential i.e. P(S > t) = e t, t > 0. Compute E(S S t) and E(S S t) for each t. Selected Exercises on Martingales #9 For integrable random variable, if E(X n+ X,..., X n ) = n (X X n ) = Y n, then Y n is a martingale.
2 #0 Suppose X n is a uniformly integrable submartingale, then for any stopping time τ, show (i) X τ n is a uniformly integrable submartingale, and (ii) EX EX τ sup n EX n. # Consider a simple random walk X 0 = 0 and X n = n = ξ for n with I.I.D. symmetric Bernoulli increments: P(ξ = ±) = /2 for. Define the moment generating function φ(θ) = E(e θξ ) = cosh θ and a stopping time T a = inf{k 0 X k = a} of filtration F n = σ(x, n). Show for any stopping time τ, (φ(θ)) n e θx n is a martingale. Then show P(T a < ) =, E(e λt ) = e λ e 2λ and calculate P(T = 2m ), m. Moreover, P(T a < T b ) = b/(a + b), E(T a T b ) = ab for a, b > 0. [What happens if P(ξ = ) = p = P(ξ = ) /2? How about Brownian motion analogue?] #2 If {X n, F n } is a square integrable martingale, show X n + f(< X > n ) = 0, a.e. on {< X > = } for every increasing function f : [0, ) [0, ) with 0 du < where (+f(u)) 2 < X > n is the nondecreasing predictable random variables such that X 2 n < X > n becomes a martingale. [Is it true for Brownian motion?] #3 (Azuma(967)-Hoeffding(963) inequality) If {X n, F n } is a martingale with increments ξ n = X n X n satisfying ξ n < r n < a.s. for some positive sequence {r n }, then show for λ > 0 ( ) P( X n λ) 2 exp λ 2 2 k = r2 k [ Show for e xy cosh y + x sinh y on x [, ] for fixed y > 0. Take x = ξ k /r k and y = tr k to obtain cosh y + ξ k /r k sinh r k e tr k for t 0. Finally, find a bound for E(e t P ξ k) and use P(Xn λ) E(e t P ξ k)/e λt. For an application to long common subsequence problem, see Steele(997) pp 4] #4 Suppose X, X 2,... are independent positive non degenerate random variables such that E(X ) = for. Define M 0 =, M n = n = X. Show M n is martingale and M = n M n exists in R. [in this context the following are equivalent: (a) EM =, (b) n E M n M = 0, (c) {M n } is uniformly integrable, (d) = a > 0 where a = E X, (e) = ( a ) <. This is due to Kakutani. An application of this result is consistency of the likelihood ratio test.] #5 Suppose {X n, F n } is a square integrable martingale with X 0 = 0 and so is {Xn 2 < X > n, F n }. Here < X > n is the predictable nondecreasing sequence. Show if E < X > <, E(sup n τ X n ) 3E < X > τ for any stopping time τ. [This is valid if we replace < X > by [X] n := n = (X X ) 2. Indeed X 2 [X] is a martingale. For this quadratic variation process [X], derive the lower bound for E(sup k τ X k ) (Davis Inequality). Also see Burkholder-Gundy Inequality.] 2.
3 #6 Fatou equation let X 0, X,..., be a sequence of random variables adapted to a filtration {F n } n=0 with E[sup n X n ] <. Denote by S the collection of stopping times of this filtration. Then show sup EX τ := inf τ S σ S sup τ σ,τ S EX τ = E( sup X n ) n #7 Suppose W is standard Brownian motion. Consider the dyadic rational partitions = t2 n of interval [0, t]. Show t (n) 2 n = (W t (n) W (n) t ) 2 = t, 2 n = W t (n) W (n) t =. #8 Show W t t W 0 s/sds, 0 t and W t t (W 0 W s )/( s)ds, 0 t are Brownian motion if W is standard Brownian motion. #9 Let Φ be the cumulative distribution function of normal random variable. Show Φ(W t / t), 0 t is a martingale, whereas Φ( W t / t), 0 t is a submartingale. #20 For a < 0 < b define T b := inf{t 0 : W t = b}, T ab := inf{t 0 : W t (a, b)}for standard Brownian motion W with W 0 = 0. Calculate P(W Tab = b), E(T ab ), P(T b < ), E(T b ). [if W is Brownian motion with drift, then how will they be adusted?] State the reflection principle. Show is Brownian motion. { Wt 0 t < T W := b 2b W t T b t < 3
4 #2 X,... i.i.d EX = µ (0, ). Define N c = sup{n : S n cn α }, where c > 0 and 0 < α <. Then Nc α /c /µ a.s. # 22 X,... independent E(X 2 n) =, EX n = 0 for each n. Then n = X n(log n) /2+ε = 0, a.s. # 23 Y n 0 in probability if and only if there exists Eg(Y n ) 0 where g(x) = x /( + x ). # 24 If X n /b n 0 as n for strictly increasing b n, then b n max n X 0. # 25 X,..., i.i.d. E X <, S n = n = X. For a > EX, show (i) P(S n > na) = 0 (ii) P(S n > na)p(s m > ma) P(S n+m > (m + n)a). (iii) P(S n na) exp(nγ(a)) where γ(a) = n P(S n > na). # 26 X i i.i.d. EX = 0, EX 2 =, EX 4 = ν, 0 < ν <. Suppose N(t)/t converges in probability to some positive constant c as t. Find the it i N(t) X ix /N(t) as t. # 27 X i i.i.d. E X r < for some r. Then S n /n converges to EX almost surely and also in L. # 28 2Ef(X +X 2 )f(x +X 3 ) Ef 2 (X +X 2 )+E 2 f(x +X 2 ) for any bounded measurable function f and i.i.d. r.v. X i, i =, 2, 3. # 29 f(x, x 2 ) is bivariate normal density if and only if both conditional densities f(x x 2 ) and f(x 2 x ) are normal. Prove or disprove. # 30 {X n, M n } bivariate discrete Markov chain with transition probabilities are defined by P(X n+ = M n+, M n, X n ) = ( Mn+ ) { } { ( + σ)xn Mn X n M n + σx n M n + σx n } Mn+ for = 0,..., M n+, σ > 0 and M n as n. Show Y := n Y n n X n /M n exists and Y ( Y ) = 0. P # 3 Explain the relationship among (a) X n X (b) X n ε) < for ε > 0, (d) X n L X. a.s. X, (c) n P( X n X > # 32 Suppose X,..., X n i.i.d. Then E X < if and only if P( X n > n i.o.) = 0. # 33 X,... i.i.d. with common distribution function F. Let F n be the empirical distribution of X,..., X n. Show that (a) P( F n (t) F (t) > x) 2e Cnx2 for some C > 0 and t R. (b) F n (t) F (t) = O([(log n)/n] /2 ). Is it true for uniformly in t? 4
5 Some exercises on characteristic functions: in the following define the characteristic function φ µ (ξ) for a probability measure µ with distribution function F. # 34 Show (sin(aξ/2)/(aξ/2)) 2 and ( ξ /a) 0 are characteristic functions. What are the corresponding probability distributions? # 35 Denote n δ be the density of the normal distribution with mean 0 and variance δ 2 and put f δ (x) := (f n δ )(x) f(x y)n δ (y)dy for a bounded uniformly continuous function f. Show as δ 0, f δ f uniformly in R. # 36 Let X be independent random variables each having the normal distribution. Find the characteristic function of n = X2. # 37 Let Re(y) be the real part of a complex variable y. Show the following Re( φ µ (ξ)) 4 Re( φ µ(2ξ)), Re(φ µ (ξ)) ξ 2 d ξ = π 2 x d µ(x). # 38 Show the inversion formula: for x < x 2 µ((x, x 2 )) + 2 µ({x }) + 2 µ({x 2}) = T 2T T T e iξx e iξx 2 φ µ (ξ)d ξ iξ # 39 Show α 0 cos T (x a) d µ(x) = [T (x a)] 2 2T 2 (F (x + u) F (x u))d u = π T T (T ξ )e iξa φ µ (ξ)d ξ, for T > 0, a R, cos αξ ξ 2 e iξx φ µ (ξ)d ξ for α > 0. # 40 Show the distribution function F is continuous if and only if T 2T T T φ µ (ξ) 2 d ξ = 0. # 4 Show the central it theorem: for X,... i.i.d. with finite mean E(X) = µ and variance Var(X) = σ 2, Sn µn σ converges in distribution to the normal. n # 42 φ µ is a characteristic function if and only if it is positive definite, continuous at zero and φ µ (0) =. 5
6 # 43 Let X and Y be independently identically distributed with mean zero and unit variance. If X + Y and X Y are independent, then the common distribution of X and Y is normal. Markov property and Markov chains # 44 Let F, F 2, F 3 are Borel fields such that F F 2 is independent of F 3. Show for each integrable X F, E(X F 2 F 3 ) = E(X F 2 ). # 45 Suppose X and Y are independent r.v. with p.m. µ and ν. Show for each B B(R), P(X + X 2 B X ) = ν(b X ). # 46 Markov property P(X n+ B X 0,..., X n ) = P(X n+ B X n ) for every n = 0,, 2,... and each B B(R). is equivalent to E(Y X,..., X n ) = E(Y X n ) for any integrable Y σ(x,..., X n+ ). # 47 Consider a Markov chain whose state space consists of the integers i = 0, ±, ±2,... and have transition probabilities P i,i+ P(X n+ = i + X n = i) = p = P(X n+ = i X n = i) = P i,i for each n = 0,,... and X 0 = 0 (a) Show the chain is recurrent when p = /2 and transient when p /2. (b) calculate the probability that the chain ever returns to state 0 when p /2. # 48 Consider a Markov chain with states 0,,... n with P 0, =, P i,i+ = p and P i,i = q = p for i n. Let N 0,n be the number of transitions that it takes the chain to go from state 0 to state n. Calculate E(N 0,n ) and V ar(n 0,n ) for each p. # 49 A transition probability matrix (P i ) with M + states 0,,..., M is said to be double stochastic if the sum over each column and row equals one: that is M+ i= P i =, for all and M+ = P i = for all i. If such a chain is irreducible and aperiodic, then the iting probabilities are given by π = /(M + ) for = 0,... M. # 50 Consider an arbitrary connected graph having a number w i associated with the arc (i, ) for each arc. If (i, ) is not arc, assign w i = 0. A particle moves from node to node with transition probability P i = w i / w i if (i, ) is arc and zero otherwise. Calculate the iting probabilities π. 6
P (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationMath 6810 (Probability) Fall Lecture notes
Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),
More informationTheory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk
Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationSTAT 200C: High-dimensional Statistics
STAT 200C: High-dimensional Statistics Arash A. Amini May 30, 2018 1 / 59 Classical case: n d. Asymptotic assumption: d is fixed and n. Basic tools: LLN and CLT. High-dimensional setting: n d, e.g. n/d
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationUseful Probability Theorems
Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)
More informationLecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.
Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal
More informationPreliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012
Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of
More informationStochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1
Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More informationMarch 1, Florida State University. Concentration Inequalities: Martingale. Approach and Entropy Method. Lizhe Sun and Boning Yang.
Florida State University March 1, 2018 Framework 1. (Lizhe) Basic inequalities Chernoff bounding Review for STA 6448 2. (Lizhe) Discrete-time martingales inequalities via martingale approach 3. (Boning)
More informationExercise Exercise Homework #6 Solutions Thursday 6 April 2006
Unless otherwise stated, for the remainder of the solutions, define F m = σy 0,..., Y m We will show EY m = EY 0 using induction. m = 0 is obviously true. For base case m = : EY = EEY Y 0 = EY 0. Now assume
More informationSolutions to the Exercises in Stochastic Analysis
Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationSolution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have
362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications
More informationADVANCED PROBABILITY: SOLUTIONS TO SHEET 1
ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More informationModern Discrete Probability Branching processes
Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson
More informationBrownian Motion and Stochastic Calculus
ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s
More informationMATH 6605: SUMMARY LECTURE NOTES
MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic
More information4 Sums of Independent Random Variables
4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables
More informationn! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2
Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More informationLecture 21 Representations of Martingales
Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationProblem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1
Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :
More informationProbability Theory. Richard F. Bass
Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2
More informationCHAPTER 3: LARGE SAMPLE THEORY
CHAPTER 3 LARGE SAMPLE THEORY 1 CHAPTER 3: LARGE SAMPLE THEORY CHAPTER 3 LARGE SAMPLE THEORY 2 Introduction CHAPTER 3 LARGE SAMPLE THEORY 3 Why large sample theory studying small sample property is usually
More informationLecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011
Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More information4 Expectation & the Lebesgue Theorems
STA 205: Probability & Measure Theory Robert L. Wolpert 4 Expectation & the Lebesgue Theorems Let X and {X n : n N} be random variables on a probability space (Ω,F,P). If X n (ω) X(ω) for each ω Ω, does
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationAlbert N. Shiryaev Steklov Mathematical Institute. On sharp maximal inequalities for stochastic processes
Albert N. Shiryaev Steklov Mathematical Institute On sharp maximal inequalities for stochastic processes joint work with Yaroslav Lyulko, Higher School of Economics email: albertsh@mi.ras.ru 1 TOPIC I:
More information6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )
6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 7 9/5/013 The Reflection Principle. The Distribution of the Maximum. Brownian motion with drift Content. 1. Quick intro to stopping times.
More informationDoléans measures. Appendix C. C.1 Introduction
Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration
More informationMS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),
MS&E 321 Spring 12-13 Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10 Section 4: Steady-State Theory Contents 4.1 The Concept of Stochastic Equilibrium.......................... 1 4.2
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationChapter 1. Poisson processes. 1.1 Definitions
Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More informationStochastic Calculus. Michael R. Tehranchi
Stochastic Calculus Michael R. Tehranchi Contents Chapter 1. A possible motivation: diffusions 5 1. Markov chains 5 2. Continuous-time Markov processes 6 3. Stochastic differential equations 6 4. Markov
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More informationJump Processes. Richard F. Bass
Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov
More informationAdvanced Probability Theory (Math541)
Advanced Probability Theory (Math541) Instructor: Kani Chen (Classic)/Modern Probability Theory (1900-1960) Instructor: Kani Chen (HKUST) Advanced Probability Theory (Math541) 1 / 17 Primitive/Classic
More informationLecture 21: Convergence of transformations and generating a random variable
Lecture 21: Convergence of transformations and generating a random variable If Z n converges to Z in some sense, we often need to check whether h(z n ) converges to h(z ) in the same sense. Continuous
More informationP i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=
2.7. Recurrence and transience Consider a Markov chain {X n : n N 0 } on state space E with transition matrix P. Definition 2.7.1. A state i E is called recurrent if P i [X n = i for infinitely many n]
More informationConcentration inequalities and the entropy method
Concentration inequalities and the entropy method Gábor Lugosi ICREA and Pompeu Fabra University Barcelona what is concentration? We are interested in bounding random fluctuations of functions of many
More informationSTAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes
STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting
More informationStochastic Process (ENPC) Monday, 22nd of January 2018 (2h30)
Stochastic Process (NPC) Monday, 22nd of January 208 (2h30) Vocabulary (english/français) : distribution distribution, loi ; positive strictement positif ; 0,) 0,. We write N Z,+ and N N {0}. We use the
More informationECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018
ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to
More informationLectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have
Lectures 16-17: Poisson Approximation 1. The Law of Rare Events Theorem 2.6.1: For each n 1, let X n,m, 1 m n be a collection of independent random variables with PX n,m = 1 = p n,m and PX n,m = = 1 p
More informationLecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.
1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if
More information1 Stat 605. Homework I. Due Feb. 1, 2011
The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due
More informationMS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 7
MS&E 321 Spring 12-13 Stochastic Systems June 1, 213 Prof. Peter W. Glynn Page 1 of 7 Section 9: Renewal Theory Contents 9.1 Renewal Equations..................................... 1 9.2 Solving the Renewal
More informationMathematical Methods for Computer Science
Mathematical Methods for Computer Science Computer Science Tripos, Part IB Michaelmas Term 2016/17 R.J. Gibbens Problem sheets for Probability methods William Gates Building 15 JJ Thomson Avenue Cambridge
More information) ) = γ. and P ( X. B(a, b) = Γ(a)Γ(b) Γ(a + b) ; (x + y, ) I J}. Then, (rx) a 1 (ry) b 1 e (x+y)r r 2 dxdy Γ(a)Γ(b) D
3 Independent Random Variables II: Examples 3.1 Some functions of independent r.v. s. Let X 1, X 2,... be independent r.v. s with the known distributions. Then, one can compute the distribution of a r.v.
More informationlim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),
1 Part I Exercise 1.1. Let C n denote the number of self-avoiding random walks starting at the origin in Z of length n. 1. Show that (Hint: Use C n+m C n C m.) lim n C1/n n = inf n C1/n n := ρ.. Show that
More informationMartingale Theory for Finance
Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.
More informationJUSTIN HARTMANN. F n Σ.
BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationMAT 135B Midterm 1 Solutions
MAT 35B Midterm Solutions Last Name (PRINT): First Name (PRINT): Student ID #: Section: Instructions:. Do not open your test until you are told to begin. 2. Use a pen to print your name in the spaces above.
More informationLecture 12. F o s, (1.1) F t := s>t
Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let
More informationFundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales
Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is
More informationis well-defined, and is distributed according to a Cauchy law If t 0, compute P( X t Y ).
MA 539 - PROBLEM LIST DISCRETE TIME PROCESSES AND BROWNIAN MOTION Problem 1. Let γ a,b be the function: 1. Gaussian vectors and CLT γ a,b (x) = 1 Γ(a)b a xa 1 e x/b 1 {x>}, where a, b > and Γ(x) = e t
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationErgodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.
Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions
More informationX n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)
14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence
More information1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )
1 Introduction These problems are meant to be practice problems for you to see if you have understood the material reasonably well. They are neither exhaustive (e.g. Diffusions, continuous time branching
More informationExercises Measure Theoretic Probability
Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary
More informationChapter 5. Weak convergence
Chapter 5 Weak convergence We will see later that if the X i are i.i.d. with mean zero and variance one, then S n / p n converges in the sense P(S n / p n 2 [a, b])! P(Z 2 [a, b]), where Z is a standard
More informationBayesian Methods with Monte Carlo Markov Chains II
Bayesian Methods with Monte Carlo Markov Chains II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University hslu@stat.nctu.edu.tw http://tigpbp.iis.sinica.edu.tw/courses.htm 1 Part 3
More informationLecture 1 Measure concentration
CSE 29: Learning Theory Fall 2006 Lecture Measure concentration Lecturer: Sanjoy Dasgupta Scribe: Nakul Verma, Aaron Arvey, and Paul Ruvolo. Concentration of measure: examples We start with some examples
More informationVerona Course April Lecture 1. Review of probability
Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is
More information18.440: Lecture 28 Lectures Review
18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems
More informationOn a class of stochastic differential equations in a financial network model
1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University
More informationStochastic Models (Lecture #4)
Stochastic Models (Lecture #4) Thomas Verdebout Université libre de Bruxelles (ULB) Today Today, our goal will be to discuss limits of sequences of rv, and to study famous limiting results. Convergence
More informationLet (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t
2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition
More informationStochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes
BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents
More information1 Exercises for lecture 1
1 Exercises for lecture 1 Exercise 1 a) Show that if F is symmetric with respect to µ, and E( X )
More informationOutline. Martingales. Piotr Wojciechowski 1. 1 Lane Department of Computer Science and Electrical Engineering West Virginia University.
Outline Piotr 1 1 Lane Department of Computer Science and Electrical Engineering West Virginia University 8 April, 01 Outline Outline 1 Tail Inequalities Outline Outline 1 Tail Inequalities General Outline
More informationQualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf
Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section
More informationCOLORING A d 3 DIMENSIONAL LATTICE WITH TWO INDEPENDENT RANDOM WALKS. Lee Dicker A THESIS. Mathematics. Philosophy
COLORING A d 3 DIMENSIONAL LATTICE WITH TWO INDEPENDENT RANDOM WALKS Lee Dicker A THESIS in Mathematics Presented to the Faculties of the University of Pennsylvania in Partial Fulfillment of the Requirements
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationZdzis law Brzeźniak and Tomasz Zastawniak
Basic Stochastic Processes by Zdzis law Brzeźniak and Tomasz Zastawniak Springer-Verlag, London 1999 Corrections in the 2nd printing Version: 21 May 2005 Page and line numbers refer to the 2nd printing
More informationUniformly Uniformly-ergodic Markov chains and BSDEs
Uniformly Uniformly-ergodic Markov chains and BSDEs Samuel N. Cohen Mathematical Institute, University of Oxford (Based on joint work with Ying Hu, Robert Elliott, Lukas Szpruch) Centre Henri Lebesgue,
More informationLecture 22 Girsanov s Theorem
Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n
More informationStability of Stochastic Differential Equations
Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010
More informationEE514A Information Theory I Fall 2013
EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/
More informationExercises and Answers to Chapter 1
Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean
More information