Lecture 23. Random walks

Similar documents
18.175: Lecture 14 Infinite divisibility and so forth

18.175: Lecture 2 Extension theorems, random variables, distributions

Lecture 9. Expectations of discrete random variables

Lecture 7. Sums of random variables

Lecture 7. Bayes formula and independence

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

18.175: Lecture 17 Poisson random variables

Lecture 14. More discrete random variables

Lecture 12. Poisson random variables

Lecture 10. Variance and standard deviation

18.440: Lecture 19 Normal random variables

Martingale Theory for Finance

18.440: Lecture 26 Conditional expectation

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

1 Stat 605. Homework I. Due Feb. 1, 2011

Notes 15 : UI Martingales

18.440: Lecture 28 Lectures Review

Lecture Notes 3 Convergence (Chapter 5)

6.041/6.431 Fall 2010 Quiz 2 Solutions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

Lecture 16. Lectures 1-15 Review

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

18.600: Lecture 7 Bayes formula and independence

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

18.175: Lecture 13 Infinite divisibility and Lévy processes

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

18.440: Lecture 25 Covariance and some conditional expectation exercises

18.600: Lecture 7 Bayes formula and independence

Design and Analysis of Algorithms

14.30 Introduction to Statistical Methods in Economics Spring 2009

Math489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7

JUSTIN HARTMANN. F n Σ.

Lecture 18. Uniform random variables

Introduction to Stochastic Processes

Comparison of Bayesian and Frequentist Inference

6.262: Discrete Stochastic Processes 2/2/11. Lecture 1: Introduction and Probability review

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

18.440: Lecture 33 Markov Chains

18.440: Lecture 28 Lectures Review

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

Class 8 Review Problems 18.05, Spring 2014

On the definition and properties of p-harmonious functions

Lecture 2. 1 Wald Identities For General Random Walks. Tel Aviv University Spring 2011

MIT Spring 2016

Lecture 17 Brownian motion as a Markov process

Exercises with solutions (Set D)

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

Statistical Inference

LECTURE 13. Last time: Lecture outline

Randomization in Algorithms

Expected Value 7/7/2006

Martingale Theory and Applications

MATH MW Elementary Probability Course Notes Part I: Models and Counting

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Hypothesis Testing. Testing Hypotheses MIT Dr. Kempthorne. Spring MIT Testing Hypotheses

MAT 135B Midterm 1 Solutions

Lectures on Stochastic Stability. Sergey FOSS. Heriot-Watt University. Lecture 4. Coupling and Harris Processes

Elementary Probability. Exam Number 38119

Lecture 5: Asymptotic Equipartition Property

Central Limit Theorem and the Law of Large Numbers Class 6, Jeremy Orloff and Jonathan Bloom

An overview of key ideas

Lecture 1. ABC of Probability

LECTURE 18: Inequalities, convergence, and the Weak Law of Large Numbers. Inequalities

Class 26: review for final exam 18.05, Spring 2014

Lecture 9: Conditional Probability and Independence

A PECULIAR COIN-TOSSING MODEL

Stat 100a: Introduction to Probability. Outline for the day:

1 Math 285 Homework Problem List for S2016

RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Problem Points S C O R E Total: 120

Lecture 6: The Pigeonhole Principle and Probability Spaces

MAKING MONEY FROM FAIR GAMES: EXAMINING THE BOREL-CANTELLI LEMMA

18.175: Lecture 3 Integration

Lecture 6: Entropy Rate

Lecture 4 An Introduction to Stochastic Processes

Problem Points S C O R E Total: 120

Paradox and Infinity Problem Set 6: Ordinal Arithmetic

CIVL Why are we studying probability and statistics? Learning Objectives. Basic Laws and Axioms of Probability

A D VA N C E D P R O B A B I L - I T Y

Chapter 1 Probability Theory

Bernoulli Trials, Binomial and Cumulative Distributions

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Conditional Probability, Independence, Bayes Theorem Spring January 1, / 23

This lecture is a review for the exam. The majority of the exam is on what we ve learned about rectangular matrices.

Econ 325: Introduction to Empirical Economics

Math 564 Homework 1. Solutions.

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Some Asymptotic Bayesian Inference (background to Chapter 2 of Tanner s book)

1 Basic continuous random variable problems

Business Statistics. Lecture 3: Random Variables and the Normal Distribution

Problems ( ) 1 exp. 2. n! e λ and

Stat 100a: Introduction to Probability.

X = X X n, + X 2

1 Poisson processes, and Compound (batch) Poisson processes

CIVL Probability vs. Statistics. Why are we studying probability and statistics? Basic Laws and Axioms of Probability

1 Sequences of events and their limits

18.600: Lecture 24 Covariance and some conditional expectation exercises

Name: Exam 2 Solutions. March 13, 2017

Homework 2. Spring 2019 (Due Thursday February 7)

Transcription:

18.175: Lecture 23 Random walks Scott Sheffield MIT 1

Outline Random walks Stopping times Arcsin law, other SRW stories 2

Outline Random walks Stopping times Arcsin law, other SRW stories 3

Exchangeable events Start with measure space (S, S, µ). Let Ω = {(ω 1, ω 2,...) : ω i S}, let F be product σ-algebra and P the product probability measure. Finite permutation of N is one-to-one map from N to itself that fixes all but finitely many points. Event A F is permutable if it is invariant under any finite permutation of the ω i. Let E be the σ-field of permutable events. This is related to the tail σ-algebra we introduced earlier in the course. Bigger or smaller? 4

Hewitt-Savage 0-1 law If X 1, X 2,... are i.i.d. and A A then P(A) {0, 1}. Idea of proof: Try to show A is independent of itself, i.e., that P(A) = P(A A) = P(A)P(A). Start with measure theoretic fact that we can approximate A by a set A n in σ-algebra generated by X 1,... X n, so that symmetric difference of A and A n has very small probability. Note that A n is independent of event A that A n holds when X 1,..., X n n and X n1,..., X 2n are swapped. Symmetric difference between A and A is also small, so A is independent of itself up to this n small error. Then make error arbitrarily small. 5

Application of Hewitt-Savage: n If X i are i.i.d. in R n then S n = i=1 R n. Theorem: if S n is a random walk on R then one of the following occurs with probability one: S n = 0 for all n S n S n = lim inf S n < lim sup S n = X i is a random walk on Idea of proof: Hewitt-Savage implies the lim sup S n and lim inf S n are almost sure constants in [, ]. Note that if X 1 is not a.s. constant, then both values would depend on X 1 if they were not in ± 6

Outline Random walks Stopping times Arcsin law, other SRW stories 7

Outline Random walks Stopping times Arcsin law, other SRW stories 8

Stopping time definition Say that T is a stopping time if the event that T = n is in F n for i n. In finance applications, T might be the time one sells a stock. Then this states that the decision to sell at time n depends only on prices up to time n, not on (as yet unknown) future prices. 9

Stopping time examples Let A 1,... be i.i.d. random variables equal to 1 with probability.5 and 1 with probability.5 and let X 0 = 0 and n X n = i=1 A i for n 0. Which of the following is a stopping time? 1. The smallest T for which X T = 50 2. The smallest T for which X T { 10, 100} 3. The smallest T for which X T = 0. 4. The T at which the X n sequence achieves the value 17 for the 9th time. 5. The value of T {0, 1, 2,..., 100} for which X T is largest. 6. The largest T {0, 1, 2,..., 100} for which X T = 0. Answer: first four, not last two. 10

Stopping time theorems Theorem: Let X 1, X 2,... be i.i.d. and N a stopping time with N <. Conditioned on stopping time N <, conditional law of {X N+n, n 1} is independent of F n and has same law as original sequence. Wald s equation: Let X i be i.i.d. with E X i <. If N is a stopping time with EN < then ES N = EX 1 EN. Wald s second equation: Let X i be i.i.d. with E X i = 0 and EX i 2 = σ 2 <. If N is a stopping time with EN < then ES N = σ 2 EN. 11

Wald applications to SRW S 0 = a Z and at each time step S j independently changes by ±1 according to a fair coin toss. Fix A Z and let N = inf{k : S k {0, A}. What is ES N? What is EN? 12

Outline Random walks Stopping times Arcsin law, other SRW stories 13

Outline Random walks Stopping times Arcsin law, other SRW stories 14

Reflection principle How many walks from (0, x) to (n, y) that don t cross the horizontal axis? Try counting walks that do cross by giving bijection to walks from (0, x) to (n, y). 15

Ballot Theorem Suppose that in election candidate A gets α votes and B gets β < α votes. What s probability that A is a head throughout the counting? Answer: (α β)/(α + β). Can be proved using reflection principle. 16

Arcsin theorem Theorem for last hitting time. Theorem for amount of positive positive time. 17

18.175: Lecture 23 Random walks Scott Sheffield MIT 1

Outline Random walks Stopping times Arcsin law, other SRW stories 2

Outline Random walks Stopping times Arcsin law, other SRW stories 3

Exchangeable events Start with measure space (S, S, µ). Let Ω = {(ω 1, ω 2,...) : ω i S}, let F be product σ-algebra and P the product probability measure. Finite permutation of N is one-to-one map from N to itself that fixes all but finitely many points. Event A F is permutable if it is invariant under any finite permutation of the ω i. Let E be the σ-field of permutable events. This is related to the tail σ-algebra we introduced earlier in the course. Bigger or smaller? 4

Hewitt-Savage 0-1 law If X 1, X 2,... are i.i.d. and A A then P(A) {0, 1}. Idea of proof: Try to show A is independent of itself, i.e., that P(A) = P(A A) = P(A)P(A). Start with measure theoretic fact that we can approximate A by a set A n in σ-algebra generated by X 1,... X n, so that symmetric difference of A and A n has very small probability. Note that A n is independent of event A that A n holds when X 1,..., X n n and X n1,..., X 2n are swapped. Symmetric difference between A and A is also small, so A is independent of itself up to this n small error. Then make error arbitrarily small. 5

Application of Hewitt-Savage: n If X i are i.i.d. in R n then S n = i=1 R n. Theorem: if S n is a random walk on R then one of the following occurs with probability one: S n = 0 for all n S n S n = lim inf S n < lim sup S n = X i is a random walk on Idea of proof: Hewitt-Savage implies the lim sup S n and lim inf S n are almost sure constants in [, ]. Note that if X 1 is not a.s. constant, then both values would depend on X 1 if they were not in ± 6

Outline Random walks Stopping times Arcsin law, other SRW stories 7

Outline Random walks Stopping times Arcsin law, other SRW stories 8

Stopping time definition Say that T is a stopping time if the event that T = n is in F n for i n. In finance applications, T might be the time one sells a stock. Then this states that the decision to sell at time n depends only on prices up to time n, not on (as yet unknown) future prices. 9

Stopping time examples Let A 1,... be i.i.d. random variables equal to 1 with probability.5 and 1 with probability.5 and let X 0 = 0 and n X n = i=1 A i for n 0. Which of the following is a stopping time? 1. The smallest T for which X T = 50 2. The smallest T for which X T { 10, 100} 3. The smallest T for which X T = 0. 4. The T at which the X n sequence achieves the value 17 for the 9th time. 5. The value of T {0, 1, 2,..., 100} for which X T is largest. 6. The largest T {0, 1, 2,..., 100} for which X T = 0. Answer: first four, not last two. 10

Stopping time theorems Theorem: Let X 1, X 2,... be i.i.d. and N a stopping time with N <. Conditioned on stopping time N <, conditional law of {X N+n, n 1} is independent of F n and has same law as original sequence. Wald s equation: Let X i be i.i.d. with E X i <. If N is a stopping time with EN < then ES N = EX 1 EN. Wald s second equation: Let X i be i.i.d. with E X i = 0 and EX i 2 = σ 2 <. If N is a stopping time with EN < then ES N = σ 2 EN. 11

Wald applications to SRW S 0 = a Z and at each time step S j independently changes by ±1 according to a fair coin toss. Fix A Z and let N = inf{k : S k {0, A}. What is ES N? What is EN? 12

Outline Random walks Stopping times Arcsin law, other SRW stories 13

Outline Random walks Stopping times Arcsin law, other SRW stories 14

Reflection principle How many walks from (0, x) to (n, y) that don t cross the horizontal axis? Try counting walks that do cross by giving bijection to walks from (0, x) to (n, y). 15

Ballot Theorem Suppose that in election candidate A gets α votes and B gets β < α votes. What s probability that A is a head throughout the counting? Answer: (α β)/(α + β). Can be proved using reflection principle. 16

Arcsin theorem Theorem for last hitting time. Theorem for amount of positive positive time. 17

MIT OpenCourseWare http://ocw.mit.edu 18.175 Theory of Probability Spring 2014 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.