FE 5204 Stochastic Differential Equations

Similar documents
Lecture 2: Repetition of probability theory and statistics

Algorithms for Uncertainty Quantification

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

ELEMENTS OF PROBABILITY THEORY

A D VA N C E D P R O B A B I L - I T Y

MATH4210 Financial Mathematics ( ) Tutorial 7

Verona Course April Lecture 1. Review of probability

Appendix A : Introduction to Probability and stochastic processes

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

Statistical Inference

1 Presessional Probability

M5A42 APPLIED STOCHASTIC PROCESSES

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Notes 1 : Measure-theoretic foundations I

JUSTIN HARTMANN. F n Σ.

Review of Probability. CS1538: Introduction to Simulations

Review of Probability Theory

Quick Tour of Basic Probability Theory and Linear Algebra

Lecture 6 Basic Probability

1.1 Review of Probability Theory

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

1.1 Definition of BM and its finite-dimensional distributions

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

1 Stat 605. Homework I. Due Feb. 1, 2011

Lecture 1: August 28

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

Lecture 22: Variance and Covariance

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

A PECULIAR COIN-TOSSING MODEL

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

An Introduction to Stochastic Calculus

P (A G) dp G P (A G)

Recap of Basic Probability Theory

Probability Background

Recap of Basic Probability Theory

Probability Review. Chao Lan

1. Stochastic Processes and filtrations

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

1: PROBABILITY REVIEW

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Multivariate probability distributions and linear regression

Lectures for APM 541: Stochastic Modeling in Biology. Jay Taylor

Discrete time Markov chains. Discrete Time Markov Chains, Definition and classification. Probability axioms and first results

18.440: Lecture 26 Conditional expectation

Brownian Motion and Conditional Probability

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

At t = T the investors learn the true state.

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 2: Review of Probability

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Homework 4 Solution, due July 23

Lecture 3 - Expectation, inequalities and laws of large numbers

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

Random Process Lecture 1. Fundamentals of Probability

Probability and Measure

Math-Stat-491-Fall2014-Notes-I

Chapter 4. Chapter 4 sections

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

CHANGE OF MEASURE. D.Majumdar

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Admin and Lecture 1: Recap of Measure Theory

Stochastic process for macro

Brownian Motion. Chapter Stochastic Process

Independent random variables

Random variables (discrete)

Lecture Notes for MA 623 Stochastic Processes. Ionut Florescu. Stevens Institute of Technology address:

3. Probability and Statistics

Elementary Probability. Exam Number 38119

Recitation 2: Probability

Universal examples. Chapter The Bernoulli process

Chapter 2. Continuous random variables

Measure theory and stochastic processes TA Session Problems No. 3

Brownian Motion. Chapter Definition of Brownian motion

01 Probability Theory and Statistics Review

Lecture 1: Review on Probability and Statistics

Probability Theory and Simulation Methods

{σ x >t}p x. (σ x >t)=e at.

Exercises. T 2T. e ita φ(t)dt.

Part II Probability and Measure

Bayesian statistics, simulation and software

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

Introduction to Probability Theory

Chapter 5. Chapter 5 sections

Deep Learning for Computer Vision

9 Brownian Motion: Construction

Formulas for probability theory and linear models SF2941

LIST OF FORMULAS FOR STK1100 AND STK1110

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Some Tools From Stochastic Analysis

University of Regina. Lecture Notes. Michael Kozdron

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Lecture 2: Review of Basic Probability Theory

Properties of Random Variables

Transcription:

Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009

Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture is Chapter 2 of the textbook.

Finite probability space Finite probability space σ-algebra σ-algebra generated by a class of sets In coin tossing example with finite tosses, Ω represents a finite sample space. Algebras F i represents information that one has after ith game. Algebra F = n i F i represents information after all n games. Each subset A F represents a particular event whose probability is P(A). The triple (Ω, F,P) is called a probability space. In our example F = 2 Ω.

σ-algebra Finite probability space σ-algebra σ-algebra generated by a class of sets If we toss infinitely many times, Ω becomes an infinite set. We need F = i F i to be closed to the union of infinite sequence of subsets of F. Such F is called a σ-algebra. Note that F is also closed for intersection of infinite sequence by de Morgan s law.

σ-algebra definition Finite probability space σ-algebra σ-algebra generated by a class of sets σ-algebra Let Ω be a sample space and let F be a collection of subsets of Ω. We say F is a σ-algebra provide (i) for any A F, A c = Ω\A F and (ii) for any sequence A i F, i=1 A i F. This is a mathematical model for information.

Finite probability space σ-algebra σ-algebra generated by a class of sets Let Ω be a sample space and let F be a σ-algebra of subsets (events) of Ω We call (Ω, F) a measure space. A function P : F [0,1] is called a probability measure if P( ) = 0, P(Ω) = 1 and, for any sequence of disjoint sets A i F, P( A i ) = P(A i ). i=1 i=1 The triple (Ω, F,P) is called a probability space.

Almost surely Finite probability space σ-algebra σ-algebra generated by a class of sets Almost surely If event A F satisfies P(A) = 1 we say A happens almost surely or in brief a.s.

σ-algebra generated by a class of sets Finite probability space σ-algebra σ-algebra generated by a class of sets Let Ω be a sample space and let U be a collection of subsets of Ω. We use σ(u) to denote the σ-algebra generated by U the smallest σ-algebra containing U. This σ-algebra can be constructed by σ(u) = A {U A : A is a σ algebra}. (Exercise 2.3)

Examples Finite probability space σ-algebra σ-algebra generated by a class of sets Let Ω = {1,2,3}. Then F = {,Ω, {1}, {2,3}} is a σ-algebra, and G = {,Ω, {2}} is not. In fact, σ(g) = {,Ω, {2}, {1,3}}. What is σ({{2,3}})?

Examples Finite probability space σ-algebra σ-algebra generated by a class of sets Borel algebra Let U be all the open sets in R n. B := σ(u) is called the Borel σ-algebra.

Random variable Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Random variable Let (Ω, F,P) be a probability space. A mapping X : Ω R n measurable with respect to F, i.e., for any open set A in R n, X 1 (A) F, is called a random variable. For random variable that takes only countably number of values a i,i = 1,2,... one needs only to check X 1 (a i ) F (Exercise 2.1(a)).

Examples Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Let Ω = {1,2,3} and F = {,Ω, {1}, {2,3}}. Define Y (1) = 1, Y (2) = 0 and Y (3) = 1 then Y is not a random variable. Define Z(1) = 1 and Z(2) = Z(3) = 0 then Z is a random variable.

Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence σ-algebra generated by a random variable σ-algebra generated by a random variable Let X : Ω R n be a random variable and let B be the Borel σ-algebra. Then σ(x) = {X 1 (F) : F B} is a σ-algebra, called the σ-algebra generated by X.

Doob-Dynkin Lemma Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence The σ-algebra generated by X describes information contained in X. The Doob-Dynkin Lemma is about information relationship between two random variables. Doob-Dynkin Lemma Let X,Y : Ω R n be two functions. Then Y is σ(x) measurable if and only if there exists a Borel measurable function g : R n R n (for any A B,g 1 (A) B) such that Y = g(x).

Simple random variables Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Simple random variables Let (Ω, F,P) be a probability space. A random variable X on Ω is simple if it has only finite number of outcomes. Alternatively, X(ω) = k x i χ Fi (ω), i=1 where F i are disjoint sets in F.

Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Probability integration of simple random variables Probability integration of simple random variables Let X be a simple random variable on Ω defined by X(ω) = k x i χ Fi (ω), i=1 where F i are disjoint sets in F. Then k X(ω)dP(ω) := x i P(F i ). Ω i=1

Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Probability integration of general random variables Let X be a random variable on Ω. Suppose that there exists a sequence of simple random variables X j such that X j X with probability 1 on Ω and Ω X j(ω)dp(ω) converges. Then we defined X(ω) = lim X j (ω)dp(ω). j Ω (This gives a way of calculation. Use it for Exercise 2.1(b),(c),(d).)

Technicalities Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence There are usually many sequence of simple random variables X j such that X j X with probability 1. In order to justify the above definition one has to show if Ω X j(ω)dp(ω) converges for one of such sequences then all the other sequences leads to integration that converges to the same limit.

Expectation Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Expectation When Ω X(ω) dp(ω) < the number E[X] = X(ω)dP(ω) is called the expectation of X. Ω

Expectation Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence The expectation can also be calculated by E[X] = xdµ X (x), R N where µ X (B) = P(X 1 (B)) for any Borel set B. This is a generalization of the concept of expectation when Ω is finite. We can also use this formula to deal with Exercise 2.1 (b),(c),(d).

Variance Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence To generalize variance we need space L 2 (P) := {X : Ω R N : X(ω) 2 dp(ω) < }, with an inner product defined by Ω X,Y = E[XY ], X,Y L 2 (P). For X,Y L 2, their covariance is cov(x,y) = E[(X E[X])(Y E[Y ])]. For X L 2 (P) its variance is Var(X) = cov(x,x). Again note here we need an additional technical assumption to restrict the range in which the concept is applicable.

Independence Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence (two events) Events A,B F are independent if P(A B) = P(A)P(B). (multiple events) Events A i F,i = 1,2,... are independent if P(A i1 A i2... A ik ) = P(A i1 )P(A i2 )... P(A ik ). (multiple collection of events) Collection of events A i F,i = 1,2,... are independent if for all choices A ij A ij, P(A i1 A i2... A ik ) = P(A i1 )P(A i2 )... P(A ik ).

Independence of random variables Random variable σ-algebra generated by a random variable Probability integration Expectation and (co)variance Independence Two random variables X and Y are independent if σ(x) and σ(y ) are independent. For X,Y L 2 (P), this implies that or equivalently cov(x,y ) = 0. E[XY ] = E[X]E[Y ] The condition can be weakened but we will not pursue the technical details. X i,i = 1,2,... are independent if any finite subfamily of σ(x i ),i = 1,2,... are independent.

A continuous stochastic process Definition Path As a two variable function Stochastic process Let (Ω, F,P) be a probability space and let [0,T] be an interval. (X t ),t [0,T] is a stochastic process if for every t, X t is a random variable on (Ω, F,P).

Path Definition Path As a two variable function For each fixed ω Ω, t X t (ω) is called a path of the stochastic process (X t ).

As a two variable function Definition Path As a two variable function A stochastic process (X t ) can also be viewed as a two variable function (t,ω) X t (ω) = X(t,ω). Usually we will need a certain properties of this function. We often assume every path is continuous which ensures the joint measurability of X(t,ω).

General filtration Definition of filtration Adapted stochastic process Definition of filtration Let (Ω, F,P) be a probability space and let [0,T] be an interval. (F t ),t [0,T] is a filtration if for every t, F t F is a σ-algebra and, for any s < t, F s F t. Again, F t represents information available up to time t.

Adapted stochastic process Definition of filtration Adapted stochastic process Adapted stochastic process Let (F t ),t [0,T] is a filtration on probability space (Ω, F,P). We say a stochastic process (X t ) is (F t )-adapted provided that, for every t, X t is F t measurable. Note: there is no corresponding concept for predictable.

Definition of filtration Adapted stochastic process generated by a stochastic process generated by a stochastic process Let (X t ),t [0,T] be a stochastic process on probability space (Ω, F,P). Define F t = σ({xs 1 (F) : s [0,t],F B}), where B is all the Boral sets in R n. Then (F t ) is a filtration and (X t ) is F t -adapted.

History History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion 1 Named after Scottish botanist Robert Brown who in 1828 observed this motion from pollen suspended in liquid. 2 Louis Bachelier use it to model stock market in 1900, wrongly for allowing negative stock price. 3 Paul Samuelson gave the widely used geometric Brownian motion model for stock price movements in 1965, which do not allow any price jump. 4 All models are wrong. Some are wronger than others.

History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Definition of 1-dimensional Brownian motion (Wiener) A stochastic process {B t : t [0,T)} is called a Brownian motion starting from x if 1 B 0 = x, 2 for 0 t 1 < t 2 <... < t k T, the random variables are independent, B t2 B t1,b t3 B t2,...,b tk B tk 1 3 for 0 s t T, B t B s has Gaussian distribution with mean 0 and variance t s, 4 for ω in a set of probability one, the path B t (ω) is continuous.

History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Definition of multi-dimensional Brownian motion Multi-dimensional Brownian motion A vector stochastic process {B t : t [0,T]} in R n is called a Brownian motion starting from x = (x 1,x 2,...,x n ) if B t = (B 1 t,b 2 t,...,b n t ) where B i t,i = 1,2,...,n are independent 1-dimensional Brownian motion starting from x i. If x = 0, we say (B t ) is a standard Brownian motion. A Brownian motion starting from x can be viewed as a standard Brownian motion shifted by x.

Existence History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion 1 Existence of a Brownian motion needs to be justified. 2 By and large, there are two ways to do it: 3 by construction (pioneered by Wiener, see e.g. Steele s book), or 4 by Kolmogorov s extension theorem (if time permits, other wise see e.g. [T]).

Uniqueness History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Brownian motions are not uniquely defined, but their effects are equivalent. We usually pick a convenient version.

Gaussian distribution History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Let v = [v 1,...,v d ] be a d-dimensional random vector. We define the mean vector and the covariance matrix by µ = E[v] = [µ 1,...,µ d ], and Σ = where σ ij = cov(v i,v j ). σ 11 σ 12... σ 1d σ 21 σ 22... σ 2d...... σ d1 σ d2... σ dd,

Gaussian distribution History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Gaussion distribution A d-dimensional random vector v is Gaussian with mean µ and covariance matrix Σ if the density of v is given by ( f (x) = (2π) d/2 (detσ) 1/2 exp 1 ) 2 (x µ) Σ 1 (x µ).

History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Two simple and important facts of Gaussian distribution Gaussian random vectors are completely determined by its mean and covariance. The components of a Gaussian random vector are independent if and only if Σ is diagonal.

Characteristic functions History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Let v be a d-dimensional random vector. Then φ(θ) = E[exp(iθ v)] is the characteristic function of v which uniquely determines the distribution of a random vector. By direct computation we have Characteristic function of a Gaussion vector Let v be a d-dimensional Gaussian random vector v with mean µ and covariance matrix Σ. Then ( φ(θ) = E[exp(iθ v)] = exp iθ µ 1 ) 2 θ Σθ.

Gaussian characterization History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Gaussian characterization Let v be a d-dimensional random vector. Then v is Gaussian if and only if for any θ R d, θ v is a univariate Gaussian random variable.

Gaussian process History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Gaussian process Let X t be a stochastic process. If, for any 0 t 1 < t 2 <... < t k, (X t1,x t2,...,x tk ) is a k-dimensional Gaussian random vector then we say (X t ) is a Gaussian process. A Gaussian process is completely determined by the mean and covariance functions µ(t) = E[X t ] and f (s,t) = cov(x s,x t ).

Covariance of a Brownian motion History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion A standard 1-dimensional Brownian motion (B t ) is a well behaved Gaussian process. For s < t, its covariance is cov(b s,b t ) = E[B t B s + B s,b s ] = E[B 2 s ] = s. In general, Covariance of a Brownian motion cov(b s,b t ) = s t.

Covariance of a Brownian motion The converse is also true: Covariance of a Brownian motion History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Let (X t ) be a Gaussian process with continuous paths and E[X t ] = 0 for all t [0,T]. Then (X t ) is a standard Brownian motion if and only if cov(x s,x t ) = s t. Proof. We need to show that X t2 X t1,x t3 X t2,...,x tk X tk 1 are independent. This can be done by expending, for i < j, E[(X ti X ti 1 )(X tj X tj 1 )] to find E[X ti X tj ] E[X ti X tj 1 ] E[X ti 1 X tj ] + E[X ti 1 X tj 1 ] = t i t i t i 1 + t i 1 = 0.

Higher moments History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion For a standard 1-dimensional Brownian motion (B t ), we can also use its density function to directly calculate higher moments and E[(B t B s ) 3 ] = 0, E[(B t B s ) 4 ] = 3(t s) 2. Alternatively, one can also use the method in Exercise 2.8.

Multi-dimensional generalizations History Definition Existence and uniqueness Review Gaussian random vectors Properties of Brownian motion Let (B t ) be an n-dimensional standard Brownian motion. Then we have Covariance of a Brownian motion cov(b s,b t ) = n(s t). In particular, E[ B t B s 2 ] = n(t s). and Higher moments E[ B t B s 4 ] = n(n + 2)(t s) 2.

Finite dimensional distribution Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion For a stochastic process (X t ) and t i,i = 1,2,...,k, µ t1,t 2,...,t k (F 1 F 2... F k ) := P(X t1 F 1,...,X tk F k ) defines a measure on R nk called a finite dimensional distribution of (X t ). Here F 1,...,F k are Borel sets in R n.

Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion Kolmogorov s extension theorem: conditions For all t i,i = 1,2,...,k, let ν t1,t 2,...,t k be probability measure on R nk such that ν tσ(1),t σ(2),...,t σ(k) (F 1 F 2... F k ) (1) = ν t1,t 2,...,t k (F σ 1 (1),F σ 1 (2),...,F σ 1 (k)) for all permutations σ on {1,2,...,k} and ν t1,t 2,...,t k (F 1 F 2... F k ) (2) = ν t1,t 2,...,t k,t k+1,...,t k+m (F 1 F 2... F k F k+1... F k+m ).

Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion Kolmogorov s extension theorem: conclusions Then there exists a probability space (Ω, F,P) and a R n stochastic process (X t ) on (Ω, F,P) such that ν t1,t 2,...,t k (F 1 F 2... F k ) := P(X t1 F 1,...,X tk F k ), for all t i and all Borel sets F i.

Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion Finite dimensional distribution for a Brownian motion For x R n, define p(t,x,y) = (2πt) n/2 exp( and p(0,x,y) = δ x (y). For t i,i = 1,2,...,k increasing define, ν t1,t 2,...,t k (F 1 F 2... F k ) = x y 2 ),y R n,t > 0, 2t F 1 F 2... F k p(t 1, x, x 1 )p(t 2 t 1, x 1, x 2 )...p(t k t k 1, x k 1, x k )dx 1... dx k. Extend this definition to arbitrary t i,i = 1,2,...,k using the permutation rule in Kolmogorov s theorem. Rule (2) automatically holds due to R n p(t,x,y)dy = 1.

Existence of a Brownian motion Finite dimensional distribution Kolmogorov s extension theorem Existence of Brownian motion Existence of a Brownian motion Let x R n. An n-dimensional Brownian motion starts from x exists. Proof. Using the Kolmogorov s extension theorem to the finite dimensional distribution for a Brownian motion defined above.

Homework is an important part of learning SDE. Homework problems are given as exercises following each lecture and those marked with * are optional. The homework of this lecture is due on Jan 27 at the beginning of the lecture. Discussions with me or classmates are encouraged but the final work should be independently completed. I expect that you submit clear and neatly written work with careful justifications for your conclusions. [T] 2.1, 2.3, 2.8(a)(b), 2.12, 2.16, 2.18, 2.20.