Verona Course April Lecture 1. Review of probability

Similar documents
ELEMENTS OF PROBABILITY THEORY

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

A Concise Course on Stochastic Partial Differential Equations

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

n E(X t T n = lim X s Tn = X s

1. Stochastic Processes and filtrations

Reflected Brownian Motion

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Lecture 12. F o s, (1.1) F t := s>t

Applications of Ito s Formula

Brownian Motion and Stochastic Calculus

FE 5204 Stochastic Differential Equations

Stochastic Differential Equations.

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

MA8109 Stochastic Processes in Systems Theory Autumn 2013

Exercises. T 2T. e ita φ(t)dt.

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Exercises Measure Theoretic Probability

Lecture 22 Girsanov s Theorem

Kolmogorov Equations and Markov Processes

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

1 Brownian Local Time

A Short Introduction to Diffusion Processes and Ito Calculus

1 Introduction. 2 Measure theoretic definitions

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Lecture 17 Brownian motion as a Markov process

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

Wiener Measure and Brownian Motion

(A n + B n + 1) A n + B n

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2)

ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS

Metric Spaces. Exercises Fall 2017 Lecturer: Viveka Erlandsson. Written by M.van den Berg

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

ERRATA: Probabilistic Techniques in Analysis

Solutions to the Exercises in Stochastic Analysis

P (A G) dp G P (A G)

Random Process Lecture 1. Fundamentals of Probability

JUSTIN HARTMANN. F n Σ.

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Stochastic Processes - lecture notes - Matteo Caiaffa

1 Stat 605. Homework I. Due Feb. 1, 2011

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

MATH4210 Financial Mathematics ( ) Tutorial 7

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

13 The martingale problem

An Introduction to Stochastic Calculus

1 Independent increments

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Probability Theory. Richard F. Bass

Continuous Time Finance

An Introduction to Stochastic Processes in Continuous Time

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Exercises Measure Theoretic Probability

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

4th Preparation Sheet - Solutions

Brownian Motion and Conditional Probability

Convergence of Random Variables

1.1 Definition of BM and its finite-dimensional distributions

Lecture 12: Diffusion Processes and Stochastic Differential Equations

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 197

Brownian Motion on Manifold

PROBABILITY THEORY II

Stochastic Processes

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Exercises in stochastic analysis

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

MATH 6605: SUMMARY LECTURE NOTES

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

Lecture 6 Basic Probability

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

1. Aufgabenblatt zur Vorlesung Probability Theory

Lecture 4: Ito s Stochastic Calculus and SDE. Seung Yeal Ha Dept of Mathematical Sciences Seoul National University

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Selected Exercises on Expectations and Some Probability Inequalities

Lecture 21 Representations of Martingales

Lecture 3. This operator commutes with translations and it is not hard to evaluate. Ae iξx = ψ(ξ)e iξx. T t I. A = lim

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

A D VA N C E D P R O B A B I L - I T Y

Math 735: Stochastic Analysis

The tree-valued Fleming-Viot process with mutation and selection

Stochastic Integration.

The Wiener Itô Chaos Expansion

Malliavin Calculus in Finance

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

EE514A Information Theory I Fall 2013

Notes 1 : Measure-theoretic foundations I

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1)

An Introduction to Malliavin calculus and its applications

{σ x >t}p x. (σ x >t)=e at.

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Some Tools From Stochastic Analysis

Transcription:

Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy

A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is a σ algebra of subsets of Ω and P a probability measure on Ω (P(Ω) = 1). A σ algebra is a collection U of subsets of Ω with the following properties (i) φ, Ω F. (ii) If A F, then A c F. (iii) If i F, i = 1, 2,..., then i=1 A i, A i F. P is said to be a probability ( measure if ) P(φ) =, P(Ω) = 1, P A i = P(A i ) if A i A j = φ for i j. i=1 i=1 A set A F is called event, points ω Ω are sample points, P(A) is the probability of the event A. A property which is true, except for an event of probability zero, is said to hold almost surely (abbreviated a.s.). i=1

Example (Buffon s needle problem) The plane is ruled by parallel lines 2 in apart and a 1-inch long needle is dropped at random on the plane. What is the probability that it hits one of the parallel lines? The first issue is to find some appropriate probability space (Ω, U, P). For this, let { h = distance from the center of needle to nearest line θ = angle ( π 2 ) that the needle makes with the horizontal. These fully determine the position of the needle, up to translations and reflection. Let us next take [ Ω =, π ) [, 1], U = Borel subsets of Ω, }{{ 2 }}{{} values of h values of θ 2 area of B P(B) = for each B U. π

Example (continue) We denote by A the event that the needle hits a horizontal line. We can now check that this happens provided h sin θ 1 2. Consequently, { A = (θ, h) Ω h sin θ }, 2 and so P(A) = 2( area of A π = 2 π π 2 1 2 sin θ dθ = 1 π.

A mapping X : Ω R n is called an n-dimensional random variable if, for each Borelian set B B, we have X 1 (B) F, (1) or, in other words, X is F-measurable. Here B is the collection of all Borel subsets of R n, which is the smallest σ algebra of subsets of R n containing all open sets. If X : Ω R n is a random variable, then F(X) = {X 1 (B); B B}. A random variable τ : Ω [, ) is called a stopping time with respect to F t provided {ω; τ(ω) t} F t, t, that is, the set {ω; τ(ω) t} is F t measurable. A stochastic process is a collection {X(t); t } of random variables. In other words, a stochastic process X assigns to each time t a random variable X(t). In fact, X = X(t, ω), t I R, ω Ω.

X is called continuous if t X(t, ω) is continuous with probability 1, i.e., P-a.s. It is called mean square continuous on [, T] if, for each t [, T], lim t t E X(t) X(t ) 2 =. A family (F t ) t F is called a filtration if F s F t for s t, F contains all A F, P(A) =. A stochastic process X=X(t) is said to be (F t ) t adapted if for any Borelian set B B, i.e., X 1 (t)(b) F t ω X(t, ω) is F t -measurable.

Integration with respect to the measure P If (Ω, F, P) is a probability space and X = N a i χ Ai, where a i R, χ Ai is the i=1 characteristic function of A i F, A i A j = φ, random variable), we define the integral Ω X dp = N A i = Ω (X is called simple i=1 N a i P(A i ). (2) i=1 If X, then, by definition, X dp = sup Ω Y X, Y simple Ω Ω Ω Y dp. If X : Ω R is a random variable, we define X dp = X + dp X dp. Ω

Finally, if X : Ω R n is an n-dimensional random variable, then X = (X 1, X 2,..., X n ), where X i : Ω R, and we define ( ) X dp = X 1 dp,..., X n dp. (3) Ω Ω Ω We call E(X) = X dp (4) the expectation or mean value of X. If X : Ω R n is a random variable, then its distribution function F X : R n [, 1] is defined by Ω F X (x) = P (X x), x R n. (5) If there is a nonnegative integrable function f : R n R such that F X (xd) = F(x 1,..., x n ) = x1 x1 f (y 1,..., y n )dy 1...dy n, then f is called the density function of X. We have for each Borelian set B P (X B) = f (x)dx = df X (x). (6) B B

Example If X : Ω R has the density f (x) = 1 2πσ 2 (x m)2 e 2σ 2, x R, we say that the random variable X is Gaussian or has normal distribution with mean m and variance σ 2. X is also called N(m, σ 2 ) random variable. If X : Ω R n has the density 1 f (x) = e (x m) C 1 (x m), (7) (2π) n (det C) 1 2 we say that X has a Gaussian or normal distribution. 1 In this case (n = 1), E(X) = xe (x m) 2 2σ 2 dx = m. 2πσ 2 The random variables X i : Ω R n, i = 1,..., m, are said to be independent if, for all k 2 and all Borel sets B 1,..., B k R n, P(X 1 B 1, X 2 B 2,..., X k B k ) = P(X 1 B 1 )...P(X k B k ). (8) In this case, we also have E(X 1 X m ) = E(X 1 )E(X 2 ) E(X n ). (9)

Convergence of random variables ( ) (i) X n X a.s. if P ω Ω; lim X n(ω) = X(ω) = 1. n (ii) X n X in probability if ε >, lim P{ X n X > ε} =. n It turns out that (i) = (ii).

Conditional expectation Let Y : Ω R n be a random variable. Then E(X Y) is the random variable defined by X dp = E(X Y)dP, A F(Y). (1) Let (Ω, F A A

Martingales Let X = X(t) be a real valued stochastic process. Then F(t) = F(X(s); s t} is the σ algebra generated by the random variables X(s) for s t. More precisely, F(t) = {(X(t)) 1 (B); B B} and is the smallest sub σ algebra of F with respect to which X is measurable. (This is the history of the process X until time t.)

Definition Let X : Ω R be a stochastic process such that E(X(t)) <, t. If (i) X(s) = E(X(t) F(s)) P-a.s. for t s >, then X is called a martingale. If (ii) X(s) E(X(t) F(s)) P-a.s. for t s > then X is a submartingale. Theorem (the martingale inequality) If X = X(t) is a martingale with continuous sample paths and 1 < p <, then ( ) p p E max p s t E X(s) p E X(t) p. p 1

Brownian motions (Definition) A real valued stochastic process W : Ω R is called Brownian motion or Wiener process if (i) W() =, P-a.s. (ii) W(t) W(s) is Gaussian with mean and variance σ 2 t s. (iii) For all times < t 1 < t 2 < < t n, the random variables W(t 1 ), W(t 2 ), W(t 1 ),..., W(t n ) W(t n 1 ) are independent. In particular, it follows that E[W(t)] =, E[W 2 (t)] = t, t, P[a W(t) b] = 1 b 2πt a e x2 2t dx.

An R n -valued stochastic process W(t) = (W 1 (t),..., W n (t)) is an n-dimensional (n D) Wiener process (or Brownian motion) provided (i) For each i, W i is 1 D Wiener process. (ii) The σ-algebras W k = F(W k (t); t ) are independent, k = 1,... It turns out that, if W is an n D Wiener process, then for each Borelian set A R n P(W(t) A) = 1 e x 2 (2πt) n 2t dx. (13) 2 If W is a Wiener process, then P-a.s. the function t W(t) is Hölder continuous with exponent α < 1 2. Moreover, for Ω Ω, t W(t, ω) is nowhere differentiable. A

Markov processes If X is a stochastic process, then F(t) = F(X(s); s t), that is, the σ algebra {X 1 (s)(b); B B} is called the history of the process X up to t. The R n valued stochastic process X is called a Markov process if P(X(t) B F(s)) = P(X(t) B X(s)), s t, (14) for all Borelian sets B R n.

Stochastic integrals Let W(t) be 1 D Brownian motion on same probability space (Ω, F, P). The σ algebra W(t) = F(W(s) s t) is called the history of W up to time t. Definition A family (F t ) t F is called a filtration with respect to W(t) if (i) F s F t for s t. (ii) W(t) F t for all t. (iii) F t is independent of W + (t) = F{W(s) W(t) s t}. Definition The real valued stochastic process X is called nonanticipating with respect to (F t ) t if, for each t, X(t) is F t -measurable. One says also that X is (F t ) t -adapted. The process X is called progressively measurable if X : (, ) Ω R is measurable and t X(t) is (F t ) t adapted.

The process X; [, T] R is called a step process if there is a partition { = t < t 1 < < t m = T} such that X(t) = a k for t k t < t k+1. We define for such a process the Itô stochastic integral T m 1 X dw = a k (W(t k+1 ) W(t k )). (15) k= It turns out that ( T ( T E X dw =, E ) 2 ) T X dw = E X 2 dw. (16)

Now, for an arbitrary process X in L 2 (, T) = one defines the Itô integral T { T } X; [, T] R, E X 2 dt < T X dw = lim n X n dw, (17) where the limit is taken in L 2 (Ω) and X n is a family of step processes such that T lim E X X n 2 dt =. (18) n

For instance, if t X(t, ω) is continuous, one can choose X n as ( ) k X n (t) = X for k n n t k + 1, k =, 1,..., [n] n and, for general X L 2 (, T), one takes X n (t) = t ne n(s t) X(s)ds. This definition extends to all processes X which are progressively measurable and T I(t) = t X dw is a martingale. X 2 dt <, P-a.s.

Stochastic integrals in n-dimension Let W = (W 1,..., W m ) be an m-dimensional Wiener process and let (F t ) t be a filtration such that F t F(W(s) s t) and (F t ) t is independent of F(W(s) W(t) t s < ). Consider an n m stochastic process Then X = (X ij ) n m i,j=1 L 2 n m(, T). T X dw is an R n -valued random variable, whose i-component is m j=1 T X ij dw j, i = 1,..., n.

Properties where G 2 = nm i,j=1 ( T E T E X dw = 2) X dw = E T X 2 dt, G ij 2. Moreover, for G L 2 (, T), I(t) = t G(s)dW(s) is a martingale and t I(t) is continuous, P-a.s. (that is, has continuous sample paths).

Definition If X = (X 1, X 2,..., X n ) is an R n -valued stochastic process such that X(t) = X(s) + t F(τ)dτ + We say that X has the stochastic differential dx That is, where X = X 1. X n dx i = F i dt +, F = t G dw. (19) dx = F dt + G dw. (2) m G ij dw j, i = 1, 2,..., n, (21) j=1 F 1. F n, G ij = G.

Theorem (Itô s formula) Suppose that dx = Fdt + GdW. Let ϕ : [, T] R n R be dϕ(t, X(t)) = ϕ t dt + = ϕ t dt + + 1 2 n i,j=1 n i=1 n i=1 2 ϕ X i X j ϕ x i dx i + 1 2 ϕ X i F i dt + n i,j=1 n i=1 m G il G jl dt. l=1 ϕ X i 2 ϕ X i X j m G il G jl dt l=1 m G ij dw j j=1 (22) In compact form, it looks like dϕ(t, X(t)) = ϕ dt + ϕ(t, X(t)) F dt + ϕ(t, X) G dw t + 1 n 2 ϕ (G G) ij dt. 2 X i X j i,j=1 (23)

Examples 1 Let dx = Fdt + GdW, F L 1 (, T), G L 2 (, T), W Brownian motion in 1 D. Then, if ϕ C 1 ([, T] R), 2 ϕ X C([, T] R), 2 we have dϕ(t, X(t)) = ϕ t 2 Let X = W, ϕ = ϕ(t, X). Then dϕ(t, W(t)) = ϕ t (t, W(t))dt+ 1 2 3 dx 1 = F 1 dt + G 1 dw dx 2 = F 2 dt + G 2 dw. Then d(x 1 X 2 ) = X 2 dx 1 + X 1 dx 2 + G 1 Gdt. ϕ (t, X(t))dt + (t, X(t))F(t)dt X + 1 2 G2 (t) 2 ϕ (t, X(t)), t (, T). X2 (24) 2 ϕ ϕ (t, W(t))dt+ (t, W(t))dW(t). X2 X

Remembering formula (22) dϕ(t, X) = ϕ t dt + n i=1 ϕ X i dx i + 1 2 n i,j=1 where, computing dx i dx j, we use the symbolic rules 2 ϕ X i X j dx i dx j, (25) (dt) 2 =, dtw k =, dw k dw l = δ kl dt, k, l = 1,..., n, m.

Exercises 1 Calculate dw m, where W is a Wiener process. ( ) 2 Calculate d e λw(t) λ2 t 2. 3 Calculate d(tw). 4 Calculate t W dw. Hint. Set X(t) = t dw(s) and apply Itô s formula to ϕ(x) = 1 2 x2.