MATH 56A SPRING 2008 STOCHASTIC PROCESSES 197

Similar documents
B8.3 Mathematical Models for Financial Derivatives. Hilary Term Solution Sheet 2

Stochastic Calculus Made Easy

1 Solution to Problem 2.1

From Random Variables to Random Processes. From Random Variables to Random Processes

MATH Solutions to Probability Exercises

Existence Theory: Green s Functions

MS 2001: Test 1 B Solutions

UNIVERSITY OF HOUSTON HIGH SCHOOL MATHEMATICS CONTEST Spring 2018 Calculus Test

Continuous distributions

1 Brownian Local Time

Calculus II Practice Test Problems for Chapter 7 Page 1 of 6

Brownian Motion and Stochastic Calculus

Measure and Integration: Solutions of CW2

Chapter 7: Techniques of Integration

Stochastic Calculus. Kevin Sinclair. August 2, 2016

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Scale analysis of the vertical equation of motion:

Verona Course April Lecture 1. Review of probability

Integration of Rational Functions by Partial Fractions

Math RE - Calculus II Antiderivatives and the Indefinite Integral Page 1 of 5

Math The Laplacian. 1 Green s Identities, Fundamental Solution

Probability and Distributions

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

FINITE DIFFERENCES. Lecture 1: (a) Operators (b) Forward Differences and their calculations. (c) Backward Differences and their calculations.

Integration of Rational Functions by Partial Fractions

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Find the slope of the curve at the given point P and an equation of the tangent line at P. 1) y = x2 + 11x - 15, P(1, -3)

Continuous Random Variables

Math 106 Fall 2014 Exam 2.1 October 31, ln(x) x 3 dx = 1. 2 x 2 ln(x) + = 1 2 x 2 ln(x) + 1. = 1 2 x 2 ln(x) 1 4 x 2 + C

Final exam (practice) UCLA: Math 31B, Spring 2017

n E(X t T n = lim X s Tn = X s

INTEGRATION: THE FUNDAMENTAL THEOREM OF CALCULUS MR. VELAZQUEZ AP CALCULUS

Math 104: l Hospital s rule, Differential Equations and Integration

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Announcements. Topics: Homework:

17. Convergence of Random Variables

Interest Rate Models:

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

13.7 Power Applied by a Constant Force

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

The Cameron-Martin-Girsanov (CMG) Theorem

Section 4.5. Integration and Expectation

p. 6-1 Continuous Random Variables p. 6-2

Math 180C, Spring Supplement on the Renewal Equation

Review for the Final Exam

Department of Aerospace Engineering AE602 Mathematics for Aerospace Engineers Assignment No. 6

Kinetic Energy and Work

More on Distribution Function

Math 162: Calculus IIA

Integration by Parts

Xt i Xs i N(0, σ 2 (t s)) and they are independent. This implies that the density function of X t X s is a product of normal density functions:

Product measure and Fubini s theorem

x 2 y = 1 2. Problem 2. Compute the Taylor series (at the base point 0) for the function 1 (1 x) 3.

Some Tools From Stochastic Analysis

Science One Integral Calculus

Lebesgue s Differentiation Theorem via Maximal Functions

(x 3)(x + 5) = (x 3)(x 1) = x + 5. sin 2 x e ax bx 1 = 1 2. lim

NORTHEASTERN UNIVERSITY Department of Mathematics

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

MATH 1271 Wednesday, 5 December 2018

Engg. Math. I. Unit-I. Differential Calculus

MATH 1242 FINAL EXAM Spring,

Hyperbolic PDEs. Chapter 6

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

1 Antiderivatives graphically and numerically

NCERT solution for Integers-2

Continuous Random Variables

Math 212-Lecture 8. The chain rule with one independent variable

Lecture 4: Ito s Stochastic Calculus and SDE. Seung Yeal Ha Dept of Mathematical Sciences Seoul National University

Itô s formula. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Week 9 Generators, duality, change of measure

Geometric projection of stochastic differential equations

A Short Introduction to Diffusion Processes and Ito Calculus

Statistics 100A Homework 5 Solutions

On pathwise stochastic integration

y = x 3 and y = 2x 2 x. 2x 2 x = x 3 x 3 2x 2 + x = 0 x(x 2 2x + 1) = 0 x(x 1) 2 = 0 x = 0 and x = (x 3 (2x 2 x)) dx

Figure 25:Differentials of surface.

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Engg. Math. II (Unit-IV) Numerical Analysis

Section 6.5 Impulse Functions

MATH 104: INTRODUCTORY ANALYSIS SPRING 2008/09 PROBLEM SET 8 SOLUTIONS

PhysicsAndMathsTutor.com

Gaussian, Markov and stationary processes

Brownian Motion and Conditional Probability

Lecture 3. Probability - Part 2. Luigi Freda. ALCOR Lab DIAG University of Rome La Sapienza. October 19, 2016

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Mathematics 426 Robert Gross Homework 9 Answers

Math 362, Problem set 1

Final Problem Set. 2. Use the information in #1 to show a solution to the differential equation ), where k and L are constants and e c L be

Math 312 Lecture Notes Linearization

Stochastic Differential Equations

Wave Phenomena Physics 15c. Lecture 11 Dispersion

SDS 321: Introduction to Probability and Statistics

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Kolmogorov Equations and Markov Processes

5.2 Continuous random variables

Announcements. Topics: Homework:

Announcements. Topics: Homework:

Transcription:

MATH 56A SPRING 8 STOCHASTIC PROCESSES 197 9.3. Itô s formula. First I stated the theorem. Then I did a simple example to make sure we understand what it says. Then I proved it. The key point is Lévy s theorem on quadratic variation. 9.3.1. statement of the theorem. Theorem 9.19 (Itô). Suppose that f(x) is a C (twice continuously differentiable) function and W t is standard Brownian motion. Then f(w t ) f(w ) = f (W s )dw s + 1 f (W s )ds where f (W s )dw s is the stochastic integral that we defined last time. The key point is the unexpected ds in the formula. Example 9.. Take f(x) = ax + bx + c. Then f(w ) = f() = c, So, the LHS of Itô s formula is: f (x) = ax + b f (x) = a f(w t ) f(w ) = aw t + bw t The RHS is (aw s + b) dw s + 1 a ds = = a aw s dw s + If we cancel the bw s terms we have: aw t = a b dw s + at W s dw s + bw t + at W s dw s + at. The infinitesimal version of this is (after dividing by a): dw s = W t dw t + dt

198 STOCHASTIC INTEGRATION 9.3.. proof of Itô s formula. I proved the infinitesimal version of Itô s formula which says: df(w t ) = f (W t )dw t + 1 f (W t )dt This is the limit as δt of the Taylor formula which we saw earlier in the derivation of the heat equation: δf(w t ) = f(w t+δt ) f(w t ) = Taylor f (W t )δw t + 1 f (W t )(δw t ) + O(δW 3 ) In usual calculus we ignore the second term. But in stochastic calculus we keep the second term and ignore the third term since O(δW 3 ) = o(δt). As δt goes to, δf(w t ) df(w t ) So, what we need to prove is that δw t dw t (δw t ) dt with probability one. This will follow from Lévy s theorem: Theorem 9.1 (Lévy). The quadratic variation of Brownian motion is = t almost surely. The first point is: Why does this complete the proof of Itô s formula? To see this we need to write the infinitesimal version of Lévy s theorem: d = dt Now, recall the definition of quadratic variation: := lim δt (δw ) So, δ = (δw ) by definition. Or: d = (dw t ) Therefore, the (δw t ) in Taylor s formula gives the dt in Itô s formula as δt.

MATH 56A SPRING 8 STOCHASTIC PROCESSES 199 Proof of Lévy s Theorem. We know that δw t = W t+δt W t N(, δt) This implies by an easy calculation that E((δW t ) ) = δt E((δW t ) 4 ) = 3(δt) So, the variance of (δw t ) is Var((δW t ) ) = E((δW t ) 4 ) E((δW t ) ) = 3(δt) (δt) = (δt). So, the standard deviation of (δw t ) is δt. In other words, (δw t ) = δt ± δt }{{} error The error term is bigger than the term itself! Lévy s Theorem is saying that the error term ± δt is negligible when compared to the main term δt!!! Now go back to the original statement. = lim (δw ) δt where the number of terms in the sum is N = t/δt. Since the expected value of each term is E((δW ) ) = δt we know that E( ) = δt = t δt δt = t This means that is equal to t on average. (So, the distribution of possible values of forms a bell shaped curve centered at t. The width of the curve at t is the standard deviation which is the square root of the variance.) To prove Lévy s theorem we need to prove that the variance of is zero. Since Var(X +Y ) = Var(X)+Var(Y ) for independent X, Y we have: ( Var (δw ) ) = Var ( (δw ) ) = (δt) But we have t/δt terms in this sum so the sum is = t δt (δt) = tδt which converges to zero as δt. This proves: Var( ) =.

STOCHASTIC INTEGRATION So, is equal to its expected value t with probability one. probability distribution is a Dirac delta function at t.) Exercise 9.. Calculate E(X n ) for X N(, σ ). By definition: Integrate by parts: E(X n ) = u = x n 1 x n e x /σ πσ dx du = (n 1)x n dx (Its dv = x e x /σ πσ dx v = σ e x /σ πσ The product uv vanished at both tails. So, So, E(X n ) = (n 1)σ x n /σ e x dx πσ E(X n ) = (n 1)σ E(X n ) Since E(X) = µ =, this formula shows that E(X n 1 ) = for all n. But, E(X ) = 1. So, and E(X ) = σ E(X 4 ) = 3σ E(X ) = 3σ 4 Exercise 9.3. Show that if X t is continuous with bounded variation then X t = Definition 9.4. The variation of X t is the total distance travelled by X from time to time t: variation X t := lim δt δxt For a C 1 (continuously differentiable) function f(t) this is variation f = f (s) ds

MATH 56A SPRING 8 STOCHASTIC PROCESSES 1 X t = lim (δx) δt Since X t is continuous, δx as δt. So, this is the same as lim δx (δx) = lim δx =. }{{}}{{} bounded δx δx This uses the Lebesgue style idea of cutting up the image instead of the domain: Instead of cutting equal time intervals, we cut up equal space intervals. Then we can factor out the δx. (The Lebegue integral is given by taking the limit as δy = y i+1 y i : g dµ = lim y i µ ({x g(x) (y i, y i+1 ]}) Ω δy where µ is the measure on subsets of the domain. If µ = P, this is = y f(y)dy = E(Y ). where f(y) is the density function of Y = g.)