AMS 216 Stochastic Differential Equations Lecture 03 Copyright by Hongyun Wang, UCSC

Similar documents
Great Theoretical Ideas in Computer Science

B8.3 Mathematical Models for Financial Derivatives. Hilary Term Solution Sheet 2

AMS 212A Applied Mathematical Methods I Appendices of Lecture 06 Copyright by Hongyun Wang, UCSC. ( ) cos2

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Xt i Xs i N(0, σ 2 (t s)) and they are independent. This implies that the density function of X t X s is a product of normal density functions:

AMS 212A Applied Mathematical Methods I Lecture 14 Copyright by Hongyun Wang, UCSC. with initial condition

Lecture 4. David Aldous. 2 September David Aldous Lecture 4

( ), λ. ( ) =u z. ( )+ iv z

The Cameron-Martin-Girsanov (CMG) Theorem

Midterm Exam 1 (Solutions)

II. FOURIER TRANSFORM ON L 1 (R)

AMS 216 Stochastic Differential Equations Lecture 02 Copyright by Hongyun Wang, UCSC ( ( )) 2 = E X 2 ( ( )) 2

ELEMENTS OF PROBABILITY THEORY

Lecture 1: Basics of Probability

AMS 147 Computational Methods and Applications Lecture 13 Copyright by Hongyun Wang, UCSC

Wiener Measure and Brownian Motion

An introduction to Lévy processes

ECE 450 Lecture 2. Recall: Pr(A B) = Pr(A) + Pr(B) Pr(A B) in general = Pr(A) + Pr(B) if A and B are m.e. Lecture Overview

Linear Filtering of general Gaussian processes

A Concise Course on Stochastic Partial Differential Equations

Algebra. Formal fallacies (recap). Example of base rate fallacy: Monty Hall Problem.

Lecture 2. Conditional Probability

1. Pseudo-Eisenstein series

Verona Course April Lecture 1. Review of probability

kg meter ii) Note the dimensions of ρ τ are kg 2 velocity 2 meter = 1 sec 2 We will interpret this velocity in upcoming slides.

Neural Networks in Structured Prediction. November 17, 2015

Stochastic Integration and Stochastic Differential Equations: a gentle introduction

Fourier Transform & Sobolev Spaces

Lecture 04: Conditional Probability. Lisa Yan July 2, 2018

University of California, Los Angeles Department of Statistics. Joint probability distributions

Lecture 4 - Conditional Probability

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Lecture 3. January 7, () Lecture 3 January 7, / 35

Stochastic Numerical Analysis

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Lecture 12: Detailed balance and Eigenfunction methods

I. Introduction. A New Method for Inverting Integrals Attenuated Radon Transform (SPECT) D to N map for Moving Boundary Value Problems

Probability Pr(A) 0, for any event A. 2. Pr(S) = 1, for the sample space S. 3. If A and B are mutually exclusive, Pr(A or B) = Pr(A) + Pr(B).

1 Fourier Integrals of finite measures.

MATH4210 Financial Mathematics ( ) Tutorial 7

Chapter 7 Probability Basics

Problems 5: Continuous Markov process and the diffusion equation

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

On continuous time contract theory

MATH 220 solution to homework 5

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility

Lecture 2: Probability

where r n = dn+1 x(t)

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

1. Aufgabenblatt zur Vorlesung Probability Theory

Non-Zero-Sum Stochastic Differential Games of Controls and St

ISyE 6644 Fall 2015 Test #1 Solutions (revised 10/5/16)

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Formulas for probability theory and linear models SF2941

Lecture Notes. The main new features today are two formulas, Inclusion-Exclusion and the product rule, introduced

Uncertainty. Michael Peters December 27, 2013

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 197

Convergence of price and sensitivities in Carr s randomization approximation globally and near barrier

Physics 6303 Lecture 2 August 22, 2018

Mathematics of Physics and Engineering II: Homework answers You are encouraged to disagree with everything that follows. P R and i j k 2 1 1

The Wiener Itô Chaos Expansion

Question Points Score Total: 137

{ } is an asymptotic sequence.

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Stochastic Mechanics of Particles and Fields

Basics on Probability. Jingrui He 09/11/2007

1 Basic continuous random variable problems

Kolmogorov Equations and Markov Processes

1 Delayed Renewal Processes: Exploiting Laplace Transforms

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

MATH97 Testing Enhancement Workshop. Department of Mathematics and Computer Science Coppin State University Dr. Min A Updated on Oct 20, 2014

Session 1: Probability and Markov chains

Brownian Motion. Chapter Stochastic Process

Stochastic Modelling in Climate Science

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Physics 584 Computational Methods

Discrete Mathematics and Probability Theory Spring 2014 Anant Sahai Note 10

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Probabilistic models

Goodness-of-Fit Testing with Empirical Copulas

Math 212-Lecture 8. The chain rule with one independent variable

Input-queued switches: Scheduling algorithms for a crossbar switch. EE 384X Packet Switch Architectures 1

Harnack Inequalities and Applications for Stochastic Equations

At time t 0, the velocity of a body moving along the horizontal s-axis is v = t 2 4t + 3.

1 Antiderivatives graphically and numerically

Lecture 5: The Principle of Deferred Decisions. Chernoff Bounds

the convolution of f and g) given by

Uncertainty Quantification in Computational Science

Lecture 22 Girsanov s Theorem

Brownian Motion and Poisson Process

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

A review: The Laplacian and the d Alembertian. j=1

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20

18.01 Single Variable Calculus Fall 2006

Deep Learning for Computer Vision

Notes by Maksim Maydanskiy.

(3) Review of Probability. ST440/540: Applied Bayesian Statistics

Solution. This is a routine application of the chain rule.

LES of Turbulent Flows: Lecture 3

Transcription:

Lecture 03 Copyright by Hongyun Wang, UCSC Review of probability theory (Continued) We show that Sum of independent normal random variable normal random variable Characteristic function (CF) of a random variable Random variable: Density: ρx(x) X(ω) Characteristic function is defined as φ X + ( ( )) = exp iξx ( ξ) E exp iξx ( )ρ X ( x)dx This is very similar to Fourier transform (FT). Fourier transform: Forward transform: ( ) = exp( i2π ξx) f x ˆf ξ + Backward transform: f 2 + ( x) = exp i2π ξx f 2 ( x) = f x ( ) ( ) ˆf ξ f ( x) ˆf ( ξ) ( )dx ˆf ( ξ ) f 2 ( x) ( )dξ Therefore, the backward transform is called the inverse transform. Connection between characteristic function (CF) and Fourier transform (FT) φ X ( ξ) = ˆρ ξ ( ) ξ = ξ 2π - -

If ˆf ξ If φ X ( ) = ĝ( ξ), then f ( x) = g( x). ( ξ) = φ Y ( ξ), then ρ X ( s) = ρ Y s ( ). Characteristic function (CF) of a normal random variable: X ~ N(μ, σ 2 ) Density: ρ X ( x) = ( ) 2 exp x µ 2πσ 2 2σ 2 Characteristic function: φ X ( ( )) = exp iξ x ( ξ) E exp iξx ( )ρ X ( x)dx = = ( ) 2 + i2σ 2 ξ x exp x µ 2πσ 2 2σ 2 exp x µ 2πσ 2 dx ( ) 2 + i2σ 2 ξ x µ ( ) 2 ( ) iσ 2 ξ + iµξ σ2 ξ 2 2σ 2 2 dx = exp iµξ σ2 ξ 2 2 ( ) iσ 2 ξ exp x µ 2πσ 2 2σ 2 2 dx = exp iµξ σ2 ξ 2 2 = exp iµξ σ2 ξ 2 2 We obtain: 2πσ 2 exp s 2 2σ 2 ds X ~ N ( µ, σ ) 2 φ X ( ξ) = exp iµξ σ2 ξ 2 Since we can map the characteristic function back to the probability density function, we have the theorem below. 2-2 -

X ~ N(μ, σ 2 ) if and only if φ X ( ξ) = exp iµξ σ2 ξ 2 2 Next, we look at the characteristic function for the sum of two independent random variables. It is simply the product of two individual characteristic functions. If X and Y are independent, then we have φ ( X+Y )( ξ) = φ X ( ξ) φ Y ( ξ) Proof: φ ( X+Y )( ξ) = E exp iξ X +Y ( ( ( ))) ( ) (using the independence) ( ( )) = E exp( iξ X ) exp( iξy ) = E( exp( iξ X )) E exp iξy = φ X ( ξ) φ Y ( ξ) Applying this theorem to two independent normal random variables, we conclude Proof: Suppose X ~ N(μ, σ 2 ), Y ~ N(μ2, σ2 2 ). Suppose X and Y are independent. Then we have (X + Y) ~ N(μ+μ2, σ 2 + σ2 2 ). That is, sum of independent normals is normal ( )( ξ) = φ X ( ξ) φ Y ξ φ X+Y ( ) = exp iµ ξ σ 2 ξ 2 = exp i ( µ + µ 2 )ξ σ 2 2 ( + σ 2 )ξ 2 2 2 exp iµ 2 ξ σ 2 2 2 ξ 2-3 -

Before we start the discussion of stochastic differential equations, let us look at another example on probability to demonstrate the importance of the framework of repeated experiments. Monty Hall s game: Your group and Mike s group will do summer camping together. But each group has a different itinerary in mind. To decide on an itinerary, you and Mike play a game ONCE with Mike hosting. Specifications of the game: ) The host (Mike) puts a card in one of the 3 boxes without you looking (so he knows which box has the card but you don t know). 2) You select a box. 3) If the box you pick contains the card, you win (and you will have priority in the itinerary planning). Otherwise, you lose (and Mike will have priority in the itinerary planning). 4) At the end, all boxes are opened to verify that the host is not cheating. The incident: After you select box #, Mike says Let us play it the Monty Hall style. He opens box #2, which is empty, and offers you the option of switching to box #3. Question: Should you switch? Answer: The behavior of the host (Mike) is incompletely specified. There are many possible ways the game can be repeated. Version : Monty Hall s game This is Mathematicians definition of Monty Hall s game. The real game show hosted by Monty Hall did not actually follow these rules. *) Upon your initial selection, the host must open a box that you did not pick. *) The host must open an empty box. *) The host must offer you the option of switching. For this game, we have Pr(winning not switching) = /3 Pr(winning switching) = 2/3 See Appendix A for derivation. - 4 -

Version 2: (The greedy host) The greedy host wants to lure you away from the correct box. *) Upon your initial selection, the host has the option of opening a box that you did not pick. The greedy host will open a box if and only if you initial selection is correct. *) If the host opens a box, he must open an empty box. *) If the host opens a box, he must offer you the option of switching. For this game, we have Pr(winning not switching if offered) = /3 Pr(winning switching if offered) = 0 See Appendix A2 for derivation. Version 3: (The less greedy host) The less greedy host still wants to lure you away from the correct box. But he wants to avoid this behavior being easily recognized in repeated games. *) Upon your initial selection, the host has the option of opening a box that you did not pick. The less greedy host adds some randomness to the decision on whether or not to open a box. Pr(opening a box your initial selection is incorrect) = p Pr(opening a box your initial selection is correct) = p2 *) If the host opens a box, he must open an empty box. *) If the host opens a box, he must offer you the option of switching. Version 2 is a special case of Version 3 with p = 0 and p2 =. For this game, we have Pr(winning not switching if offered) = /3 Pr(winning switching if offered) = (+2p p2)/3 For example, for p = 0.25 and p2 = 0.75 Pr(winning switching if offered) = 0.25 See Appendix A3 for derivation. Key observation: When you encounter an incompletely specified game only once, you have to make a model perceiving how the game is repeated. The model is subjective. - 5 -

Stochastic differential equation dx ( t ) = b( X ( t ),t )dt + a( X ( t ),t )dw t Or in a more concise form dx = b( X,t )dt + a( X,t )dw We need to introduce W(t). ( ) The Wiener process (Brownian motion) Definition : The Wiener process, denoted by W(t) satisfies ) W(0) = 0 2) For t 0, W(t) ~ N(0, t) 3) For t4 t3 t2 t 0, increments W(t2) W(t) and W(t4) W(t3) are independent. Definition 2: ) W(0) = 0 2) For t2 t 0, W(t2) W(t) ~ N(0, t2 t) 3) For t4 t3 t2 t 0, increments W(t2) W(t) and W(t4) W(t3) are independent. Definition 2 appears to be stronger than Definition. Question: Are these two definitions equivalent? Answer: Yes. Suppose X and Y are independent. Suppose X ~ N(μ, σ 2 ) and (X+Y) ~ N(μ2, σ2 2 ). Then we have Y ~ N(μ2 μ, σ2 2 σ 2 ) Proof: Homework problem. Using this theorem, we show that Definition is as strong as Definition 2. - 6 -

We start with Definition and derive Definition 2. Definition : ==> W(t) ~ N(0, t) and W(t2) ~ N(0, t2) We write W(t2) as a sum W(t2) = W(t) + (W(t2) W(t)) For t2 t 0, W(t) and (W(t2) W(t)) are independent. Applying the theorem above, we conclude W(t2) W(t) ~ N(0, t2 t) which is Definition 2. Appendix A Suppose you never switch. Pr(winning not switching) = Pr(your initial selection is correct) = /3 Suppose you always switch. Recall that the host must open an empty box. When you initial section is incorrect and you switch, you will switch to the correct box. Pr(winning switching) = Pr(your initial selection is incorrect) = 2/3 Appendix A2 Suppose you never switch if offered. Pr(winning not switching if offered) = Pr(your initial selection is correct) = /3 Suppose you always switch if offered. Let A = your initial selection is correct B = the host opens an empty box S = switching if offered - 7 -

W = winning" The greedy host opens a box and offers you the option of switching if and only if your initial selection is correct. B <==> A We use the law of total probability. Pr(winning switching if offered) = Pr(W A and S) Pr(A) + Pr(W A C ) Pr(A C ) = 0 (/3) + 0 (2/3) = 0 Appendix A3 Suppose you never switch if offered. Pr(winning not switching if offered) = Pr(your initial selection is correct) = /3 Suppose you always switch if offered. Let A = your initial selection is correct B = the host opens an empty box S = switching if offered W = winning" The host decides whether or not to open a box with probabilities Pr(B A C ) = p Pr(B A) = p2 We use the law of total probability. Pr(winning switching if offered) = Pr(W A and B and S) Pr(A and B) + Pr(W A and B C and S) Pr(A and B C ) + Pr(W A C and B and S) Pr(A C and B) + Pr(W A C and B C and S) Pr(A C and B C ) We first calculate the various terms used in the law of total probability. Pr(A and B) = Pr(B A) Pr(A) = p2*(/3) Pr(A and B C ) = Pr(B C A) Pr(A) = ( p2)*(/3) Pr(A C and B) = Pr(B A C ) Pr(A C ) = p*(2/3) - 8 -

Pr(A C and B C ) = Pr(B C A C ) Pr(A C ) = ( p)*(2/3) Pr(W A and B and S) = 0 Pr(W A and B C and S) = Pr(W A C and B and S) = Pr(W A C and B C and S) = 0 Substituting these terms into the law of total probability, we obtain Pr(winning switching if offered) = 0 + ( p2)*(/3) + p*(2/3) + 0 = (+2p p2)/3 ==> Pr(winning switching if offered) = (+2p p2)/3-9 -