MATH4210 Financial Mathematics ( ) Tutorial 7

Similar documents
SDS 321: Introduction to Probability and Statistics

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Chapter 4. Chapter 4 sections

Continuous Random Variables

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss?

Continuous Random Variables and Continuous Distributions

p. 6-1 Continuous Random Variables p. 6-2

1 Random Variable: Topics

1 Presessional Probability

Fundamental Tools - Probability Theory II

Brief Review of Probability

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

FE 5204 Stochastic Differential Equations

Lecture 5: Expectation

Quick Tour of Basic Probability Theory and Linear Algebra

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Introduction to Statistical Inference Self-study

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

1 Review of Probability

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Errata for Stochastic Calculus for Finance II Continuous-Time Models September 2006

ELEMENTS OF PROBABILITY THEORY

1 Solution to Problem 2.1

3 Operations on One Random Variable - Expectation

Lecture 12: Diffusion Processes and Stochastic Differential Equations

Ch3 Operations on one random variable-expectation

3. Probability and Statistics

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Lecture 1: August 28

Chapter 4. Continuous Random Variables 4.1 PDF

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Expectation of Random Variables

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Verona Course April Lecture 1. Review of probability

Proving the central limit theorem

The Multivariate Normal Distribution. In this case according to our theorem

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Econ 508B: Lecture 5

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Probability Review. Chao Lan

Stat410 Probability and Statistics II (F16)

18.440: Lecture 28 Lectures Review

1. Stochastic Processes and filtrations

I forgot to mention last time: in the Ito formula for two standard processes, putting

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Continuous distributions

Formulas for probability theory and linear models SF2941

Stochastic Integration and Stochastic Differential Equations: a gentle introduction

2 Continuous Random Variables and their Distributions

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

Northwestern University Department of Electrical Engineering and Computer Science

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Discrete Probability Refresher

5 Operations on Multiple Random Variables

Lectures 22-23: Conditional Expectations

Gaussian Random Variables Why we Care

3 Continuous Random Variables

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Lecture 21: Expectation of CRVs, Fatou s Lemma and DCT Integration of Continuous Random Variables

MA8109 Stochastic Processes in Systems Theory Autumn 2013

BASICS OF PROBABILITY

Lecture 22: Variance and Covariance

Linear Filtering of general Gaussian processes

conditional cdf, conditional pdf, total probability theorem?

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2)

1 Probability space and random variables

MAS113 Introduction to Probability and Statistics. Proofs of theorems

1.1 Review of Probability Theory

CHANGE OF MEASURE. D.Majumdar

Lecture 2: Repetition of probability theory and statistics

Lecture 4: Introduction to stochastic processes and stochastic calculus

Week 12-13: Discrete Probability

18.440: Lecture 28 Lectures Review

Lecture 19: Properties of Expectation

Analysis of Engineering and Scientific Data. Semester

Measure theory and stochastic processes TA Session Problems No. 3

2 (Statistics) Random variables

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Math438 Actuarial Probability

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

Chapter 2. Continuous random variables

STA 711: Probability & Measure Theory Robert L. Wolpert

Chapter 4. Continuous Random Variables

Lecture 4. Continuous Random Variables and Transformations of Random Variables

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Lecture 2: Review of Basic Probability Theory

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].

Math Spring Practice for the final Exam.

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

1: PROBABILITY REVIEW

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Probability and Distributions

Uncertainty Quantification in Computational Science

Transcription:

MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra of Ω). Also recall that a random variable X : Ω R is a real-valued function from Ω to R. The probability P is a function P : F [0, ] such that. P( ) 0, P(Ω).. If A, A,... F and {A i } i is disjoint, then P ( ia i ) P(A i ) Let X be a random variable, and p X (x) be its probability density function, hence p X (x) 0 for all values of x and p X (x) is integrable (That means p X (x)dx is well-defined).. P {X > δ} p δ X (x)dx and p X(x)dx.. A random variable Z with standard normal distribution is denoted by Z N (0, ). Such random variable is said to have a distributed as a Gaussian Process. The corresponding probability density function of Z is given by p Z (x) e x /, x (, ) i 3. The Expectation of X is: and for any real-valued function g. 4. The Variance of X is: E[X] E[g(X)] xp X (x)dx g(x)p X (x)dx Var(X) E[(X E[X]) ] E[X ] E[X] 5. Linearity of the expectation and variance: E[aX + b] E[aX] + b ae[x] + b; Var(aX + b) Var(aX) a Var(X)

6. The moment generating function: E[e tx ] k0 k0 k0 e tx p X (x)dx k! E[Xk ] k! xk p X (x)dx k! xk p X (x)dx Note that E[e tx ] is a function of t, by the above equation, we have E[e tx ] k0 k! E[Xk ] it contains the information of all the moment of random variable X. Take the m-th derivative of both sides of the above equations at t 0, we obtain the m-th moment of Z d m E[e tx ] dt m E[X m ] t0 Example Show that { 0, when n is odd, E[Z n ] (k)!, when n k. k k! where Z N (0, ). Answer: We will prove by induction on n using the moment generating function of Z. Since Z N (0, ), we get the density function of Z p Z (x) e x /, < x < The moment generating function of Z is For n,, r0 E[e tz ] e t / e t / e tx e x / dx d ( ) e t / t0 te t / dt, t0 e (x t) / dx 0 d ( ) dt e t / t0 e t / + t e t /, t0 So it is true for n,. Assume that it is true for n,..., m. For n m +, d m+ ( ) ( ) dt m+ e t / dm dt m te t /, m Cr m d r ( ) ( n ( dt r t dm r dt m r e t /, Leibniz rule: (uv) n n k. t dm dt m ( e t / ) + m dm dt m ( e t / ). k ) u (k) v (n k) )

Therefore If m + is odd, then m is also odd and we have d m+ ( ) ( ) dt m+ e t / t0 m dm dt m e t / t0. d m+ ( ) ( ) dt m+ e t / t0 m dm dt m e t / ( ) m(m ) dm 3 dt m 3 e t / t0 (m(m ) ) te t / 0 If m + is even, then m is also even and let m + k. We have t0 d m+ ( ) dt m+ e t / t0 (k ) (e ) dm t / t0 dt m (k )(k 3) dm 3 dt m 3 (e t / ) t0 (k )(k 3) e t / t0 (k ) k 4 (k 3)k k k 4 (k )! k (k )! (k)! k k! Example : Probability density function of dependent variables Suppose random variable X has continuous probability density function f and P {α X β}. Let Y g(x), where g is strictly increasing and differentiable on (α, β). Show that Y has probability density function f Y (y) f(g (y) g (g (y)). Answer: For y (g(α), g(β)), P {Y y} P {g(x) y} P { X g (y) } Example 3: Log-normal distribution The random variable S is log-normal distributed if g (y) α f(x)dx, f Y (y) d P {Y y} dy d dy g (y) α f(x)dx f ( g (y) ) ( g (y) ) f(g (y)) g (g (y)) ln S N (µ, σ ) 3

Find the probability density function p S (x) of S. Answer: Clearly p S (x) 0 for x 0. So we only consider the situation x > 0. By the definition of probability density function Since let Y ln S and Y N (µ, σ ), thus P(S x) x 0 p S (t)dt. P(S x) P(ln S ln x), P(S x) P(Y ln x) σ ln x e (y µ) σ dy. Since we get p S (x) d(p(s x)) p S (x) ( dx d dx σ d(p(s x)), dx ln x (ln x µ) σ x e σ x (ln x µ) σ e σ ) e (y µ) σ dy Example 4: Chebychev s inequality Let X : Ω R n be a random variable such that E[ X p ], for some p, 0 < p <. Prove the Chebychev s inequality: P ( X λ) λ p E[ X p ] for all λ 0 Suppose there exists k > 0 such that M E[exp(k X )] <. Prove that P ( X λ) Me kλ for all λ 0. Answer: Let A {ω; X(ω) λ}. Then X(ω) p p X (x)dx X(ω) p p X (x)dx λ p P(A) Ω Ω Then from above, we have (choose p ) P[ X λ] P (exp(k X ) exp(kλ)) exp( kλ)e[exp(k X )]. 4

Stochastic Process and Wiener Process Definition: A stochastic process is a parametrized collection of random variables {X t } t T defined on a probability space (Ω, F, P) and assuming values in R n. The parameter space T is usually the halfline [0, ]. Note that for each t T fixed, we have a random variable ω X t (ω); ω Ω. On the other hand, fixing ω Ω, we can consider the function which is called a path of X t. t X t (ω); t T. Model for Stock prices and Wiener Process Recall that we have derived the following mathematical representation for stock prices: ds(t) S(t) µ(t, X t) + σ(t, X t )dx t where µ and σ are some given functions. The problem is that we need to find a suitable and reasonable dx t. Because we hope that dx t will be able to represent some random noises, the following properties are some assumptions on X t :. X 0 (ω) 0 for all ω Ω. the path X t : [0, ) R n are continuous functions for every ω Ω 3. for every t and h 0, the increment X t+h X t N (0, h) (Gaussian process) and; 4. X u X v and X t X s are independent for all 0 u v t s (Independent increments) Stochastic process satisfying -4 is called Wiener Process. Note that only the Brownian process can satisfy the continuity property. Therefore the terms Brownian motion and Wiener Process are usually used interchangeably. Similarly, we have the following definition for generalized Wiener process: Definition: The process Y t σx t + µt + ζ is called a generalized Wiener process, starting at ζ with a drift parameter µ and variance rate σ. 5