MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra of Ω). Also recall that a random variable X : Ω R is a real-valued function from Ω to R. The probability P is a function P : F [0, ] such that. P( ) 0, P(Ω).. If A, A,... F and {A i } i is disjoint, then P ( ia i ) P(A i ) Let X be a random variable, and p X (x) be its probability density function, hence p X (x) 0 for all values of x and p X (x) is integrable (That means p X (x)dx is well-defined).. P {X > δ} p δ X (x)dx and p X(x)dx.. A random variable Z with standard normal distribution is denoted by Z N (0, ). Such random variable is said to have a distributed as a Gaussian Process. The corresponding probability density function of Z is given by p Z (x) e x /, x (, ) i 3. The Expectation of X is: and for any real-valued function g. 4. The Variance of X is: E[X] E[g(X)] xp X (x)dx g(x)p X (x)dx Var(X) E[(X E[X]) ] E[X ] E[X] 5. Linearity of the expectation and variance: E[aX + b] E[aX] + b ae[x] + b; Var(aX + b) Var(aX) a Var(X)
6. The moment generating function: E[e tx ] k0 k0 k0 e tx p X (x)dx k! E[Xk ] k! xk p X (x)dx k! xk p X (x)dx Note that E[e tx ] is a function of t, by the above equation, we have E[e tx ] k0 k! E[Xk ] it contains the information of all the moment of random variable X. Take the m-th derivative of both sides of the above equations at t 0, we obtain the m-th moment of Z d m E[e tx ] dt m E[X m ] t0 Example Show that { 0, when n is odd, E[Z n ] (k)!, when n k. k k! where Z N (0, ). Answer: We will prove by induction on n using the moment generating function of Z. Since Z N (0, ), we get the density function of Z p Z (x) e x /, < x < The moment generating function of Z is For n,, r0 E[e tz ] e t / e t / e tx e x / dx d ( ) e t / t0 te t / dt, t0 e (x t) / dx 0 d ( ) dt e t / t0 e t / + t e t /, t0 So it is true for n,. Assume that it is true for n,..., m. For n m +, d m+ ( ) ( ) dt m+ e t / dm dt m te t /, m Cr m d r ( ) ( n ( dt r t dm r dt m r e t /, Leibniz rule: (uv) n n k. t dm dt m ( e t / ) + m dm dt m ( e t / ). k ) u (k) v (n k) )
Therefore If m + is odd, then m is also odd and we have d m+ ( ) ( ) dt m+ e t / t0 m dm dt m e t / t0. d m+ ( ) ( ) dt m+ e t / t0 m dm dt m e t / ( ) m(m ) dm 3 dt m 3 e t / t0 (m(m ) ) te t / 0 If m + is even, then m is also even and let m + k. We have t0 d m+ ( ) dt m+ e t / t0 (k ) (e ) dm t / t0 dt m (k )(k 3) dm 3 dt m 3 (e t / ) t0 (k )(k 3) e t / t0 (k ) k 4 (k 3)k k k 4 (k )! k (k )! (k)! k k! Example : Probability density function of dependent variables Suppose random variable X has continuous probability density function f and P {α X β}. Let Y g(x), where g is strictly increasing and differentiable on (α, β). Show that Y has probability density function f Y (y) f(g (y) g (g (y)). Answer: For y (g(α), g(β)), P {Y y} P {g(x) y} P { X g (y) } Example 3: Log-normal distribution The random variable S is log-normal distributed if g (y) α f(x)dx, f Y (y) d P {Y y} dy d dy g (y) α f(x)dx f ( g (y) ) ( g (y) ) f(g (y)) g (g (y)) ln S N (µ, σ ) 3
Find the probability density function p S (x) of S. Answer: Clearly p S (x) 0 for x 0. So we only consider the situation x > 0. By the definition of probability density function Since let Y ln S and Y N (µ, σ ), thus P(S x) x 0 p S (t)dt. P(S x) P(ln S ln x), P(S x) P(Y ln x) σ ln x e (y µ) σ dy. Since we get p S (x) d(p(s x)) p S (x) ( dx d dx σ d(p(s x)), dx ln x (ln x µ) σ x e σ x (ln x µ) σ e σ ) e (y µ) σ dy Example 4: Chebychev s inequality Let X : Ω R n be a random variable such that E[ X p ], for some p, 0 < p <. Prove the Chebychev s inequality: P ( X λ) λ p E[ X p ] for all λ 0 Suppose there exists k > 0 such that M E[exp(k X )] <. Prove that P ( X λ) Me kλ for all λ 0. Answer: Let A {ω; X(ω) λ}. Then X(ω) p p X (x)dx X(ω) p p X (x)dx λ p P(A) Ω Ω Then from above, we have (choose p ) P[ X λ] P (exp(k X ) exp(kλ)) exp( kλ)e[exp(k X )]. 4
Stochastic Process and Wiener Process Definition: A stochastic process is a parametrized collection of random variables {X t } t T defined on a probability space (Ω, F, P) and assuming values in R n. The parameter space T is usually the halfline [0, ]. Note that for each t T fixed, we have a random variable ω X t (ω); ω Ω. On the other hand, fixing ω Ω, we can consider the function which is called a path of X t. t X t (ω); t T. Model for Stock prices and Wiener Process Recall that we have derived the following mathematical representation for stock prices: ds(t) S(t) µ(t, X t) + σ(t, X t )dx t where µ and σ are some given functions. The problem is that we need to find a suitable and reasonable dx t. Because we hope that dx t will be able to represent some random noises, the following properties are some assumptions on X t :. X 0 (ω) 0 for all ω Ω. the path X t : [0, ) R n are continuous functions for every ω Ω 3. for every t and h 0, the increment X t+h X t N (0, h) (Gaussian process) and; 4. X u X v and X t X s are independent for all 0 u v t s (Independent increments) Stochastic process satisfying -4 is called Wiener Process. Note that only the Brownian process can satisfy the continuity property. Therefore the terms Brownian motion and Wiener Process are usually used interchangeably. Similarly, we have the following definition for generalized Wiener process: Definition: The process Y t σx t + µt + ζ is called a generalized Wiener process, starting at ζ with a drift parameter µ and variance rate σ. 5