Lecture 4: Processes with independent increments

Similar documents
Stochastic Modelling in Finance - Solutions to sheet 8

Homework 10 (Stats 620, Winter 2017) Due Tuesday April 18, in class Questions are derived from problems in Stochastic Processes by S. Ross.

An Introduction to Malliavin calculus and its applications

Transform Techniques. Moment Generating Function

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS

Lecture 20: Riccati Equations and Least Squares Feedback Control

MODULE 3 FUNCTION OF A RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES PROBABILITY DISTRIBUTION OF A FUNCTION OF A RANDOM VARIABLE

Utility maximization in incomplete markets

Physics 127b: Statistical Mechanics. Fokker-Planck Equation. Time Evolution

Stochastic models and their distributions

Discrete Markov Processes. 1. Introduction

6. Stochastic calculus with jump processes

arxiv: v1 [math.pr] 21 May 2010

f(s)dw Solution 1. Approximate f by piece-wise constant left-continuous non-random functions f n such that (f(s) f n (s)) 2 ds 0.

Math 10B: Mock Mid II. April 13, 2016

arxiv: v1 [math.pr] 19 Feb 2011

Introduction to Probability and Statistics Slides 4 Chapter 4

Vehicle Arrival Models : Headway

Loss of martingality in asset price models with lognormal stochastic volatility

Chapter 14 Wiener Processes and Itô s Lemma. Options, Futures, and Other Derivatives, 9th Edition, Copyright John C. Hull

Differential Equations

ψ(t) = V x (0)V x (t)

Cash Flow Valuation Mode Lin Discrete Time

Richard A. Davis Colorado State University Bojan Basrak Eurandom Thomas Mikosch University of Groningen

Lecture #31, 32: The Ornstein-Uhlenbeck Process as a Model of Volatility

Chapter 6. Systems of First Order Linear Differential Equations

The Strong Law of Large Numbers

dt = C exp (3 ln t 4 ). t 4 W = C exp ( ln(4 t) 3) = C(4 t) 3.

The consumption-based determinants of the term structure of discount rates: Corrigendum. Christian Gollier 1 Toulouse School of Economics March 2012

Sample Autocorrelations for Financial Time Series Models. Richard A. Davis Colorado State University Thomas Mikosch University of Copenhagen

13.3 Term structure models

Elements of Stochastic Processes Lecture II Hamid R. Rabiee

Chapter 7 Response of First-order RL and RC Circuits

Uniqueness of solutions to quadratic BSDEs. BSDEs with convex generators and unbounded terminal conditions

TMA 4265 Stochastic Processes

On a Fractional Stochastic Landau-Ginzburg Equation

Hamilton- J acobi Equation: Explicit Formulas In this lecture we try to apply the method of characteristics to the Hamilton-Jacobi equation: u t

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

arxiv: v1 [math.pr] 6 Oct 2008

Regular Variation and Financial Time Series Models

ODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004

An random variable is a quantity that assumes different values with certain probabilities.

7 The Itô/Stratonovich dilemma

ME 391 Mechanical Engineering Analysis

The Optimal Stopping Time for Selling an Asset When It Is Uncertain Whether the Price Process Is Increasing or Decreasing When the Horizon Is Infinite

Dual Representation as Stochastic Differential Games of Backward Stochastic Differential Equations and Dynamic Evaluations

5. Stochastic processes (1)

A FAMILY OF MARTINGALES GENERATED BY A PROCESS WITH INDEPENDENT INCREMENTS

Chapter 2. First Order Scalar Equations

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

Optimal Investment, Consumption and Retirement Decision with Disutility and Borrowing Constraints

Object tracking: Using HMMs to estimate the geographical location of fish

Optimal Investment under Dynamic Risk Constraints and Partial Information

ENGI 9420 Engineering Analysis Assignment 2 Solutions

ON SCHRÖDINGER S EQUATION, 3-DIMENSIONAL BESSEL BRIDGES, AND PASSAGE TIME PROBLEMS

1. Consider a pure-exchange economy with stochastic endowments. The state of the economy

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

Avd. Matematisk statistik

Backward stochastic dynamics on a filtered probability space

Simulation of BSDEs and. Wiener Chaos Expansions

NEW EXAMPLES OF CONVOLUTIONS AND NON-COMMUTATIVE CENTRAL LIMIT THEOREMS

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

Basic notions of probability theory (Part 2)

in Engineering Prof. Dr. Michael Havbro Faber ETH Zurich, Switzerland Swiss Federal Institute of Technology

Representation of Stochastic Process by Means of Stochastic Integrals

Oscillation of an Euler Cauchy Dynamic Equation S. Huff, G. Olumolode, N. Pennington, and A. Peterson

Markov Processes and Stochastic Calculus

System of Linear Differential Equations

Math Final Exam Solutions

OBJECTIVES OF TIME SERIES ANALYSIS

A proof of Ito's formula using a di Title formula. Author(s) Fujita, Takahiko; Kawanishi, Yasuhi. Studia scientiarum mathematicarum H Citation

EE3723 : Digital Communications

5. Response of Linear Time-Invariant Systems to Random Inputs

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing

Fractional Method of Characteristics for Fractional Partial Differential Equations

THE MYSTERY OF STOCHASTIC MECHANICS. Edward Nelson Department of Mathematics Princeton University

POSITIVE AND MONOTONE SYSTEMS IN A PARTIALLY ORDERED SPACE

An Introduction to Backward Stochastic Differential Equations (BSDEs) PIMS Summer School 2016 in Mathematical Finance.

Stochastic Model for Cancer Cell Growth through Single Forward Mutation

Undetermined coefficients for local fractional differential equations

Math 334 Fall 2011 Homework 11 Solutions

10. State Space Methods

Unit Root Time Series. Univariate random walk

Finish reading Chapter 2 of Spivak, rereading earlier sections as necessary. handout and fill in some missing details!

Ann. Funct. Anal. 2 (2011), no. 2, A nnals of F unctional A nalysis ISSN: (electronic) URL:

Chapter 4. Location-Scale-Based Parametric Distributions. William Q. Meeker and Luis A. Escobar Iowa State University and Louisiana State University

1 Solutions to selected problems

Simulation of BSDEs and. Wiener Chaos Expansions

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Generalized Snell envelope and BSDE With Two general Reflecting Barriers

14 Autoregressive Moving Average Models

Modeling Economic Time Series with Stochastic Linear Difference Equations

Stochastic Modelling of Electricity and Related Markets: Chapter 3

Homework 4 (Stats 620, Winter 2017) Due Tuesday Feb 14, in class Questions are derived from problems in Stochastic Processes by S. Ross.

4 Sequences of measurable functions

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

MA 214 Calculus IV (Spring 2016) Section 2. Homework Assignment 1 Solutions

A general continuous auction system in presence of insiders

Lecture 6: Wiener Process

2 Some Property of Exponential Map of Matrix

Transcription:

Lecure 4: Processes wih independen incremens 1. A Wienner process 1.1 Definiion of a Wienner process 1.2 Reflecion principle 1.3 Exponenial Brownian moion 1.4 Exchange of measure (Girsanov heorem) 1.5 A mulivariae Wienner processes 2. Poisson and relaed processes 2.1 A Poisson process 2.2 Compound Poisson processes 2.3 Jump-diffusion Processes 2.4 Risk processes 3. LN Problems 1

1. A Wienner process 1.1 Definiion of a Wienner process Le X(), 0 be a real-valued process defind on some probabiliy space < Ω, F, P >. Definiion 4.1. Sochasic process X(), 0 is a Markov process if for any 0 0 <... < n < < + s, n 1, P{X( + s) y/x() = x, X( k ) = x k, k = 1,..., n} (1) = P{X( + s) y/x() = x} = P (, x, + s, y). Definiion 4.2. Sochasic process X(), 0 is a process wih independen incremens if for any 0 0 <... < n, n 1, incremens X(+s) X(), X() X( n ), X( n ) X( n 1 ),..., X( 1 ) X( 0 ) and X( 0 ) are independen random variables. P{X( + s) y/x() = x, X( k ) = x k, k = 1,..., n} (2) Lemma 4.1. Any sochasic process wih independen incremens is a Markov process. = P{X( + s) X() y x/x() X( n ) = x x n, X( n ) X( n 1 ) = x n x n 1,..., X( 1 ) X( 0 ) = x 1 x 0, X( 0 ) = x 0 } = P{X( + s) X() y x} = P (, + s, y x). The process wih independen incremens X(), 0 is said o be homogeneous in ime if for any s, P{X( + s) X() y} = P (, + s, y) = P (s, y). 2

Figur 1: A rajecory of a Wienner process Definiion 4.3. A real-valued sochasic process W (), 0 defined on a probabiliy space < Ω, F, P > is called a sandard Wienner process (Brownian moion) if: A: W (), 0 is a homogenous process wih independen incremens wih he iniial value W (0) 0. B: Incremen W ( + s) W () has a normal disribuion wih he mean 0 and he variance s, for 0 + s <. C: Process W (), 0 is coninuous, i.e., a rajecory W (, ω), 0 is coninuous funcion for any ω Ω. Lemma 4.2. A process W (), 0 is a Wienner process if and only if i is a real-valued, coninuous, Gaussian process wih he iniial value W (0) 0, he expeced values EW () = 0, 0 and he correlaion funcion EW ()W (s) = min(, s),, s 0. 3

(a) A linear ransformaion of a Gaussian random vecor is also a Gaussian random vecor; (b) Le us ake arbirary n 1 and 0 = 0 < 1 < < n <. If (W ( 1 ),..., W ( n )) is a Gaussian random vecor hen (W ( 1 ) W ( 0 ),..., W ( n ) W ( n 1 )) is also a Gaussian random vecor and vise versa, since hese vecors are linear ransformaions of each oher. (c) If random variables W ( i ) W ( i 1 ), i = 1,..., n are independen and V ar(w ( i ) W ( i 1 )) = i i 1, i = 1,..., n hen we ge EW ( i )W ( j ) = E(W ( i ) W ( 0 )) 2 + E(W ( i ) W ( 0 ))(W ( j ) W ( i )) = i, for i j. (d) If EW ( i )W ( j ) = i, for i j, hen E(W ( i ) W ( i 1 )) 2 = EW ( i ) 2 2EW ( i )W ( i 1 ) + EW ( i 1 ) 2 = i 2 i 1 + i 1 = i i 1, for i = 1,..., n, and E(W ( i ) W ( i 1 ))(W ( j ) W ( j 1 )) = i i i 1 + i 1 = 0, for i < j. (1) Wienner process posesses he following symmeric propery, W (), 0 d = W (), 0. (2) Trajecories of a Wienner process are coninuous bu non-differeniable funcions. (e) Thus incremens W ( i ) W ( i 1 ), i = 1,..., n are independen since Gaussian random vecor has independen componens if and only if hey are no correlaed. (a) Indeed, for any > 0, he random variable and, herefore, aking n = 1 n 4, we ge for any K > 0, n=1 P{ W ( + n) W () n < K} W (+ ) W () d = 1 N(0, 1), 4

= n=1 1 K n 2π K e u 2 2K 2 du 1 n 2π n <. 2 (b) This relaion implies, by Borel-Kanelli lemma, ha for any K > 0 he only a finie number of evens from he sequence A n,k = { W (+ n) W () n < K}, n = 1, 2,... occur and, hus, n=1 W ( + n) W () a.s. as n. n (3) Trajecories of a Wienner process have unbounded variaion, ha means ha for any k,n = a + k n, where n = (b a) 2 n, k = 1,... 2 n, n 1 L n = 2 n k=1 2 n + W ( k,n ) W ( k 1,n 2 (a) Indeed, L n M n = 2 n k=1 W ( k,n) W ( k 1,n. a.s. as n. (b) EM n = 2 n n E N(0, 1) = c2 n/2, where c = b a E N(0, 1) ; (c) V arm n = 2 n V ar W ( 1,n ) W ( 0,n 2 n E W ( 1,n ) W ( 0,n 2 = b a. (d) Le d n = o(2 n/2 ). Obviously P{ M n EM n d n } b a d 2 n 0. (e) M n P as n. (f) M n, n = 1, 2,... is a monoonic sequence, and, hus, M n L n n. (4) Wienner process has a fracal self-similariy propery, 1 c W (c), 0 d = W (), 0. a.s. as 5

A process Z() = x + µ + σw (), 0, where x, µ R 1, σ > 0, is a Wienner process wih he iniial sae x, he drif µ, and he diffusion coefficien σ. (5) Transiion probabiliies for Wienner process have he following form, where P (x, y, ) = P{Z(s + ) y/z(s) = x} = P{x + µ + σw () y} = Φ( y x µ σ ), Φ(u) = 1 2π u e v2 2 dv, u R1. (6) They saisfy he Kolmogorov backward equaion, P (x, y, ) = µ σ2 2 P (x, y, ) + x 2 x2p (x, y, ). (7) The characerisic funcion of a Wienner process Z() has he following form, for 0, Ee izz() = e iz(x+µ) z2 σ 2 2, z R 1. 1.2 Reflecion principle Le consider he minimum funcionals of a sandard Wienner process W (), 0, M() = min W (s), 0. 0 s (8) Process Z x () = x + W (), 0 possesses a srong Markov propery a hiing momens τ y = inf( : Z x () = y) ha means ha σ-algebras of random evens σ[z x (s τ y ), s 0] and σ[z x (τ y + ), 0)] are independen and, moreover, Z x (τ y + ), 0 d = Z y (), 0. 6

Figur 2: Reflecion ransformaion (9) The srong Markov propery (8) implies he following reflecion principle, which is expressed by he following equaliy, for x, y 0, P{x + W () > y, x + M() 0} = P{x + W () < y} (3) = Φ( x y ) = 1 Φ( x + y ), x, y 0. Theorem 4.1. The following formula ake place for he sandard Wienner process, P{W () > u, M() > x} = Φ( 2x+u ) Φ( u ) for u x, x 0 Φ( x ) Φ( x ) for u < x, x 0 (4) (10) M() d = sup 0 s W (s) d = N(0, ). 7

(a) Using reflecion equaliy (5), we ge for x, y 0, P{x + W () > y, x + M() > 0} (5) = P{x + W () > y} P{x + W () > y, x + M() 0} = 1 Φ( y x ) (1 Φ( x + y )) = Φ( x + y ) Φ( y x ). (b) Using change of variables y x = u y + x = 2x + u, we ge for x, y 0, and P{W () > y x, M() > x} = P{W () > u, M() > x} = Φ( 2x + u ) Φ( u ), u x, x 0, P{W () > u, M() > x} = P{M() > x} = P{W () > x, M() > x} = Φ( x ) Φ( x ), u < x, x 0. 1.3 Exponenial Brownian moion Definiion 4.3. An exponenial Brownian moion is he process given by he following relaion, where S, µ R 1, σ > 0. Example S() = Se µ+σw (), 0. A European opion conrac. This is a conrac beween wo paries, a seller and a buyer, in which he buyer pays o he seller he price C > 0 a momen 0 for he righ o ge from he seller he revenue e rt [S(T ) K] + = 8

e rt [S(T ) K]I(S(T ) K). Here, r > 0 is he free ineres rae, K is a so-calledsrike price, and T is a mauriy of he opion conrac. I is known, ha he fair price C of he European conrac is given by he following formula, C = Ee rt [S(T ) K] +, where he expecaion should be compued for he so-called risk-neural model wih parameer µ = r σ2 2. Noe, ha, in his case, he process e r S(), 0 possesses he maringale propery, E{e r(+s) S( + s)/s()} = E{e r S()e ( r+µ)s+σ(w (+s) W ()) /S()} e r σ2 ( r+µ+ S()e 2 )s = e r S(),, s 0. 1.4 Exchange of measure (Girsanov heorem) Le W (), 0 be a sandard Wienner process defined on a probabiliy space < Ω, F, P >. Le us inroduce random variables, for β R 1, T > 0, β2 βw (T ) Y β (T ) = e 2 T. (6) Noe ha, by he definiion, (a) Y β (T ) > 0 (for every ω Ω), and (b) EY β (T ) = 1. Le us also define a new probabiliy measure on he σ-algebra F using he following β-ransformaion, P (A) = EI(A)Y β (T ), A F. (7) Lemma 4.3. The measures P (A) = EI(A) and P (A) are equivalen, i.e., P (A) = 0 P (A) = 0. 9

Theorem 4.2 (Girsanov). The process W () = W () β, 0 is a sandard Wienner process under measure P (A) or, equivalenly, W () = W () + β, 0 is a Wienner process wih drif β and diffusion 1 under measure P (A). (a) This readily follows from he posiiviy of he random variable Y β (T ). (a) The momen generaing funcion, m (z) = E e z(w () β) = Ee z(w () β) β2 βw (T ) e 2 T (8) β2 zβ = e 2 Ee (z+β)w () Ee β2 β(w (T ) W ()) (T ) β2 zβ = e 2 e (z+β)2 2 1 = e z2 2 = E e z W (), z R 1. (b) A proof for mulivariae disribuions is similar. Example Black-Scholes formula. Le us illusrae Girsanov heorem using i for proving he celebraing Black-Scholes formula for he fair price of an European opion. (a) According Girsanov heorem, he β-ransform ransforms he process µ + σw () o he process µ + σ( W () + β) = (µ + σβ) + σ W (), where W () is a sandard Wienner process under measure P (A) = EI(A)Y β (T ). (b) If o choose β = (r σ 2 2 ) µ σ, hen he process µ + σw () will be ransformed o he process (r σ2 2 ) + σ W () under measure P (A). (c) In his case, he he process S() = Se µ+σw () will be ransformed in he process S() σ2 (r = Se 2 )+σ W () under measure P (A). 10 2

(d) The fair price of an European opion, C = e rt E σ2 (r [Se 2 )T +σ W (T ) K] + (9) = SE e σ2 2 T +σ W (T ) I ) (Se σ2 2 T +σ W (T ) K e rt KP σ2 (r {Se 2 )T +σ W (T ) K}. (e) The probabiliy in he second erm of he above formula, P σ2 (r {Se 2 )T +σ W (T ) K} = P {σ W (T ) ln K S where ξ = ln K S (r σ 2 2 )T σ T = P {N(0, 1) ln K S + σ T 2. σ2 (r 2 )T σ } = Φ(ξ σ T ), T σ2 (r )T } (10) 2 (f) Le us apply he β-ransform wih parameer β = σ o he sandard Wienner process W () (under measure P (A)), which, in his case, will be ransformed o he process W () + σ), where W () is a sandard Wienner process under he new measure P (A) = E I(A)Ỹσ(T ), where Ỹ β (T ) = e β W (T ) β2 2 T. (g) In his case, he expecaion, E e σ2 2 T +σ W (T ) I (Se σ2 2 T +σ W (T ) K ) = P σ2 (r {Se 2 )T +σ( W (T )+σt ) K} = P {N(0, 1) ln K σ2 S (r + 2 )T σ } = Φ(ξ). (11) T (h) Formulas (9) (11) yields he desired Black-Scholes formula, C = SΦ(ξ) e rt KΦ(ξ σ T ). (12) 11

1.5 A mulivariae Wienner processes Definiion 4.4. A sochasic process W () = (W 1 (),..., W k ()), 0 wih real-valued componens, defined on a probabiliy space < Ω, F, P >, is called a sandard k-dimensional Wienner process (Brownian moion) if: A : W (), 0 is a homogenous process wih independen incremens wih he iniial values of componens W i (0) 0, i = 1,..., k. B : An incremen W ( + s) W () has a mulivariae normal disribuion wih means E(W i ( + s) W i ()) = 0, i = 1,..., k and correlaion coefficiens E(W i ( + s) W i ())(W j ( + s) W j ()) = si(i = j), i, j = 1,..., k, for 0 + s <. C : Process W (), 0 is coninuous, i.e., a rajecory W (, ω), 0 is coninuous funcion for any ω Ω. Definiion 4.5. A sochasic process Z() = (Z 1 (),..., Z k ()), 0 is a k-dimensional Wienner process if i is a linear ransformaion of a sandard k-dimensional Wienner process W (), 0, i.e, i is given by he following formula, Z() = x + µ + ΛW (), 0, where x = (x 1,..., x k ), µ = (µ 1,..., µ k ) are k-dimensional vecors wih real-valued componens, and Λ = λ ij is a k k marix wih real-valued elemens. Lemma 4.4. Process Z(), 0 is a coninuous process wih independen incremens, iniial sae Z(0) = x, and k-dimensional Gaussian disribuion of an incremen Z( + s) Z() wih he mean vecor µs and he correlaion marix Σ s = σ ij s, where σ ij = k r=1 λ irλ jr, i, j = 1,..., k, for, s 0. Lemma 4.5. A linear ransformaion Z () = ΓZ() = Γ x + Γ µ + ΓΛW (), 0 of a k-dimensional Wienner process Z() = z + µ + ΛW (), 0, where Γ = γ ij is a l k marix wih real-valued elemens, is a 12

l-dimensional Wienner process. 2. Poisson and relaed processes 2.1 A Poisson process Definiion 4.6. A sochasic process N(), 0 wih real-valued componens, defined on a probabiliy space < Ω, F, P > is called a Poisson process if: D: N(), 0 is a homogenous process wih independen incremens wih he iniial value N(0) 0. E: Incremen N(+s) N() has a Poisson disribuion wih mean λ > 0, i.e., P{N( + s) N() = n} = e λs (λs) n n!, n = 0, 1,.... F: Process N(), 0 has rajecories coninuous from he righ. (1) Process N( + s) N(), 0 akes only non-negaive ineger values and, hus, N() is a non-decreasing process. (2) Process N(), 0 has sepwise rajecories. (3) Process N(), 0 is sochasically coninuous. (4) Le T n = min( : > T n 1, N() > N(T n 1 )), n = 1, 2,..., where T 0 = 0. By he definiion, T n is he momen of n-h jump for he process N(). The iner-jump imes X n = T n T n 1, n = 1, 2,... are muually independen random variables. (5) Random variables X n have exponenial disribuion wih parameer λ, i.e., P{X n < x} = 1 e λx, x 0, for n = 1, 2,.... (6) Random variables T n have he Erlang disribuion wih he pdf p n () = (λ)n 1 (n 1)! e λ, 0, for n = 1, 2,.... 13

Figur 3: A rajecory of a Poisson process (7) Random variables N(T n ) = n wih probabiliy 1 for n = 0, 1,.... (a) N δ () = N((n + 1)δ) if nδ < (n + 1)δ, n = 0, 1,..., for δ > 0. (b) N() N δ () N( + δ) and, hus, N δ () a.s. N() as δ 0. (c) T δ,n = min( : > T δ,n 1, N δ () > N δ (T δ,n 1 )), n = 1, 2,..., where T δ,0 = 0. (d) T δ,n T n T δ,n + δ, n = 0, 1,.... (e) P{T δ,1 > kδ} = e kδλ e λ as δ 0, kδ, for > 0. (f) P{N δ (T δ,1 ) = 1} = k=0 e kδλ δλe δλ = δλe δλ 1 e δλ 1 as δ 0. (g) ec. (8) Le N(), 0 is a Poisson process wih parameer λ = 1 and λ(), 0 be a non-negaive, coninuous funcion. Define he process Ñ() = 14

Figur 4: A rajecory of a compound Poisson process N(Λ()), 0, where Λ() = 0 λ(u)du. Then, Ñ() is a process wih independen incremens such ha he incremen Ñ( + s) Ñ() has a Poisson disribuion wih parameer Λ(, + s) = Λ( + s) Λ() = +s λ(u)du, for, s 0. 2.2 Compound Poisson processes Definiion 4.7. A sochasic process Y (), 0 wih real-valued componens, defined on a probabiliy space < Ω, F, P > is called a compound Poisson process if i has he following form: N() X() = X k, 0, k=1 where (a) N(), 0 is a Poisson process wih parameer λ > 0; (b) X n, n = 1, 2,... is a sequence of i.i.d. real-valued random variables wih a disribuion funcion F (x); (c) he process N(), 0 and he random sequence X n, n = 1, 2,... are independen. 15

(1) Process X(), 0 is a homogeneous process wih independen incremens. (2) Process X(), 0 has coninuous from he righ sepwise rajecories. (3) Process X(), 0 is sochasically coninuous. (4) The characerisic funcion of he compound Poisson process X() has he following form, for 0, Ψ (z) = Ee izx() = n=0 = exp{λ e λ(λ)n (Ee izx 1 ) n = exp{λ(ee izx 1 1)} n! (e izx 1)dF (x)}, z R 1. (5) EX() = λα, V arx() = λβ, where α = EX 1, β = EX 2 1. 2.3 Jump-diffusion processes Definiion 4.7. A sochasic process Y (), 0 wih real-valued componens, defined on a probabiliy space < Ω, F, P > is called a jump-diffusion process if i has he following form: N() Y () = y + µ + σw () + X k, 0, where (a) y, µ R 1, σ > 0; (b) W () is a sandard Brownian moion; (c) N(), 0 is a Poisson process wih parameer λ > 0; (d) X n, n = 1, 2,... is a sequence of i.i.d. real-valued random variables wih a disribuion funcion F (x); (e) he processes W (), 0, N(), 0 and a sequence of random variables X n, n = 1, 2,... are independen. k=1 16

Figur 5: A rajecory of a jump-diffusion process (1) Process Y (), 0 is a homogeneous process wih independen incremens. (2) Process Y (), 0 has coninuous from he righ sepwise rajecories. (3) Process Y (), 0 is sochasically coninuous. (4) The characerisic funcion of he jump-diffusion process Y () has he following form, for 0, Ψ (z) = Ee izy () = exp {iz(y+µ) z2 σ 2 2 +λ } (e izx 1)dF (x), z R 1. (5) The process S() = s exp{µ + σw () + N() k=1 X k}, 0 is referred as an exponenial jump-diffusion process. 2.4 Risk processes 17

Figur 6: A rajecory of a risk process Definiion 4.8. A sochasic process X(), 0 wih real-valued componens, defined on a probabiliy space < Ω, F, P > is called a jump-diffusion process if i has he following form: N() X() = x + c X k, 0, where (a) x, c > 0; (b) N(), 0 is a Poisson process wih parameer λ > 0; (c) X 1, X 2,... is a sequence of i.i.d. non-negaive random variables wih a disribuion funcion F (x); (d) he process N(), 0 and a sequence of random variables X n, n = 1, 2,... are independen. In insurance applicaions, c is inerpreed as a premium rae, N() as a process couning he number of claims received by an insurance company in an inerval [0, ], X n as sequenial random insurance claims. A risk process is a paricular case of a jump-diffusion process and, hus, has analogous properies. (1) Process X(), 0 is a homogeneous process wih independen incremens. 18 k=1

(2) Process X(), 0 has coninuous from he righ sepwise rajecories. (3) Process X(), 0 is sochasically coninuous. (4) The characerisic funcion of he risk process X() has he following form, for 0, { } Ψ (z) = Ee izx() = exp iz(x + c) + λ (e izx 1)dF (x), z R 1. 3. LN Problems 1. Le U(, x) = Ef(x+µ+σW ()), where f(x) be a bounded coninuous funcion. Then U(, x) saisfies he equaion, U(x, ) 0 = µ σ2 2 U(x, ) + x 2 x2u(x, ). 2. V (, x) = E 0 f(x + µs + σw (s))ds, where f(x) be a bounded coninuous funcion. The following formula akes place, V (, x) = 0 1 2πsσ f(x + µs + y) exp{ y2 2σ 2 s }dyds. 3. Le N() = max(n : X 1 + +X n ), 0, where X n, n = 1, 2,... is a sequence of non-negaive i.i.d. random variables. Using formula P{N() n} = P{X 1 + + X n } prove ha N() has a Poisson disribuion, if random variables X n, n = 1, 2,... has and exponenial disribuion wih parameer λ. 4. Prove he proposiion (8) formulaed a he Page 13. 5. Find condiions under which an exponenial jump-diffusion process S(), 0, inroduced in Sub-secion 2.3, possesses he maringale propery, i.e., E{S( + s)/s()} = S(), s, 0. 19

6. Find condiions under which a risk process X(), 0, inroduced in Sub-secion 2.4, possesses he maringale propery, i.e., E{X( + s)/x()} = X(), s, 0. 20