Chapter 2 Application to Markov Processes
|
|
- Roland Chapman
- 5 years ago
- Views:
Transcription
1 Chapter 2 Application to Markov Processes 2.1 Problem Let X t be a standard Markov process with the state space. The time interval [, is denoted by T.Leta be a fixed state and σ a the hitting time for a. We impose the following four assumptions. (A-1 P b (σ a < = 1; (A-2 E b (σ a 1 asb a; (A-3 inf b U c E b (σ a 1 > for every neighborhood U of a; (A-4 a is a discontinuous exit state. We will explain the meaning of this condition. s is called an exit time from a for the path (X t (ω if, for every ε>, and if, for some ε>, {t : X t (ω = a} (s ε, s = {t : X t (ω = a} (s, s + ε =. All exit times from a for the path (X t (ω form a countable set depending on ω. An exit time s from a for the path (X t (ω is called a continuous or discontinuous exit time according as X s (ω = a or X s (ω = a. a is called a discontinuous exit state if all exit times from a for the path (X t (ω are discontinuous a.s. with respect to P a. The Author(s 215 K. Itô, Poisson Point Processes and Their Application to Markov Processes, pringerbriefs in Probability and Mathematical tatistics, DOI 1.17/ _2 19
2 2 2 Application to Markov Processes Let Xt = X t σa. ince the hitting time σa of the path (X t (ω is the same as σ a, the conditions (A-1, (A-2 and (A-3 are equivalent to (A -1 P b (σa < = 1; (A -2 E b (σa 1, as b a; (A -3 inf b U c E b (σa 1 > for every neighborhood U of a. By the strong Markov property of (X t, the probability laws of the path (X t is determined by the probability laws of the path (Xt and the probability law of the path (X t starting at a. ymbolically we have p.l. of (X t = p.l. of (X t + p.l. of (X t starting at a. (2.1 ince the path (X t starting at a behaves outside of a in the same way as the path (Xt, the union relation in (2.1 is no disjoint union. We want to extract some information I from the probability law of the path (X t starting at a to obtain a symbolic information relation p.l. of(x t = p.l. of(xt + I (disjoint union. (2.1 In the subsequent sections we will prove that I consists of two elements: jumping-in measure k(db and stagnancy rate m. 2.2 The Poisson Point Process Attached to a Markov Process at a tate a We use the same notations as in ect. 2.1 and impose the conditions (A-1, (A-2, (A-3 and (A-4. Let A(t be a local time of (X t at a. By our assumptions (A-1 and (A-2, A(t is determined up to a multiplicative constant and we have P b (A(t < for every t = 1 and P b (A(t as t = 1. We refer the reader to Blumenthal and Getoor [1] for the definition and the properties of local times. Let U be the space of all right continuous functions: T with left limits. The sample path of (X t belongs to U a.s. for every starting point. From now on we will refer to P a for the probability law of X t (ω unless the contrary is explicitly stated. Let us define a point process X : T U by D X ={A(s : s moves over all exit times from a for the path}, X ω (t = (X θ A 1 (t for t D X,
3 2.2 The Poisson Point Process Attached to a Markov Process at a tate a 21 Fig. 2.1 X t, A t and X t where θ t is a shift operator and is a stopping operator. Note that D X consists of all values of A(t corresponding to the flat t-intervals of A(t and that X ω (t is a function : T belonging to U for ω and t fixed and (Fig. 2.1 X ω (t(s = X s σa (θ τ ω(θ τ ω for s T, τ = A 1 (t. Figure 2.2 is an intuitive picture of X t. Fig. 2.2 An intuitive picture of X t
4 22 2 Application to Markov Processes Theorem and Definition The point process X defined above is a Poisson point process: T U; it is called the Poisson point process attached to the Markov process (X t. Proof Let {B t } be a family of sub-σ -algebras of B in the definition of the Markov process (X t. ince A 1 (t+ and A 1 (t = sup n A 1 (t 1 + are both stopping n times, B A 1 (t and B A 1 (t+ are well-defined. Let B t (X be the σ -algebra generated by X d [, t. Then B t (X B A 1 (t B A 1 (t+. For t fixed, we have P(X (A 1 (t+ = a = 1. Thus, for B B t (X and M P, wehave P a (B ((θ t X M = P a (BP a (X M because of the additivity of A(t, the strong Markov property of (X t and the definition of X. Thus X has renewal property. This implies that X is differential and stationary. To prove the σ -discreteness of X we will introduce a map h : U T by h(u = inf{t : u(t = a}. If u is the path of (X t (ω, then h(u will be σ a (ω. Let U n be the set of all u such that h(u > 1 n. By (A-4 we have X = n X r U n a.s. ince N(X, [, t U n n A 1 (t < a.s., X ω r U n is discrete a.s. for each n. X is therefore σ -discrete.
5 2.3 The Jumping-In Measure and the tagnancy Rate The Jumping-In Measure and the tagnancy Rate Let us consider a map e : U by e(u = u(, u U. ince the path of X t has no discontinuities of the second kind and since A 1 (t < for t <, the distance ρ(x t, a between X t and a can be larger than ε(> a finite number of times during [, A 1 (t a.s. for t <. This implies that e X is a σ -discrete point process. By Theorem 1.6.1, we see that e X is a Poisson point process. Definition The characteristic measure k of e X is called the jumping-in measure of the Markov process X t from a. It is obvious that k = n X e 1. ince n X is concentrated on the paths starting at points in {a} by (A-4, k is concentrated on {a}. It is obvious that the total measure of k is the same as that of n X. ince X is σ -discrete, the total measure of k is σ -finite. ince A 1 (t is known to be an increasing homogeneous Lévy process (=a subordinator, it can be written as when J(t is a pure jump process. A 1 (t = m t + J(t, m >, Definition The coefficient m is called the stagnancy rate of the Markov process (X t. The following theorem shows that the characteristic measure n X is determined by the measure k and the probability law of the path of (X t. Theorem n X (V = k(dbp b (X V where X denotes the path (Xt (ω, t T. Proof Let i denote the set {b : ρ(a, b >1/i} for i = 1, 2,... Then i = {a}.letu i ={u U : u( i }=e 1 ( i. Then U i increases with i and the limit U is the space of all paths in U starting from points in {a}.wehave X = X r U = i X i, X i = X r U i by (A-4. The set A 1 (D Xi [, A 1 (t+] is included in the set of the time points s [, A 1 (t+] for which ρ(x s, X s >1/i. ince the sample path of (X t has no
6 24 2 Application to Markov Processes discontinuity points of the second kind, the latter set is finite and so is A 1 (D Xi [, A 1 (t+]. This implies D Xi [, t] is finite. X i is therefore a discrete Poisson point process. By Theorem 1.4.1, wehave n Xi (V i = λ i P a (X i (τ i V i, V i U i U i U, where λ i = n Xi (U i and τ i is the smallest element in D Xi. By the definition we have X i (τ i = X(τ i = (X θ σi, σ i = A 1 (τ i. ince n Xi V U, = n X U i and since σ i is a stopping time with respect to {B t },wehave,for n X (U i V = λ i P a ((X θ σi V U i = λ i i P a (X σi dbp b (X V U i et V = e 1 (B i, B i i i. Then V e 1 ( i = U i and so k(b i = λ i P a (X σi B i. Thus we have n X (U i V = k(dbp b (X V U i. i Letting i,wehave n X (V = k(dbp b (X V = k(dbp b (X V, {a} which completes the proof. The jumping-in measure k is not arbitrary. We have: Theorem k is concentrated on {a} and E b (σa 1k(db<. Proof h X is also a Poisson point process whose integrated process is the discontinuous part of the increasing homogeneous Lévy process A 1 (t. Therefore h X is summable and so (1 tn h X (dt<
7 2.3 The Jumping-In Measure and the tagnancy Rate 25 by virtue of Theorem ince n h X = n X h 1, this can be written (1 t k(dbp b (σa dt< by the previous theorem, namely k(dbe b (σa 1 <. Remark By this theorem and the condition (A-3, we have for every neighborhood U of a. k(u c < If k( <, then X is discrete. Then the set {t : X t (ω = a} is a sequence of disjoint intervals ordered linearly and A(t,ωis the sojourn time at a singleton {a} up to a multiplicative constant. Thus we have m > in the decomposition: A 1 (t = mt + J(t, J(t being the discontinuous part of A 1 (t. Therefore we obtain: Theorem m in general, and m > in case k( <. A(t is determined up to a multiplicative constant and m and k depend on which version of A(t we take. Let A i (t, i = 1, 2 be two versions of A(t and write the corresponding m and k as m i and k i, i = 1, 2. Then we have a constant c > such that A 2 (t = ca 1 (t. Consider the decompositions Then A 1 i (s = m i s + J i (s, i = 1, 2. A 1 2 (cs = A 1 1 (s, m 2 cs + J 2 (cs = m 1 s + J 1 (s and so m 2 = 1 c m 1.
8 26 2 Application to Markov Processes Writing # A for the number of points in A, wehave ε k 2 (B = E a [#{s : s ε, e(x s B}] = E a [#{s : s ε, X (A 1 2 (s B}] = E a [#{t : A 2 (t ε, X (t B}] = E a [#{t : ca 1 (t ε, X (t B}] = E a [# {t : A 1 (t 1 }] c ε, X (t B = 1 c εk 1(B and so k 2 = 1 c k 1. Thus we have: Theorem If A 2 (t = ca 1 (t, then m 2 = 1 c m 1 and k 2 = 1 c k 1. Therefore m and k are determined up to a common multiplicative constant. To have m and k determined uniquely, we have to take a standard version of the local time A(t. Definition A(t is called standard if ( E a e t da(t = 1, in which case E b ( e t da(t = E b (e σ a for every b. The m and k that correspond to the standard A(t are called the standard stagnancy rate and the standard jumping-in measure. Theorem The standard stagnancy rate m and the stagnancy jumping-in measure k satisfy the following conditions. (a m in general and m > in case k( <. (b k is concentrated on {a} and (i k(dbe b(σa 1 < ; (ii m + k(dbe b(1 e σ a = 1. Proof By Theorems and it is enough to prove (b-(ii. ince m and k are standard, the corresponding A(t satisfies
9 2.3 The Jumping-In Measure and the tagnancy Rate 27 ( E a e t da(t = 1. But the left side is ( E a e A 1 (t dt = E a (e A 1 (t dt = e mt t (1 e s k(dbp b(σa ds dt (see the proof of Theorem and use the Lévy Khinchin formula ( = m + (1 e s 1 k(dbp b (σa ds ( 1. = m + k(dbe b (1 e σ a This proves (ii. 2.4 The Existence and Uniqueness Theorem uppose that X t is a standard Markov process with the state space and that a is a fixed state. We assume (A-1, (A-2, (A-3 and (A-4 in ect Let Xt = X t σa, m the standard stagnancy rate and k the jumping-in measure for X t. Then we have proved: (i Xt is a standard Markov process which satisfies (A -1, (A -2, (A -3. (ii m and k satisfy (a and (b in Theorem Now we want to construct X t for Xt, m and k given. Theorem uppose that Xt, m and k satisfy (i and (ii. Then there exists a standard Markov process X t satisfying (A-1, (A-2 and (A-3 such that X t σa is equivalent to Xt and that the standard stagnancy rate and the standard jumping-in measure are respectively equal to m and k. uch X t is unique up to equivalence. Proof of existence First we will construct the Poisson point process X attached to the Markov process X t that is to be constructed. Let U be the space of all right continuous functions: T with left limits. Define a σ -finite measure n on U by n(v = k(dbp b (X V and construct a Poisson point process X : T U with n X = n by Theorem 1.3.5, or by Theorem
10 28 2 Application to Markov Processes et Ã(s = ms + α s α D X h(x(α where h(u = inf{α T : u(α = a}. Define Y (t as follows. Y (t = { X(s(t Ã(s a if Ã(s t < Ã(s if Ã(s = t = Ã(s. Now define the probability law P a of the path of X t starting at a by P a (X V = P(Y ( V and the probability law P b of the path of X t starting at a general state b by P b (X σa V 1, X θ σa V 2 = P b (X V 1P a (X V 2. It is needless to say that the definition of P a is suggested by Fig. 2.1 and that the definition of P b is suggested by the strong Markov property. First we will prove that P(Ã(s < for every s and Ã( = = 1, (2.2 so that Y (t is well-defined for every t.ifk( =, then m > and Ã(s = ms < and A( =. If k( >, then h X is a Poisson point process with n h X = nh 1 = k(dbp b (σa. ince (t 1n h X (dt = k(dbe b (σa 1 <, h X is summable and so J(s = α s α D X h(x(α < for every s < and J(s is a homogeneous Lévy process with increasing paths. ince n h X ([, = k( >, we have
11 2.4 The Existence and Uniqueness Theorem 29 P(J( = = 1. This proves (2.2. Now we will prove that the process X t defined above is a standard Markov process with (A-1, (A-2, (A-3 and (A-4. Case 1. k( <. In this case we have m >. ince n X (U = k(dbp b (X U = k(, X is discrete. et and set D X ={τ 1 <τ 1 + τ 2 <τ 1 + τ 2 + τ 3 < } ξ i = X(τ 1 + τ 2 + +τ i, i = 1, 2,... Then τ 1,τ 2,..., ξ 1,ξ 2,...are independent and P(τ i > t = e tk(, P(ξ i V = 1 k(dbp b (X ( V, V U. k( In other words the probability law of ξ i is the probability law of the path of X with the initial distribution k(db/k(. By the definition of Y (t we have Y (t = a for mτ 1 + h(ξ 1 + +mτ i 1 + h(ξ j 1 t < mτ 1 + h(ξ 1 + +mτ i 1 + h(ξ i 1 + mτ i and Y (t = ξ i (t mτ 1 h(ξ 1 mτ i 1 h(ξ i 1 mτ i for mτ 1 + h(ξ 1 + +mτ i 1 + h(ξ i 1 + mτ i t < mτ 1 + h(ξ 1 + +mτ i + h(ξ i. ince P(mτ i > t = P(τ i > t/m = e tk(/m, X (t can be described as follows. If it starts at a, it stays at a for an exponential holding time with the parameter = k(/m, then jumps into db with probability
12 3 2 Application to Markov Processes Fig. 2.3 Markov process from a point process k(db/k( and moves in the same way as Xt does until it hits a, it will repeat the same motion afterwards independently of its past history. If it starts at b = a, it performs the same motion as Xt until it hits a and then it will act as above. We can verify the strong Markov property of this motion by routine. It is easy to check the other properties of X t stated above (see Fig Case 2. k( =. Everything can be verified by routine except the fact that the sample path of Y (t belongs to U a.s. ince it is obvious that Y (t is right continuous and has left limits as far as it is in {a}, the only fact that needs proof is that the set of s such that σ ε (X(s <, σ ε (u = inf{t : ρ(a, u(t ε} forms a discrete set a.s. for every ε>. ince X(s(t = a for t h(x(s a.s., σ ε (X(s < is equivalent to a.s. It is therefore enough to prove that σ ε (X(s < h(x(s X r V ε, V ε ={u : σ ε (u <h(u} is discrete a.s., namely that n X (V ε <. et Then δ>by(a -3. δ = inf{e b (σ a 1 : ρ(b, a ε}.
13 2.4 The Existence and Uniqueness Theorem 31 Observe that h(u 1 n X (du h(u 1 n X (du U V ε (h(u σ ε (u 1 n X (du V ε = (h(u σ ε (u 1 k(dbp b (X du V = k(dbe b [(σa σ ε(x 1, σa >σ ε(x ] = k(dbe b [E X (σε (X (σa 1, σ a >σ ε(x ] δ k(dbp b (σa >σ ε(x = δ k(dbp b (X V ε = δn X (V ε and that U h(u 1 n X (du = h(u 1 k(dbp b (X du U = k(dbe b (h(x 1 = k(dbe b (σa 1. Thus we have n X (V ε <. The proof of uniqueness is easy, because the probability law of the path of (Xt and k determine n X and so the probability law of X, which, combined with m determines the probability law of the path of X t. 2.5 The Resolvent Operator and the Generator of the Markov Process Constructed in ect. 2.4 The generator of a Markov process is defined in many ways which are not always equivalent to each other. We will adopt the following definition due to E.B. Dynkin. Let X t be a Markov process with right continuous paths. The transition probability p(t, b, E is defined by p(t, b, E = P b (X t E,
14 32 2 Application to Markov Processes and the transition operator p t is defined by p t f (b = p(t, b, dc f (c = E b ( f (X t. p t carries the space B( of all bounded real Borel measurable functions into itself. It has the semi-group property: p t+s = p t p s, p = I (=identity operator. The resolvent operator (potential operator of order α R α (α > is defined by R α = e αt p t dt i.e., R α f (b = e αt p t f (b dt = E b ( e αt f (X t dt. It satisfies the resolvent equations: R α R β + (α βr α R β =. The Dynkin subspace L of B( is defined by L ={f B( : lim t p t f (b = f (b for every b}. L is a linear subspace of B(. Because of the right continuity of the path of X t we have C( L B(, C( being the space of all bounded continuous real functions on. It is easy to see that p t L L, R α L L. In view of this fact we will regard p t and R α as operators : L L, unless the contrary is stated explicitly. By virtue of the resolvent equation R = R α L is independent of α. R α : L R is 1 1 and so Rα 1 is well-defined.
15 2.5 The Resolvent Operator and the Generator 33 Definition The generator G of (X t is defined by { D(G = f L : 1 t (p t f f converges boundedly as t } to a function L and 1 G f (b = lim t t (p t f (b f (b, f D(G. Theorem D(G = R = R α L, G f = α f Rα 1 f, f D(G. Let X t be a standard Markov process and a be a fixed state. We assume that (A-1, (A-2 and (A-3 are satisfied. Let Xt = X t σa. Then Xt is also a standard Markov process with (A -1, (A -2 and (A -3. We will denote the transition operator, the resolvent operator and the generator of X t respectively by p t, R α and G and the corresponding operators for Xt are denoted by pt, R α and G. Theorem D(G D(G and G f (b = G f (b, b = a, G f (a =. Proof If f D(G, then f = R α g, g L. By Dynkin s formula we have f (b = E b ( = E b ( σ a = E b ( σ a e αt g(x t dt e αt g(x t dt + E b (e ασ a f (X σa e αt g(x t dt + E b (e ασ a f (a. et Then g (b = { g(b b = a, αr α g(a = α f (a b = a. ( Rα g (b = E b e αt g (Xt dt = E b ( σ a e αt g(x t dt + E b (e ασ a Rα g (a.
16 34 2 Application to Markov Processes ince we have R α g (a = e αt α f (a dt = f (a, f (b = R α g (b. To complete the proof of D(G D(G, we need only prove that g belongs to the Dynkin space L of X t. ince X t = a for t σ a,wehave p t g (a = g (a g (a as t. uppose b = a. Then P b (σ a > = 1 and so Therefore lim P b (σ a t =. t pt g (b g (b = E b (g (Xt g (b = E b (g(x t, t <σ a + E b (g (a, t σ a g(b = E b (g(x t E b (g(x t, t σ a + E b (g (a, t σ a g(b = E b (g(x t g(b +( g + g (a P b (t σ a, where g =sup c g(c. ince f = R α g = R α g, we have and so G f = α f g, G f = α f g G f (b = G f (b for b = a and G f (a = α f (a g (a =. Let X t be the Markov process constructed from X t, m and k in ect The resolvent and the generator for X t are denoted respectively by R α and G and the corresponding operators for X t are denoted respectively by R α and G.
17 2.5 The Resolvent Operator and the Generator 35 We will discuss the relation between (R α, G and (R α, G. Let us make three cases. Case 1. k( =. In this trivial case a is a trap for X t and (X t is equivalent to (X t, so that R α = R α and G = G. Case 2. < k( <.(m > in this case. a is an exponential holding state with the rate k(/m. Theorem If < k( <, then R α g(b = Rα g(b + E b(e ασ a Rα g(a for b = a; (2.3 mg(a + k(dbrα g(b R α g(a = ; αm + k(dbe b (1 e ασ a (2.4 G f (b = G f (b for b = a; (2.5 mg f (a = k(db( f (b f (a. (2.6 Proof Equation (2.3 is obvious by Dynkin s formula. To prove (2.4, set f (a = R α g(a and f (a = R α g(a. The Poisson point process X attached to (X t is discrete. Let σ be the first point in D X and τ be the first exit time from a for (X t.lety t be the process derived from X in ect By Dynkin s formula, we have ( f (a = E a e αt g(x t dt ( τ = E a e αt g(x t dt + E a (e ατ f (X τ ( mσ = E e αt g(a dt + E[e αmσ f (X σ (] [ 1 e αmσ ] = g(ae + E[e αmσ ]E[ f (X σ (]. α
18 36 2 Application to Markov Processes Observe E(e αmσ = = e αmt e tk( k( dt k( αm + k( and Therefore we have E[ f (X σ (] = 1 k(db f (b. k( mg(a + k(db f (b f (a =, (2.7 αm + k( which, combined with (2.3, implies mg(a + k(db f (b + k(dbe b (e ασ a f (a f (a =. αm + k( olving this for f (a, wehave mg(a + k(db f (b f (a =, αm + k(dbe b (1 e ασ a which proves (2.4. Equation (2.5 is obvious by Theorem It follows from (2.7 that m(α f (a g(a = k(db( f (b f (a, which proves (2.6. Case 3. k( =. a is an instantaneous state for (X t. Theorem Theorem holds also in case k( = : ( k(db( f (b f (a = lim k(db( f (b f (a ε ρ(b,a>ε with the following proviso. If m >,(2.4 holds for g with lim g(b = g(a (2.8 b a
19 2.5 The Resolvent Operator and the Generator 37 and (2.6 holds for f = R α g with g satisfying the same condition. Proof (2.3 and (2.5 are obvious. Let ε> and set 1,ε ={b : ρ(b, a ε}, 2,ε = 1,ε, U i,ε ={u U : u( i,ε }, i = 1, 2, X i,ε = X r U i,ε, i = 1, 2. Let Y 2,ε (t be the process derived from X 2,ε in the same way as Y t was derived from X in ect ince we fix ε for the moment, we omit ε in i,ε, U i,ε etc. Let J(t, X = s t s D X h(x s. imilarly for J(t, X i. X 1 is discrete. Let σ be the first element in D X 1. Noticing that s D X, s <σ = s D X 2, we have J(σ, X = J(σ, X 2, X σ = X 1 σ and Y t = Y 2 t for t < mσ + J(σ, X = mσ + J(σ, X 2. f (a R α g(a ( = E e αt g(y t dt ( mσ +J(σ,X = E e αt g(y t dt h(xσ + E [e αmσ α J(σ,X [ + E e αmσ α J(σ,X αh(x σ ] e αt g(x σ (t dt ] e αt g(y (t,θ σ X d (, dt ( mσ +J(σ,X 2 = E e αt g(yt 2 dt [ h(x 1 + E e αmσ α J(σ σ ],X2 e αt g(xσ 1 (t dt ] + E [e αmσ α J(σ,X2 αh(xσ 1 e αt g(y (t,θ σ X d (, dt = I 1 + I 2 + I 3.
20 38 2 Application to Markov Processes X 1 and X 2 are independent. σ and Xσ 1 are B(X1 measurable and independent of each other. X 2, σ and Xσ 1 are therefore independent of each other. Thus we have [ I 2 = E[e αmσ α J(σ,X2 h(xσ 1 ] ]E e αt g(xσ 1 (t dt = P(σ dte αmt E[ α J(t,X2 ] 1 k(db ( σ k( 1 E a b e αt g(xt. dt ince J(t, X 2 is a Lévy process increasing with jumps whose Lévy measure is equal to n h X 2(dt = k(dbp b (σa dt, 2 we have E[e α J(t,X2 ]=e t (1 e αs n h X 2 (ds = e t 2 k(dbe b(1 e ασ a. It is obvious that Therefore I 2 = P(σ dt = e k(1 t k( 1 dt. 1 k(dbe b ( σ a e αt g(x t dt. αm + k( 1 + k(dbe b (1 e ασ a 2 By the strong renewal property of X we have [ I 3 = E[e αmσ α J(σ,X2 αh(xσ 1 ]E Thus we have = E[e αmσ α J(σ,X2 ]E[e αh(x1 σ ] f (a f (a k(dbe b (e ασ a = 1. αm + k( 1 + k(dbe b (1 e ασ a 2 ] e αt g(y t dt ( k(db [E σa ] b e αt g(xt dt + E b (e ασ a f (a f (a = I (2.9 αm + k( 1 + k(dbe b (1 e ασ a 2
21 2.5 The Resolvent Operator and the Generator 39 To evaluate I 1, consider ( f 2 (a E e αt g(yt 2 dt This implies = I 1 + E = I 1 + ( E ( e αmσ α J(σ,X2 P(σ ds e αms αj(s,x2 = I 1 + P(σ ds ( E(e αms αj(s,x2 E (by the renewal property ofx 2 = I 1 + From (2.9 wehave e αt g(y (t,θ σ X 2 d (, dt e αt g(y (t,θ s X 2 d (, dt e αt g(y (t, X 2 dt P(σ dse(e αms α J(s,X2 f 2 (a = I 1 + E(e αmσ α J(σ,X2 f 2 (a. I 1 = f 2 (a[1 E(e αmσ α J(σ,X2 ] [ = f 2 k( 1 ] (a 1 αm + k( 1 + k(dbe b (1 e ασ a 2 αm + k(dbe b (1 e ασ a = f 2 (a 2. αm + k( 1 + k(dbe b (1 e ασ a 2 αm + f (a = f 2 (a αm + k( k(dbe b (1 e ασ a 2 k(dbe b (1 e ασ a (2.1 ( k(db [E σa ] b e αt g(xt dt + E b (e ασ a f (a + 1. αm + k( 1 + k(dbe b (1 e ασ a 2
22 4 2 Application to Markov Processes olving this for f we have f (a = ( f 2 (a αm + k(dbe b (1 e ασ a 2 αm + ( σa + k(dbe b 1 k(dbe b (1 e ασ a e αt g(x t dt (2.11 Let ε, then ( σa ( k(dbe b e αt g(xt σ dt a k(dbe b e αt g(xt ; dt 1. notice that ( σa k(db E b e αt g(x t dt g k(dbe b (1 e ασ a <, = sup. norm by virtue of k(dbe b(σa 1 <. It is obvious that k(dbe b (1 e ασ a as ε. 2 It follows from (2.1 and (2.3 that αm + f (a = f 2 (a αm + k( k(dbe b (1 e ασ a 2 k(dbe b (1 e ασ a k(db f (b + 1 αm + k( 1 + k(dbe b (1 e ασ a 2 so that m(α f (a α f 2 (a + f (a k(dbe b (1 e ασ a 2 = f 2 (a k(dbe b (1 e ασ a + k(db( f (b f (a. (
23 2.5 The Resolvent Operator and the Generator 41 If m =, we can derive (2.4 and (2.6 from (2.11 and (2.12, letting ε and noticing that f 2 (a f 2,ε (a g /α. If m >, we need only prove that in order to derive (2.4 and (2.6 from (2.11 and (2.12. Let η> and set lim f 2,ε (a = g(a ε α, (2.13 V 1 = V 1,η ={u U : sup ρ(u(t, a η}, t V 2 = V 2,η = U V 1, Y i = Y i,ε,η = X 2,ε r V i,η, i = 1, 2. By the argument in the last step of the existence proof of Theorem 2.4.1, wehave λ λ ε,η n Y 1(V 1 = n X 2,ε(V 1,η ( 1 inf E b(σa 1 k(dbe b (σa 1 ρ(b,a>η 2,ε, ε forη fixed. Y 1 is a discrete Poisson point process. Let τ = τ ε,η be the first element in D Y 1. Then τ is exponentially distributed with rate = λ ε,η. Using the same argument as in deriving (2.9, we obtain f 2,ε (a g(a α ( E ( mτ+j(τ,y 2 = E e αt g (Yt 2,ε dt, g (b = g(b g(a e αt g (Y (t, Y 2 dt ( + E(e αmτ α J(τ,Y 2 h(yτ 1 E e αt g (Yτ 1 (t dt ( + E(e αmτ α J(τ,Y 2 αh(yτ 1 E e αt g (Yt 2,ε dt. ince ρ(y (t, Y 2, a <ηfor < t < mτ + J(τ, Y 2,wehave f 2,ε (a g(a δ(η 1 α α + E(e αmτ g α + E(e αmτ g α
24 42 2 Application to Markov Processes where δ(η = sup{g (b, ρ(b, a <η} (η by (2.8. ince τ is exponentially distributed with rate λ ε,η,wehave E(e αmτ = λ ε,η e αmt e λε,ηt λ ε,η dt = αm + λ ε,η, ε by m >. Thus we have lim sup ε f 2,ε (a g(a δ(η 1 α α, η. This completes the proof. 2.6 Examples Example 1 Let =[, and X be a diffusion in stopped at such that the generator of X is G = d d dm dx. Let be an exit or regular (i.e., exit in Feller s new terminology boundary, i.e., 1 m(ξ,1 dξ <. Then X satisfies (A -1, (A -2 and (A -3 in ect. 2.1; notice that inf E b(σ 1 = E ε(σ 1 >. ρ(b,>ε We will investigate the condition (i in Theorem 2.3.8: k(dbe b (σ 1 <. (2.14 This is equivalent to k(dbe b (1 e σ <.
25 2.6 Examples 43 ince u(b = E b (e σ is a decreasing positive solution of d d u = u, u( = 1, dm dx 1 u (1 u (ξ = u(ξm(dξ m(ξ, 1 (ξ, u( u(b ξ b (m(ξ, 1 u (1 dξ (α(ξ β(ξ (ξ means that we have c 1, c 2 > independent of ξ such that c 1 β(ξ < α(ξ < c 2 β(ξ near ξ =. Case 1. (regular case If is a regular (i.e., exit and entrance in Feller s new terminology boundary, i.e., m(, 1 <, then E b (1 e σ = u( u(b b (b. ince E b (1 e σ 1asb, Eb (1 e σ b 1in< b <. Therefore our condition (2.14 turns out to be k(db(b 1 <. Case 2. (exit case If is an exit (i.e., exit and non-entrance in Feller s new terminology boundary, i.e., m(, 1 =, then (2.14 turns out to be [ b ] k(db m(ξ, 1 dξ 1 <. Example 2 Let =[, and X be a deterministic motion with constant speed 1. Then P b (σ = b = 1 and so (2.14 is written as k(db(b 1 <. Reference 1. Blumenthal, R., Getoor, R.: Markov Processes and Potential Theory. Academic Press, New York (1968
26
Reflected Brownian Motion
Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide
More informationLecture 21 Representations of Martingales
Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let
More informationITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES. Masatoshi Fukushima
ON ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES Masatoshi Fukushima Symposium in Honor of Kiyosi Itô: Stocastic Analysis and Its Impact in Mathematics and Science, IMS, NUS July 10, 2008 1 1. Itô s point
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationStochastic Processes
Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space
More informationMartingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi
s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November
More informationConvergence of Feller Processes
Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationLecture 12. F o s, (1.1) F t := s>t
Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationStochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes
BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationIntroduction to Random Diffusions
Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales
More informationStability of Stochastic Differential Equations
Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010
More informationDynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)
Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic
More informationHomework #6 : final examination Due on March 22nd : individual work
Université de ennes Année 28-29 Master 2ème Mathématiques Modèles stochastiques continus ou à sauts Homework #6 : final examination Due on March 22nd : individual work Exercise Warm-up : behaviour of characteristic
More informationExtending Markov Processes in Weak Duality by Poisson Point Processes of Excursions
Extending Markov Processes in Weak Duality by Poisson Point Processes of Excursions Zhen-Qing Chen 1, Masatoshi Fukushima 2, and Jiangang Ying 3 1 Department of Mathematics, University of Washington, Seattle,
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More information{σ x >t}p x. (σ x >t)=e at.
3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ
More informationLecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.
1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if
More information6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )
6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationLaplace s Equation. Chapter Mean Value Formulas
Chapter 1 Laplace s Equation Let be an open set in R n. A function u C 2 () is called harmonic in if it satisfies Laplace s equation n (1.1) u := D ii u = 0 in. i=1 A function u C 2 () is called subharmonic
More informationSUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES
SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,
More informationOn Ergodic Impulse Control with Constraint
On Ergodic Impulse Control with Constraint Maurice Robin Based on joint papers with J.L. Menaldi University Paris-Sanclay 9119 Saint-Aubin, France (e-mail: maurice.robin@polytechnique.edu) IMA, Minneapolis,
More informationFunctional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals
Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico
More informationLECTURE 2: LOCAL TIME FOR BROWNIAN MOTION
LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in
More informationFeller Processes and Semigroups
Stat25B: Probability Theory (Spring 23) Lecture: 27 Feller Processes and Semigroups Lecturer: Rui Dong Scribe: Rui Dong ruidong@stat.berkeley.edu For convenience, we can have a look at the list of materials
More informationLecture 10: Semi-Markov Type Processes
Lecture 1: Semi-Markov Type Processes 1. Semi-Markov processes (SMP) 1.1 Definition of SMP 1.2 Transition probabilities for SMP 1.3 Hitting times and semi-markov renewal equations 2. Processes with semi-markov
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationUniformly Uniformly-ergodic Markov chains and BSDEs
Uniformly Uniformly-ergodic Markov chains and BSDEs Samuel N. Cohen Mathematical Institute, University of Oxford (Based on joint work with Ying Hu, Robert Elliott, Lukas Szpruch) Centre Henri Lebesgue,
More informationApplications of Ito s Formula
CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale
More informationPoisson random measure: motivation
: motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps
More informationOn the Multi-Dimensional Controller and Stopper Games
On the Multi-Dimensional Controller and Stopper Games Joint work with Yu-Jui Huang University of Michigan, Ann Arbor June 7, 2012 Outline Introduction 1 Introduction 2 3 4 5 Consider a zero-sum controller-and-stopper
More informationThe strictly 1/2-stable example
The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such
More informationNotes 1 : Measure-theoretic foundations I
Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,
More informationProperties of an infinite dimensional EDS system : the Muller s ratchet
Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population
More informationAN INTEGRO-DIFFERENTIAL EQUATION FOR A COMPOUND POISSON PROCESS WITH DRIFT AND THE INTEGRAL EQUATION OF H. CRAMER
Watanabe, T. Osaka J. Math. 8 (1971), 377-383 AN INTEGRO-DIFFERENTIAL EQUATION FOR A COMPOUND POISSON PROCESS WITH DRIFT AND THE INTEGRAL EQUATION OF H. CRAMER TAKESI WATANABE (Received April 13, 1971)
More informationWiener Measure and Brownian Motion
Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u
More informationStochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik
Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2
More informationExponential functionals of Lévy processes
Exponential functionals of Lévy processes Víctor Rivero Centro de Investigación en Matemáticas, México. 1/ 28 Outline of the talk Introduction Exponential functionals of spectrally positive Lévy processes
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationOn Stopping Times and Impulse Control with Constraint
On Stopping Times and Impulse Control with Constraint Jose Luis Menaldi Based on joint papers with M. Robin (216, 217) Department of Mathematics Wayne State University Detroit, Michigan 4822, USA (e-mail:
More informationRegular Variation and Extreme Events for Stochastic Processes
1 Regular Variation and Extreme Events for Stochastic Processes FILIP LINDSKOG Royal Institute of Technology, Stockholm 2005 based on joint work with Henrik Hult www.math.kth.se/ lindskog 2 Extremes for
More informationPartial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula
Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula Group 4: Bertan Yilmaz, Richard Oti-Aboagye and Di Liu May, 15 Chapter 1 Proving Dynkin s formula
More informationERRATA: Probabilistic Techniques in Analysis
ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More informationMulti-dimensional Stochastic Singular Control Via Dynkin Game and Dirichlet Form
Multi-dimensional Stochastic Singular Control Via Dynkin Game and Dirichlet Form Yipeng Yang * Under the supervision of Dr. Michael Taksar Department of Mathematics University of Missouri-Columbia Oct
More informationContinuous Functions on Metric Spaces
Continuous Functions on Metric Spaces Math 201A, Fall 2016 1 Continuous functions Definition 1. Let (X, d X ) and (Y, d Y ) be metric spaces. A function f : X Y is continuous at a X if for every ɛ > 0
More informationWeak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij
Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 6
MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and
More informationThreshold dividend strategies for a Markov-additive risk model
European Actuarial Journal manuscript No. will be inserted by the editor Threshold dividend strategies for a Markov-additive risk model Lothar Breuer Received: date / Accepted: date Abstract We consider
More informationBranching Processes II: Convergence of critical branching to Feller s CSB
Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied
More informationLecture 22 Girsanov s Theorem
Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n
More informationA Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1
Chapter 3 A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Abstract We establish a change of variable
More informationSolving the Poisson Disorder Problem
Advances in Finance and Stochastics: Essays in Honour of Dieter Sondermann, Springer-Verlag, 22, (295-32) Research Report No. 49, 2, Dept. Theoret. Statist. Aarhus Solving the Poisson Disorder Problem
More informationLecture 19 L 2 -Stochastic integration
Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes
More informationHJB equations. Seminar in Stochastic Modelling in Economics and Finance January 10, 2011
Department of Probability and Mathematical Statistics Faculty of Mathematics and Physics, Charles University in Prague petrasek@karlin.mff.cuni.cz Seminar in Stochastic Modelling in Economics and Finance
More informationJump Processes. Richard F. Bass
Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov
More informationA.Piunovskiy. University of Liverpool Fluid Approximation to Controlled Markov. Chains with Local Transitions. A.Piunovskiy.
University of Liverpool piunov@liv.ac.uk The Markov Decision Process under consideration is defined by the following elements X = {0, 1, 2,...} is the state space; A is the action space (Borel); p(z x,
More informationMATH 6605: SUMMARY LECTURE NOTES
MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic
More informationMath 564 Homework 1. Solutions.
Math 564 Homework 1. Solutions. Problem 1. Prove Proposition 0.2.2. A guide to this problem: start with the open set S = (a, b), for example. First assume that a >, and show that the number a has the properties
More informationConvergence of Markov Processes. Amanda Turner University of Cambridge
Convergence of Markov Processes Amanda Turner University of Cambridge 1 Contents 1 Introduction 2 2 The Space D E [, 3 2.1 The Skorohod Topology................................ 3 3 Convergence of Probability
More informationLecture 4: Introduction to stochastic processes and stochastic calculus
Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London
More informationGAUSSIAN PROCESSES AND THE LOCAL TIMES OF SYMMETRIC LÉVY PROCESSES
GAUSSIAN PROCESSES AND THE LOCAL TIMES OF SYMMETRIC LÉVY PROCESSES MICHAEL B. MARCUS AND JAY ROSEN CITY COLLEGE AND COLLEGE OF STATEN ISLAND Abstract. We give a relatively simple proof of the necessary
More informationRichard F. Bass Krzysztof Burdzy University of Washington
ON DOMAIN MONOTONICITY OF THE NEUMANN HEAT KERNEL Richard F. Bass Krzysztof Burdzy University of Washington Abstract. Some examples are given of convex domains for which domain monotonicity of the Neumann
More informationX n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)
14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence
More informationKrzysztof Burdzy Robert Ho lyst Peter March
A FLEMING-VIOT PARTICLE REPRESENTATION OF THE DIRICHLET LAPLACIAN Krzysztof Burdzy Robert Ho lyst Peter March Abstract: We consider a model with a large number N of particles which move according to independent
More informationLAN property for sde s with additive fractional noise and continuous time observation
LAN property for sde s with additive fractional noise and continuous time observation Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Samy Tindel (Purdue University) Vlad s 6th birthday,
More informationThe first order quasi-linear PDEs
Chapter 2 The first order quasi-linear PDEs The first order quasi-linear PDEs have the following general form: F (x, u, Du) = 0, (2.1) where x = (x 1, x 2,, x 3 ) R n, u = u(x), Du is the gradient of u.
More informationPath Decomposition of Markov Processes. Götz Kersting. University of Frankfurt/Main
Path Decomposition of Markov Processes Götz Kersting University of Frankfurt/Main joint work with Kaya Memisoglu, Jim Pitman 1 A Brownian path with positive drift 50 40 30 20 10 0 0 200 400 600 800 1000-10
More informationSolutions For Stochastic Process Final Exam
Solutions For Stochastic Process Final Exam (a) λ BMW = 20 0% = 2 X BMW Poisson(2) Let N t be the number of BMWs which have passes during [0, t] Then the probability in question is P (N ) = P (N = 0) =
More informationBSDEs and PDEs with discontinuous coecients Applications to homogenization K. Bahlali, A. Elouain, E. Pardoux. Jena, March 2009
BSDEs and PDEs with discontinuous coecients Applications to homogenization K. Bahlali, A. Elouain, E. Pardoux. Jena, 16-20 March 2009 1 1) L p viscosity solution to 2nd order semilinear parabolic PDEs
More informationZdzis law Brzeźniak and Tomasz Zastawniak
Basic Stochastic Processes by Zdzis law Brzeźniak and Tomasz Zastawniak Springer-Verlag, London 1999 Corrections in the 2nd printing Version: 21 May 2005 Page and line numbers refer to the 2nd printing
More informationTwo viewpoints on measure valued processes
Two viewpoints on measure valued processes Olivier Hénard Université Paris-Est, Cermics Contents 1 The classical framework : from no particle to one particle 2 The lookdown framework : many particles.
More informationAn essay on the general theory of stochastic processes
Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16
More informationIntroduction and Preliminaries
Chapter 1 Introduction and Preliminaries This chapter serves two purposes. The first purpose is to prepare the readers for the more systematic development in later chapters of methods of real analysis
More informationA MODEL FOR THE LONG-TERM OPTIMAL CAPACITY LEVEL OF AN INVESTMENT PROJECT
A MODEL FOR HE LONG-ERM OPIMAL CAPACIY LEVEL OF AN INVESMEN PROJEC ARNE LØKKA AND MIHAIL ZERVOS Abstract. We consider an investment project that produces a single commodity. he project s operation yields
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More informationCross-Selling in a Call Center with a Heterogeneous Customer Population. Technical Appendix
Cross-Selling in a Call Center with a Heterogeneous Customer opulation Technical Appendix Itay Gurvich Mor Armony Costis Maglaras This technical appendix is dedicated to the completion of the proof of
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More informationfor all subintervals I J. If the same is true for the dyadic subintervals I D J only, we will write ϕ BMO d (J). In fact, the following is true
3 ohn Nirenberg inequality, Part I A function ϕ L () belongs to the space BMO() if sup ϕ(s) ϕ I I I < for all subintervals I If the same is true for the dyadic subintervals I D only, we will write ϕ BMO
More informationSolutions to the Exercises in Stochastic Analysis
Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional
More informationThe Lebesgue Integral
The Lebesgue Integral Brent Nelson In these notes we give an introduction to the Lebesgue integral, assuming only a knowledge of metric spaces and the iemann integral. For more details see [1, Chapters
More informationA Central Limit Theorem for Fleming-Viot Particle Systems Application to the Adaptive Multilevel Splitting Algorithm
A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Algorithm F. Cérou 1,2 B. Delyon 2 A. Guyader 3 M. Rousset 1,2 1 Inria Rennes Bretagne Atlantique 2 IRMAR, Université de Rennes
More informationSmall-time asymptotics of stopped Lévy bridges and simulation schemes with controlled bias
Small-time asymptotics of stopped Lévy bridges and simulation schemes with controlled bias José E. Figueroa-López 1 1 Department of Statistics Purdue University Seoul National University & Ajou University
More informationPotentials of stable processes
Potentials of stable processes A. E. Kyprianou A. R. Watson 5th December 23 arxiv:32.222v [math.pr] 4 Dec 23 Abstract. For a stable process, we give an explicit formula for the potential measure of the
More informationWe denote the space of distributions on Ω by D ( Ω) 2.
Sep. 1 0, 008 Distributions Distributions are generalized functions. Some familiarity with the theory of distributions helps understanding of various function spaces which play important roles in the study
More informationThe Structure of a Brownian Bubble
The Structure of a Brownian Bubble Robert C. Dalang 1 and John B. Walsh 2 Abstract In this paper, we examine local geometric properties of level sets of the Brownian sheet, and in particular, we identify
More informationA LITTLE REAL ANALYSIS AND TOPOLOGY
A LITTLE REAL ANALYSIS AND TOPOLOGY 1. NOTATION Before we begin some notational definitions are useful. (1) Z = {, 3, 2, 1, 0, 1, 2, 3, }is the set of integers. (2) Q = { a b : aεz, bεz {0}} is the set
More informationGARCH processes continuous counterparts (Part 2)
GARCH processes continuous counterparts (Part 2) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/
More informationHardy-Stein identity and Square functions
Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /
More informationSome Tools From Stochastic Analysis
W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click
More informationOn the submartingale / supermartingale property of diffusions in natural scale
On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property
More informationTHEOREMS, ETC., FOR MATH 515
THEOREMS, ETC., FOR MATH 515 Proposition 1 (=comment on page 17). If A is an algebra, then any finite union or finite intersection of sets in A is also in A. Proposition 2 (=Proposition 1.1). For every
More informationMarkov Chain Approximation of Pure Jump Processes. Nikola Sandrić (joint work with Ante Mimica and René L. Schilling) University of Zagreb
Markov Chain Approximation of Pure Jump Processes Nikola Sandrić (joint work with Ante Mimica and René L. Schilling) University of Zagreb 8 th International Conference on Lévy processes Angers, July 25-29,
More informationOn pathwise stochastic integration
On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic
More information