Solving the Poisson Disorder Problem

Size: px
Start display at page:

Download "Solving the Poisson Disorder Problem"

Transcription

1 Advances in Finance and Stochastics: Essays in Honour of Dieter Sondermann, Springer-Verlag, 22, (295-32) Research Report No. 49, 2, Dept. Theoret. Statist. Aarhus Solving the Poisson Disorder Problem G. PESKIR 3 and A. N. SHIRYAEV 3 The Poisson disorder problem seeks to determine a stopping time which is as close as possible to the (unknown) time of disorder when the intensity of an observed Poisson process changes from one value to another. Partial answers to this question are known to date only in some special cases, and the main purpose of the present paper is to describe the structure of the solution in the general case. The method of proof consists of reducing the initial (optimal stopping) problem to a freeboundary differential-difference problem. The key point in the solution is reached by specifying when the principle of smooth fit breaks down and gets superseded by the principle of continuous fit. This can be done in probabilistic terms (by describing the sample path behaviour of the a posteriori probability process) and in analytic terms (via the existence of a singularity point of the free-boundary equation).. Introduction The Poisson disorder problem is less formally stated as follows. Suppose that at time t = we begin observing a trajectory of the Poisson process X = (X t ) t whose intensity changes from to at some random (unknown) time which is assumed to take value with probability, and is exponentially distributed with parameter given that >. Based upon the information which is continuously updated through our observation of the trajectory of X, our problem is to terminate the observation (and declare the alarm) at a time 3 which is as close as possible to (measured by a cost function with parameter c > specified below). The problem above was first studied in [2] where a solution has been found in the case when + c >. This result has been extended in [] to the case when + c >. Many other authors have also studied the problem from a different standpoint (see e.g. [5]). The main purpose of the present paper is to describe the structure of the solution in the general case. The Wiener process version of the disorder problem (where the drift changes) appeared earlier (see [7]) and is now well-understood (we refer to [8, page 28] for historical comments and references). The method of proof consists of reducing the initial (optimal stopping) problem to a free-boundary differential problem which can be solved explicitly. The principle of smooth fit plays a key role in this context. In this paper we adopt the same methodology as in the Wiener process case. A discontinuous character of the observed (Poisson) process in the present case, however, forces us to deal with a differential-difference equation forming a free-boundary problem which is more delicate. This in turn leads to a new effect of the breakdown of the smooth fit principle (and its replacement by the principle of continuous fit), and the key issue in the solution is to understand and specify when 3 Centre for Mathematical Physics and Stochastics, supported by the Danish National Research Foundation. Mathematics Subject Classification 2. Primary 62M2, 6G4, 34K. Secondary 62L5, 62C, 6J75. Key words and phrases: Disorder (quickest detection, change-point, disruption, disharmony) problem, Poisson process, optimal stopping, a free-boundary differential-difference problem, the principles of continuous and smooth fit, point (counting) (Cox) process, the innovation process, measure of jumps and its compensator, Itô s formula. goran@imf.au.dk

2 exactly this happens. This can be done, on one hand, in terms of the a posteriori probability process (i.e. its jump structure and sample path behaviour), and on the other hand, in terms of a singularity point of the equation from the free-boundary problem. Moreover, it turns out that the existence of such a singularity point makes explicit computations feasible. The facts on the principles of continuous and smooth fit found here complement and further extend our findings in [6]. Problems of detecting the arrival of disorder are of central importance in quality control and have also found notable industrial and other applications. 2. The Poisson disorder problem. The Poisson disorder problem can be formally stated as follows. Let N = (N t ) t, N = (N t ) t and L = (L t ) t be three independent stochastic processes defined on a probability space (; F ; P ) with 2 [; ] such that: (2.) N is a Poisson process with intensity > ; (2.2) N is a Poisson process with intensity > ; (2.3) L is a continuous Markov chain with two states and, initial distribution [ ; ], and transition-probability matrix [ e t ; e t ; ; ] for t > where >. Thus P (L = ) = P (L = ) =, and given that L =, there is a single passage of L from to at a random time > satisfying P ( > t) = e t for all t >. The process X = (X t ) t observed is given by (2.4) X t = Z t I(L s = ) dn s + Z t I(L s = ) dn s and we set Ft X = (X s j s t) for t. Denoting = inf f t j L t = g we see that P ( = ) = and P ( > t j > ) = e t for all t >. It is assumed that the time of disorder is unknown (i.e. it cannot be observed directly). The Poisson disorder problem seeks to find a stopping time 3 of X that is as close as possible to as a solution of the following optimal stopping problem: (2.5) V () = inf P ( < ) + ce ( ) + where P ( < ) is interpreted as the probability of a false alarm, E ( ) + is interpreted as the average delay in detecting the occurrence of disorder correctly, c > is a given constant, and the infimum in (2.5) is taken over all stopping times of X. [A stopping time of X means a stopping time with respect to the natural filtration (Ft X ) t generated by X. The same terminology will be used for other processes in the sequel as well.] 2. Introducing the a posteriori probability process (2.6) t = P ( t j F X t ) 2

3 for t, it is easily seen that P ( < ) = E ( ) all stopping times of X, so that (2.5) can be rewritten as follows: (2.7) V () = inf Z E ( ) + c t dt and E ( ) + = E R t dt for where the infimum is taken over all stopping times of ( t ) t (as shown following (2.) below). Defining the likelihood ratio process (2.8) ' t = t t it is possible to verify by standard means that the following explicit expression is valid: (2.9) ' t = e t e Xt log(=)()t ' + Z t e s e Xs log(=)+()s ds for t. Hence by Itˆo s formula (see e.g. [3]) one finds that the processes (' t ) t solve the following stochastic equations respectively: (2.) d' t = ( + ' t ) dt + ' t d X t t) (2.) d t = ( t ) dt + ( ) t( t) t + ( t) dx t and ( t ) t t + ( t) dt (cf. [2, page 73] or [4, page 37]). It follows that (' t ) t and ( t ) t are time-homogeneous (strong) Markov processes under P with respect to the natural filtrations which clearly coincide with (Ft X ) t respectively. Thus, the infimum in (2.7) may indeed be viewed as taken over all stopping times of ( t ) t, and the optimal stopping problem (2.7) falls into the class of optimal stopping problems for Markov processes. We thus proceed by finding the infinitesimal operator of the Markov process ( t ) t. By Itô s formula, upon making use of the fact easily verified (see (2.4) below) that the innovation process b Xt = X t R t E (L s j F X s ) ds = X t Rt ( s + ( s )) ds is a martingale under P with respect to (F X t ) t, it follows from (2.) that the infinitesimal operator of ( t ) t acts on f 2 C [; ] according to the following rule: + + () f! f (). + () (2.2) (ILf )() = ( ) () f () It may be noted that the equations (2.)-(2.2) for = reduce to the analogous equations in [6]. 3. We may assume that for each r a probability measure Q r is defined on (; F ) such that Q r ( = r) =. Thus, under Q r the observed process X = (X t ) t is given by Z t Z t (2.3) X t = I(s r) dn s + I(s > r) dn s 3

4 for all t where r. It follows that P admits the following decomposition: (2.4) P = Q + () Z e r Q r dr which appears to be an elegant tool, for instance, to check that the innovation process defined above is a martingale under P. Moreover, using (2.4) it is straightforwardly verified that the following facts are valid: (2.5) The map 7! V () is concave (continuous) and decreasing on [; ] ; (b Xt ) t (2.6) The stopping time 3 = inf f t j t B3 g is optimal in the problem ( ), where B3 is the smallest from [; ] satisfying V () =. Thus V () < for all 2 [; B3 and V () = for all 2 [B 3; ]. It should be noted in (2.6) that t = ' t =( + ' t ), and hence by (2.9) we see that t is a (path-dependent) functional of the process X observed up to time t. Thus, by observing a trajectory of X it is possible to decide when to stop in accordance with the rule 3 given in (2.6). The question arises, however, to determine the optimal threshold B3 in terms of the four parameters ; ; ; c as well as to compute the value V () for 2 [; B3 (especially for = ). We tackle these questions by forming a free-boundary problem. 3. A free-boundary problem. Being aided by the general (optimal stopping) theory of Markov processes (see e.g. [8]), and making use of the preceding facts, we are naturally led to formulate the following free-boundary problem for 7! V () and B3 defined above: (3.) (ILV )() = c ( < < B3) (3.2) V () = (B3 ) (3.3) V (B3) = B3 (continuous fit). In some cases (specified below) the following condition will be satisfied as well: (3.4) V (B3) = (smooth fit). However, we will also see below that this condition may fail. Finally, it is easily verified by passing to the limit for # that each continuous solution 7! V () of the system (3.+3.2) must necessarily satisfy: (3.5) V (+) = (normal entrance) whenever V (+) is finite. This condition proves useful in the case when <. For a similar free-boundary differential-difference problem corresponding to the case = above we refer to [6]. 4

5 2. Solving the free-boundary problem (3.). It turns out that the case < is much different from the case >. Thus assume first that > and consider the equation (3.) on ; B] for some < B < given and fixed. Introduce the step function (3.6) S() = + () for B. Observe that S() > for all < < and find points... < B 2 < B < B := B such that S(B n ) = B n for n. It is easily verified that (3.7) B n = Denote I n = B n ; B n] (3.8) d(; B) = + ( ) n B ( ) n B + ( ) n (B) (n = ; ;... ). for n, and introduce the distance function " # log B B for B, where [x] denotes the integer part of x. Observe that d is defined to satisfy: (3.9) 2 I n () d(; B) = n for all < B. Now consider the equation (3.) first on I upon setting V () = for 2 B; S(B)]. This is then a first-order linear differential equation which can be solved explicitly. Imposing a continuity condition at B (which is in agreement with (3.3) above) we obtain a unique solution 7! V (; B) on I. It is possible to verify that the following formula holds: log (3.) V (; B) = c (B) V g () + V p;(; B) ( 2 I ) where 7! V p;(; B) is a (bounded) particular solution of the non-homogeneous equation in (3.): (3.) V p;(; B) = ( c) ( +) + +c ( +) and 7! V g () is a general solution of the homogeneous equation in (3.): (3.2) V g () = () j ( ) j, if 6= = () exp ( )(), if = where = =( ) and = ( + )=( ), and the constant c (B) is determined by the continuity condition V (B; B) = B (3.3) c (B) = V g (B) leading to + c ( +) B ( c) ( +) 5

6 where V g (B) is obtained by replacing in (3.2) by B. [We see from (3.)-(3.3) however that the continuity condition at B cannot be met when B equals B b from (3.6) below unless bb equals ( c)=( +c ) from (4.5) below (the latter is equivalent to c = ). Thus, if B = B b 6= ( c)=( +c ) then there is no solution 7! V (; B) on I that satisfies V (; B) = for 2 B; S(B)] and is continuous at B. It turns out, however, that this analytic fact has no significant implication for the solution of ( ).] Next consider the equation (3.) on I 2 upon using the solution found on I and setting V () = c (B) V g () + V p;(; B) for 2 B ; S(B )]. This is then again a first-order linear differential equation which can be solved explicitly. Imposing a continuity condition over I 2 [ I at B (which is in agreement with (2.5) above) we obtain a unique solution 7! V (; B) on I 2. It turns out, however, that the general solution of this equation cannot be expressed in terms of elementary functions (unless = as shown in [6]) but one needs, for instance, the Gauss hypergeometric function. As these expressions are increasingly complex to record, we omit the explicit formulas in the sequel. Continuing the preceding procedure by induction as long as possible (considering the equation (3.) on I n upon using the solution found on I n and imposing a continuity condition over I n [ I n at B n ) we obtain a unique solution 7! V (; B) on I n given as (3.4) V (; B) = c n (B) V g () + V p;n (; B) ( 2 I n ) where 7! V p;n (; B) is a (bounded) particular solution, 7! V g () is a general solution given by (3.2), and B 7! c n (B) is a function of B (and the four parameters). [We will see however in Theorem 4. below that in the case B > B b > with B b from (3.6) below the solution (3.4) exists for 2 B; b B] but explodes at B b unless B = B3.] The key difference in the case < is that S() < for all < < so that we need to deal with points B := B < B < B 2 <... such that S(B n ) = B n for n. Then the facts (3.7)-(3.9) remain preserved provided that we set I n = [B n ; B n for n. In order to prescribe the initial condition when considering the equation (3.) on I, we can take B = " > small and make use of (3.5) upon setting V () = v for all 2 [S(B); B where v 2 ; is a given number satisfying V (B) = v. Proceeding by induction as earlier (considering the equation (3.) on I n upon using the solution found on I n and imposing a continuity condition over I n [ I n at B n ) we obtain a unique solution 7! V (; "; v) on I n given as (3.5) V (; "; v) = c n (") V g () + V p;n (; "; v) ( 2 I n ) where 7! V p;n (; "; v) is a particular solution, 7! V g () is a general solution given by (3.2), and " 7! c n (") is a function of " (and the four parameters). We shall see in Theorem 4. below how these solutions can be used to determine the optimal 7! V () and B Two key facts about the solution. Both of these facts hold only in the case when >. The first fact to be observed is that (3.6) b B = is a singularity point of the equation (3.) whenever <. This is clearly seen from (3.2) 6

7 where V g ()! for! b B. The second fact of interest is that (3.7) e B = +c is a smooth-fit point of the system (3.)-(3.3) whenever > and c 6=, i.e. V ( e B; e B) = in the notation of (3.4) above. This can be verified by (3.) using (3.)- (3.3). It means that e B is the unique point which in addition to (3.)-(3.3) has the power of satisfying the smooth-fit condition (3.4). It may also be noted in the verification above that the equation V (B; B) = has no solution when c = as the only candidate B := e B = b B satisfies: (3.8) V ( B; B) =. This identity follows readily from (3.)-(3.3) upon noticing that c ( B) =. Thus, when c runs from + to, the smooth-fit point e B runs from to the singularity point bb, and once e B has reached b B for c =, the smooth-fit condition (3.4) breaks down and gets replaced by the condition (3.8) above. We will soon attest below that in all these cases the smooth-fit point B e is actually equal to the optimal-stopping point B3 from (2.6) above. Observe that the equation (3.) has no singularity points when <. This analytic fact reveals a key difference between the two cases. 4. Conclusions In parallel to the two analytic properties displayed above we begin this section by stating the relevant probabilistic properties of the a posteriori probability process.. Sample-path properties of ( t ) t. First consider the case >. Then from (2.) we see that ( t ) t can only jump towards (at times of the jumps of the process X ). Moreover, the sign of the drift term ()( )( ) = ( )( B b )( ) is determined by the sign of b B. Hence we see that ( t ) t has a positive drift in [; b B, a negative drift in b B; ], and a zero drift at b B. Thus, if (t ) t starts or ends up at b B, it is trapped there until the first jump of the process X occurs. At that time ( t ) t finally leaves b B by jumping towards. This also shows that after once ( t ) t leaves [; b B it never comes back. The sample-path behaviour of ( t ) t when > is depicted in Figure (Part i) below. Next consider the case <. Then from (2.) we see that ( t ) t can only jump towards (at times of the jumps of the process X ). Moreover, the sign of the drift term ()( )() = ( + ( ))() is always positive. Thus ( t ) t always moves continuously towards and can only jump towards. The sample-path behaviour of ( t ) t when < is depicted in Figure (Part ii) below. 2. Sample-path behaviour and the principles of smooth and continuous fit. With a view to (2.6), and taking < B < given and fixed, we shall now examine the manner in which the process ( t ) t enters [B; ] if starting at B d where d is infinitesimally small. Our previous analysis then shows the following (see Figure below). 7

8 If > and B < B b, or <, then ( t ) t enters [B; ] by passing through B continuously. If, however, > and B > B b then the only way for ( t ) t to enter [B; ] is by jumping over B. (Jumping exactly at B happens with probability zero.) The case > and B = B b is special. If starting outside [B; ] then ( t ) t travels towards B b by either moving continuously or by jumping. However, the closer ( t ) t gets to B b the smaller the drift to the right becomes, and if there were no jump over B b eventually, the process ( t ) t would never reach B b as the drift to the right tends to zero together with the distance of ( t ) t to B b. This fact can be formally verified by analysing the explicit representation of (' t ) t in (2.9) and using that t = ' t =(+' t ) for t. Thus, in this case as well, the only way for ( t ) t to enter [ B; b ] after starting at B d is by jumping over to B; b ] We will demonstrate below that the sample-path behaviour of the process ( t ) t during the entrance of [B 3 ; ] has a precise analytic counterpart in terms of the free-boundary problem (3.). If the process ( t ) t may enter [B 3 ; ] by passing through B 3 continuously, then the smooth-fit condition (3.4) holds at B 3 ; if, however, the process ( t ) t enters [B 3 ; ] exclusively by jumping over B 3, then the smooth-fit condition (3.4) breaks down. In this case the continuous-fit condition (3.3) still holds at B 3, and the existence of a singularity point B b can be used to determine the optimal B 3 as shown below. 3. The preceding considerations may now be summarized as follows. Theorem 4. Consider the Poisson disorder problem (2.5) and the equivalent optimal-stopping problem (2.7) where the process ( t ) t from (2.6) solves (2.) and ; ; ; c > are given and fixed. Then there exists B 3 2 ; such that the stopping time (4.) 3 = inf f t j t B 3 g is optimal in (2.5) and (2.7). Moreover, the optimal cost function 7! V () from ( ) solves the free-boundary problem (3.)-(3.3), and the optimal threshold B 3 is determined as follows. (i): If > and c >, then the smooth-fit condition (3.4) holds at B 3, and the following explicit formula is valid (cf. [2] and []): (4.2) B 3 = + c. In this case B 3 < b B where b B is a singularity point of the free-boundary equation (3.) given in (3.6) above (see Figure 2 below). (ii): If > and c =, then the smooth-fit condition breaks down at B 3 and gets replaced by the condition (3.8) above ( V (B 3 ) = = ). The optimal threshold B 3 is still given by (4.2), and in this case B 3 = B b (see Figure 3 below). (iii): If > and c <, then the smooth-fit condition does not hold at B 3, and the optimal threshold B 3 is determined as a unique solution in B; b of the following equation: (4.3) c d(b B;B 3) (B 3) = 8

9 where the map B 7! d(b B; B) is defined in (3.8), and the map B 7! cn (B) is defined by (3.3) and (3.4) above (see Figure 4 below). In particular, when c satisfies: (4.4) ( ) + ( )(+ ) c < then the following explicit formula is valid: (4.5) B 3 = ( c) +c which in the case c = reduces again to (4.2) above. In the cases (i)-(iii) the optimal cost function 7! V () from ( ) is given by (3.4) with B 3 in place of B for all < B 3 (with V () = V (+)) and V () = for B 3. (iv): If < then the smooth-fit condition holds at B 3, and the optimal threshold B 3 can be determined using the normal entrance condition (3.5) as follows (see Figure 5). For " > small let v " denote a unique number in ; for which the map 7! V (; "; v " ) from (3.5) hits the map 7! smoothly at some B " from 3 ;. Then we have: (4.6) B 3 = lim B " 3 "# (4.7) V () = lim V (; "; v " ) "# for all < B 3 (with V () = V (+) ) and V () = for B 3. Proof. We have already established in (2.6) above that 3 from (4.) is optimal in (2.5) and (2.7) for some B 3 2 [; ] to be found. It thus follows by the strong Markov property of the process ( t ) t together with (2.5) above that the optimal cost function 7! V () from ( ) solves the free-boundary problem (3.)-(3.3). Some of these facts will also be reproved below. First consider the case >. In Section 3.2 above it was shown that for each given and fixed B 2 ; B b the problem (3.)-(3.3) with B in place of B3 has a unique continuous solution given by the formula (3.4). Moreover, this solution is (at least) C everywhere but possibly at B where it is (at least) C. As explained following (3.3) above, these facts also hold for B = b B when b B equals ( c)=( +c ) from (4.5) above. We will now show how the optimal threshold B 3 is determined among all these candidates B when c. (i)+(ii): Since the innovation process under P with respect to (Ft X ) t, it follows by (2.) that (4.8) t = + Z t b Xt = X t R t ( s + ( s )) ds is a martingale ( s ) ds + M t where M = (M t ) t is a martingale under P with respect to (F X t ) t. Hence by the optional sampling theorem we easily find: (4.9) E + c Z t dt = () + (+c) E Z t +c dt 9

10 for all stopping times of ( t ) t. Recalling the sample-path behaviour of ( t ) t in the case > as displayed in Section 4. above (cf. Figure (Part i)), and the definition of V () in (2.7) together with the fact that B e = =( + c) B b when c, we clearly see from (4.9) that it is never optimal to stop ( t ) t in [; B e, as well as that (t ) t must be stopped immediately after entering [e B; ] as it will never return to the favourable region [; B e again. This proves that e B equals the optimal threshold B3, i.e. that 3 from (4.) with B 3 from (4.2) is optimal in (2.5) and (2.7). The claim about the breakdown of the smooth-fit condition (3.4) when c = has been already established in the paragraph containing (3.8) above (cf. Figure 3). The general answer (4.2) has been obtained in []. (iii): It was shown in Section 3.2 above that for each given and fixed B 2 B; b the problem (3.)-(3.3) with B in place of B 3 has a unique continuous solution on B; b ] given by the formula (3.4). We will now show that there exists a unique point B 3 2 B; b such that is finite. This point is from (4.) is optimal in (2.5) and (2.7). Moreover, lim #bb V (; B) = 6 if B 2 b B; B3 [ B3 ; and lim #bb V (; B 3) the optimal threshold, i.e. the stopping time 3 the point B 3 can be characterized as a unique solution of the equation (4.3) in b B;. In order to verify the preceding claims we will first state the following observation which proves useful. Setting g() = for < < we have: (4.) (ILg)() c () e B where B e is given in (3.7). This is verified straightforwardly using (2.2). Now since B b is a singularity point of the equation (3.) (recall our discussion in Section 3.3 above), and moreover 7! V () from ( ) solves (3.)-(3.3), we see that the optimal threshold B 3 from (2.6) must satisfy (4.3). This is due to the fact that a particular solution 7! V p;n (; B 3 ) for n = d(b B; B3 ) in (3.4) above is taken bounded. The key remaining fact to be established is that there cannot be two (or more) points in B; b satisfying (4.3). Assume on the contrary that there are two such points B and B 2. We may however assume that both B and B 2 are larger than B e since for B 2 b B; B e the solution 7! V (; B) is ruled out by the fact that V (; B) > for 2 B"; B with " > small. This fact is verified directly using (3.)-(3.3). Thus, each map 7! V (; B i ) solves (3.)-(3.3) on ; B i ] and is continuous (bounded) at B b for i = ; 2. Since S() > for all < < when >, it follows easily from (2.2) that each solution 7! V (; B i ) of (3.)-(3.3) must also satisfy < V (+; B i ) < + for i = ; 2. In order to make use of the preceding fact we shall set h () = ( + ( )b B) for b B and h () = for b B. Since both maps 7! V (; Bi ) are bounded on ; b B we can fix > large enough so that V (; Bi ) h () for all < b B and i = ; 2. Consider then the auxiliary optimal stopping problem: Z h ( ) + c t dt (4.) W () := inf E where the supremum is taken over all stopping times of ( t ) t. Extend the map 7! V (; B i ) on [B i ; ] by setting V (; B i ) = for B i and denote the resulting (continuous) map on [; ] by 7! V i () for i = ; 2. Then 7! V i () satisfies (3.)-(3.3), and since

11 B i B e, we see by means of (4.) that the following condition is also satisfied: (4.2) (ILV i )() c for 2 [B i ; ] and i = ; 2. We will now show that the preceding two facts have the power of implying that V i () = W () for all 2 [; ] with either i 2 f; 2g given and fixed. It follows by Itˆo formula that (4.3) V i ( t ) = V i () + Z t (ILV i )( s) ds + M t where M = (M t ) t is a martingale ( under P ) given by (4.4) M t = Xt b R t = X t Z t V i s + s Vi ( s) and ( s + ( s)) ds is the innovation process. By the optional sampling theorem it follows from (4.3) using (4.2) and the fact that V i () h () for all 2 [; ] that V i () W () for all 2 [; ]. Moreover, defining i = inf f t j t B i g it is easily seen by (4.8) for instance that E ( i ) <. Using then that 7! V i () is bounded on [; ], it follows easily by the optional sampling theorem that E (M i ) =. Since moreover V i ( i ) = h ( i ) and (ILV i )( s) = c s for all s i, we see from (4.3) that the inequality V i () W () derived above is actually equality for all 2 [; ]. This proves that V (; B ) = V (; B 2 ) for all 2 [; ], or in other words, that there cannot be more than one point B 3 in B; b satisfying (4.3). Thus, there is only one solution 7! V () of (3.)-(3.3) which is finite at B b (see Figure 4 below), and the proof of the claim is complete. (iv): It was shown in Section 3.2 above that the map 7! V (; "; v) from (3.5) is a unique continuous solution of the equation (ILV )() = c for " < < satisfying V () = v for all 2 [S("); "]. It can be checked using (3.2) that (4.5) V p;(; "; v) = (4.6) c (") = V g (") c ( +) + c ( +) " + d b Xs c ( +) + v c ( +) for 2 I = ["; " where S(" ) = ". Moreover, it may be noted directly from (2.2) above that IL(f+c) = IL(f ) for every constant c, and thus V (; "; v) = V (; "; )+v for all 2 [S(");. Consequently, the two maps 7! V (; "; v ) and 7! V (; "; v ) do not intersect in [S("); when v and v are different. Each map 7! V (; "; v) is concave on [S(");. This fact can be proved by a probabilistic argument using (2.4) upon considering the auxiliary optimal stopping problem (4.) where the map 7! h () is replaced by the concave map h v () = v ^ ( ). [It is a matter of fact that 7! W () from (4.) is concave on [; ] whenever 7! h () is so.] Moreover, using (3.2)+(4.5)+(4.6) in (3.5) with n = it is possible to see that for v close to we have V (; "; v) < for some > ", and for v close to we have V (; "; v) > for

12 some > " (see Figure 5 below). Thus a simple concavity argument implies the existence of a unique point B " 3 2 ; at which 7! V (; "; v " ) for some v " 2 ; hits 7! smoothly. The key non-trivial point in the verification that V (; "; v " ) equals the value function W () of the optimal stopping problem (4.) with 7! h v" () in place of 7! h () is to establish that (IL(V ( ; "; v " )))() c for all 2 B " 3 ; S (B " 3. Since ) " B 3 is a smooth-fit point, however, this can be done using the same method which we applied in part 3 of the proof of Theorem 2. in [6]. Moreover, when " # then clearly (4.6) and (4.7) are valid (recall (2.5) and (3.5) above), and the proof of the theorem is complete. Concluding the paper we would like to mention that the fixed false-alarm formulation of the Poisson disorder problem (cf. [8, page 25]) raises some new interesting questions not present in the Wiener process version of the same problem. REFERENCES [] DAVIS, M. H. A. (976). A note on the Poisson disorder problem. Banach Center Publ. (65-72). [2] GAL CHUK, L. I. and ROZOVSKII, B. L. (97). The disorder problem for a Poisson process. Theory Probab. Appl. 6 (72-76). [3] JACOD, J. and SHIRYAEV, A. N. (987). Limit Theorems for Stochastic Processes. Springer- Verlag, Berlin Heidelberg. [4] LIPTSER, R. S. and SHIRYAYEV, A. N. (978). Statistics of Random Processes II. Springer- Verlag, New York. [5] MARCELLUS, R. L. (99). A Markov renewal approach to the Poisson disorder problem. Comm. Statist. Stochastic Models 6 (23-228). [6] PESKIR, G. and SHIRYAEV, A. N. (998). Sequential testing problems for Poisson processes. Research Report No. 4, Dept. Theoret. Statist. Aarhus (2 pp). Ann. Statist. 28, 2, ( ). [7] SHIRYAEV, A. N. (967). Two problems of sequential analysis. Cybernetics 3 (63-69). [8] SHIRYAEV, A. N. (978). Optimal Stopping Rules. Springer-Verlag, New York. Goran Peskir Department of Mathematical Sciences University of Aarhus, Denmark Ny Munkegade, DK-8 Aarhus home.imf.au.dk/goran goran@imf.au.dk Albert N. Shiryaev Steklov Mathematical Institute Gubkina str Moscow Russia shiryaev@mi.ras.ru 2

13 (i) λ > λ B B (ii) λ < λ Figure. Sample-path properties of the a posteriori probability process ( t) t from (2.6+2.). The point B is a singularity point (3.6) of the free-boundary equation (3.). 3

14 (i) λ > λ π - π π V(π) B * π (ii) λ > λ π V(π;B) π - π B * B π V(π;B) π Figure 2. A computer drawing of the maps 7! V (; B) from (3.4) for different B from ; in the case = 4 ; = 2 ; = ; c = 2. The singularity point B from (3.6) equals =2, and the smooth-fit point B from (3.7) equals =3. The optimal threshold B 3 coincides with the smooth-fit point B. The optimal cost function 7! V () from ( ) equals 7! V (; B 3) for B 3 and for B 3. (This is presented in part (i) above.) The solutions 7! V (; B) for B > B 3 are ruled out since they fail to satisfy V () for all 2 [; ]. (This is shown in part (ii) above.) The general case > with c > looks very much the same. 4

15 λ > λ π V(π) π - π smooth fit ( c > λ - λ - λ ) B * = B breakdown point continuous fit (c < λ - λ - λ ) π Figure 3. A computer drawing of the optimal cost functions 7! V () from ( ) in the case = 4, = 2, = and c = :4; :3; :2; :; ; :9; :8; :7; :6. The given V () equals V (; B 3) from (3.4) for all < B 3 where B 3 as a function of c is given by (4.2) and (4.5). The smooth-fit condition (3.4) holds in the cases c = :4; :3; :2; :. The point c = is a breakdown point when the optimal threshold B 3 equals the singularity point B from (3.6), and the smooth-fit condition gets replaced by the condition (3.8) with B = B 3 = B = :5 in this case. For c = :9; :8; :7; :6 the smooth-fit condition (3.4) does not hold. In these cases the continuous-fit condition (3.3) is satisfied. Moreover, numerical computations suggest that the mapping B 3 7! V (B 3 ; B 3 ) which equals for < B 3 < B and jumps to = = :5 for B 3 = B is decreasing on [B; and tends to a value slightly larger than :6 when B 3 " that is c #. The general case > looks very much the same. 5

16 (i) λ > λ π V(π;B) π - π B π B * (ii) λ > λ π - π π V(π;B) B π Figure 4. A computer drawing of the maps 7! V (; B) from (3.4) for different B from ; in the case = 4, = 2, =, c = 2=5. The singularity point B from (3.6) equals =2. The optimal threshold B 3 can be determined from the fact that all solutions 7! V (; B) for B > B 3 hit zero for some > B, and all solutions 7! V (; B) for B < B 3 hit for some > B. (This is shown in part (i) above.) A simple numerical method based on the preceding fact suggests the following estimates :75 < B 3 < :752. The optimal cost function 7! V () from ( ) equals 7! V (; B 3) for B 3 and for B 3. The solutions 7! V (; B) for B B are ruled out since they fail to be concave. (This is shown in part (ii) above.) The general case > with c < looks very much the same. 6

17 λ < λ π V(π) π V(π;ε,v ε ) π V(π;ε,v) π - π B ε * B * π Figure 5. A computer drawing of the maps 7! V (; "; v) from (3.5) for different v from ; with " = : in the case = 2 ; = 4 ; = ; c =. For each " > there is a unique number v " 2 ; such that the map 7! V (; "; v " ) hits the map 7! smoothly at some B " 3 2 ;. Letting " # we obtain B " 3! B 3 and V (; "; v ")! V () for all 2 [; ] where B 3 is the optimal threshold from (2.6) and 7! V () is the optimal cost function from ( ). 7

Maximum Process Problems in Optimal Control Theory

Maximum Process Problems in Optimal Control Theory J. Appl. Math. Stochastic Anal. Vol. 25, No., 25, (77-88) Research Report No. 423, 2, Dept. Theoret. Statist. Aarhus (2 pp) Maximum Process Problems in Optimal Control Theory GORAN PESKIR 3 Given a standard

More information

The Azéma-Yor Embedding in Non-Singular Diffusions

The Azéma-Yor Embedding in Non-Singular Diffusions Stochastic Process. Appl. Vol. 96, No. 2, 2001, 305-312 Research Report No. 406, 1999, Dept. Theoret. Statist. Aarhus The Azéma-Yor Embedding in Non-Singular Diffusions J. L. Pedersen and G. Peskir Let

More information

The Wiener Sequential Testing Problem with Finite Horizon

The Wiener Sequential Testing Problem with Finite Horizon Research Report No. 434, 3, Dept. Theoret. Statist. Aarhus (18 pp) The Wiener Sequential Testing Problem with Finite Horizon P. V. Gapeev and G. Peskir We present a solution of the Bayesian problem of

More information

Pavel V. Gapeev The disorder problem for compound Poisson processes with exponential jumps Article (Published version) (Refereed)

Pavel V. Gapeev The disorder problem for compound Poisson processes with exponential jumps Article (Published version) (Refereed) Pavel V. Gapeev The disorder problem for compound Poisson processes with exponential jumps Article (Published version) (Refereed) Original citation: Gapeev, Pavel V. (25) The disorder problem for compound

More information

Bayesian quickest detection problems for some diffusion processes

Bayesian quickest detection problems for some diffusion processes Bayesian quickest detection problems for some diffusion processes Pavel V. Gapeev Albert N. Shiryaev We study the Bayesian problems of detecting a change in the drift rate of an observable diffusion process

More information

A Barrier Version of the Russian Option

A Barrier Version of the Russian Option A Barrier Version of the Russian Option L. A. Shepp, A. N. Shiryaev, A. Sulem Rutgers University; shepp@stat.rutgers.edu Steklov Mathematical Institute; shiryaev@mi.ras.ru INRIA- Rocquencourt; agnes.sulem@inria.fr

More information

The Wiener Disorder Problem with Finite Horizon

The Wiener Disorder Problem with Finite Horizon Stochastic Processes and their Applications 26 11612 177 1791 The Wiener Disorder Problem with Finite Horizon P. V. Gapeev and G. Peskir The Wiener disorder problem seeks to determine a stopping time which

More information

On the American Option Problem

On the American Option Problem Math. Finance, Vol. 5, No., 25, (69 8) Research Report No. 43, 22, Dept. Theoret. Statist. Aarhus On the American Option Problem GORAN PESKIR 3 We show how the change-of-variable formula with local time

More information

On Reflecting Brownian Motion with Drift

On Reflecting Brownian Motion with Drift Proc. Symp. Stoch. Syst. Osaka, 25), ISCIE Kyoto, 26, 1-5) On Reflecting Brownian Motion with Drift Goran Peskir This version: 12 June 26 First version: 1 September 25 Research Report No. 3, 25, Probability

More information

Predicting the Time of the Ultimate Maximum for Brownian Motion with Drift

Predicting the Time of the Ultimate Maximum for Brownian Motion with Drift Proc. Math. Control Theory Finance Lisbon 27, Springer, 28, 95-112 Research Report No. 4, 27, Probab. Statist. Group Manchester 16 pp Predicting the Time of the Ultimate Maximum for Brownian Motion with

More information

On the principle of smooth fit in optimal stopping problems

On the principle of smooth fit in optimal stopping problems 1 On the principle of smooth fit in optimal stopping problems Amir Aliev Moscow State University, Faculty of Mechanics and Mathematics, Department of Probability Theory, 119992, Moscow, Russia. Keywords:

More information

On the sequential testing problem for some diffusion processes

On the sequential testing problem for some diffusion processes To appear in Stochastics: An International Journal of Probability and Stochastic Processes (17 pp). On the sequential testing problem for some diffusion processes Pavel V. Gapeev Albert N. Shiryaev We

More information

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Chapter 3 A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Abstract We establish a change of variable

More information

On Wald-Type Optimal Stopping for Brownian Motion

On Wald-Type Optimal Stopping for Brownian Motion J Al Probab Vol 34, No 1, 1997, (66-73) Prerint Ser No 1, 1994, Math Inst Aarhus On Wald-Tye Otimal Stoing for Brownian Motion S RAVRSN and PSKIR The solution is resented to all otimal stoing roblems of

More information

On Doob s Maximal Inequality for Brownian Motion

On Doob s Maximal Inequality for Brownian Motion Stochastic Process. Al. Vol. 69, No., 997, (-5) Research Reort No. 337, 995, Det. Theoret. Statist. Aarhus On Doob s Maximal Inequality for Brownian Motion S. E. GRAVERSEN and G. PESKIR If B = (B t ) t

More information

Optimal Prediction of the Ultimate Maximum of Brownian Motion

Optimal Prediction of the Ultimate Maximum of Brownian Motion Optimal Prediction of the Ultimate Maximum of Brownian Motion Jesper Lund Pedersen University of Copenhagen At time start to observe a Brownian path. Based upon the information, which is continuously updated

More information

Law of total probability and Bayes theorem in Riesz spaces

Law of total probability and Bayes theorem in Riesz spaces Law of total probability and Bayes theorem in Riesz spaces Liang Hong Abstract. This note generalizes the notion of conditional probability to Riesz spaces using the order-theoretic approach. With the

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Change detection problems in branching processes

Change detection problems in branching processes Change detection problems in branching processes Outline of Ph.D. thesis by Tamás T. Szabó Thesis advisor: Professor Gyula Pap Doctoral School of Mathematics and Computer Science Bolyai Institute, University

More information

ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER

ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER GERARDO HERNANDEZ-DEL-VALLE arxiv:1209.2411v1 [math.pr] 10 Sep 2012 Abstract. This work deals with first hitting time densities of Ito processes whose

More information

A MODEL FOR THE LONG-TERM OPTIMAL CAPACITY LEVEL OF AN INVESTMENT PROJECT

A MODEL FOR THE LONG-TERM OPTIMAL CAPACITY LEVEL OF AN INVESTMENT PROJECT A MODEL FOR HE LONG-ERM OPIMAL CAPACIY LEVEL OF AN INVESMEN PROJEC ARNE LØKKA AND MIHAIL ZERVOS Abstract. We consider an investment project that produces a single commodity. he project s operation yields

More information

Optimal Stopping Games for Markov Processes

Optimal Stopping Games for Markov Processes SIAM J. Control Optim. Vol. 47, No. 2, 2008, (684-702) Research Report No. 15, 2006, Probab. Statist. Group Manchester (21 pp) Optimal Stopping Games for Markov Processes E. Ekström & G. Peskir Let X =

More information

Sequential Detection. Changes: an overview. George V. Moustakides

Sequential Detection. Changes: an overview. George V. Moustakides Sequential Detection of Changes: an overview George V. Moustakides Outline Sequential hypothesis testing and Sequential detection of changes The Sequential Probability Ratio Test (SPRT) for optimum hypothesis

More information

OPTIMAL STOPPING OF A BROWNIAN BRIDGE

OPTIMAL STOPPING OF A BROWNIAN BRIDGE OPTIMAL STOPPING OF A BROWNIAN BRIDGE ERIK EKSTRÖM AND HENRIK WANNTORP Abstract. We study several optimal stopping problems in which the gains process is a Brownian bridge or a functional of a Brownian

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Stochastic Modelling Unit 1: Markov chain models

Stochastic Modelling Unit 1: Markov chain models Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson

More information

MARKOVIANITY OF A SUBSET OF COMPONENTS OF A MARKOV PROCESS

MARKOVIANITY OF A SUBSET OF COMPONENTS OF A MARKOV PROCESS MARKOVIANITY OF A SUBSET OF COMPONENTS OF A MARKOV PROCESS P. I. Kitsul Department of Mathematics and Statistics Minnesota State University, Mankato, MN, USA R. S. Liptser Department of Electrical Engeneering

More information

2. Transience and Recurrence

2. Transience and Recurrence Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times

More information

On an Effective Solution of the Optimal Stopping Problem for Random Walks

On an Effective Solution of the Optimal Stopping Problem for Random Walks QUANTITATIVE FINANCE RESEARCH CENTRE QUANTITATIVE FINANCE RESEARCH CENTRE Research Paper 131 September 2004 On an Effective Solution of the Optimal Stopping Problem for Random Walks Alexander Novikov and

More information

Jónsson posets and unary Jónsson algebras

Jónsson posets and unary Jónsson algebras Jónsson posets and unary Jónsson algebras Keith A. Kearnes and Greg Oman Abstract. We show that if P is an infinite poset whose proper order ideals have cardinality strictly less than P, and κ is a cardinal

More information

Lectures in Mathematics ETH Zürich Department of Mathematics Research Institute of Mathematics. Managing Editor: Michael Struwe

Lectures in Mathematics ETH Zürich Department of Mathematics Research Institute of Mathematics. Managing Editor: Michael Struwe Lectures in Mathematics ETH Zürich Department of Mathematics Research Institute of Mathematics Managing Editor: Michael Struwe Goran Peskir Albert Shiryaev Optimal Stopping and Free-Boundary Problems Birkhäuser

More information

LAW OF LARGE NUMBERS FOR THE SIRS EPIDEMIC

LAW OF LARGE NUMBERS FOR THE SIRS EPIDEMIC LAW OF LARGE NUMBERS FOR THE SIRS EPIDEMIC R. G. DOLGOARSHINNYKH Abstract. We establish law of large numbers for SIRS stochastic epidemic processes: as the population size increases the paths of SIRS epidemic

More information

Poisson Disorder Problem with Exponential Penalty for Delay

Poisson Disorder Problem with Exponential Penalty for Delay MATHEMATICS OF OPERATIONS RESEARCH Vol. 31, No. 2, May 26, pp. 217 233 issn 364-765X eissn 1526-5471 6 312 217 informs doi 1.1287/moor.16.19 26 INFORMS Poisson Disorder Problem with Exponential Penalty

More information

Albert N. Shiryaev Steklov Mathematical Institute. On sharp maximal inequalities for stochastic processes

Albert N. Shiryaev Steklov Mathematical Institute. On sharp maximal inequalities for stochastic processes Albert N. Shiryaev Steklov Mathematical Institute On sharp maximal inequalities for stochastic processes joint work with Yaroslav Lyulko, Higher School of Economics email: albertsh@mi.ras.ru 1 TOPIC I:

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

Pathwise Construction of Stochastic Integrals

Pathwise Construction of Stochastic Integrals Pathwise Construction of Stochastic Integrals Marcel Nutz First version: August 14, 211. This version: June 12, 212. Abstract We propose a method to construct the stochastic integral simultaneously under

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

Erdős-Renyi random graphs basics

Erdős-Renyi random graphs basics Erdős-Renyi random graphs basics Nathanaël Berestycki U.B.C. - class on percolation We take n vertices and a number p = p(n) with < p < 1. Let G(n, p(n)) be the graph such that there is an edge between

More information

Limit theorems for multipower variation in the presence of jumps

Limit theorems for multipower variation in the presence of jumps Limit theorems for multipower variation in the presence of jumps Ole E. Barndorff-Nielsen Department of Mathematical Sciences, University of Aarhus, Ny Munkegade, DK-8 Aarhus C, Denmark oebn@imf.au.dk

More information

On the submartingale / supermartingale property of diffusions in natural scale

On the submartingale / supermartingale property of diffusions in natural scale On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

STOCHASTIC PERRON S METHOD AND VERIFICATION WITHOUT SMOOTHNESS USING VISCOSITY COMPARISON: OBSTACLE PROBLEMS AND DYNKIN GAMES

STOCHASTIC PERRON S METHOD AND VERIFICATION WITHOUT SMOOTHNESS USING VISCOSITY COMPARISON: OBSTACLE PROBLEMS AND DYNKIN GAMES STOCHASTIC PERRON S METHOD AND VERIFICATION WITHOUT SMOOTHNESS USING VISCOSITY COMPARISON: OBSTACLE PROBLEMS AND DYNKIN GAMES ERHAN BAYRAKTAR AND MIHAI SÎRBU Abstract. We adapt the Stochastic Perron s

More information

A COLLOCATION METHOD FOR THE SEQUENTIAL TESTING OF A GAMMA PROCESS

A COLLOCATION METHOD FOR THE SEQUENTIAL TESTING OF A GAMMA PROCESS Statistica Sinica 25 2015), 1527-1546 doi:http://d.doi.org/10.5705/ss.2013.155 A COLLOCATION METHOD FOR THE SEQUENTIAL TESTING OF A GAMMA PROCESS B. Buonaguidi and P. Muliere Bocconi University Abstract:

More information

March 16, Abstract. We study the problem of portfolio optimization under the \drawdown constraint" that the

March 16, Abstract. We study the problem of portfolio optimization under the \drawdown constraint that the ON PORTFOLIO OPTIMIZATION UNDER \DRAWDOWN" CONSTRAINTS JAKSA CVITANIC IOANNIS KARATZAS y March 6, 994 Abstract We study the problem of portfolio optimization under the \drawdown constraint" that the wealth

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Sequences. We know that the functions can be defined on any subsets of R. As the set of positive integers

Sequences. We know that the functions can be defined on any subsets of R. As the set of positive integers Sequences We know that the functions can be defined on any subsets of R. As the set of positive integers Z + is a subset of R, we can define a function on it in the following manner. f: Z + R f(n) = a

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

On Optimal Stopping Problems with Power Function of Lévy Processes

On Optimal Stopping Problems with Power Function of Lévy Processes On Optimal Stopping Problems with Power Function of Lévy Processes Budhi Arta Surya Department of Mathematics University of Utrecht 31 August 2006 This talk is based on the joint paper with A.E. Kyprianou:

More information

SEQUENTIAL TESTING OF SIMPLE HYPOTHESES ABOUT COMPOUND POISSON PROCESSES. 1. Introduction (1.2)

SEQUENTIAL TESTING OF SIMPLE HYPOTHESES ABOUT COMPOUND POISSON PROCESSES. 1. Introduction (1.2) SEQUENTIAL TESTING OF SIMPLE HYPOTHESES ABOUT COMPOUND POISSON PROCESSES SAVAS DAYANIK AND SEMIH O. SEZER Abstract. One of two simple hypotheses is correct about the unknown arrival rate and jump distribution

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Point Process Control

Point Process Control Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

Surveillance of BiometricsAssumptions

Surveillance of BiometricsAssumptions Surveillance of BiometricsAssumptions in Insured Populations Journée des Chaires, ILB 2017 N. El Karoui, S. Loisel, Y. Sahli UPMC-Paris 6/LPMA/ISFA-Lyon 1 with the financial support of ANR LoLitA, and

More information

arxiv:math/ v4 [math.pr] 12 Apr 2007

arxiv:math/ v4 [math.pr] 12 Apr 2007 arxiv:math/612224v4 [math.pr] 12 Apr 27 LARGE CLOSED QUEUEING NETWORKS IN SEMI-MARKOV ENVIRONMENT AND ITS APPLICATION VYACHESLAV M. ABRAMOV Abstract. The paper studies closed queueing networks containing

More information

Random Times and Their Properties

Random Times and Their Properties Chapter 6 Random Times and Their Properties Section 6.1 recalls the definition of a filtration (a growing collection of σ-fields) and of stopping times (basically, measurable random times). Section 6.2

More information

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications The multidimensional Ito Integral and the multidimensional Ito Formula Eric Mu ller June 1, 215 Seminar on Stochastic Geometry and its applications page 2 Seminar on Stochastic Geometry and its applications

More information

Applications of Optimal Stopping and Stochastic Control

Applications of Optimal Stopping and Stochastic Control Applications of and Stochastic Control YRM Warwick 15 April, 2011 Applications of and Some problems Some technology Some problems The secretary problem Bayesian sequential hypothesis testing the multi-armed

More information

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 19, No. 4, December 26 GIRSANOV THEOREM FOR GAUSSIAN PROCESS WITH INDEPENDENT INCREMENTS Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** Abstract.

More information

The Codimension of the Zeros of a Stable Process in Random Scenery

The Codimension of the Zeros of a Stable Process in Random Scenery The Codimension of the Zeros of a Stable Process in Random Scenery Davar Khoshnevisan The University of Utah, Department of Mathematics Salt Lake City, UT 84105 0090, U.S.A. davar@math.utah.edu http://www.math.utah.edu/~davar

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze

More information

( f ^ M _ M 0 )dµ (5.1)

( f ^ M _ M 0 )dµ (5.1) 47 5. LEBESGUE INTEGRAL: GENERAL CASE Although the Lebesgue integral defined in the previous chapter is in many ways much better behaved than the Riemann integral, it shares its restriction to bounded

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Bisection Ideas in End-Point Conditioned Markov Process Simulation

Bisection Ideas in End-Point Conditioned Markov Process Simulation Bisection Ideas in End-Point Conditioned Markov Process Simulation Søren Asmussen and Asger Hobolth Department of Mathematical Sciences, Aarhus University Ny Munkegade, 8000 Aarhus C, Denmark {asmus,asger}@imf.au.dk

More information

Tree sets. Reinhard Diestel

Tree sets. Reinhard Diestel 1 Tree sets Reinhard Diestel Abstract We study an abstract notion of tree structure which generalizes treedecompositions of graphs and matroids. Unlike tree-decompositions, which are too closely linked

More information

Introduction to Random Diffusions

Introduction to Random Diffusions Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales

More information

DEGREE SEQUENCES OF INFINITE GRAPHS

DEGREE SEQUENCES OF INFINITE GRAPHS DEGREE SEQUENCES OF INFINITE GRAPHS ANDREAS BLASS AND FRANK HARARY ABSTRACT The degree sequences of finite graphs, finite connected graphs, finite trees and finite forests have all been characterized.

More information

OPTIMAL CONTROL OF A FLEXIBLE SERVER

OPTIMAL CONTROL OF A FLEXIBLE SERVER Adv. Appl. Prob. 36, 139 170 (2004) Printed in Northern Ireland Applied Probability Trust 2004 OPTIMAL CONTROL OF A FLEXIBLE SERVER HYUN-SOO AHN, University of California, Berkeley IZAK DUENYAS, University

More information

The Uniform Integrability of Martingales. On a Question by Alexander Cherny

The Uniform Integrability of Martingales. On a Question by Alexander Cherny The Uniform Integrability of Martingales. On a Question by Alexander Cherny Johannes Ruf Department of Mathematics University College London May 1, 2015 Abstract Let X be a progressively measurable, almost

More information

Monitoring actuarial assumptions in life insurance

Monitoring actuarial assumptions in life insurance Monitoring actuarial assumptions in life insurance Stéphane Loisel ISFA, Univ. Lyon 1 Joint work with N. El Karoui & Y. Salhi IAALS Colloquium, Barcelona, 17 LoLitA Typical paths with change of regime

More information

Likelihood Functions for Stochastic Signals in White Noise* TYRONE E. DUNCAN

Likelihood Functions for Stochastic Signals in White Noise* TYRONE E. DUNCAN INFORMATION AND CONTROL 16, 303-310 (1970) Likelihood Functions for Stochastic Signals in White Noise* TYRONE E. DUNCAN Computer, Information and Control Engineering, The University of Nlichigan, Ann Arbor,

More information

32 Divisibility Theory in Integral Domains

32 Divisibility Theory in Integral Domains 3 Divisibility Theory in Integral Domains As we have already mentioned, the ring of integers is the prototype of integral domains. There is a divisibility relation on * : an integer b is said to be divisible

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Newton Method with Adaptive Step-Size for Under-Determined Systems of Equations

Newton Method with Adaptive Step-Size for Under-Determined Systems of Equations Newton Method with Adaptive Step-Size for Under-Determined Systems of Equations Boris T. Polyak Andrey A. Tremba V.A. Trapeznikov Institute of Control Sciences RAS, Moscow, Russia Profsoyuznaya, 65, 117997

More information

Jump-type Levy Processes

Jump-type Levy Processes Jump-type Levy Processes Ernst Eberlein Handbook of Financial Time Series Outline Table of contents Probabilistic Structure of Levy Processes Levy process Levy-Ito decomposition Jump part Probabilistic

More information

Pavel V. Gapeev, Neofytos Rodosthenous Perpetual American options in diffusion-type models with running maxima and drawdowns

Pavel V. Gapeev, Neofytos Rodosthenous Perpetual American options in diffusion-type models with running maxima and drawdowns Pavel V. Gapeev, Neofytos Rodosthenous Perpetual American options in diffusion-type models with running maxima and drawdowns Article (Accepted version) (Refereed) Original citation: Gapeev, Pavel V. and

More information

Multiple points of the Brownian sheet in critical dimensions

Multiple points of the Brownian sheet in critical dimensions Multiple points of the Brownian sheet in critical dimensions Robert C. Dalang Ecole Polytechnique Fédérale de Lausanne Based on joint work with: Carl Mueller Multiple points of the Brownian sheet in critical

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

A sequential hypothesis test based on a generalized Azuma inequality 1

A sequential hypothesis test based on a generalized Azuma inequality 1 A sequential hypothesis test based on a generalized Azuma inequality 1 Daniël Reijsbergen a,2, Werner Scheinhardt b, Pieter-Tjerk de Boer b a Laboratory for Foundations of Computer Science, University

More information

arxiv:math/ v1 [math.fa] 26 Oct 1993

arxiv:math/ v1 [math.fa] 26 Oct 1993 arxiv:math/9310217v1 [math.fa] 26 Oct 1993 ON COMPLEMENTED SUBSPACES OF SUMS AND PRODUCTS OF BANACH SPACES M.I.Ostrovskii Abstract. It is proved that there exist complemented subspaces of countable topological

More information

Introduction to self-similar growth-fragmentations

Introduction to self-similar growth-fragmentations Introduction to self-similar growth-fragmentations Quan Shi CIMAT, 11-15 December, 2017 Quan Shi Growth-Fragmentations CIMAT, 11-15 December, 2017 1 / 34 Literature Jean Bertoin, Compensated fragmentation

More information

arxiv: v2 [math.pr] 4 Feb 2009

arxiv: v2 [math.pr] 4 Feb 2009 Optimal detection of homogeneous segment of observations in stochastic sequence arxiv:0812.3632v2 [math.pr] 4 Feb 2009 Abstract Wojciech Sarnowski a, Krzysztof Szajowski b,a a Wroc law University of Technology,

More information

Invariant measures for iterated function systems

Invariant measures for iterated function systems ANNALES POLONICI MATHEMATICI LXXV.1(2000) Invariant measures for iterated function systems by Tomasz Szarek (Katowice and Rzeszów) Abstract. A new criterion for the existence of an invariant distribution

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Introduction to Real Analysis Alternative Chapter 1

Introduction to Real Analysis Alternative Chapter 1 Christopher Heil Introduction to Real Analysis Alternative Chapter 1 A Primer on Norms and Banach Spaces Last Updated: March 10, 2018 c 2018 by Christopher Heil Chapter 1 A Primer on Norms and Banach Spaces

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

A Representation of Excessive Functions as Expected Suprema

A Representation of Excessive Functions as Expected Suprema A Representation of Excessive Functions as Expected Suprema Hans Föllmer & Thomas Knispel Humboldt-Universität zu Berlin Institut für Mathematik Unter den Linden 6 10099 Berlin, Germany E-mail: foellmer@math.hu-berlin.de,

More information

SEQUENTIAL CHANGE DETECTION REVISITED. BY GEORGE V. MOUSTAKIDES University of Patras

SEQUENTIAL CHANGE DETECTION REVISITED. BY GEORGE V. MOUSTAKIDES University of Patras The Annals of Statistics 28, Vol. 36, No. 2, 787 87 DOI: 1.1214/95367938 Institute of Mathematical Statistics, 28 SEQUENTIAL CHANGE DETECTION REVISITED BY GEORGE V. MOUSTAKIDES University of Patras In

More information

ON THE SOLUTION OF DIFFERENTIAL EQUATIONS WITH DELAYED AND ADVANCED ARGUMENTS

ON THE SOLUTION OF DIFFERENTIAL EQUATIONS WITH DELAYED AND ADVANCED ARGUMENTS 2003 Colloquium on Differential Equations and Applications, Maracaibo, Venezuela. Electronic Journal of Differential Equations, Conference 13, 2005, pp. 57 63. ISSN: 1072-6691. URL: http://ejde.math.txstate.edu

More information

FORMULATION OF THE LEARNING PROBLEM

FORMULATION OF THE LEARNING PROBLEM FORMULTION OF THE LERNING PROBLEM MIM RGINSKY Now that we have seen an informal statement of the learning problem, as well as acquired some technical tools in the form of concentration inequalities, we

More information

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing Advances in Dynamical Systems and Applications ISSN 0973-5321, Volume 8, Number 2, pp. 401 412 (2013) http://campus.mst.edu/adsa Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes

More information

Boolean Algebras. Chapter 2

Boolean Algebras. Chapter 2 Chapter 2 Boolean Algebras Let X be an arbitrary set and let P(X) be the class of all subsets of X (the power set of X). Three natural set-theoretic operations on P(X) are the binary operations of union

More information

PROBABILISTIC METHODS FOR A LINEAR REACTION-HYPERBOLIC SYSTEM WITH CONSTANT COEFFICIENTS

PROBABILISTIC METHODS FOR A LINEAR REACTION-HYPERBOLIC SYSTEM WITH CONSTANT COEFFICIENTS The Annals of Applied Probability 1999, Vol. 9, No. 3, 719 731 PROBABILISTIC METHODS FOR A LINEAR REACTION-HYPERBOLIC SYSTEM WITH CONSTANT COEFFICIENTS By Elizabeth A. Brooks National Institute of Environmental

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

Independence of some multiple Poisson stochastic integrals with variable-sign kernels

Independence of some multiple Poisson stochastic integrals with variable-sign kernels Independence of some multiple Poisson stochastic integrals with variable-sign kernels Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information