Feuille d exercices n 1 : Poisson processes. 1. Give a necessary and su cient condition for having

Similar documents
Poisson Processes. Stochastic Processes. Feb UC3M

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Modelling the risk process

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

University Of Calgary Department of Mathematics and Statistics

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Formulas for probability theory and linear models SF2941

f X (x) = λe λx, , x 0, k 0, λ > 0 Γ (k) f X (u)f X (z u)du

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Exponential Distribution and Poisson Process

3 Continuous Random Variables

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

1 Poisson processes, and Compound (batch) Poisson processes

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Stability of the Defect Renewal Volterra Integral Equations

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

1 Some basic renewal theory: The Renewal Reward Theorem

IEOR 4703: Homework 2 Solutions

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

RMSC 2001 Introduction to Risk Management

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Manual for SOA Exam MLC.

RMSC 2001 Introduction to Risk Management

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

The exponential distribution and the Poisson process

Lecture 5: Moment generating functions

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

STA 256: Statistics and Probability I

S n = x + X 1 + X X n.

1 Review of di erential calculus

Lecture Notes on Risk Theory

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Applying the proportional hazard premium calculation principle

Discrete Distributions Chapter 6

Exercises. T 2T. e ita φ(t)dt.

1 Delayed Renewal Processes: Exploiting Laplace Transforms

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc

Point Process Control

EDRP lecture 7. Poisson process. Pawe J. Szab owski

3 Conditional Expectation

Chapter 5. Chapter 5 sections

Approximating the Integrated Tail Distribution

1.1 Review of Probability Theory

Quantile-quantile plots and the method of peaksover-threshold

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Continuous random variables

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Slides 8: Statistical Models in Simulation

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Renewal theory and its applications

Notes for Math 324, Part 19

Actuarial Science Exam 1/P

A Ruin Model with Compound Poisson Income and Dependence Between Claim Sizes and Claim Intervals

STAT 414: Introduction to Probability Theory

Ruin Theory. A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science at George Mason University

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

Continuous distributions

Analysis of the ruin probability using Laplace transforms and Karamata Tauberian theorems

. Find E(V ) and var(v ).

Math 576: Quantitative Risk Management

Multivariate Risk Processes with Interacting Intensities

Asymptotics of random sums of heavy-tailed negatively dependent random variables with applications

µ X (A) = P ( X 1 (A) )

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Continuous Random Variables

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Introduction to Rare Event Simulation

STAT 380 Markov Chains

It can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Modèles de dépendance entre temps inter-sinistres et montants de sinistre en théorie de la ruine

Finite-time Ruin Probability of Renewal Model with Risky Investment and Subexponential Claims

p. 6-1 Continuous Random Variables p. 6-2

Multiple Random Variables

Random variables and transform methods

Continuous-time Markov Chains

SPRING 2005 EXAM M SOLUTIONS. = When (as given in the problem), (x) dies in the second year from issue, the curtate future lifetime x + 1 is 0, so

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

Lecture 4a: Continuous-Time Markov Chain Models

STAT Chapter 5 Continuous Distributions

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Brief Review of Probability

MAS223 Statistical Inference and Modelling Exercises

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

Chapter 4 Parametric Families of Lifetime Distributions

Extremes and ruin of Gaussian processes

Random Variables (Continuous Case)

Chapter 1. Poisson processes. 1.1 Definitions

Rare-Event Simulation

THE QUEEN S UNIVERSITY OF BELFAST

Ruin probabilities in multivariate risk models with periodic common shock

A Note On The Erlang(λ, n) Risk Process

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Transcription:

Processus de Poisson et méthodes actuarielles (25-26) Feuille d exercices n : Poisson processes Exercise. Let ( n,n ) be a sequence of IID (independent and identically distributed) non-negative random variables. Set T n = +... + n, n, (T =)and N t =#{i :T i apple t}, t.. Give a necessary and su cient condition for having P(N only makes jumps of size ) >. 2. Show that under this condition P(N only makes jumps of size ) =. 3. Is it possible that (T n,n ) converges to a finite limit with positive probability? 4. Compute the probability of the event {9t :N(t) =}. Exercise 2. Let N be a Poisson process with intensity interpretation of the following properties. P(N h =)= h + o(h) (h! ) 2. P(N h 2) = o(h) (h! ) 3. P(N h =)= h + o(h) (h! ). 4. 8t, P(N jumps at time t) =. 5. Compute Cov(N s,n t ), 8s, t. >. Prove and give an Exercise 3. Let N be a counting process with stationary and independent increments. Assume that there exists >suchthat For u 2 R, letg t (u) =E[e iunt ]. P(N h =)= h + o(h), P(N h 2) = o(h).. Prove that g t+h (u) =g t (u)g h (u) foreveryt, h.. En anglais le mot positive signifie strictement positif. Pour dire positif au sens large on dit nonnegative. De même les termes negative, bigger, smaller sont à prendre au sens strict.

2. Prove that 3. Conclude. d g dt t(u) = (e iu )g t, g (u) =. Exercise 4. Let N be a Poisson process with intensity >, modelling the arrival times of the claims for an insurance company. Let T denote the arrival time of the first claim. Show that the conditional law of T given N t =isuniformlydistributedover[,t]. Exercise 5. Let (T n,n ) (T =)bearenewalprocessandn its associated counting process. Assume that N has independent and stationary increments.. Show that 2. Derive that N is a Poisson process. Exercise 6. P(T >s+ t) =P(T >t)p(t >s), 8s, t.. Show that two independent Poisson processes cannot jump simultaneously a.s. 2. Let N and N 2 be two independent Poisson processes with parameters > and 2 respectively. Show that the process N t = N t + N 2 t, t is a Poisson process and give its intensity. 3. Derive that the sum of n independent Poisson processes with respective intensities >,..., n > isapoissonprocessandgiveitsintensity. Exercise 7. Insects fall into a soup bowl according to a Poisson process N with intensity >(theevent{n t = n} means that there are n insects in the bowl at time t). Assume that every insect is green with probability p 2 (, ) and that its colour is independent of the colour of the other insects. Show that the number of green insects that fall into the bowl, as a function of time, is a Poisson process with intensity p. Exercise 8. Liver transplants arrive at an operating block following a Poisson process N with intensity >. Two patients wait for a transplant. The first patient has lifetime T (before the transplant) according to an exponential distribution with parameter µ.the second one has lifetime T (before the transplant) according to an exponential distribution wity parameter µ 2.Theruleisthatthefirsttransplantarrivaltothehospitalisgiven to the first patient if still alive, and to the second patient otherwise. Assume that T,T and N are independent.. Compute the probability that the second patient is transplanted. 2. Compute the probability that the first patient is transplanted. 3. Let X denote the number of transplants arrived at the hospital during [,T]. Compute the law of X. 2

Exercise 9. The Bus paradox. Buses arrive at a given bus stop according to a Poisson process with intensity >. You arrive at the bus stop at time t.. Give a first guess for the value of the average waiting time before the following bus arrives? 2. Let A t = T Nt+ t be the waiting time before the next bus, and let B t = t T Nt denote the elapsed time since the last bus arrival. Compute the joint distribution of (A t,b t )(hint:computefirstp(a t x,b t x 2 )forx,x 2 ). 3. Derive that the random variables A t and B t are independent. What are their distributions? 4. In particular, compute E[A t ]. Compare with your initial first guess. Exercise. Law of large numbers and central limit theorem.. Recall and prove a law of large numbers for a Poisson process with intensity >. 2. Prove that N satisfies the following central limit theorem N t t p law! N (, ) as t!, t (a) by using characteristic functions (b) by showing first that (N n n) / p n converges in distribution as n!and then max t2[n,n+) (N t N n ) / p n! inprobability. Exercise.. Give an expression for the density function of the conditional distribution of (T,...,T n ) given N t = n when N is a Poisson process with intensity jump times. and <T <...<T n <...are its 2. Derive an expression for the density of T i given N t = n, 8 apple i apple n and similarly for (T i,t j )givenn t = n, 8 apple i<japple n. 3. Set U i,j = T j T i,apple i<japple n. GiveanexpressionforthedensityofU i,j given N t = n. DeriveanexpressionforthedensityofT n T n given N t = n. 3

Exercise 2. Martingales. Let X =(X t,t ) be a continuous time process and F =(F t,t ) a filtration, i.e.anestedfamilyofsigma-fieldsf s F t A8sapplet, where A is the sigma-field on the probability space (, A, P) overwhichx is defined. The process X is a martingale with respect to the filtration F if X t is F t -measurable and integrable 8t and E[X t F s ]=X s, 8 apple s apple t. Let N =(N t,t ) be a Poisson process with intensity >. Show that the three processes. (N t t, t ) ; 2. ((N t t) 2 t),t ) ; 3. (exp(un t + t( e u )),t ) (for a given real number u); are martingales with respect to the filtration generated by N, i.e. F N t = (N s,sapple t). Exercise 3. Let N be a Poisson process with intensity >andlet<t <...< T n <...denote its jump times.. Show that T n /n converges almost surely as n!and identify its limit. 2. Show that P i T 2 i converges almost surely. Let X denote its limit. 3. Show that X Nt = P N(t) i= T 2 i! X a.s. as t!. 4. Let (U i,i ) denote a sequence of independent uniform random variables on [, ]. We admit the following result n 2 nx i= U 2 i law! n! Z, where Z is a positive random variable, whose Laplace transform is given by E[exp( sz)] = exp( c p s), 8s, for some c>. The goal is to show that X and c Z have same law for some c that we will explicitly compute. We asssume moreover that (U i,i ) is independent of N. (a) Show that for every n andeveryt>, the law of X Nt given N t = n is the same as the law of t P 2 n i= U 2 i. (b) Derive that X N(t) has same distribution as t P 2 N(t) i= U 2 i. (c) Prove that X N(t) 2 N(t) i= U 2 i law! Z as t!. (d) Recall the law of large numbers for Poisson processes and conclude. 5. Derive E[X] =. 4

Processus de Poisson et méthodes actuarielles (25-26) Feuille d exercices n 2 : Mixed Poisson process, total claim amount, renewal theory. Exercise. Let Ñ be a mixed Poisson process and denote by < T <...< T n <... its jumps times. Prove that the conditional distribution of ( T,..., T n )givenñ(t) =n (n 2 N \{}) coincideswiththedistributionoftheorderstatisticofn independent random variables with common uniform distribution on [,t]. Exercise 2. ArandomvariableX follows a negative binomial distribution on {,, 2,...} with parameters r>andp 2 (, ) if P(X = k) = (r + k) p r ( p) k, 8k. (r)k! Let Ñ be a mixed Poisson process with mixture distribution (, ). What is the distribution of Ñ(t)? The process Ñ is called a negative binomial process. Thenegative binomial law is also called mixed Poisson or mixed Gamma-Poisson distribution. Exercise 3. Let N =(N t,t ) be a standard Poisson process with intensity >. Let f :[, )! [, ) bealocallyboundedborelfunction.set N(f) t = X i f(t i ) {Ti applet} for t, where the (T i ) i are the jump times of N.. Show that for all t, we have N(f) t < almost-surely. 2. If f(s) = (a,b] (s) where[a, b] [,t], what is the distribution of N( (a,b] ) t? 3. Show that for u, we have E e un(f)t N t = n = Z t n. e ds uf(s) t n 4. Derive E e un(f)t ad find back the result of Question 2. 5. Compute E N(f) t and Var[N(f)t ]. R t 6. Prove that N(f) t f(s)ds is a martingale.

Exercise 4. The total claim amount of a portfolio for a year is modelled by X = NX j= C j where N is the number of claims in the year and C j is the cost of the j-th claim. Assume that N follows a mixed Poisson distribution with random parameter, i.e. the conditional distribution of N given = is Poisson( ). Assume moreover that isdistributedaccordingtoa (b, b) distribution,forsomeb>. Assume that the cost of the claims (C j ) j are independent and identically distributed random variables, independent of N.. Compute E( ) and Var( ). 2. Compute E(N) andvar(n). 3. We assume that C Exponential ( ) forsome >. Show that the conditional law of X given N is a Gamma distribution and identify its parameters. What is the pure premium? 4. Show that the conditional law of given (X, N) isindependentofx and that it is a Gamma distribution. Identify its parameters. Exercise 5. Let ( i,i ) be a sequence of i.i.d. real-valued random variables, with second-order moments, independent of the Poisson process N =(N t,t ) with intensity >. For t, we set XN t X t = i.. Show that t X t converges almost surely as t!and identify its limit. We set 2,fort i= M t = p Xt N t µ. N t 3. Compute E e iumt. and derive that Mt converges in distribution as t!.identify its limit. 5. Show that p X t t µ N t t t converges in distribution as t!and identify its limit. Exercise 6. Let (T n ) n be a renewal process and let N t denote its counting function. We assume that the common law of the interarrival times admits a density function f on R (with value on (, ]). For n, we denote by f n the density function of T n and F n its cumulative distribution function.. Putting X t = on {N t =}. 2. Putting M t = on {N t =}. 2

. Check that f (t) =f(t) andshowthat Z f n+ (t) = f n (t R s)f(s)ds. Define r(t) =E[N t ]ift andr(t) =fort<. 2. Show that r(t) = P n F n(t) andthatforn F n (t T ). 2, we have P(T n apple t T ) = 3. Show that r satisfies the renewal equation r(t) =F (t)+ Z t r(t s)f(s)ds, t wher F denotes the cumulative distribution fcuntion of the common interarrival times. 3

Processus de Poisson et méthodes actuarielles (25-26) Feuille d exercices n 3 : Compound Poisson processes, renewal processes. Exercise. Give a necessary and su to be a standard Poisson process. cient condition for a compound Poisson process Exercise 2. Find the renewal function and the renewal measure of a renewal process when the interarrival times are distributed according to a Gamma distribution with parameters 2, >. Find back the renewal theorem and the key renewal theorem in that case. Exercise 3. An appliance has repeated failures in time and is repaired after each failure. Denote by (X i,i ) the successive durations when the appliance is functioning, and (Y i,i ) the successive durations when the appliance is being repaired. In other words, the appliance is in order in the time interval [,X ), it is being repaired in the time interval [X,X + Y ), then it is in order again in the time interval [X + Y,X + Y + X 2 ) and so on. Assume that the Xis are i.i.d. and that their common law has no atom and satisfies P(X > ) = and E[X ] <. Assume that the Yi s have the same properties and that they are moreover independent of the Xis. Denote by p(t) theprobabilitythattheapplianceisfunctioningattimet.. Let Z = X + Y and let F be the cumulative distribution function of Z.Show that p(t) satisfiestherenewalequation 2. Derive p(t) =P(X >t)+ Z t p(t s)df (s), 8t. lim p(t) = E[X ] t! E[X + Y ]. Exercise 4. Let N be a renewal process and let F (t) =T N(t)+ t denote the time elapsed between t and the time of the N(t)+-threnewal,fort. We denote by P T the duration of the interarrival times and F T its cumulative distribution function.. Show that for every x, the function P(F (t) >x)satifiestherenewalequation P(F (t) >x)= F T (t + x)+ Z t P(F (t u) >x)dp T (u), t. (Hint : start by decomposing P(F (t) >x)intwoterms,accordingtotheevent {T >t} or {T apple t}.)

2. Solve this equation when the interarrival times follow an exponential distribution. Exercise 5. In the same setting as Exercise 4, we consider a renewal process with interarrival times distributed according to a Pareto distribution P( >x)= ( + x), x.. Let X be a nonnegative random variable. Show that for all r>, Z rx r P(X >x)dx = E[X r ]. 2. Use the previous identity and the renewal equation satisfied by F (t) toshowthat E[F (t) 2 ]= Z t Z 2x ( + t u + x) dx dm(u), where m(t) =E[N(t)] is the renewal measure of N. 3. Derive that for >3, we have E[F (t) 2 ]! 2 and compute this limit explicitly. Z x( + x) dx as t! 2

Processus de Poisson et méthodes actuarielles (25-26) Feuille d exercices n 4 : Ruin theory In all exercises that use the Cramer-Lundberg model, we denote by c> the premimum rate, we denote by > the intensity of the Poisson process that models the number of claims and we denote by u the initial wealth of the insurer. Exercise.. Show that the following distribution are thin tailed : (a) the distribution of a nonnegative bounded random variable. (b) the Gamma distribution. (c) the Weibull distribution, with parameters C>, of a Weibull distribution with parameters C, is f(x) =C x exp( Cx ) {x>}. 2. Show that the following distributions are sub-exponential :. The density function (a) the Pareto distribution with parameters >, > (f(x) = /( + x) +,x>). (b) the Weibull distribution with parameters C>, <. Exercise 2. The parameters c>, >et >arefixedthroughout.forevery integer k 2 N, we consider the Cramer-Lundberg model, where the costs of the claims are distributed according to a (k, )distribution.set (k) (u) for the ruin probability of this model. Show that for every u>andeveryk 2 N, (k) (u) apple (k+) (u). Exercise 3. We condider the Cramér-Lundberg model, where the costs of the claims follow an exponential distribution with parameter >. The safety loading is positive. We wish to give en explicit formula for the ruin probability (u).. Show that the exponential distribution is thin tailed and compute the corresponding adjustment coe cient R. 2. Derive a good upper bound for the ruin probability thanks to Lundberg inequality. 3. Write the renewal equation satisfied by u 7! e Ru (u). 4. Using the renewal theorem, solve the equation and compute (u) as a function of, and u.

Exercise 4. We consider the setting of the Cramer-Lundberg model, where the costs X i,i followaparetodistributionwithindex >, =,i.e. F X (x) =(+x), x.. Compute µ = E[X ]andtheassociatedsafetyloading. For which values c do we have >? 2. Show that R e ux F X,I(dx) = for every u>. Derive that F X,I is not thin tailed. 3. Show that F X,I is subexponential. What can we say about the ruin probability (u) asu!? Exercise 5. WeworkintheCramer-Lundbergsetting. Partie A. The r.v. X i,i thatmodelthecostclaimshaveadensity f(x) = 2 p x e px {x>}.. Compute µ = E[X ]andf X (x),x. 2. For every x, set F X,I(x) =µ R x F X (y)dy and q(x) = F X (x)/µ F X,I(x). (a) Show that Z e py dy =2e px ( p x +), 8x, x and derive a simple expression for q(x). (b) Derive that F X,I is the cumulative distribution function of a subexponential distribution. 3. Give an equivalent of the ruin probability (u) asu!.expressthisequivalent as a function of f and the parameters c,. Partie B. We now assume that the X i,i havedensity. Show that µ = p /2. 2. Show that X is thin tailed. g(x) =2xe x2 {x>}. 3. Prove the existence of the adjustment coe cient R. 4. Express the integral R ye Ry y2 dy as a function of c, and R.Deriveanexpression for R e Ry y2 dy as a function of c, and R. 5. Compute df X,I and give the renewal equation satisfied by the function u 7! e Ru (u), and check that the required conditions are satisfied here. 2

6. Give the asymptotic behaviour of the ruin probability (u)asu!as a function of c,, R and. Exercise 6.. Part An insurer has a risky portfolio with risks which are partitioned into two classes : the big claims, denoted by X i,i andthesmallclaims,denotedbyx 2 i,i. It is moreover assumed that the two kind of risks are independent. The total claim amount of the insurer at time t is denoted by S t = S t + S 2 t where St = P Nt i= X i is the total claim amount of the first kind (big claims) and St 2 = P Nt 2 i= X2 i is the total claim amount of the second kind (small claims). The processes (N i ) i=,2 are independent Poisson processes with intensities i, and they are independent of the di erent costs Xi,Xi 2,i. We assume that (Xi,i ) are i.i.d. with distribution F and that the (Xi 2,i ) are i.i.d. with distribution F 2. (a) Compute the value of the moment generating function M S t, of S t,themoment generating function of S 2 t and derive the moment generating function of S t. (b) Check that S is a compound Poisson process that will be written in the form XN t S t = Y i, t, i= where N is a Poisson process with intensity = + 2 and Y i,i arei.i.d. with distribution F being a mixture of of F and F 2.Computethemixture coe cients explicitly. (c) We now assume that F = E( )istheexponentialdistributionwithparameter >andf 2 = Par(, ) is the Pareto distribution with parameters,, with >. Compute in that case the density function f Y,I(y), the function F Y,I(y), the expectation E[Y ]andthecoe cientq(y) = f Y,I(y) F. Y,I(y) (d) Consider the Cramer-Lundberg model U t = u + ct S t, t where u istheinitialwealthofthecompany.weassumethatthesafety loading coe cient is the same for each class and we take as premimu rate c := ( + )E[Y ]; with >. Under the assumption of Question (c), compute c as a function of the model parameters and compute an asymptotic equivalent (u). 3

2. Part 2. The insurer decides to mix the two groups adding an insurance excess a>. This means that the insurer only pays for claims with a cost greater than a threshold a>, and for a claim with cost Z>a,theinsureronlycoverstheamount(Z a) We consider the Cramer -Lundberg model XN t U t = u + ct S t where S t = Yi a and Yi a =(Z i a) + N being a Poisson process with intensity. (a) Compute µ = E[Y a ]=E[(Z a) + ]. when the claims have a cost Z following a E( )distribution. (b) Compute M Y a,themomentgeneratingfunctionofy a, and derive the moment generating function of S t. (c) Show that M St (u) =M S t (u) where i= N t X St = N t being a Poisson process with intensity exp( a) indepenentofthez i s. (d) Derive that the processes S and S have the same distribution. (e) Derive that the risk process U has the same distribution as U defined as i= Z i U t = u + ct S t, t. Show that (u) =P[inf t U t < ] = P[inf t Ut < ] and compute an asymptotic equivalent for (u). 4