Feuille d exercices n 1 : Poisson processes. 1. Give a necessary and su cient condition for having
|
|
- Julia Gilbert
- 5 years ago
- Views:
Transcription
1 Processus de Poisson et méthodes actuarielles (25-26) Feuille d exercices n : Poisson processes Exercise. Let ( n,n ) be a sequence of IID (independent and identically distributed) non-negative random variables. Set T n = n, n, (T =)and N t =#{i :T i apple t}, t.. Give a necessary and su cient condition for having P(N only makes jumps of size ) >. 2. Show that under this condition P(N only makes jumps of size ) =. 3. Is it possible that (T n,n ) converges to a finite limit with positive probability? 4. Compute the probability of the event {9t :N(t) =}. Exercise 2. Let N be a Poisson process with intensity interpretation of the following properties. P(N h =)= h + o(h) (h! ) 2. P(N h 2) = o(h) (h! ) 3. P(N h =)= h + o(h) (h! ). 4. 8t, P(N jumps at time t) =. 5. Compute Cov(N s,n t ), 8s, t. >. Prove and give an Exercise 3. Let N be a counting process with stationary and independent increments. Assume that there exists >suchthat For u 2 R, letg t (u) =E[e iunt ]. P(N h =)= h + o(h), P(N h 2) = o(h).. Prove that g t+h (u) =g t (u)g h (u) foreveryt, h.. En anglais le mot positive signifie strictement positif. Pour dire positif au sens large on dit nonnegative. De même les termes negative, bigger, smaller sont à prendre au sens strict.
2 2. Prove that 3. Conclude. d g dt t(u) = (e iu )g t, g (u) =. Exercise 4. Let N be a Poisson process with intensity >, modelling the arrival times of the claims for an insurance company. Let T denote the arrival time of the first claim. Show that the conditional law of T given N t =isuniformlydistributedover[,t]. Exercise 5. Let (T n,n ) (T =)bearenewalprocessandn its associated counting process. Assume that N has independent and stationary increments.. Show that 2. Derive that N is a Poisson process. Exercise 6. P(T >s+ t) =P(T >t)p(t >s), 8s, t.. Show that two independent Poisson processes cannot jump simultaneously a.s. 2. Let N and N 2 be two independent Poisson processes with parameters > and 2 respectively. Show that the process N t = N t + N 2 t, t is a Poisson process and give its intensity. 3. Derive that the sum of n independent Poisson processes with respective intensities >,..., n > isapoissonprocessandgiveitsintensity. Exercise 7. Insects fall into a soup bowl according to a Poisson process N with intensity >(theevent{n t = n} means that there are n insects in the bowl at time t). Assume that every insect is green with probability p 2 (, ) and that its colour is independent of the colour of the other insects. Show that the number of green insects that fall into the bowl, as a function of time, is a Poisson process with intensity p. Exercise 8. Liver transplants arrive at an operating block following a Poisson process N with intensity >. Two patients wait for a transplant. The first patient has lifetime T (before the transplant) according to an exponential distribution with parameter µ.the second one has lifetime T (before the transplant) according to an exponential distribution wity parameter µ 2.Theruleisthatthefirsttransplantarrivaltothehospitalisgiven to the first patient if still alive, and to the second patient otherwise. Assume that T,T and N are independent.. Compute the probability that the second patient is transplanted. 2. Compute the probability that the first patient is transplanted. 3. Let X denote the number of transplants arrived at the hospital during [,T]. Compute the law of X. 2
3 Exercise 9. The Bus paradox. Buses arrive at a given bus stop according to a Poisson process with intensity >. You arrive at the bus stop at time t.. Give a first guess for the value of the average waiting time before the following bus arrives? 2. Let A t = T Nt+ t be the waiting time before the next bus, and let B t = t T Nt denote the elapsed time since the last bus arrival. Compute the joint distribution of (A t,b t )(hint:computefirstp(a t x,b t x 2 )forx,x 2 ). 3. Derive that the random variables A t and B t are independent. What are their distributions? 4. In particular, compute E[A t ]. Compare with your initial first guess. Exercise. Law of large numbers and central limit theorem.. Recall and prove a law of large numbers for a Poisson process with intensity >. 2. Prove that N satisfies the following central limit theorem N t t p law! N (, ) as t!, t (a) by using characteristic functions (b) by showing first that (N n n) / p n converges in distribution as n!and then max t2[n,n+) (N t N n ) / p n! inprobability. Exercise.. Give an expression for the density function of the conditional distribution of (T,...,T n ) given N t = n when N is a Poisson process with intensity jump times. and <T <...<T n <...are its 2. Derive an expression for the density of T i given N t = n, 8 apple i apple n and similarly for (T i,t j )givenn t = n, 8 apple i<japple n. 3. Set U i,j = T j T i,apple i<japple n. GiveanexpressionforthedensityofU i,j given N t = n. DeriveanexpressionforthedensityofT n T n given N t = n. 3
4 Exercise 2. Martingales. Let X =(X t,t ) be a continuous time process and F =(F t,t ) a filtration, i.e.anestedfamilyofsigma-fieldsf s F t A8sapplet, where A is the sigma-field on the probability space (, A, P) overwhichx is defined. The process X is a martingale with respect to the filtration F if X t is F t -measurable and integrable 8t and E[X t F s ]=X s, 8 apple s apple t. Let N =(N t,t ) be a Poisson process with intensity >. Show that the three processes. (N t t, t ) ; 2. ((N t t) 2 t),t ) ; 3. (exp(un t + t( e u )),t ) (for a given real number u); are martingales with respect to the filtration generated by N, i.e. F N t = (N s,sapple t). Exercise 3. Let N be a Poisson process with intensity >andlet<t <...< T n <...denote its jump times.. Show that T n /n converges almost surely as n!and identify its limit. 2. Show that P i T 2 i converges almost surely. Let X denote its limit. 3. Show that X Nt = P N(t) i= T 2 i! X a.s. as t!. 4. Let (U i,i ) denote a sequence of independent uniform random variables on [, ]. We admit the following result n 2 nx i= U 2 i law! n! Z, where Z is a positive random variable, whose Laplace transform is given by E[exp( sz)] = exp( c p s), 8s, for some c>. The goal is to show that X and c Z have same law for some c that we will explicitly compute. We asssume moreover that (U i,i ) is independent of N. (a) Show that for every n andeveryt>, the law of X Nt given N t = n is the same as the law of t P 2 n i= U 2 i. (b) Derive that X N(t) has same distribution as t P 2 N(t) i= U 2 i. (c) Prove that X N(t) 2 N(t) i= U 2 i law! Z as t!. (d) Recall the law of large numbers for Poisson processes and conclude. 5. Derive E[X] =. 4
5 Processus de Poisson et méthodes actuarielles (25-26) Feuille d exercices n 2 : Mixed Poisson process, total claim amount, renewal theory. Exercise. Let Ñ be a mixed Poisson process and denote by < T <...< T n <... its jumps times. Prove that the conditional distribution of ( T,..., T n )givenñ(t) =n (n 2 N \{}) coincideswiththedistributionoftheorderstatisticofn independent random variables with common uniform distribution on [,t]. Exercise 2. ArandomvariableX follows a negative binomial distribution on {,, 2,...} with parameters r>andp 2 (, ) if P(X = k) = (r + k) p r ( p) k, 8k. (r)k! Let Ñ be a mixed Poisson process with mixture distribution (, ). What is the distribution of Ñ(t)? The process Ñ is called a negative binomial process. Thenegative binomial law is also called mixed Poisson or mixed Gamma-Poisson distribution. Exercise 3. Let N =(N t,t ) be a standard Poisson process with intensity >. Let f :[, )! [, ) bealocallyboundedborelfunction.set N(f) t = X i f(t i ) {Ti applet} for t, where the (T i ) i are the jump times of N.. Show that for all t, we have N(f) t < almost-surely. 2. If f(s) = (a,b] (s) where[a, b] [,t], what is the distribution of N( (a,b] ) t? 3. Show that for u, we have E e un(f)t N t = n = Z t n. e ds uf(s) t n 4. Derive E e un(f)t ad find back the result of Question Compute E N(f) t and Var[N(f)t ]. R t 6. Prove that N(f) t f(s)ds is a martingale.
6 Exercise 4. The total claim amount of a portfolio for a year is modelled by X = NX j= C j where N is the number of claims in the year and C j is the cost of the j-th claim. Assume that N follows a mixed Poisson distribution with random parameter, i.e. the conditional distribution of N given = is Poisson( ). Assume moreover that isdistributedaccordingtoa (b, b) distribution,forsomeb>. Assume that the cost of the claims (C j ) j are independent and identically distributed random variables, independent of N.. Compute E( ) and Var( ). 2. Compute E(N) andvar(n). 3. We assume that C Exponential ( ) forsome >. Show that the conditional law of X given N is a Gamma distribution and identify its parameters. What is the pure premium? 4. Show that the conditional law of given (X, N) isindependentofx and that it is a Gamma distribution. Identify its parameters. Exercise 5. Let ( i,i ) be a sequence of i.i.d. real-valued random variables, with second-order moments, independent of the Poisson process N =(N t,t ) with intensity >. For t, we set XN t X t = i.. Show that t X t converges almost surely as t!and identify its limit. We set 2,fort i= M t = p Xt N t µ. N t 3. Compute E e iumt. and derive that Mt converges in distribution as t!.identify its limit. 5. Show that p X t t µ N t t t converges in distribution as t!and identify its limit. Exercise 6. Let (T n ) n be a renewal process and let N t denote its counting function. We assume that the common law of the interarrival times admits a density function f on R (with value on (, ]). For n, we denote by f n the density function of T n and F n its cumulative distribution function.. Putting X t = on {N t =}. 2. Putting M t = on {N t =}. 2
7 . Check that f (t) =f(t) andshowthat Z f n+ (t) = f n (t R s)f(s)ds. Define r(t) =E[N t ]ift andr(t) =fort<. 2. Show that r(t) = P n F n(t) andthatforn F n (t T ). 2, we have P(T n apple t T ) = 3. Show that r satisfies the renewal equation r(t) =F (t)+ Z t r(t s)f(s)ds, t wher F denotes the cumulative distribution fcuntion of the common interarrival times. 3
8 Processus de Poisson et méthodes actuarielles (25-26) Feuille d exercices n 3 : Compound Poisson processes, renewal processes. Exercise. Give a necessary and su to be a standard Poisson process. cient condition for a compound Poisson process Exercise 2. Find the renewal function and the renewal measure of a renewal process when the interarrival times are distributed according to a Gamma distribution with parameters 2, >. Find back the renewal theorem and the key renewal theorem in that case. Exercise 3. An appliance has repeated failures in time and is repaired after each failure. Denote by (X i,i ) the successive durations when the appliance is functioning, and (Y i,i ) the successive durations when the appliance is being repaired. In other words, the appliance is in order in the time interval [,X ), it is being repaired in the time interval [X,X + Y ), then it is in order again in the time interval [X + Y,X + Y + X 2 ) and so on. Assume that the Xis are i.i.d. and that their common law has no atom and satisfies P(X > ) = and E[X ] <. Assume that the Yi s have the same properties and that they are moreover independent of the Xis. Denote by p(t) theprobabilitythattheapplianceisfunctioningattimet.. Let Z = X + Y and let F be the cumulative distribution function of Z.Show that p(t) satisfiestherenewalequation 2. Derive p(t) =P(X >t)+ Z t p(t s)df (s), 8t. lim p(t) = E[X ] t! E[X + Y ]. Exercise 4. Let N be a renewal process and let F (t) =T N(t)+ t denote the time elapsed between t and the time of the N(t)+-threnewal,fort. We denote by P T the duration of the interarrival times and F T its cumulative distribution function.. Show that for every x, the function P(F (t) >x)satifiestherenewalequation P(F (t) >x)= F T (t + x)+ Z t P(F (t u) >x)dp T (u), t. (Hint : start by decomposing P(F (t) >x)intwoterms,accordingtotheevent {T >t} or {T apple t}.)
9 2. Solve this equation when the interarrival times follow an exponential distribution. Exercise 5. In the same setting as Exercise 4, we consider a renewal process with interarrival times distributed according to a Pareto distribution P( >x)= ( + x), x.. Let X be a nonnegative random variable. Show that for all r>, Z rx r P(X >x)dx = E[X r ]. 2. Use the previous identity and the renewal equation satisfied by F (t) toshowthat E[F (t) 2 ]= Z t Z 2x ( + t u + x) dx dm(u), where m(t) =E[N(t)] is the renewal measure of N. 3. Derive that for >3, we have E[F (t) 2 ]! 2 and compute this limit explicitly. Z x( + x) dx as t! 2
10 Processus de Poisson et méthodes actuarielles (25-26) Feuille d exercices n 4 : Ruin theory In all exercises that use the Cramer-Lundberg model, we denote by c> the premimum rate, we denote by > the intensity of the Poisson process that models the number of claims and we denote by u the initial wealth of the insurer. Exercise.. Show that the following distribution are thin tailed : (a) the distribution of a nonnegative bounded random variable. (b) the Gamma distribution. (c) the Weibull distribution, with parameters C>, of a Weibull distribution with parameters C, is f(x) =C x exp( Cx ) {x>}. 2. Show that the following distributions are sub-exponential :. The density function (a) the Pareto distribution with parameters >, > (f(x) = /( + x) +,x>). (b) the Weibull distribution with parameters C>, <. Exercise 2. The parameters c>, >et >arefixedthroughout.forevery integer k 2 N, we consider the Cramer-Lundberg model, where the costs of the claims are distributed according to a (k, )distribution.set (k) (u) for the ruin probability of this model. Show that for every u>andeveryk 2 N, (k) (u) apple (k+) (u). Exercise 3. We condider the Cramér-Lundberg model, where the costs of the claims follow an exponential distribution with parameter >. The safety loading is positive. We wish to give en explicit formula for the ruin probability (u).. Show that the exponential distribution is thin tailed and compute the corresponding adjustment coe cient R. 2. Derive a good upper bound for the ruin probability thanks to Lundberg inequality. 3. Write the renewal equation satisfied by u 7! e Ru (u). 4. Using the renewal theorem, solve the equation and compute (u) as a function of, and u.
11 Exercise 4. We consider the setting of the Cramer-Lundberg model, where the costs X i,i followaparetodistributionwithindex >, =,i.e. F X (x) =(+x), x.. Compute µ = E[X ]andtheassociatedsafetyloading. For which values c do we have >? 2. Show that R e ux F X,I(dx) = for every u>. Derive that F X,I is not thin tailed. 3. Show that F X,I is subexponential. What can we say about the ruin probability (u) asu!? Exercise 5. WeworkintheCramer-Lundbergsetting. Partie A. The r.v. X i,i thatmodelthecostclaimshaveadensity f(x) = 2 p x e px {x>}.. Compute µ = E[X ]andf X (x),x. 2. For every x, set F X,I(x) =µ R x F X (y)dy and q(x) = F X (x)/µ F X,I(x). (a) Show that Z e py dy =2e px ( p x +), 8x, x and derive a simple expression for q(x). (b) Derive that F X,I is the cumulative distribution function of a subexponential distribution. 3. Give an equivalent of the ruin probability (u) asu!.expressthisequivalent as a function of f and the parameters c,. Partie B. We now assume that the X i,i havedensity. Show that µ = p /2. 2. Show that X is thin tailed. g(x) =2xe x2 {x>}. 3. Prove the existence of the adjustment coe cient R. 4. Express the integral R ye Ry y2 dy as a function of c, and R.Deriveanexpression for R e Ry y2 dy as a function of c, and R. 5. Compute df X,I and give the renewal equation satisfied by the function u 7! e Ru (u), and check that the required conditions are satisfied here. 2
12 6. Give the asymptotic behaviour of the ruin probability (u)asu!as a function of c,, R and. Exercise 6.. Part An insurer has a risky portfolio with risks which are partitioned into two classes : the big claims, denoted by X i,i andthesmallclaims,denotedbyx 2 i,i. It is moreover assumed that the two kind of risks are independent. The total claim amount of the insurer at time t is denoted by S t = S t + S 2 t where St = P Nt i= X i is the total claim amount of the first kind (big claims) and St 2 = P Nt 2 i= X2 i is the total claim amount of the second kind (small claims). The processes (N i ) i=,2 are independent Poisson processes with intensities i, and they are independent of the di erent costs Xi,Xi 2,i. We assume that (Xi,i ) are i.i.d. with distribution F and that the (Xi 2,i ) are i.i.d. with distribution F 2. (a) Compute the value of the moment generating function M S t, of S t,themoment generating function of S 2 t and derive the moment generating function of S t. (b) Check that S is a compound Poisson process that will be written in the form XN t S t = Y i, t, i= where N is a Poisson process with intensity = + 2 and Y i,i arei.i.d. with distribution F being a mixture of of F and F 2.Computethemixture coe cients explicitly. (c) We now assume that F = E( )istheexponentialdistributionwithparameter >andf 2 = Par(, ) is the Pareto distribution with parameters,, with >. Compute in that case the density function f Y,I(y), the function F Y,I(y), the expectation E[Y ]andthecoe cientq(y) = f Y,I(y) F. Y,I(y) (d) Consider the Cramer-Lundberg model U t = u + ct S t, t where u istheinitialwealthofthecompany.weassumethatthesafety loading coe cient is the same for each class and we take as premimu rate c := ( + )E[Y ]; with >. Under the assumption of Question (c), compute c as a function of the model parameters and compute an asymptotic equivalent (u). 3
13 2. Part 2. The insurer decides to mix the two groups adding an insurance excess a>. This means that the insurer only pays for claims with a cost greater than a threshold a>, and for a claim with cost Z>a,theinsureronlycoverstheamount(Z a) We consider the Cramer -Lundberg model XN t U t = u + ct S t where S t = Yi a and Yi a =(Z i a) + N being a Poisson process with intensity. (a) Compute µ = E[Y a ]=E[(Z a) + ]. when the claims have a cost Z following a E( )distribution. (b) Compute M Y a,themomentgeneratingfunctionofy a, and derive the moment generating function of S t. (c) Show that M St (u) =M S t (u) where i= N t X St = N t being a Poisson process with intensity exp( a) indepenentofthez i s. (d) Derive that the processes S and S have the same distribution. (e) Derive that the risk process U has the same distribution as U defined as i= Z i U t = u + ct S t, t. Show that (u) =P[inf t U t < ] = P[inf t Ut < ] and compute an asymptotic equivalent for (u). 4
Poisson Processes. Stochastic Processes. Feb UC3M
Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written
More informationOptional Stopping Theorem Let X be a martingale and T be a stopping time such
Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10
More informationModelling the risk process
Modelling the risk process Krzysztof Burnecki Hugo Steinhaus Center Wroc law University of Technology www.im.pwr.wroc.pl/ hugo Modelling the risk process 1 Risk process If (Ω, F, P) is a probability space
More informationE[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =
Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of
More informationUniversity Of Calgary Department of Mathematics and Statistics
University Of Calgary Department of Mathematics and Statistics Hawkes Seminar May 23, 2018 1 / 46 Some Problems in Insurance and Reinsurance Mohamed Badaoui Department of Electrical Engineering National
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationFormulas for probability theory and linear models SF2941
Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms
More informationf X (x) = λe λx, , x 0, k 0, λ > 0 Γ (k) f X (u)f X (z u)du
11 COLLECTED PROBLEMS Do the following problems for coursework 1. Problems 11.4 and 11.5 constitute one exercise leading you through the basic ruin arguments. 2. Problems 11.1 through to 11.13 but excluding
More informationProbability Models. 4. What is the definition of the expectation of a discrete random variable?
1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More information1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =
1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find
More information1 Poisson processes, and Compound (batch) Poisson processes
Copyright c 2007 by Karl Sigman 1 Poisson processes, and Compound (batch) Poisson processes 1.1 Point Processes Definition 1.1 A simple point process ψ = {t n : n 1} is a sequence of strictly increasing
More informationExam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)
Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax
More informationStability of the Defect Renewal Volterra Integral Equations
19th International Congress on Modelling and Simulation, Perth, Australia, 12 16 December 211 http://mssanz.org.au/modsim211 Stability of the Defect Renewal Volterra Integral Equations R. S. Anderssen,
More informationRandom Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.
Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example
More information1 Some basic renewal theory: The Renewal Reward Theorem
Copyright c 27 by Karl Sigman Some basic renewal theory: The Renewal Reward Theorem Here, we will present some basic results in renewal theory such as the elementary renewal theorem, and then the very
More informationIEOR 4703: Homework 2 Solutions
IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially
More informationSTA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/
STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This
More informationRMSC 2001 Introduction to Risk Management
RMSC 2001 Introduction to Risk Management Tutorial 4 (2011/12) 1 February 20, 2012 Outline: 1. Failure Time 2. Loss Frequency 3. Loss Severity 4. Aggregate Claim ====================================================
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationManual for SOA Exam MLC.
Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition
More informationRMSC 2001 Introduction to Risk Management
RMSC 2001 Introduction to Risk Management Tutorial 4 (2011/12) 1 February 20, 2012 Outline: 1. Failure Time 2. Loss Frequency 3. Loss Severity 4. Aggregate Claim ====================================================
More informationMoments. Raw moment: February 25, 2014 Normalized / Standardized moment:
Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationLecture 5: Moment generating functions
Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf
More informationDistributions of Functions of Random Variables. 5.1 Functions of One Random Variable
Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique
More informationSTA 256: Statistics and Probability I
Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested
More informationS n = x + X 1 + X X n.
0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each
More information1 Review of di erential calculus
Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts
More informationLecture Notes on Risk Theory
Lecture Notes on Risk Theory February 2, 21 Contents 1 Introduction and basic definitions 1 2 Accumulated claims in a fixed time interval 3 3 Reinsurance 7 4 Risk processes in discrete time 1 5 The Adjustment
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationThe concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.
The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes
More informationApplying the proportional hazard premium calculation principle
Applying the proportional hazard premium calculation principle Maria de Lourdes Centeno and João Andrade e Silva CEMAPRE, ISEG, Technical University of Lisbon, Rua do Quelhas, 2, 12 781 Lisbon, Portugal
More informationDiscrete Distributions Chapter 6
Discrete Distributions Chapter 6 Negative Binomial Distribution section 6.3 Consider k r, r +,... independent Bernoulli trials with probability of success in one trial being p. Let the random variable
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More information1 Delayed Renewal Processes: Exploiting Laplace Transforms
IEOR 6711: Stochastic Models I Professor Whitt, Tuesday, October 22, 213 Renewal Theory: Proof of Blackwell s theorem 1 Delayed Renewal Processes: Exploiting Laplace Transforms The proof of Blackwell s
More informationCIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc
CIMPA SCHOOL, 27 Jump Processes and Applications to Finance Monique Jeanblanc 1 Jump Processes I. Poisson Processes II. Lévy Processes III. Jump-Diffusion Processes IV. Point Processes 2 I. Poisson Processes
More informationPoint Process Control
Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued
More informationEDRP lecture 7. Poisson process. Pawe J. Szab owski
EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment
More information3 Conditional Expectation
3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationApproximating the Integrated Tail Distribution
Approximating the Integrated Tail Distribution Ants Kaasik & Kalev Pärna University of Tartu VIII Tartu Conference on Multivariate Statistics 26th of June, 2007 Motivating example Let W be a steady-state
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationQuantile-quantile plots and the method of peaksover-threshold
Problems in SF2980 2009-11-09 12 6 4 2 0 2 4 6 0.15 0.10 0.05 0.00 0.05 0.10 0.15 Figure 2: qqplot of log-returns (x-axis) against quantiles of a standard t-distribution with 4 degrees of freedom (y-axis).
More informationPoisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.
Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,
More informationChapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University
Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real
More informationContinuous random variables
Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot
More informationTheory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk
Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments
More informationChing-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12
Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution
More informationErrata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA
Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Effective July 5, 3, only the latest edition of this manual will have its
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationRenewal theory and its applications
Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationActuarial Science Exam 1/P
Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,
More informationA Ruin Model with Compound Poisson Income and Dependence Between Claim Sizes and Claim Intervals
Acta Mathematicae Applicatae Sinica, English Series Vol. 3, No. 2 (25) 445 452 DOI:.7/s255-5-478- http://www.applmath.com.cn & www.springerlink.com Acta Mathema cae Applicatae Sinica, English Series The
More informationSTAT 414: Introduction to Probability Theory
STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises
More informationRuin Theory. A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science at George Mason University
Ruin Theory A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science at George Mason University by Ashley Fehr Bachelor of Science West Virginia University, Spring
More informationASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata
ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata
More informationContinuous distributions
CHAPTER 7 Continuous distributions 7.. Introduction A r.v. X is said to have a continuous distribution if there exists a nonnegative function f such that P(a X b) = ˆ b a f(x)dx for every a and b. distribution.)
More informationAnalysis of the ruin probability using Laplace transforms and Karamata Tauberian theorems
Analysis of the ruin probability using Laplace transforms and Karamata Tauberian theorems Corina Constantinescu and Enrique Thomann Abstract The classical result of Cramer-Lundberg states that if the rate
More information. Find E(V ) and var(v ).
Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number
More informationMath 576: Quantitative Risk Management
Math 576: Quantitative Risk Management Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 11 Haijun Li Math 576: Quantitative Risk Management Week 11 1 / 21 Outline 1
More informationMultivariate Risk Processes with Interacting Intensities
Multivariate Risk Processes with Interacting Intensities Nicole Bäuerle (joint work with Rudolf Grübel) Luminy, April 2010 Outline Multivariate pure birth processes Multivariate Risk Processes Fluid Limits
More informationAsymptotics of random sums of heavy-tailed negatively dependent random variables with applications
Asymptotics of random sums of heavy-tailed negatively dependent random variables with applications Remigijus Leipus (with Yang Yang, Yuebao Wang, Jonas Šiaulys) CIRM, Luminy, April 26-30, 2010 1. Preliminaries
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationChapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations
Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models
More informationContinuous Random Variables
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationIntroduction to Rare Event Simulation
Introduction to Rare Event Simulation Brown University: Summer School on Rare Event Simulation Jose Blanchet Columbia University. Department of Statistics, Department of IEOR. Blanchet (Columbia) 1 / 31
More informationSTAT 380 Markov Chains
STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring 2016 1 / 38 1/41 PoissonProcesses.pdf (#2) Poisson Processes
More informationIt can be shown that if X 1 ;X 2 ;:::;X n are independent r.v. s with
Example: Alternative calculation of mean and variance of binomial distribution A r.v. X has the Bernoulli distribution if it takes the values 1 ( success ) or 0 ( failure ) with probabilities p and (1
More informationSolution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.
Solutions Stochastic Processes and Simulation II, May 18, 217 Problem 1: Poisson Processes Let {N(t), t } be a homogeneous Poisson Process on (, ) with rate λ. Let {S i, i = 1, 2, } be the points of the
More informationModèles de dépendance entre temps inter-sinistres et montants de sinistre en théorie de la ruine
Séminaire de Statistiques de l'irma Modèles de dépendance entre temps inter-sinistres et montants de sinistre en théorie de la ruine Romain Biard LMB, Université de Franche-Comté en collaboration avec
More informationFinite-time Ruin Probability of Renewal Model with Risky Investment and Subexponential Claims
Proceedings of the World Congress on Engineering 29 Vol II WCE 29, July 1-3, 29, London, U.K. Finite-time Ruin Probability of Renewal Model with Risky Investment and Subexponential Claims Tao Jiang Abstract
More informationp. 6-1 Continuous Random Variables p. 6-2
Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationRandom variables and transform methods
Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are
More informationContinuous-time Markov Chains
Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017
More informationSPRING 2005 EXAM M SOLUTIONS. = When (as given in the problem), (x) dies in the second year from issue, the curtate future lifetime x + 1 is 0, so
SPRING 005 EXAM M SOLUTIONS Question # Key: B Let K be the curtate future lifetime of (x + k) K + k L 000v 000Px :3 ak + When (as given in the problem), (x) dies in the second year from issue, the curtate
More informationSTAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes
STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes This section introduces Lebesgue-Stieltjes integrals, and defines two important stochastic processes: a martingale process and a counting
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationSTAT Chapter 5 Continuous Distributions
STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationBrief Review of Probability
Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More informationASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata
ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata
More informationChapter 4 Parametric Families of Lifetime Distributions
Chapter 4 Parametric Families of Lifetime istributions In this chapter, we present four parametric families of lifetime distributions Weibull distribution, gamma distribution, change-point model, and mixture
More informationExtremes and ruin of Gaussian processes
International Conference on Mathematical and Statistical Modeling in Honor of Enrique Castillo. June 28-30, 2006 Extremes and ruin of Gaussian processes Jürg Hüsler Department of Math. Statistics, University
More informationRandom Variables (Continuous Case)
Chapter 6 Random Variables (Continuous Case) Thus far, we have purposely limited our consideration to random variables whose ranges are countable, or discrete. The reason for that is that distributions
More informationChapter 1. Poisson processes. 1.1 Definitions
Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s
More informationRare-Event Simulation
Rare-Event Simulation Background: Read Chapter 6 of text. 1 Why is Rare-Event Simulation Challenging? Consider the problem of computing α = P(A) when P(A) is small (i.e. rare ). The crude Monte Carlo estimator
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationRuin probabilities in multivariate risk models with periodic common shock
Ruin probabilities in multivariate risk models with periodic common shock January 31, 2014 3 rd Workshop on Insurance Mathematics Universite Laval, Quebec, January 31, 2014 Ruin probabilities in multivariate
More informationA Note On The Erlang(λ, n) Risk Process
A Note On The Erlangλ, n) Risk Process Michael Bamberger and Mario V. Wüthrich Version from February 4, 2009 Abstract We consider the Erlangλ, n) risk process with i.i.d. exponentially distributed claims
More informationThe story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes
The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element
More informationBMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs
Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem
More information