Thinning of Renewal Process. By Nan-Cheng Su. Department of Applied Mathematics National Sun Yat-sen University Kaohsiung, Taiwan, 804, R.O.C.

Size: px
Start display at page:

Download "Thinning of Renewal Process. By Nan-Cheng Su. Department of Applied Mathematics National Sun Yat-sen University Kaohsiung, Taiwan, 804, R.O.C."

Transcription

1 Thinning of Renewal Process By Nan-Cheng Su Department of Applied Mathematics National Sun Yat-sen University Kaohsiung, Taiwan, 804, R.O.C. June 2001

2 Contents Abstract ii 1. Introduction Preliminaries Multinomial Thinning Generalization of Multinomial Thinning References i

3 Abstract In this thesis we investigate thinning of the renewal process. After multinomial thinning from a renewal process A, we obtain the k thinned processes, A i, i = 1,, k. Based on some characterizations of the Poisson process as a renewal process, we give another characterizations of the Poisson process from some relations of expectation, variance, covariance, residual life of the k thinned processes. Secondly, we consider that at each arrival time we allow the number of arrivals to be i.i.d. random variables, also the mass of each unit atom can be split into k new atoms with the ith new atom assigned to the process D i, i = 1,, k. We also have characterizations of the Poisson process from some relations of expectation, variance of the process D i, i = 1,, k. Key words: characterization, exponential distribution, Poisson distribution, Poisson process, renewal process, residual life, thinning process ii

4 Thinning of Renewal Process 1 1 Introduction In this thesis we investigate thinning of the renewal process. Let {X n 0, n 1} be a sequence of independent and identically distributed (i.i.d.) positive random variables with the common distribution function F. F is called life distribution and we assume F (0) = 0 n throughtout this thesis. Let S 0 = 0, S n = X i, n 1. Also let A(t) = sup{n S n t}, t 0. Then A {A(t), t 0} is called a renewal process, X n is called the n-th inter-arrival time of A and the sequence {S n, n 1} is called arrival times of A. The random variables δ t = t S A(t), γ t = S A(t)+1 t, and β t = γ t +δ t, will be called respectively current life at t, residual life at t, and total life at t of the renewal process A. In the above set up, if {X n 0, n 1} are still assumed to be independent, but only X 2, X 3, are identically distributed with the common distribution function F, while X 1 has possibly a different distribution function, say H with H(0) = 0, then {X n 0, n 1} form the inter-arrival times of a delayed renewal process, say A d = {A d (t), t 0}. In particular, if i=1 H(t) = t 0 1 F (u)du E(X 2 ) holds, then the process becomes a stationary renewal process, say A s = {A s (t), t 0}. When F is exponentially distributed, then A becomes a Poisson process. Poisson processes are related to many distributions. The most important ones are uniform, exponential and Poisson distributions. Many interesting properties of the Poisson process are more or less based on the characteristics of these three distributions. For example, it is known that given A(t) = n, S 1 S 2 S n are distributed as the order statistics of n i.i.d random variables with the common uniform distribution on [0, t]. Huang et al. (1993) prove that given a function G, under mild conditions, as long as E(G(δ t ) A(t) = n) = E(G(X 1 ) A(t) = n), t 0, (1)

5 Thinning of Renewal Process 2 holds for a single positive integer n, then A is a Poisson process. Li et al. (1994) and Huang and Su (1997) prove that for some fixed n and k, k n, if E(S r k A(t) = n) = at r and E(S s k A(t) = n) = bt s, t 0, (2) for certain pairs of r and s, where a and b are independent of t, then A has to be a Poisson process. Note that the fact that (1) and (2) hold for A to be a Poisson process are based on the so called order statistics property. When A is a Poisson process, not only X n, but also γ t, independent of t, is exponentially distributed. This is resulted from the memoryless property of the exponential distribution. Gupta and Gupta (1986) prove that given a function G, under certain conditions, if E(G(γ t )) = c, t 0, where c > 0 is a constant, then A is a Poisson process. If Var(γ t ) is a constant and F is assumed to be continuous, Huang and Li (1993) prove that F is the exponential distribution function. Obviously, when A is a Poisson process, it is true that Var(γ t ) = E 2 (γ t ), t 0. Conversely, Huang and Chang (2000) prove that the equality Var(γ t ) = E 2 (γ t ), t 0, implies A is a Poisson process. Poisson process also has the superposition property. That is the superposition of independent Poisson processes is still a Poisson process. Conversely, if A is a Poisson process, independent thinned processes can be obtained through thinning operations on A. Based on the above Poisson characteristic, Chandramohan et al. (1985), Chandramohan and Liang (1985) and Huang (1989) prove that the existence of a pair of uncorrelated thinned processes implies the renewal process is Poisson. As a by-product, Chandramohan et al. (1985) prove that for a stationary renewal process A s, Var(A s (t)) = E(A s (t)), t 0, holds if and only if A s is a Poisson process. A similar result for an ordinary renewal process is given by Chen (1994). Note that mean and variance are equal is also an interesting

6 Thinning of Renewal Process 3 property enjoyed by the Poisson random variable. Furthermore, we introduce some important results about thinning of point process. Let N be a point process on R +. Then we obtain a thinned point process N p by retaining every point of the process with a certain constant probability p and deleting it with probability 1 p, independently of all other points and independently of the point process. N p is called the p-thinning of N, N is the p-inverse of N p and q = 1 p is the thinning probability. The p-inverse of any thinned point process is unique in distributional meaning (cf. Thedéen, 1986) and it will be also called the original process. Both the Cox (also called doubly stochastic Poisson) processes and the renewal processes are natural generalizations of Poisson processes. There is a basic characterization of Cox processes due to Mecke (1968). It says that a point process is a Cox process if and only if, for every p in (0, 1), there exists a point process which is the p-inverse of the process. Yannaros (1988b) gives some necessary and sufficient conditions for an ordinary or delayed gamma renewal process to be Cox. When F is gamma distributed, renewal process is called gamma renewal process. Yannaros proves that an ordinary gamma renewal process is a Cox process if and only if 0 < α 1, where α is a shape parameter of gamma distribution. For definitions and properties of Cox processes see Grandell (1976) or Karr (1986). Yannaros (1989) gives the following result. If a renewal process is a Cox process, then it cannot be the thinning of some non-renewal process, and all the possible original processes are p-thinnings of other renewal processes for every thinning parameter p (0, 1). This property characterizes the processes which are both Cox and renewal. Yannaros (1988a) proves that a renewal process is not the p-thinning of some non-renewal process for any p < 1, and the inverse of a thinned renewal process has to be renewal. In this thesis based on the above results, we present some characterizations of the Poisson process from thinning of renewal process.

7 Thinning of Renewal Process 4 2 Preliminaries Let M(t) = E(A(t)), and supp(g) denote the support of the function G. The following result is proved by Huang and Chang (2000). Theorem 1. Let E(X 1 ) <, M(t) = E(A(t)) be continuous with inf supp (M) = 0. Also assume M +(0) exists and that Var(A(t)) = be(a(t)), t 0, where b is a constant. Then b = 1 and F is exponential. The next theorem is for the stationary renewal process. It is also proved by Huang and Chang (2000). Note that it is implicitly assumed that the common inter-arrival time distribution function F is not periodic for a stationary renewal process. Theorem 2. Let Var(A s (t)) = be(a s (t)), t 0, where b is a constant. Then b = 1 and {A s (t), t 0}, is a Poisson process. As mentioned in Section 1, Huang and Li (1993) prove the following theorem which is a characterization of the Poisson process. Theorem 3. Assume F is an absolutely continuous distribution function with density function F (t), t 0. Then Var(γ(t)) is independent of t, if and only if F is exponentially distributed.

8 Thinning of Renewal Process 5 The following lemma has been proved by Huang and Li (1993), which gives an expression for the covariance of γ i (t) and γ j (t). Lemma 1. Assume Var(X 1 ) <. For every i j and t 0, Cov(γ i (t), γ j (t)) = Var(γ(t)) E 2 (X 1 ) + 1 p i p j p i + p j (Var(X 1 ) E 2 (X 1 )). (3) 3 Multinomial Thinning Now let {ξ n, n 1} be a sequence of i.i.d. random variables with distribution given k by P (ξ n = i) = p i, i = 1,, k, where 0 < p i < 1 and p i = 1. Thus we obtain the k thinned processes, {A i (t), t 0}, i = 1,, k, as follows: A i (t) = I (Sn t)i (ξn=i), n=1 i=1 by thinning the renewal process A through {ξ n, n 1}. On the other hand, the n-th point of A is assigned to the process A i, i = 1,, k, respectively, according to ξ n = i,i = 1,, k. Obviously, E(A i (t)) = p i p 1 j E(A j (t)), i j, i.e. E(A i (t)) is proportional to E(A j (t)), t 0, i j, for every renewal process A. For the case k = 2, Huang and Chang (2000) stated and proved that if Var(A 1 (t)) = cvar(a 2 (t)), t 0, where c is a constant, then A is a Poisson process and c = p 1 /(1 p 1 ). Based on Theorem 1, we investigate characterizations of A to be a Poisson process from the relationship between variances or expectations of A i and A j, i j. We give the general case in the following. Theorem 4. If, for some fixed i, j, i j, Var(A i (t)) = cvar(a j (t)), t 0, (4)

9 Thinning of Renewal Process 6 where c is a constant, then A is a Poisson process and c = p i /p j. Proof. The following arguments are very similar to the proof of Corollary 1 of Huang and Chang (2000). We know that the conditional distribution of A i (t) given A(t) is Binomial(A(t), p i ), i = 1,, k. First it is trivial to obtain Var(A i (t)) = p i (1 p i )E(A(t)) + p 2 i Var(A(t)), Var(A j (t)) = p j (1 p j )E(A(t)) + p 2 jvar(a(t)). Substituting these into (4), yields Var(A(t)) = cp j(1 p j ) p i (1 p i ) E(A(t)). p 2 i cp 2 j The assertions then follow from Theorem 1 immediately. The following consequence is immediate. Hence we omit the prove. As expected, there is a similar corollary based on Theorem 2 for the case of stationary renewal process. We also omit the statement of this result. Corollary 1. If, for some fixed i, j, i j, Var(A i (t)) = ce(a j (t)), t 0, where c is a constant, then A is a Poisson process and c = p i /p j. As mentioned in Section 1, Chandramohan et al. (1985), Chandramohan and Liang (1985) and Huang (1989) investigate characterizations of A to be a Poisson process from uncorrelatedness of A i and A j, for some i j. Let γ i (t) denote the residual life at time t of the renewal process A i, i = 1,, k. From a different direction Huang and Li (1993) prove

10 Thinning of Renewal Process 7 that if {A(t), t 0} is a Poisson process, then for every i j, Cov(γ i (t), γ j (t)) = 0, t 0. Conversely, Huang and Li (1993) also have the result that if there exist i j such that Cov(γ i (t), γ j (t)) = 0, t 0, then {A(t), t 0} is a Poisson process. Based on Theorem 3 and Lemma 1 we are now ready to state and prove the next corollary. Corollary 2. For some fixed i, j, m, n, where (i, j) (m, n). Let Cov(γ i (t), γ j (t)) 0. Assume, for some constant b 0, Cov(γ i (t), γ j (t)) = bcov(γ m (t), γ n (t)), t 0. (5) Then (i) b 1 implies A is a Poisson process. (ii) b = 1 implies p i + p j = p m + p n. Proof. Based on Lemma 1, we substitute (3) in (5) and obtain Var(γ(t)) = E 2 (X 1 ) + (1 + b )(Var(X 1 ) E 2 (X 1 )), 1 b p m + p n 1 b p i + p j where b 1. As Var(γ(t)) equals to a constant, from Theorem 3 we obtain A is a Poisson process. This proves part (i). Now assume b = 1, we substitute (3) into (5) again. After some simplifications we obtain 1 p i p j p i + p j = 1 p m p n p m + p n, and part (ii) follows. Now we want to see how to use Var(γ i (t)) to characterize the process. The following lemma gives an expression for the variance of γ i (t). Lemma 2. Assume Var(X 1 ) <. Then for every i and t 0, Var(γ i (t)) = Var(γ(t)) + 1 p i p i Var(X 1 ) + 1 p i E 2 (X p 2 1 ). (6) i

11 Thinning of Renewal Process 8 Proof. The following arguments are similar to those of Lemma 1. First, for convenience we define It is easy to see h I (ξr i) = 1 and r=k h X m = 0 if k > h. m=k γ i (t) = From (7) we obtain h=a(t)+1 (γ(t) + S h S A(t)+1 )I (ξh =i) h 1 r=a(t)+1 I (ξr i). (7) We also obtain γ 2 i (t) = = = = = h=a(t)+1 h=a(t)+1 +2 E[γ i (t)] = E[γ(t)] + 1 p i p i E[X 1 ]. (8) (γ(t) + S h S A(t)+1 )I (ξh =i) (γ(t) + S h S A(t)+1 )I (ξh =i) h=a(t)+1 l=h+1 (γ(t) + S l S A(t)+1 )I (ξl =i) h=a(t)+1 h=a(t)+1 h=a(t)+1 γ(t) + γ 2 (t) + 2γ(t) γ 2 (t) + 2γ(t) I (ξh =i) h 1 I (ξr i). r=a(t)+1 Here we have used the fact l 1 r=a(t)+1 h 1 r=a(t)+1 h 1 r=a(t)+1 (γ(t) + S h S A(t)+1 )I (ξh =i) 2 I (ξr i) 2 I (ξr i) h 1 r=a(t)+1 I (ξr i) l 1 I (ξr i) r=a(t)+1 2 h h 1 X m I (ξh =i) I (ξr i) m=a(t)+2 r=a(t)+1 2 h h h 1 X m + X m I(ξh=i) I (ξr i) m=a(t)+2 m=a(t)+2 r=a(t)+1 h h h 1 h I (ξr i) = m=a(t)+2 X m + m=a(t)+2 X 2 m + 2 h 1 l 1 I (ξr i)i (ξh i) r=a(t)+1 s=h+1 m=a(t)+2 n=m+1 I (ξs i). X m X n

12 Thinning of Renewal Process 9 Taking the expectation of γi 2 (t), we obtain E[γi 2 (t)] = p i (1 p i ) h 1 A(t) E[γ 2 (t)] + 2E[γ(t)]E[ = h=a(t)+1 +E[ h m=a(t)+2 h=a(t)+1 X 2 m] + 2E[ h 1 h m=a(t)+2 n=m+1 X m X n ] h m=a(t)+2 X m ] p i (1 p i ) h 1 A(t) ( E[γ 2 (t)] + 2(h A(t) 1)E[γ(t)]E[X 1 ] +(h A(t) 1)E[X 2 1] + (h A(t) 2)(h A(t) 1)E 2 [X 1 ] ) = p i (1 p i ) ( h E[γ 2 (t)] + 2hE[γ(t)]E[X 1 ] + he[x1] 2 + h(h 1)E 2 [X 1 ] ) h=0 = p i (1 p i ) ( h E[γ 2 (t)] + 2hE[γ(t)]E[X 1 ] + hvar(x 1 ) + h 2 E 2 [X 1 ] ) h=0 = p i E[γ 2 (t)] (1 p i ) h + 2p i E[γ(t)]E[X 1 ] h(1 p i ) h h=0 h=0 +p i Var(X 1 ) h(1 p i ) h + p i E 2 [X 1 ] h 2 (1 p i ) h h=0 h=0 = E[γ 2 (t)] + 2(1 p i) E[γ(t)]E[X 1 ] + 1 p i Var(X 1 ) p i p i + (2 p i)(1 p i ) E 2 [X p 2 1 ]. (9) i Finally, it follows from (8) and (9) that as desired. Var(γ i (t)) = E[γ 2 i (t)] E 2 [γ i (t)] = Var(γ(t)) + 1 p i p i Var(X 1 ) + 1 p i E 2 (X p 2 1 ), i Having given Theorem 3 and Lemma 2 we are now ready to state and prove the next corollary. Corollary 3. For some fixed i, j, i j. Assume, for some constant b > 0, Var(γ i (t)) = bvar(γ j (t)), t 0. (10)

13 Thinning of Renewal Process 10 Then (i) b 1 implies A is a Poisson process. (ii) b = 1 implies p i = p j. Proof. By using (6), (10) becomes Var(γ(t)) = ( b 1 )Var(X 1 ) b p i 1 b p j 1 b ( 1 b 1 p i p j p 2 i + b )E 2 (X p 2 1 ), j where b 1. Again Theorem 3 implies part (i). Now assume b = 1, substituting (6) in (10) and after some simplifications, we obtain p i p j Var(X 1 ) + (p i p j )(p i p i p j + p j ) E 2 (X p i p j p 2 i p 2 1 ) = 0. j By using the fact that Var(X 1 ) > 0 and E 2 (X 1 ) > 0, p i = p j is the only possibility. This proves part (ii). p j p 1 i Note that when A is a Poisson process, Var(γ i (t)) = p 2 jp 2 i Var(γ j (t)), E(γ i (t)) = E(γ j (t)) and Var(γ i (t)) = p 2 jp 2 E 2 (γ j (t)), i j, hold. Next we give another consequence. Again the proof is omitted. i Corollary 4. For some fixed i. Assume, for some constant b > 0, Var(γ i (t)) = bvar(γ(t)), t 0. Then (i) b 1 implies A is a Poisson process. (ii) b = 1 implies p i = 1. 4 Generalization of Multinomial Thinning Consider a renewal process A {A(t), t 0} defined on a probability space (Ω, F, P ), where it is assumed that (i) the state space consists of non-negative integers, (ii) A(0) =

14 Thinning of Renewal Process 11 0, A(t) <, a.s., and (iii) A is a separable point process with non-decreasing rightcontinuous sample paths with unit step jumps. Let {T n, n 1} be the sequence of arrival times of A. Assume there are X n arrivals at time T n, where {X n, n 1} are i.i.d. random variables with the same distribution as some random variable, say X, which only takes on positive integers. For each arrival, independent of everything else, split the mass of this unit atom into k new atoms according to β, where β = (β 1, β 2,, β k ) is a random vector in R k such that P (β 1 + β β k = 1) = 1, so that for i = 1, 2,, k, the ith new atom has (possibly negative) mass β i and is assigned to the process D i {D i (t), t 0}. Thus D i (t) can be represented as D i (t) = A(t) X n n=1 m=1 β imn, (11) where { β mn }, with β mn = (β 1mn, β 2mn,, β kmn ), is a two-dimensional array of i.i.d. random vectors with the same distribution as β. Again let E k denote the set {e 1,, e k }, where e i is the element of R k with a 1 in the ith position and zeros elsewhere. The case in section 2 is for β E k. Let E(X) = µ, E(β i ) = p i, i = 1,, k, E(β 2 i ) = γ ii, i = 1,, k. Assume p i <, γ ii < and 1 E(X 2 ) <. Huang (1989) gives an expression for the pointwise covariance of D i and D j. For completeness we supply a proof here. Lemma 3. Var(D i (t)) = γ ii µe[a(t)] + p 2 i E[X(X 1)]E[A(t)]

15 Thinning of Renewal Process 12 +p 2 i µ 2 E[A(t)(A(t) 1)] (p i µe[a(t)]) 2 (12) = p 2 i µ 2 Var(A(t)) + [p 2 i Var(X) p 2 i µ + γ ii µ]e[a(t)]. (13) Proof. First it is easy to obtain E[D i (t)] = p i µe[a(t)]. (14) Now E[Di 2 (t)] A(t) = E A(t) = E 2 X n β imn n=1 m=1 X n n=1 m=1 β 2 imn + A(t) X n n=1 m r β imn β irn + A(t) n s ( Xn ) ( Xs β imn m=1 r=1 β irs ) = µγ ii E[A(t)] + p 2 i E[X(X 1)]E[A(t)] + p 2 i µ 2 E[A(t)(A(t) 1)], (15) where we have used the facts that E[β 2 imn] = E[β 2 i ] = γ ii, E[β imn β irs ] = E[β i ]E[β i ] = p 2 i, (m, n) (r, s), and for n s, E[X n X s ] = (E[X]) 2 = µ 2. Finally, the assertion follows easily from (14) and (15). Obviously, E(D i (t)) = p i p 1 j E(D j (t)), for some 1 i, j k, i j, i.e. E(D i (t)) is proportional to E(D j (t)), t 0, for every renewal process A. Yet given that Var(D i (t)) is proportional to Var(D j (t)), t 0, will imply A is Poisson. We state and prove the consequence of Theorem 1 in the following.

16 Thinning of Renewal Process 13 Corollary 5. If, for some fixed i, j, i j, Var(D i (t)) = cvar(d j (t)), t 0, (16) where c is a constant, then A is a Poisson process and c = γ iiµ + p 2 i (µ 2 µ + Var(X)) γ jj µ + p 2 j(µ 2 µ + Var(X)). Proof. From (12), it is easy to get Var(D i (t)) = p 2 i µ 2 Var(A(t)) + [p 2 i Var(X) p 2 i µ + r ii µ]e(a(t)), Var(D j (t)) = p 2 jµ 2 Var(A(t)) + [p 2 jvar(x) p 2 jµ + r jj µ]e(a(t)). Substituting these into (16), yields ( c(p 2 Var(A(t)) = j Var(X) p 2 jµ + γ jj µ) (p 2 i Var(X) p 2 ) i µ + γ ii µ) E(A(t)). p 2 i µ 2 cp 2 jµ 2 The assertions now follow from Theorem 1 immediately. By using Theorem 1, it can be shown that if Var(D i (t)) is proportional to E(D j (t)), t 0, then A is Poisson process. Corollary 6. If, for some fixed i, j, i j, Var(D i (t)) = ce(d j (t)), t 0, where c is a constant, then A is a Poisson process and c = p2 i µ 2 p 2 i µ + γ ii µ + p 2 i Var(X). p j µ In the following we consider the case of stationary renewal process. Again we assume

17 Thinning of Renewal Process 14 that the common interarrival time distribution function F is not periodic for a stationary renewal process. We give D si (t) = A s(t) X n n=1 m=1 β imn, instead of (11) for the stationary renewal process. We can obtain the following result. Let M s (t) = E(A s (t)) and M k is the k-fold convolution of M with itself. Theorem 5. If, for some fixed i, j, i j, Var(D si (t)) = cvar(d sj (t)), t 0, (17) where c is a constant, then A s is a Poisson process and c = γ iiµ + p 2 i E[X(X 1)] γ jj µ + p 2 je[x(x 1)]. Proof. From Lemma 3, it is easy to obtain Var(D si (t)) = µγ ii E[A s (t)] + p 2 i E[X(X 1)]E[A s (t)] Again for every positive integer k, the identity +p 2 i µ 2 E[A s (t)(a s (t) 1)] (p i µe[a s (t)]) 2. E(A s (t)(a s (t) 1) (A s (t) k + 1)) = k!m s (t) M k 1 (t), t 0, see Lemma 2.1 of Chandramohan et al. (1985), we have E(A s (t)(a s (t) 1)) = 2 t M(t x)dx = 2 t M(x)dx, η 1 η 1 0 where again η 1 = E(X 2 ). We also know that E(A s (t)) = t η 1. Substituting these into (17), yields t µγ ii + p 2 i E[X(X 1)] t + p 2 i µ 2 2 t η 1 η 1 η 1 0 t = cµγ jj + cp 2 η je[x(x 1)] t + cp 2 1 η jµ η 1 0 M(x)dx (p i µ t ) 2 η 1 t 0 M(x)dx c(p j µ t η 1 ) 2.

18 Thinning of Renewal Process 15 Taking the derivatives of both sides with respect to t and simplifying, we obtain that M(t) = c(γ jjµ + p 2 je[x(x 1)]) (γ ii µ + p 2 i E[X(X 1)]) 2µ 2 (p 2 i cp 2 j) Finally letting t = 0, we obtain c = γ iiµ + p 2 i E[X(X 1)] γ jj µ + p 2 je[x(x 1)], and M(t) = t η 1 follows. Consequently F is exponential as required. + t η 1. Note that the above theorem can also be proved by using Theorem 3. For the stationary case, E(D si (t)) = p i p 1 j E(D sj (t)) still holds as in the ordinary case. Similar remark can be applied for Theorem 5 and the following Corollary 7. Corollary 7. Let, for some fixed i, j, i j, Var(D si (t)) = ce(d sj (t)), t 0, where c is a constant. Then {A s (t), t 0} is a Poisson process and c = γ iiµ + p 2 i E[X(X 1)]. p j µ

19 Thinning of Renewal Process 16 References 1. Chandramohan J, Foley RD, Disney RL (1985) Thinning of point processes-covariance analysis. Adv. Appl. Prob. 17, Chandramohan J, Liang LK (1985) Bernoulli, multinomial and Markov chain thinning of some point processes and some results about the superposition of dependent renewal processes. J. Appl. Prob. 22, Chen YH (1994) Classes of life distributions and renewal counting processes. J. Appl. Prob. 23, Grandell J (1976) Doubly Stochastic Poisson Processes. Lect. Notes Math. 529 Springer-Verlag, New York. 5. Gupta PL, Gupta RC (1986) A characterization of the Poisson process. J. Appl. Prob. 23, Huang WJ (1989) On some characterizations of the Poisson process. J. Appl. Prob. 26, Huang WJ, Li SH (1993) Characterizations of the Poisson process using the variance. Commun. Statist.-Theory Meth. 22, Huang WJ, Li SH, Su JC (1993) Some characterizations of the Poisson process and geometric renewal process. J. Appl. Prob. 30, Huang WJ, Su JC (1997) On a study of renewal process connected with certain conditional moments. Sankhyā Ser. A 59, Huang WJ, Chang WC (2000) On a study of the exponential and Poisson characteristics of the Poisson process. Metrika 50,

20 Thinning of Renewal Process Karr AF (1986) Point Processes and Their Statistical Inference. Marcel Dekker, New York. 12. Li SH, Huang WJ, Huang MNL (1994) Characterizations of the Poisson process as a renewal process via two conditional moments. Ann. Inst. Stat. Math. 46, Mecke J (1968) Eine charakteristische Eigenschaft der doppelt stochastischen Poissonschen Prozesse. Z. Wahrschein. und Verw. Geb. 11, Thedéen T (1986) The inverses of thinned point processes. Dept. of Statistics, University of Stockholm, Research Report 1986: Yannaros N (1988a) The Inverses of Thinned Renewal Processes. J. Appl. Prob. 25, Yannaros N (1988b) On Cox Processes and Gamma Renewal Processes. J. Appl. Prob. 25, Yannaros N (1989) On Cox and Renewal Processes. Stat. Prob. Lett. 7,

Renewal Theory and Geometric Infinite Divisibility

Renewal Theory and Geometric Infinite Divisibility ProbStat Models, 2, May-2003, Pages 1-8 Department of Statistics Valiavilakom Prajyoti Niketan College Ookode Pudukkad, Trichur 680 301 Trivandrum 695 020 India. India. esandhya@hotmail.com Received in

More information

WEIBULL RENEWAL PROCESSES

WEIBULL RENEWAL PROCESSES Ann. Inst. Statist. Math. Vol. 46, No. 4, 64-648 (994) WEIBULL RENEWAL PROCESSES NIKOS YANNAROS Department of Statistics, University of Stockholm, S-06 9 Stockholm, Sweden (Received April 9, 993; revised

More information

1 Basic concepts from probability theory

1 Basic concepts from probability theory Basic concepts from probability theory This chapter is devoted to some basic concepts from probability theory.. Random variable Random variables are denoted by capitals, X, Y, etc. The expected value or

More information

The Distributions of Stopping Times For Ordinary And Compound Poisson Processes With Non-Linear Boundaries: Applications to Sequential Estimation.

The Distributions of Stopping Times For Ordinary And Compound Poisson Processes With Non-Linear Boundaries: Applications to Sequential Estimation. The Distributions of Stopping Times For Ordinary And Compound Poisson Processes With Non-Linear Boundaries: Applications to Sequential Estimation. Binghamton University Department of Mathematical Sciences

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

One important issue in the study of queueing systems is to characterize departure processes. Study on departure processes was rst initiated by Burke (

One important issue in the study of queueing systems is to characterize departure processes. Study on departure processes was rst initiated by Burke ( The Departure Process of the GI/G/ Queue and Its MacLaurin Series Jian-Qiang Hu Department of Manufacturing Engineering Boston University 5 St. Mary's Street Brookline, MA 2446 Email: hqiang@bu.edu June

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

WLLN for arrays of nonnegative random variables

WLLN for arrays of nonnegative random variables WLLN for arrays of nonnegative random variables Stefan Ankirchner Thomas Kruse Mikhail Urusov November 8, 26 We provide a weak law of large numbers for arrays of nonnegative and pairwise negatively associated

More information

arxiv: v2 [math.pr] 4 Sep 2017

arxiv: v2 [math.pr] 4 Sep 2017 arxiv:1708.08576v2 [math.pr] 4 Sep 2017 On the Speed of an Excited Asymmetric Random Walk Mike Cinkoske, Joe Jackson, Claire Plunkett September 5, 2017 Abstract An excited random walk is a non-markovian

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

ON EXTREME VALUE ANALYSIS OF A SPATIAL PROCESS

ON EXTREME VALUE ANALYSIS OF A SPATIAL PROCESS REVSTAT Statistical Journal Volume 6, Number 1, March 008, 71 81 ON EXTREME VALUE ANALYSIS OF A SPATIAL PROCESS Authors: Laurens de Haan Erasmus University Rotterdam and University Lisbon, The Netherlands

More information

Monotonicity and Aging Properties of Random Sums

Monotonicity and Aging Properties of Random Sums Monotonicity and Aging Properties of Random Sums Jun Cai and Gordon E. Willmot Department of Statistics and Actuarial Science University of Waterloo Waterloo, Ontario Canada N2L 3G1 E-mail: jcai@uwaterloo.ca,

More information

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES Bull. Korean Math. Soc. 52 (205), No. 3, pp. 825 836 http://dx.doi.org/0.434/bkms.205.52.3.825 A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES Yongfeng Wu and Mingzhu

More information

Stat 426 : Homework 1.

Stat 426 : Homework 1. Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

A central limit theorem for randomly indexed m-dependent random variables

A central limit theorem for randomly indexed m-dependent random variables Filomat 26:4 (2012), 71 717 DOI 10.2298/FIL120471S ublished by Faculty of Sciences and Mathematics, University of Niš, Serbia Available at: http://www.pmf.ni.ac.rs/filomat A central limit theorem for randomly

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition

COPYRIGHTED MATERIAL CONTENTS. Preface Preface to the First Edition Preface Preface to the First Edition xi xiii 1 Basic Probability Theory 1 1.1 Introduction 1 1.2 Sample Spaces and Events 3 1.3 The Axioms of Probability 7 1.4 Finite Sample Spaces and Combinatorics 15

More information

On the Convolution Order with Reliability Applications

On the Convolution Order with Reliability Applications Applied Mathematical Sciences, Vol. 3, 2009, no. 16, 767-778 On the Convolution Order with Reliability Applications A. Alzaid and M. Kayid King Saud University, College of Science Dept. of Statistics and

More information

Experimental Design and Statistics - AGA47A

Experimental Design and Statistics - AGA47A Experimental Design and Statistics - AGA47A Czech University of Life Sciences in Prague Department of Genetics and Breeding Fall/Winter 2014/2015 Matúš Maciak (@ A 211) Office Hours: M 14:00 15:30 W 15:30

More information

A Review of Basic FCLT s

A Review of Basic FCLT s A Review of Basic FCLT s Ward Whitt Department of Industrial Engineering and Operations Research, Columbia University, New York, NY, 10027; ww2040@columbia.edu September 10, 2016 Abstract We briefly review

More information

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions 18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Renewal theory and its applications

Renewal theory and its applications Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

A Martingale Central Limit Theorem

A Martingale Central Limit Theorem A Martingale Central Limit Theorem Sunder Sethuraman We present a proof of a martingale central limit theorem (Theorem 2) due to McLeish (974). Then, an application to Markov chains is given. Lemma. For

More information

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974 LIMITS FOR QUEUES AS THE WAITING ROOM GROWS by Daniel P. Heyman Ward Whitt Bell Communications Research AT&T Bell Laboratories Red Bank, NJ 07701 Murray Hill, NJ 07974 May 11, 1988 ABSTRACT We study the

More information

Synchronized Queues with Deterministic Arrivals

Synchronized Queues with Deterministic Arrivals Synchronized Queues with Deterministic Arrivals Dimitra Pinotsi and Michael A. Zazanis Department of Statistics Athens University of Economics and Business 76 Patission str., Athens 14 34, Greece Abstract

More information

A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM

A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM J. Appl. Prob. 49, 876 882 (2012 Printed in England Applied Probability Trust 2012 A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM BRIAN FRALIX and COLIN GALLAGHER, Clemson University Abstract

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes

Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes Front. Math. China 215, 1(4): 933 947 DOI 1.17/s11464-15-488-5 Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes Yuanyuan LIU 1, Yuhui ZHANG 2

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

STOCHASTIC GEOMETRY BIOIMAGING

STOCHASTIC GEOMETRY BIOIMAGING CENTRE FOR STOCHASTIC GEOMETRY AND ADVANCED BIOIMAGING 2018 www.csgb.dk RESEARCH REPORT Anders Rønn-Nielsen and Eva B. Vedel Jensen Central limit theorem for mean and variogram estimators in Lévy based

More information

Recent works on characterizations of the gamma distribution and related studies

Recent works on characterizations of the gamma distribution and related studies Recent works on characterizations of the gamma distribution related studies Chao-Wei Chou 1 Wen-Jang Huang 2 1 Department of Applied Mathematics, National Sun Yat-sen University, Kaohsiung, Taiwan 804,

More information

Obstacle problems and isotonicity

Obstacle problems and isotonicity Obstacle problems and isotonicity Thomas I. Seidman Revised version for NA-TMA: NA-D-06-00007R1+ [June 6, 2006] Abstract For variational inequalities of an abstract obstacle type, a comparison principle

More information

CHARACTERIZATIONS OF UNIFORM AND EXPONENTIAL DISTRIBUTIONS VIA MOMENTS OF THE kth RECORD VALUES RANDOMLY INDEXED

CHARACTERIZATIONS OF UNIFORM AND EXPONENTIAL DISTRIBUTIONS VIA MOMENTS OF THE kth RECORD VALUES RANDOMLY INDEXED APPLICATIONES MATHEMATICAE 24,3(1997), pp. 37 314 Z. GRUDZIEŃ and D. SZYNAL(Lublin) CHARACTERIZATIONS OF UNIFORM AND EXPONENTIAL DISTRIBUTIONS VIA MOMENTS OF THE kth RECORD VALUES RANDOMLY INDEXED Abstract.

More information

6. Bernoulli Trials and the Poisson Process

6. Bernoulli Trials and the Poisson Process 1 of 5 7/16/2009 7:09 AM Virtual Laboratories > 14. The Poisson Process > 1 2 3 4 5 6 7 6. Bernoulli Trials and the Poisson Process Basic Comparison In some sense, the Poisson process is a continuous time

More information

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics Data from one or a series of random experiments are collected. Planning experiments and collecting data (not discussed here). Analysis:

More information

Random Infinite Divisibility on Z + and Generalized INAR(1) Models

Random Infinite Divisibility on Z + and Generalized INAR(1) Models ProbStat Forum, Volume 03, July 2010, Pages 108-117 ISSN 0974-3235 Random Infinite Divisibility on Z + and Generalized INAR(1) Models Satheesh S NEELOLPALAM, S. N. Park Road Trichur - 680 004, India. ssatheesh1963@yahoo.co.in

More information

Quadratic forms in skew normal variates

Quadratic forms in skew normal variates J. Math. Anal. Appl. 73 (00) 558 564 www.academicpress.com Quadratic forms in skew normal variates Arjun K. Gupta a,,1 and Wen-Jang Huang b a Department of Mathematics and Statistics, Bowling Green State

More information

THE STRONG LAW OF LARGE NUMBERS FOR LINEAR RANDOM FIELDS GENERATED BY NEGATIVELY ASSOCIATED RANDOM VARIABLES ON Z d

THE STRONG LAW OF LARGE NUMBERS FOR LINEAR RANDOM FIELDS GENERATED BY NEGATIVELY ASSOCIATED RANDOM VARIABLES ON Z d ROCKY MOUNTAIN JOURNAL OF MATHEMATICS Volume 43, Number 4, 2013 THE STRONG LAW OF LARGE NUMBERS FOR LINEAR RANDOM FIELDS GENERATED BY NEGATIVELY ASSOCIATED RANDOM VARIABLES ON Z d MI-HWA KO ABSTRACT. We

More information

RENEWAL PROCESSES AND POISSON PROCESSES

RENEWAL PROCESSES AND POISSON PROCESSES 1 RENEWAL PROCESSES AND POISSON PROCESSES Andrea Bobbio Anno Accademico 1997-1998 Renewal and Poisson Processes 2 Renewal Processes A renewal process is a point process characterized by the fact that the

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

A Note on Some Wishart Expectations

A Note on Some Wishart Expectations Metrika, Volume 33, 1986, page 247-251 A Note on Some Wishart Expectations By R. J. Muirhead 2 Summary: In a recent paper Sharma and Krishnamoorthy (1984) used a complicated decisiontheoretic argument

More information

Minimum average value-at-risk for finite horizon semi-markov decision processes

Minimum average value-at-risk for finite horizon semi-markov decision processes 12th workshop on Markov processes and related topics Minimum average value-at-risk for finite horizon semi-markov decision processes Xianping Guo (with Y.H. HUANG) Sun Yat-Sen University Email: mcsgxp@mail.sysu.edu.cn

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Journal of Inequalities in Pure and Applied Mathematics

Journal of Inequalities in Pure and Applied Mathematics Journal of Inequalities in Pure and Applied Mathematics ON A HYBRID FAMILY OF SUMMATION INTEGRAL TYPE OPERATORS VIJAY GUPTA AND ESRA ERKUŞ School of Applied Sciences Netaji Subhas Institute of Technology

More information

Mathematical Methods for Computer Science

Mathematical Methods for Computer Science Mathematical Methods for Computer Science Computer Science Tripos, Part IB Michaelmas Term 2016/17 R.J. Gibbens Problem sheets for Probability methods William Gates Building 15 JJ Thomson Avenue Cambridge

More information

Convergence Rates for Renewal Sequences

Convergence Rates for Renewal Sequences Convergence Rates for Renewal Sequences M. C. Spruill School of Mathematics Georgia Institute of Technology Atlanta, Ga. USA January 2002 ABSTRACT The precise rate of geometric convergence of nonhomogeneous

More information

Probability and Statistics Concepts

Probability and Statistics Concepts University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Bounds on expectation of order statistics from a nite population

Bounds on expectation of order statistics from a nite population Journal of Statistical Planning and Inference 113 (2003) 569 588 www.elsevier.com/locate/jspi Bounds on expectation of order statistics from a nite population N. Balakrishnan a;, C. Charalambides b, N.

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Section 9.1. Expected Values of Sums

Section 9.1. Expected Values of Sums Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1

More information

THE VARIANCE CONSTANT FOR THE ACTUAL WAITING TIME OF THE PH/PH/1 QUEUE. By Mogens Bladt National University of Mexico

THE VARIANCE CONSTANT FOR THE ACTUAL WAITING TIME OF THE PH/PH/1 QUEUE. By Mogens Bladt National University of Mexico The Annals of Applied Probability 1996, Vol. 6, No. 3, 766 777 THE VARIANCE CONSTANT FOR THE ACTUAL WAITING TIME OF THE PH/PH/1 QUEUE By Mogens Bladt National University of Mexico In this paper we consider

More information

1 Generating functions

1 Generating functions 1 Generating functions Even quite straightforward counting problems can lead to laborious and lengthy calculations. These are greatly simplified by using generating functions. 2 Definition 1.1. Given a

More information

ON A SURVEY OF UNIFORM INTEGRABILITY OF SEQUENCES OF RANDOM VARIABLES

ON A SURVEY OF UNIFORM INTEGRABILITY OF SEQUENCES OF RANDOM VARIABLES ON A SURVEY OF UNIFORM INTEGRABILITY OF SEQUENCES OF RANDOM VARIABLES S. A. Adeosun and S. O. Edeki 1 Department of Mathematical Sciences, Crescent University, Abeokuta, Nigeria. 2 Department of Industrial

More information

X n = c n + c n,k Y k, (4.2)

X n = c n + c n,k Y k, (4.2) 4. Linear filtering. Wiener filter Assume (X n, Y n is a pair of random sequences in which the first component X n is referred a signal and the second an observation. The paths of the signal are unobservable

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology 6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first

More information

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. Service Engineering Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. G/G/1 Queue: Virtual Waiting Time (Unfinished Work). GI/GI/1: Lindley s Equations

More information

Lecturer: Olga Galinina

Lecturer: Olga Galinina Renewal models Lecturer: Olga Galinina E-mail: olga.galinina@tut.fi Outline Reminder. Exponential models definition of renewal processes exponential interval distribution Erlang distribution hyperexponential

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Applied Probability and Stochastic Processes

Applied Probability and Stochastic Processes Applied Probability and Stochastic Processes In Engineering and Physical Sciences MICHEL K. OCHI University of Florida A Wiley-Interscience Publication JOHN WILEY & SONS New York - Chichester Brisbane

More information

Analysis of an Infinite-Server Queue with Markovian Arrival Streams

Analysis of an Infinite-Server Queue with Markovian Arrival Streams Analysis of an Infinite-Server Queue with Markovian Arrival Streams Guidance Professor Associate Professor Assistant Professor Masao FUKUSHIMA Tetsuya TAKINE Nobuo YAMASHITA Hiroyuki MASUYAMA 1999 Graduate

More information

A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand

A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand Carl Mueller 1 and Zhixin Wu Abstract We give a new representation of fractional

More information

Modelling the risk process

Modelling the risk process Modelling the risk process Krzysztof Burnecki Hugo Steinhaus Center Wroc law University of Technology www.im.pwr.wroc.pl/ hugo Modelling the risk process 1 Risk process If (Ω, F, P) is a probability space

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Precise Large Deviations for Sums of Negatively Dependent Random Variables with Common Long-Tailed Distributions

Precise Large Deviations for Sums of Negatively Dependent Random Variables with Common Long-Tailed Distributions This article was downloaded by: [University of Aegean] On: 19 May 2013, At: 11:54 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer

More information

A New Fractional Process: A Fractional Non-homogeneous Poisson Process Enrico Scalas

A New Fractional Process: A Fractional Non-homogeneous Poisson Process Enrico Scalas A New Fractional Process: A Fractional Non-homogeneous Poisson Process Enrico Scalas University of Sussex Joint work with Nikolai Leonenko (Cardiff University) and Mailan Trinh Fractional PDEs: Theory,

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

ISyE 6761 (Fall 2016) Stochastic Processes I

ISyE 6761 (Fall 2016) Stochastic Processes I Fall 216 TABLE OF CONTENTS ISyE 6761 (Fall 216) Stochastic Processes I Prof. H. Ayhan Georgia Institute of Technology L A TEXer: W. KONG http://wwong.github.io Last Revision: May 25, 217 Table of Contents

More information

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 7

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 7 MS&E 321 Spring 12-13 Stochastic Systems June 1, 213 Prof. Peter W. Glynn Page 1 of 7 Section 9: Renewal Theory Contents 9.1 Renewal Equations..................................... 1 9.2 Solving the Renewal

More information

On lower and upper bounds for probabilities of unions and the Borel Cantelli lemma

On lower and upper bounds for probabilities of unions and the Borel Cantelli lemma arxiv:4083755v [mathpr] 6 Aug 204 On lower and upper bounds for probabilities of unions and the Borel Cantelli lemma Andrei N Frolov Dept of Mathematics and Mechanics St Petersburg State University St

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011 Exercise 1 Solutions to Homework 6 6.262 Discrete Stochastic Processes MIT, Spring 2011 Let {Y n ; n 1} be a sequence of rv s and assume that lim n E[ Y n ] = 0. Show that {Y n ; n 1} converges to 0 in

More information

Continuous Distributions

Continuous Distributions Continuous Distributions 1.8-1.9: Continuous Random Variables 1.10.1: Uniform Distribution (Continuous) 1.10.4-5 Exponential and Gamma Distributions: Distance between crossovers Prof. Tesler Math 283 Fall

More information

IE 303 Discrete-Event Simulation

IE 303 Discrete-Event Simulation IE 303 Discrete-Event Simulation 1 L E C T U R E 5 : P R O B A B I L I T Y R E V I E W Review of the Last Lecture Random Variables Probability Density (Mass) Functions Cumulative Density Function Discrete

More information

A regeneration proof of the central limit theorem for uniformly ergodic Markov chains

A regeneration proof of the central limit theorem for uniformly ergodic Markov chains A regeneration proof of the central limit theorem for uniformly ergodic Markov chains By AJAY JASRA Department of Mathematics, Imperial College London, SW7 2AZ, London, UK and CHAO YANG Department of Mathematics,

More information

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Alea 4, 117 129 (2008) Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Anton Schick and Wolfgang Wefelmeyer Anton Schick, Department of Mathematical Sciences,

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Karaliopoulou Margarita 1. Introduction

Karaliopoulou Margarita 1. Introduction ESAIM: Probability and Statistics URL: http://www.emath.fr/ps/ Will be set by the publisher ON THE NUMBER OF WORD OCCURRENCES IN A SEMI-MARKOV SEQUENCE OF LETTERS Karaliopoulou Margarita 1 Abstract. Let

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Chapter 6 : Moment Functions Néhémy Lim 1 1 Department of Statistics, University of Washington, USA Winter Quarter 2016 of Common Distributions Outline 1 2 3 of Common Distributions

More information

ECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic

ECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic ECE 3511: Communications Networks Theory and Analysis Fall Quarter 2002 Instructor: Prof. A. Bruce McDonald Lecture Topic Introductory Analysis of M/G/1 Queueing Systems Module Number One Steady-State

More information

Pointwise convergence rates and central limit theorems for kernel density estimators in linear processes

Pointwise convergence rates and central limit theorems for kernel density estimators in linear processes Pointwise convergence rates and central limit theorems for kernel density estimators in linear processes Anton Schick Binghamton University Wolfgang Wefelmeyer Universität zu Köln Abstract Convergence

More information

Stat 5101 Notes: Algorithms

Stat 5101 Notes: Algorithms Stat 5101 Notes: Algorithms Charles J. Geyer January 22, 2016 Contents 1 Calculating an Expectation or a Probability 3 1.1 From a PMF........................... 3 1.2 From a PDF...........................

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 22 12/09/2013. Skorokhod Mapping Theorem. Reflected Brownian Motion

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 22 12/09/2013. Skorokhod Mapping Theorem. Reflected Brownian Motion MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.7J Fall 213 Lecture 22 12/9/213 Skorokhod Mapping Theorem. Reflected Brownian Motion Content. 1. G/G/1 queueing system 2. One dimensional reflection mapping

More information

Mean Convergence Theorems with or without Random Indices for Randomly Weighted Sums of Random Elements in Rademacher Type p Banach Spaces

Mean Convergence Theorems with or without Random Indices for Randomly Weighted Sums of Random Elements in Rademacher Type p Banach Spaces STOCHASTIC ANALYSIS AND APPLICATIONS Vol. 21, No. 5, pp. 1169 1187, 2003 Mean Convergence Theorems with or without Random Indices for Randomly Weighted Sums of Random Elements in Rademacher Type p Banach

More information

Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek

Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek J. Korean Math. Soc. 41 (2004), No. 5, pp. 883 894 CONVERGENCE OF WEIGHTED SUMS FOR DEPENDENT RANDOM VARIABLES Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek Abstract. We discuss in this paper the strong

More information