1 Continuous-Time Chains pp-1. 2 Markovian Streams pp-1. 3 Regular Streams pp-1. 4 Poisson Streams pp-2. 5 Poisson Processes pp-5
|
|
- Marian Walsh
- 6 years ago
- Views:
Transcription
1 Stat667 Random Processes Poisson Processes This is to supplement the textbook but not to replace it. Contents 1 Continuous-Time Chains pp-1 2 Markovian Streams pp-1 3 Regular Streams pp-1 4 Poisson Streams pp-2 5 Poisson Processes pp-5 6 Random Streams pp-6 7 Supperposition of Poisson Streams pp-1 8 Decomposition of a Stream pp-11 9 Compound Poisson Processes pp-14 1 Spatial Poisson Processes pp Examples pp-18
2 1 Continuous-Time Chains Continuous-time chains can be studied partially by looking into the embedded discrete-time chains. This method, while greatly simplies the study and provides certain insight to the understanding of the chain, suffers the lost of information of the behavior of the chain in the time domain. In order to fully understand the chain, the time that transitions take place must be considered. 2 Markovian Streams DEFINITION 2.1 (STREAM (OF EVENTS)) A sequence of similar events occur in time is called a stream (of events). E E 1 E 2 E 3 T = T 1 T 2 T 3 t DEFINITION 2.2 (MARKOVIAN STREAM) A stream of events in which its development to the future is independent of its past is called a Markovian stream. That is, if a stream is memoryless, it is Markovian. REMARK 2.3 (NONOVERLAPPING INTERVALS) The developments within any two nonoverlapping time intervals of a Markovian stream are independent. 3 Regular Streams DEFINITION 3.1 (EXPLOSION) A stream is said to have an explosion at a particular time T < if, approaching T, events occur more and more intense and by time T, an infinite number of occurrences have already taken place. EXAMPLE 3.2 (A BOUNCING BALL) A ball bouncing on the floor, let the state of the system be the number of bounces it has made. A physically reasonable assumption can be made that the time (in seconds) between the nth bounce and the (n + 1)th bounce is 2 n. Then x n = n and the time the nth bounce taking place is T n = n 1 = n 1. Clearly T n < 2 and T n 2 = T as n. The process hence explodes at time T = 2. DEFINITION 3.3 (REGULAR STREAM) A stream having no explosion is called a regular stream. pp-1
3 THEOREM 3.4 (NECESSARY AND SUFFICIENT CONDITION OF REGULAR STREAM) A stream is regular if and only if, within an infinitesimal interval (t, t+h], there can be at most one occurrence of the event. That is, a stream is regular if and only if P (two or more occurrences within (t, t + h]) = o(h). Hence, denoting we have m i (t, h) = P (exactly i occurrences within (t, t + h]), m (t, h) = 1 λ(t)h + o(h) m 1 (t, h) = λ(t)h + o(h) m i (t, h) = o(h), i > 1. 4 Poisson Streams DEFINITION 4.1 (POISSON STREAM) A stream that is Markovian and regular is called a Poisson stream. The expected number of occurrences within (t, t + h] is Hence, [1 λ(t)h + o(h)] + 1 [λ(t)h + o(h)] + io(h) = λ(t)h + o(h). E{number of occurrences within (t, t + h]} λ(t) = lim. h h i=2 DEFINITION 4.2 (OCCURRENCE RATE) For a Poisson stream, λ(t), the expected number of occurrences per unit time at time t is called the occurrence rate (or instantaneous rate (of occurrence) or intensity) of the chain. DEFINITION 4.3 (AVERAGE OCCURRENCE RATE) For a Poisson stream, the average occurrence rate during (t, t + τ] is λ(t, τ) = E{number of occurrences within (t, t + τ]} τ DEFINITION 4.4 (TIME-HOMOGENEOUS POISSON STREAMS) A Poisson stream is called timehomogeneous if λ(t) = λ t. Consequently, λ = λ(t) = λ(t, τ), t > and τ >. Unless otherwise mentioned, assume hereafter time-homogeneity. DEFINITION 4.5 (REMAINING LIVES) The remaining life (or residual life) of a (time-homogeneous) Poisson stream, denoted r(t) is the time (from t) to the next occurrence. pp-2
4 THEOREM 4.6 (REMAINING LIFE IS EXPONENTIALLY DISTRIBUTED) The remaining life r(t) is distributed as E(λ). The above theorem holds because P (r(t) > s) = P (zero occurrence within (t, t + s]) = lim P ( k k j=1(zero occurrence within (t + (j 1)s k k = lim P (zero occurrence within (t + (j 1)s, t + js ]) k k k j=1 = lim (1 λs k k + o( 1 k ))k = e λs., t + js k ])) DEFINITION 4.7 (LIVES OF A STREAM) The duration l i between two consecutive occurrences E i 1 and E i in a stream is called the ith life. That is l i = T i T i 1 = ith life = interoccurrence time. A graphical view of lives of a Poisson stream is given below: E l 1 E 1 l 2 E 2 l 3 E 3 l 4 E 4 t T = T 1 T 2 T 3 T 4 THEOREM 4.8 (LIVES AND REMAINING LIVES OF POISSON STREAMS) The following are equivalent: a. a stream is a (time-homogeneous) Poisson stream; b. the remaining life is exponentially distributed; c. the lives are exponentially distributed. Note that the equivalence of (a) and (b) is already established above. Also (b) implies (c) by observing l n = r(t n 1 ). It remains to show that (c) implies (b). Observing the follorwing figure l n θ r(t) T n 1 = t θ t T n t+s T n+1 time we have then P (r(t) < s l n > θ) = P (r(t) < s and l n > θ) P (l n > θ) = 1 e λs. pp-3 = P (θ < l n < θ + s) P (l n > θ) = e λθ e λ(θ+s) e λθ
5 DEFINITION 4.9 (SHORTER REMAINING LIFE) Consider two completely independent Poisson streams: S X having rate λ X and remaining life r X (t), and S Y having rate λ Y and remaining life r Y (t). Define R 2 (t) = min(r X (t), r Y (t)) as the shorter remaining life of the two streams. If the event (R 2 (t) = r X (t)) (r X (t) < r Y (t)) occurs then stream S X is said to have a shorter remaining life than stream S Y does. THEOREM 4.1 (DISTRIBUTION OF SHORTER REMAINING LIFE) The distribution of R 2 (t) follows E(λ X + λ Y ) and λ X P (r X (t) < r Y (t)) =. λ X + λ Y To verify the above theorem first note that P (R 2 (t) > s) = P (r X (t) > s and r Y (t) > s) = P (r X (t) > s)p (r Y (t) > s) = e λ Xs e λ Y s = e (λ X+λ Y )s. So R 2 (t) is exponentially distributed with rate λ X + λ Y. Now, consider an infinitesimal duration (t, t + h]. Denote O ij = the event that exactly i coourrences in stream S X and exactly j occurrences in stream S Y. Employing a continuous version of the first-step analysis by conditioning on the occurrences of the two streams within (t, t + h]: P (r X (t) < r Y (t)) = i= P (r X (t) < r Y (t) O ij )P (O ij ) j= = 1 (λ X h(1 λ Y h) + o(h)) + ((1 λ X h)λ Y h + o(h)) +P (r X (t) < r Y (t)) ((1 λ X h) (1 λ Y h) + o(h)) + P (r X (t) < r Y (t) O ij )P (O ij ) o(h) i>1 or j>1 = λ X h + (1 λ X h λ Y h)p (r X (t) < r Y (t)) + o(h). Hence, P (r X (t) < r Y (t)) = λ X λ X + λ Y + o(h) (λ X + λ Y )h h λ X λ X + λ Y. REMARK 4.11 (SHORTEST REMAINING LIFE) Consider k independent Poisson streams S i with rate λ i and remaining life r i (t), i = 1,..., k. Let R k (t) be the shortest remaining life of these streams from time t. That is, R k (t) = min 1 i k r i(t). pp-4
6 Then R k (t) follows E(λ) where In addition, λ = λ λ k. P (r j (t) = R k (t)) = λ j λ. 5 Poisson Processes DEFINITION 5.1 (COUNTING PROCESSES) A counting process {m(t) : t } is a continuoustime chain which counts the number of occurrences of the event in a stream within (, t]. Define m(). The following is observed: REMARK 5.2 (PROPERTIES OF COUNTING PROCESSES) For a counting process the state space is {, 1, 2, }; it is nondecreasing, i.e., m(t) m(s) if t < s; it increases by one whenever an event occurs. DEFINITION 5.3 (POISSON PROCESSES) A counting process is called a (time-homogeneous) Poisson process if m(t) is Poisson distributed with parameter λt t >. The parameter λ is called the rate of the Poisson process. THEOREM 5.4 (PROPERTIES OF POISSON PROCESSES) A Poisson process has INDEPENDENT INCREMENTS: counting process increases independently in nonoverlapping intervals (Markovian); STATIONARY INCREMENTS: time-homogeneous; POISSON SEGMENTS: m(t + τ) m(t) P(λτ), τ > and t > ; i.e., the number of occurrences of events within any time segment is Poisson distributed with mean occurrence rate λ (length of the segment). REMARK 5.5 (POISSON PROCESS STREAM IS REGULAR) The stream associated with a Poisson process is regular and has exponentially distributed life since the tail probability of the remaining life is P (r(t) > τ) = P (no occurrence during (t, t + τ]) = e λτ. pp-5
7 REMARK 5.6 (VERIFICATION OF POISSON STREAMS) First pick a convenient time interval, then test the (null) hypothesis that the number of occurrences in this interval is Poisson. Caution should be exercised to interpret the outcome of the hypothesis test: it only shows that the data do not violate the Poisson stream assumption if the null hypothesis is not rejected. DEFINITION 5.7 (ERLANG DISTRIBUTIONS) From the definition of lives of a Poisson stream, T k = time to kth occurrence = l 1 + l l k has an Erlang distribution with parameters k and λ. REMARK 5.8 (ERLANG DISTRIBUTION IS A GAMMA DISTRIBUTION) Erlang distribution is a special case of the more general family of distribution, namely, the gamma distribution with parameters α = k and λ. Hence the p.d.f., the mean and the variance of an Erlang distribution with parameters k and λ are f(t) = λk t k 1 e λt Γ(k) = λk t k 1 e λt, t > (k 1)! E(T k ) = k λ Var(T k ) = k λ 2. 6 Random Streams Poisson streams are also referred to as random streams. THEOREM 6.1 (CONDITIONAL UNIFORM OCCURRENCES) For a Poisson process m(t), given that m(t) = n (i.e., given that there are n occurrences within (, t]), the n (ordered) occurrence times T 1, T 2,..., T n have the same joint distribution as the order statistics corresponding to n independent uniform (U(, t)) random variables over the interval (, t). That is, f T1,...,T n m(t)(t 1,, t n n) = n! t n, < t 1 < t 2 < < t n < t. ( ] T 1 T 2 T 3 T n t time REMARK 6.2 (UNORDERED UNIFORM) Intutively, we usually say that under the condition that n occurrences of events have taken place within (, t], the times at which events occur, considered as unordered random variables, are taken from a sample of size n from U(, t]. pp-6
8 REMARK 6.3 (TESTING POISSON) The above theorem may also be used to test the hypothesis that a given counting process is a Poisson process. This may be done by observing the process for a fixed time t. If in this time period we observed n occurrences and if the process is Poisson, then the unordered occurrence times would be independently and uniformly distributed on (, t]. Hence, we may test if the process is Poisson by testing the hypothesis that the n occurrence times come from a uniform (, t] population. This may be done by standard statistical procedures such as the Kolmogorov-Smirov test. EXAMPLE 6.4 (AN INFINITE SERVER POISSON QUEUE) Suppose that customers arrive at a service station according to a Poisson process with rate λ. Assume, upon arrival, the customer is immediately served by one of an infinite number of servers. Assume also that the service times are independent having common distribution G. Let X(t) denote the number of customers in the system at time t. X(t) = # of time t T n T 3 T 1 T 2 To determine the distribution of X(t), condition on m(t), the total number of customers who have arrived by t. By the Law of Total Probability we have P (X(t) = j) = λt (λt)n P (X(t) = j m(t) = n)e. n! n= Now, the probability that a customer who arrives at time x (the time x, given that m(t) = n, is selected uniformly from (, t] according to the above theorem) will still be present at time t is 1 G(t x) = P (service time exceeds t x). Hence, given that m(t) = n, the probability that an arbitrary one of these customers still being present at time t is given by p = t [1 G(t x)] 1 t dx = t 1 G(x) dx, t independently of others. Consequently, { ( n ) P (X(t) = j m(t) = n) = j p j (1 p) n j, j =, 1,, n,, for j > n, t time and thus P (X(t) = j) = n=j ( n )p j (1 p) n j λt (λt)n e j n! pp-7 λpt (λpt)j = e. j!
9 Or, in other words, X(t) is Poisson distributed with mean occurrence rate λ t [1 G(x)]dx. Or, {X(t), t } is a time-inhomogeneous Poisson process with instantaneous rate λ t [1 G(x)]dx. If the service distribution G follows an exponential distribution with mean service rate µ then the instantaneous rate becomes λ (1 µ e µt ) and the whole system is said to be an M/M/ queueing system which stands for Memoryless arrival time/memoryless service time/infinite number of servers which is so by recalling that the lives and the remaining lives of Poisson streams are exponentially distributed (memoryless). The figure below shows a sample path of {X(t)} from an M/M/ with arrival rate λ = 3 and service rate µ = 2: X(t) time, t arrivals departures If, instead, G follows general distribution (non-exponential) then the system is called an M/G/ queueing system. EXAMPLE 6.5 (AN ELECTRONIC COUNTER) Electrical pulses having random amplitudes arrived at a counter in accordance with a Poisson process with rate λ. With initial amplitude A, the amplitude decays in time, independent of its initial amplitude, and is modeled as Ae αt a time t later. Suppose the initial amplitudes of the pulses are independent having a common distribution function F. Denote the initial amplitudes of the pulses A 1, A 2,.... The total amplitude at time t is then m(t) A(t) = A n e α(t Tn). n=1 Note that A(t) is a continuous-time continuous state space random process. A sample path of A(t) is shown below: pp-8
10 A(t) t time We shall determine the distribution of A(t) by calculating its moment generating function M A(t) (u) = E m(t) M A(t) m(t) (u) = E[e ua(t) λt (λt)n m(t) = n]e. n! Conditioning on m(t) = n, the unordered arrival times are i.i.d. U(, t] distributed. Hence, given that m(t) = n, A(t) has the same distribution as It follows then Now, n= n A j e α(t Yj), Y j s are i.i.d. U(, t]. j=1 E[e ua(t) m(t) = n] = E and hence ( t E[e ua(t) m(t) = n] = [ { exp u n j=1 }] A j e α(t Y j) = ( E[exp{uA 1 e α(t Y1) }] ) n. E [ exp { ua 1 e α(t Y 1) } Y 1 = y ] = M A1 ( ue α(t y) ) ( M ) ) n ( A1 ue α(t y) 1 1 t dy = t Therefore, ( 1 t ( M A(t) (u) = E m(t) M ) m(t) ( A1 ue αy 1 dy) = Φ m(t) t t { [ ( 1 t ( = exp λ 1 M ) ) ]} A1 ue αy dy t { = exp λ t [ ( 1 M )] } A1 ue αy dy. t pp-9 t M A1 ( ue αy ) dy) n. t ( M ) ) A1 ue αy dy
11 The moments of A(t) may then be calculated by differentiation. For example, the mean function of the process {A(t), t } µ A (t) = E[A(t)] = M A(t)() = λ t = λe(a 1) (1 e αt ). αt t M A 1 ()e αy dy 7 Supperposition of Poisson Streams Consider k independent Poisson streams S i having respective rates λ i and remaining lives r i (t) at time t, for 1 i k. DEFINITION 7.1 (SUPERPOSITION OF POISSON STREAMS) A stream S X is called the superposition of S i s if an event in S X occurs whenever an event occurs in at least one of S i s. Denote r X (t) the remaining life of stream S X at time t. THEOREM 7.2 (SUPPERPOSED STREAM) The remaining life of stream S X is the shortest remaining life of S i s, i.e., r X (t) = min 1 i k r i(t). Moreover, stream S X is also a Poisson stream with rate λ X = k i=1 λ i. An example of the superposition of two Poisson streams is illustrated in Figure in the textbook. REMARK 7.3 (NEXT OCCURRENCE) As remarked in 4 (Remark 4.11), the probability that next occurrence of an event in stream S X comes from stream S i is λ i /λ X. In addition, the time until the next occurrence of an event in stream S X is independent of whether it comes from any particular stream S j. That is, see Exercise 8.21 in the textbook. P (r X (t) < τ and r j (t) = r X (t)) = λ j λ X [ 1 e λ X τ ] t, τ >, Now, consider the Poisson processes associated with the streams. Let {m i (t)} be the Poisson processes defined on stream S i, 1 i k, and {m X (t)} the Poisson process defined on stream S X. It follows then the theorem below can be established. THEOREM 7.4 (SUM OF POISSON PROCESSES) The sum of independent Poisson processes is also a Poisson process with rate equals to the sum of rates of the independent Poisson processes. The following figure shows sample paths of two independent Poisson processes with rates 1 and 2, respectively, and the sum process: pp-1
12 # occurrences m 1 (t) m 2 (t) m X (t) t time 8 Decomposition of a Stream Consider a Poisson stream S having rate λ. Upon an occurrence in S, a Bernoulli trial independent of the stream is performed. An occurrence in stream S X is triggered if the outcome is a success (with probability p), otherwise, an occurrence in stream S Y is triggered. DEFINITION 8.1 (DECOMPOSITION OF A STREAM) The two streams S X and S Y are said to decompose the stream S. Now, in any duration τ, P (m occurrences in S X and k occurrences in S Y ) = P (m + k occurrences in S; and m successes in m + k trials) = P (m + k occurrences in S)P (m successes in m + k trials) ( ) = (λτ)m+k m + k (m + k)! e λτ p m q k m ( ) ( ) (λpτ) m (λqτ) = e λpτ k e λqτ m! k! = P (m occurrences in S X )P (k occurrences in S Y ). Hence the following theorem holds. pp-11
13 THEOREM 8.2 (DECOMPOSED STREAMS) The two streams S X and S Y are Poisson streams with rates λp and λq, respectively. DEFINITION 8.3 (MARKED POISSON PROCESSES) Suppose T 1, T 2, are the ordered times of occurrences of a Poisson stream S, and Y 1, Y 2, are independent random variables, independent of the stream, having common distribution function G. The sequence of pairs (T 1, Y 1 ), (T 2, Y 2 ), is called a marked Poisson process. It follows then from the discussion above: THEOREM 8.4 (DECOMPOSED POISSON PROCESSES) Let {m(t), t } be the Poisson process corresponding to the Poisson stream S, and upon occurrences the Bernoulli trials are incurred. Then the Poisson processes corresponding to the two streams S X and S Y are m(t) m(t) m 1 (t) = Y k and m (t) = m(t) m 1 (t) = (1 Y k ), respectively. k=1 Moreover, {m 1 (t)} is a Poisson process with rate λp and {m (t)} is a Poisson process with rate λq. In addition, the two decomposed processes are independent. REMARK 8.5 (GENERAL DECOMPOSED POISSON PROCESSES) Consider G as from a discrete distribution having possible values, 1, 2, with probability function P (Y n = k) = a k > for k =, 1, with k= a k = 1. Define k=1 m(t) m k (t) = I (Yn=k), for k =, 1, 2,. n=1 Then {m (t)}, {m 1 (t)}, are independent Poisson processes with rates λa, λa 1,, respectively. The following is am example for which a i >, i =, 1, 2, 3 and a + a 1 + a 2 + a 3 = Y 3 = 3 Y 9 = 3 m 3 (t) Y 5 = 2 Y 8 = 2 Y 1 = 1 Y 4 = 1 Y 6 = 1 Y 1 = 1 m 2(t) m 1 (t) Y 2 = Y 7 = m (t) time T 1 T 2 T 3 T 4 T 5 T 6 T 7 T 8 T 9 T 1 pp-12
14 EXAMPLE 8.6 (CUSTOMERS PURCHASES) Customers enter a store according to a Poisson process of rate λ = 8 per hour. According to the past experience, each customer, independent of others, buys something with probability p =.4 and leaves without making a purchase with probability q = 1 p =.6. What is the probability that during the first hour 9 people enter the store with only 4 making purchases and 5 do not? To see this, first note that N 1 = m 1 (1), the number of customers who make purchases during the first hour, and N = m (1), the number of customers who do not make a purchase are independently Poisson distributed with respective rates λp = 3.2 and λq = 4.8. Now, P (4 make purchases and 5 do not) = P (N 1 = 4, N = 5) = P (N 1 = 4)P (N = 5) = 3.24 e 3.2 = (.1781) (.1747) =.311. EXERCISE 8.7 (FOR MARKED POISSON PROCESSES) 4! 4.85 e 4.8 5! a. Customers demanding service at a central processing station arrive according to a Poisson process of rate λ = 1. Each demand, independent of others, is classified as urgent with probability p u =.1, high priority with probability p h =.2, and low priority with probability p l =.7. What is the probability that 3 urgent, 6 high priority, and 12 low priority demands arise in the first two units of time? b. Shocks occur to a system according to a Poisson process of rate λ. Each shock causes some damage Y to the system, and these damages accumulate. Denote m(t) the number of shocks up to time t. Assume the damages Y i s are independently and identically distributed. The total damage up to time t is m(t) D(t) = Y i. (a) Suppose P (Y > y) = e θy for y >. Determine the mean and the variance of D(t). (b) Suppose now P (Y > n) = q n for n =, 1, where < q < 1. The system continues to function as long as the total damage is strictly less than some critical value a, and fails in the contrary circumstance. Determine the mean time to system failure. c. Alpha particles are emitted from a fixed mass of material in accordance with a Poisson process of rate λ. Each particle exists for a random duration and is then annihilated. Assume the lifetimes Y i s of these particles are i.i.d. having common distribution function G. Determine the distribution of N(t), the number of particles existing at time t. d. A tour boat that makes trips through the Houston ship channel leaves every T minutes from Pier 1. Tourists arrive at the pier according to a Poisson process with rate λ. The time that a tourist runs out of patience waiting for the tour boat follows a exponential distribution with rate µ. Running out of patience for waiting, he/she simply leaves without taking the tour boat. Determine the expected number of tourists present at each boat departure epoch. pp-13 i=1
15 9 Compound Poisson Processes In a Poisson stream having rate λ, whenever an event occurs there is a reward Y (or cost). These rewards are independent and identically distributed random variables having common distribution G, mean µ, and variance σ 2. Let {m(t)} be the Poisson process defined on the stream. Consider the process {n(t), t } which is defined by m(t) n(t) = Y i. i= DEFINITION 9.1 (COMPOUND POISSON PROCESSES) The process {n(t)} as described above is called a compound Poisson process. Note that m 1 (t) described in the previous section when each Y is a Bernoulli trial is a compound Poisson process. THEOREM 9.2 (MEAN AND VARIANCE OF A COMPOUND POISSON PROCESS) E[n(t)] = µλt and Var[n(t)] = E(Y 2 )λt. EXAMPLE 9.3 (OF COMPOUND POISSON PROCESSES) a. (RISK THEORY) Suppose claims arrive at an insurance company according to a Poisson process with rate λ. Denote Y k the magnitude of claim k. Then A(t) = m(t) k=1 Y k represents the cumulative amount claimed up to time t. b. (STOCK PRICES) Suppose that transactions in a certain stock take place according to a Poisson process with rate λ. Denote Y k the market price change of the stock between the (k 1)th and the kth transactions. Let X(t) = m(t) k=1 Y k represent the total price change up to time t. The convolution notation below can be used to denote the distribution of the sum of independent random variables: { 1 for y, G () (y) = for y < n G (n) (y) = P ( Y k y) = G (n 1) (y z)dg(z). We have then P (X(t) x) = k=1 = n= P (m(t) Y k x m(t) = n ) (λt) n e λt n! k=1 (λt) n e λt G (n) (x). n! n= pp-14
16 1 Spatial Poisson Processes DEFINITION 1.1 (POINT PROCESSES) Let T R n and the parameter space A = {A : A T }. A point process in T is a random process {m(a)} indexed by the sets A in A taking nonnegative integers as possible values where m(a) = # of points in A with certain characteristics. Also for A B =, A B A we have m(a B) = m(a) + m(b). EXAMPLE 1.2 (OF POINT PROCESSES) a. spatial distribution of stars or galaxies; b. spatial distribution of plants or animals; c. spatial distribution of bacteria on a slide; d. spatial distribution of defects on a surface or in a volume; e. spatial distribution of oil in a region. DEFINITION 1.3 ((HOMOGENEOUS) POISSON POINT PROCESSES) of intensity λ > if a. m(a) takes only nonnegative integer values, and < P (m(a) = ) < 1 if < A < ; b. the distribution of m(a) depends on A only through its size A (independent of shape and location) and P (m(a) = 1) = λ A + o( A ), P (m(a) 2) = o( A ) as A ; c. if A 1,..., A n are mutually exclusive regions then m(a 1 ),..., m(a n ) are independent and n m( A i ) = i=1 n m(a i ). i=1 THEOREM 1.4 (PROPERTIES OF POISSON POINT PROCESS) a. For A A, m(a) P(λ A ). b. For finite mutually exclusive sets A 1,..., A n in A, we have that m(a 1 ),..., m(a n ) are independent. That is, the outcome in one region A does not influence or be influenced by the outcomes in other regions nonoverlapping the region A. pp-15
17 c. If B A then P (m(b) = 1 m(a) = 1) = B A. d. Given that m(a) = n 1, the n points are uniformly distributed over A. e. Suppose A > and m(a) = n 1. Consider a partition {A 1,..., A k } of A. Then m(a 1 ),..., m(a k ) m(a) = n is That is, multinomially distributed with parameters n and A 1 A,..., A k A. P (m(a 1 ) = n 1,, m(a k ) = n k m(a) = n) { n! A 1 n 1 A k n k = n 1! n k, if n = k! A n i=1 n i, n i s are nonnegative integers,, otherwise. EXAMPLE 1.5 (APPLICATIONS OF POISSON POINT PROCESSES) a. (IN ASTRONOMY) Consider stars distributed in space according to a 3-D Poisson point process of intensity λ. For x, y R 3, let the light intensity exerted at x by a star located at y be α g(x, y, α) = x y = α 2 (x 1 y 1 ) 2 + (x 2 y 2 ) 2 + (x 3 y 3 ), 2 where α is a random parameter depending on the intensity of the star at y. The intensities associated with stars are i.i.d. random variables having common mean µ α and variance σα. 2 Assume the total intensity exerted at the point x due to light created by different stars accumulates additively. Denote W (x, A) the total light intensity at x due to light created from all sources located in region A. Then W (x, A) = m(a) i=1 g(x, y i, α i ) = m(a) i=1 α i x y i 2, where y i is the location of the ith star in A. Since the stars distributed in space follow a Poisson point process, y i is uniformly distributed in A and hence E ( x y i 2) = 1 1 A x y dy. 2 We have then Now, E[g(x, y, α)] = E(α)E ( x y i 2) = µ α 1 A A x y dy. 2 E[W (x, A)] = E[m(A)]E[g(x, y, α)] = λµ α pp-16 A A 1 x y 2 dy.
18 b. Customer arrivals at a service station follow a Poisson process of unknown rate. Suppose it is known that 1 customers have arrived during the first three hours. Denote N i the number of customers arriving during the ith hour, i = 1, 2, 3. Determine the probability that N 1 = 3, N 2 = 4, and N 3 = 3. Answer: 1! 3!4!3! (1 3 ) 3 (1 3 ) 4 (1) 3 = c. Bacteria are distributed throughout a volume of liquid according to a Poisson process of intensity λ =.8 organisms per mm 3. A measuring device counts the number of bacteria in in a 5mm 3 volume of the liquid. Determine the probability that more than three bacteria are in this measured volume. Answer: The number of bacteria in this measured volume follows a Poisson distribution with mean rate of 5λ = 4. Hence the desired probability is 1 e 4 4e 4 8e e 4 = d. For a star in space, denote D the distance to its nearest neighbor. Show that D has the probability density function { } 4λπd f(d) = (4λπd 2 3 ) exp, d > 3 if the stars are distributed in space in accordance with a Poisson point process of intensity λ. e. Of furniture making, defects (such as air bubbles, contaminants, chips) occur over the surface of a varnished tabletop according to a Poisson point process at a mean rate of one defect per top. Two inspectors each check separate halves of a given table, determine the probability that both inspectors find defects. f. A piece of a fibrous composite material is sliced across its circular cross section of radius R, revealing fiber ends distributed across the circular area in accordance with a Poisson point process of rate λ fibers per cross section.denote D the radial distance of a fiber from the center of the circular cross section. Determine the probability density function of D. g. Men and women enter a supermarket according to two independent Poisson processes having respective rates (per minute) of two for men and five for women. (a) Starting at an arbitrary time, calculate the probability that at least two men arrive before the first woman arrives. (b) Determine the probability that at least two men arrive before the third woman arrives. pp-17
19 h. Poisson point processes can be applied to check random variates generated by computer algorithm. Say a sequence U, U 1, U 2,, of unform (,1) random variates is generated and 1, pairs, say, of (U 2n, U 2n+1 ) are plotted on the unit square. If the numbers generated are random, then the points in a fixed number of small disjoint squares within the unit square should behave independent Poisson random variables. 11 Examples EXAMPLE 11.1 (AN OPTIMIZATION PROBLEM) Items arrive at a processing plant in accordance with a Poisson process with rate λ. At a fixed time T, all items are dispatched from the system. To minimize the total waiting time of all items, it is to pick an intermediate time t (, T ), at which all items in the system are also dispatched. Now, if all items are dispatched at time t and T, then the expected total waiting time is which can be reasoned as follows: λt λ(t t)2 2 (1) duration = t expected number of items = λt t duration = T t T time each arrival is U(,t), hence similar reasons expected wait for an item = t 2 therefore, expected total waiting time = λt 2 2 expected total λ(t t)2 waiting time = 2 From Equation 1, the constant dispatch time minimizing the expected total waiting time of all items is T λt 2 with the minimal expected total waiting time of. 2 4 Moreover, not only does T minimize the expected total waiting time, it also maximizes the 2 probability that the wait is less than y for every y. To see this, let t be any intermediate dispatch time. Then the total waiting time is W (T ) = m(t ) n=1 W t(n) and { t T(n), if T W t (n) = (n) t T T (n), if T (n) > t, where T (1), T (2), are the (ordered) arrival times of the items. pp-18
20 number of items T (2) W t (2) W t (n) t T (n) T time Now, conditioning on the event that m(t ) = n, the (unordered) arrivals T 1, T 2, are i.i.d. U(, T ). Therefore, W (T ) has the same distribution as m(t ) n=1 Z t(n) where m(t ) P(λT ) and Z t (n) s are i.i.d. and independent of m(t ), having P (Z t (n) < y) = { 1 [min(y, t) + min(y, T t)], T y T 1, y > T. (2) Equation 2 holds for the reasons that follows: To have a waiting time less than y, the item must arrive either between { (t y, t), if t > y (, t), if t < y in which the interval length is min(y, t), or between { (T y, T ), if T y > t (t, T ), if T y < t in which the interval length is min(y, T t). Hence it has to arrive in one of two disjoint intervals having lengths of min(y, t) and of min(y, T t), respectively. The result in Equation 2 then follows since the arrival time of the item is uniformly distributed on (, T ). It follows then from Equation 2 that for fixed y, P (Z t (n) < y) is maximized by t = T. Since 2 the distribution of m(t ) is independent of t and of the Z t (n) s, we have then that P ( m(t ) n=1 Z t(n) < y) = P ( m(t ) n=1 W t(n) < y) is maximized by t = T. Thus, not only does T minimize the expected 2 2 total waiting time, but it also possesses the stronger property of maximizing the probability that the total wait is less than y for every y. EXAMPLE 11.2 (SHOT NOISE) Shot noise process models fluctuations in electrical currents that are due to chance arrivals of electrons to an anode. Assume that electrons arrive at an anode in accordance with a Poisson process with intensity λ. Assume also that an arriving electron produces a current whose intensity x time units after arrival is given by the impulse response function h(x). Examples of impulse response functions include pp-19
21 the power law shot noise: h(x) = x θ, for x > ; the decaying exponential shot noise: triangles, rectangles, etc. h(x) = e θx, for x > ; The intensity of the current at time t is then the shot noise: m(t) I(t) = h(t T k ), k=1 where T k s are the (ordered) arrival times. Note that, conditioning on m(t) = n, I(t) has the same distribution as that of n h(u k ) k=1 where U k s are i.i.d. U(, t]. That is, the (unconditional) distribution of I(t) is the same as that of the random sum m(t) where ε k = h(u k ), k = 1, 2,..., are i.i.d. EXERCISE 11.3 (LAST ONE) k=1 ε k a. Miller Auditorium accepts season ticket subscriptions during an interval (, t] starting time. Each subscription, with probability p i, entails the purchase of i tickets, i = 1, 2, 3, 4, at a cost of c dollars per ticket payable at the time of purchase. A dollar received u time units from now will be discounted with a factor of β (i.e., will have the present value of e βu ). Assume the subscriptions arrive according to a Poisson process with rate λ. Compute the expected present value at time of all revenues received from the ticket sales in (, t]. b. An earthquake caused serious damage in a given locality. The locations of fatality follow a two-dimensional homogeneous Poisson point process with intensity λ =.1 per square mile. What is the probability that there are no fatalities within ten miles radius of the city hall? c. (DATABASE MAINTENANCE) At time, a file in a database is loaded with r records. Additional records will be added in the future according to a Poisson process. For each record in the file, its sueful life follows a distribution G. At the end of an interval of length t, there are two types of records dead and live records. Denote D(t) and L(t), respectively, the pp-2
22 numbers of dead and live records. Then m(t) = D(t) + L(t). (Dead records are logically flagged as deleted with actual pyhsical removal done when the file is compacted at database maintenance time.) The ratio a(t) = E[m(t)] E[L(t)] gives the mean search overhead per live record reference at time t (or the average number of record accesses per reference). Express a(t) for each of cases as follows. Also for each numerical example as given in parentheses, do plots to show the result.: arrival rate useful life λ(t) = λ λ(t) distribution G E(l) case (a) case (b) (r = 5, l = 2 days, λ = 5.58 per day) (r = 5, l = 2 days, ln λ(t) = G case (c) case (d) t sin( πt 15 ).275 cos( πt 15 )) pp-21
Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationPOISSON PROCESSES 1. THE LAW OF SMALL NUMBERS
POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted
More informationOptional Stopping Theorem Let X be a martingale and T be a stopping time such
Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10
More informationChapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory
Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection
More informationEDRP lecture 7. Poisson process. Pawe J. Szab owski
EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More information2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES
295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.
More informationChapter 2. Poisson Processes
Chapter 2. Poisson Processes Prof. Ai-Chun Pang Graduate Institute of Networking and Multimedia, epartment of Computer Science and Information Engineering, National Taiwan University, Taiwan utline Introduction
More information3. Poisson Processes (12/09/12, see Adult and Baby Ross)
3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous
More informationContinuous-Time Markov Chain
Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),
More informationContinuous-time Markov Chains
Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationThe story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes
The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element
More informationPoisson Processes. Stochastic Processes. Feb UC3M
Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More informationChapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations
Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models
More information1 Poisson processes, and Compound (batch) Poisson processes
Copyright c 2007 by Karl Sigman 1 Poisson processes, and Compound (batch) Poisson processes 1.1 Point Processes Definition 1.1 A simple point process ψ = {t n : n 1} is a sequence of strictly increasing
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder
More informationChapter 6 Queueing Models. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation
Chapter 6 Queueing Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Purpose Simulation is often used in the analysis of queueing models. A simple but typical queueing model: Queueing
More informationf X (x) = λe λx, , x 0, k 0, λ > 0 Γ (k) f X (u)f X (z u)du
11 COLLECTED PROBLEMS Do the following problems for coursework 1. Problems 11.4 and 11.5 constitute one exercise leading you through the basic ruin arguments. 2. Problems 11.1 through to 11.13 but excluding
More informationQ = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?
IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross
More informationRenewal theory and its applications
Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially
More informationPoisson processes and their properties
Poisson processes and their properties Poisson processes. collection {N(t) : t [, )} of rando variable indexed by tie t is called a continuous-tie stochastic process, Furtherore, we call N(t) a Poisson
More informationCAS Exam MAS-1. Howard Mahler. Stochastic Models
CAS Exam MAS-1 Howard Mahler Stochastic Models 2019 Stochastic Models, HCM 12/31/18, Page 1 The slides are in the same order as the sections of my study guide. Section # Section Name 1 Introduction 2 Exponential
More informationData analysis and stochastic modeling
Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt
More informationContents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory
Contents Preface... v 1 The Exponential Distribution and the Poisson Process... 1 1.1 Introduction... 1 1.2 The Density, the Distribution, the Tail, and the Hazard Functions... 2 1.2.1 The Hazard Function
More informationSOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012
SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 This exam is closed book. YOU NEED TO SHOW YOUR WORK. Honor Code: Students are expected to behave honorably, following the accepted
More informationM/G/1 and M/G/1/K systems
M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded
More informationSolution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.
Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M Ross John L Weatherwax November 14, 27 Introduction Chapter 1: Introduction to Stochastic Processes Chapter 1:
More informationQueueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions
Queueing Theory II Summary! M/M/1 Output process! Networks of Queue! Method of Stages " Erlang Distribution " Hyperexponential Distribution! General Distributions " Embedded Markov Chains M/M/1 Output
More informationStatistics 150: Spring 2007
Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities
More informationQueueing Theory. VK Room: M Last updated: October 17, 2013.
Queueing Theory VK Room: M1.30 knightva@cf.ac.uk www.vincent-knight.com Last updated: October 17, 2013. 1 / 63 Overview Description of Queueing Processes The Single Server Markovian Queue Multi Server
More information3 Modeling Process Quality
3 Modeling Process Quality 3.1 Introduction Section 3.1 contains basic numerical and graphical methods. familiar with these methods. It is assumed the student is Goal: Review several discrete and continuous
More informationPoint Process Control
Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued
More informationSince D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.
IEOR 46: Introduction to Operations Research: Stochastic Models Chapters 5-6 in Ross, Thursday, April, 4:5-5:35pm SOLUTIONS to Second Midterm Exam, Spring 9, Open Book: but only the Ross textbook, the
More informationExercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems
Exercises Solutions Note, that we have not formulated the answers for all the review questions. You will find the answers for many questions by reading and reflecting about the text in the book. Chapter
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationIEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.
IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas
More informationComputer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.
Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model
More informationSTAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)
STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it
More informationCDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationSlides 9: Queuing Models
Slides 9: Queuing Models Purpose Simulation is often used in the analysis of queuing models. A simple but typical queuing model is: Queuing models provide the analyst with a powerful tool for designing
More informationChapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process
More informationerrors every 1 hour unless he falls asleep, in which case he just reports the total errors
I. First Definition of a Poisson Process A. Definition: Poisson process A Poisson Process {X(t), t 0} with intensity λ > 0 is a counting process with the following properties. INDEPENDENT INCREMENTS. For
More informationClass 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.
Service Engineering Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. G/G/1 Queue: Virtual Waiting Time (Unfinished Work). GI/GI/1: Lindley s Equations
More informationSlides 8: Statistical Models in Simulation
Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An
More informationPoissonprocessandderivationofBellmanequations
APPENDIX B PoissonprocessandderivationofBellmanequations 1 Poisson process Let us first define the exponential distribution Definition B1 A continuous random variable X is said to have an exponential distribution
More informationLecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)
Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution
More informationLecturer: Olga Galinina
Renewal models Lecturer: Olga Galinina E-mail: olga.galinina@tut.fi Outline Reminder. Exponential models definition of renewal processes exponential interval distribution Erlang distribution hyperexponential
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationMassachusetts Institute of Technology
6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first
More informationEE126: Probability and Random Processes
EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of
More informationModelling the risk process
Modelling the risk process Krzysztof Burnecki Hugo Steinhaus Center Wroc law University of Technology www.im.pwr.wroc.pl/ hugo Modelling the risk process 1 Risk process If (Ω, F, P) is a probability space
More informationECE 313 Probability with Engineering Applications Fall 2000
Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) =
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationMassachusetts Institute of Technology
.203J/6.28J/3.665J/5.073J/6.76J/ESD.26J Quiz Solutions (a)(i) Without loss of generality we can pin down X at any fixed point. X 2 is still uniformly distributed over the square. Assuming that the police
More information(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only
More informationContinuous Time Markov Chains
Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov
More informationLecture 20: Reversible Processes and Queues
Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n
More informationSystem Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models
System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder
More information1 Some basic renewal theory: The Renewal Reward Theorem
Copyright c 27 by Karl Sigman Some basic renewal theory: The Renewal Reward Theorem Here, we will present some basic results in renewal theory such as the elementary renewal theorem, and then the very
More informationSTAT2201. Analysis of Engineering & Scientific Data. Unit 3
STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random
More informationThe Transition Probability Function P ij (t)
The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it
More informationRandom variables and transform methods
Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are
More informationExercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010
Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationIEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013
IEOR 316: Second Midterm Exam, Chapters 5-6, November 7, 13 SOLUTIONS Honor Code: Students are expected to behave honorably, following the accepted code of academic honesty. You may keep the exam itself.
More informationQueueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "
Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals
More informationDiscrete Random Variables
Chapter 5 Discrete Random Variables Suppose that an experiment and a sample space are given. A random variable is a real-valued function of the outcome of the experiment. In other words, the random variable
More informationec1 e-companion to Liu and Whitt: Stabilizing Performance
ec1 This page is intentionally blank. Proper e-companion title page, with INFORMS branding and exact metadata of the main paper, will be produced by the INFORMS office when the issue is being assembled.
More informationMath 151. Rumbos Fall Solutions to Review Problems for Final Exam
Math 5. Rumbos Fall 23 Solutions to Review Problems for Final Exam. Three cards are in a bag. One card is red on both sides. Another card is white on both sides. The third card in red on one side and white
More informationPBW 654 Applied Statistics - I Urban Operations Research
PBW 654 Applied Statistics - I Urban Operations Research Lecture 2.I Queuing Systems An Introduction Operations Research Models Deterministic Models Linear Programming Integer Programming Network Optimization
More informationIEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.
IEOR 46: Spring Solutions to Homework Assignment 7: Due on Tuesday, March. More of Chapter 5: Read the rest of Section 5.3, skipping Examples 5.7 (Coupon Collecting), 5. (Insurance claims)and Subsection
More informationSample Spaces, Random Variables
Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted
More information3 Continuous Random Variables
Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random
More informationIEOR 6711, HMWK 5, Professor Sigman
IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationSystems Simulation Chapter 6: Queuing Models
Systems Simulation Chapter 6: Queuing Models Fatih Cavdur fatihcavdur@uludag.edu.tr April 2, 2014 Introduction Introduction Simulation is often used in the analysis of queuing models. A simple but typical
More informationName of the Student:
SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct
More informationBirth-Death Processes
Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death
More informationCDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical
CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one
More informationSTAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.
STAT 302 Introduction to Probability Learning Outcomes Textbook: A First Course in Probability by Sheldon Ross, 8 th ed. Chapter 1: Combinatorial Analysis Demonstrate the ability to solve combinatorial
More informationTime : 3 Hours Max. Marks: 60. Instructions : Answer any two questions from each Module. All questions carry equal marks. Module I
Reg. No.:... Name:... First Semester M.Tech. Degree Examination, March 4 ( Scheme) Branch: ELECTRONICS AND COMMUNICATION Signal Processing( Admission) TSC : Random Processes and Applications (Solutions
More informationNotes for Math 324, Part 19
48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which
More informationNon Markovian Queues (contd.)
MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where
More informationChapter 5. Chapter 5 sections
1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions
More informationDefinition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R
Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 6
MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and
More information1 IEOR 4701: Continuous-Time Markov Chains
Copyright c 2006 by Karl Sigman 1 IEOR 4701: Continuous-Time Markov Chains A Markov chain in discrete time, {X n : n 0}, remains in any state for exactly one unit of time before making a transition (change
More informationMathematics Department, Birjand Branch, Islamic Azad University, Birjand, Iran
REVSTAT Statistical Journal Volume 14, Number 3, June 216, 229 244 PROPERTIES OF n-laplace TRANSFORM RATIO ORDER AND -CLASS OF LIFE DISTRIBUTIONS Authors: Jalil Jarrahiferiz Mathematics Department, Birjand
More informationStatistics for Economists. Lectures 3 & 4
Statistics for Economists Lectures 3 & 4 Asrat Temesgen Stockholm University 1 CHAPTER 2- Discrete Distributions 2.1. Random variables of the Discrete Type Definition 2.1.1: Given a random experiment with
More informationEE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002
EE/CpE 345 Modeling and Simulation Class 5 September 30, 2002 Statistical Models in Simulation Real World phenomena of interest Sample phenomena select distribution Probabilistic, not deterministic Model
More informationP (L d k = n). P (L(t) = n),
4 M/G/1 queue In the M/G/1 queue customers arrive according to a Poisson process with rate λ and they are treated in order of arrival The service times are independent and identically distributed with
More informationComputer Systems Modelling
Computer Systems Modelling Computer Laboratory Computer Science Tripos, Part II Michaelmas Term 2003 R. J. Gibbens Problem sheet William Gates Building JJ Thomson Avenue Cambridge CB3 0FD http://www.cl.cam.ac.uk/
More informationGI/M/1 and GI/M/m queuing systems
GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting
More informationProbability. Lecture Notes. Adolfo J. Rumbos
Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More information