Continuous-time Markov Chains

Size: px
Start display at page:

Download "Continuous-time Markov Chains"

Transcription

1 Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester October 23, 2017 Introduction to Random Processes Continuous-time Markov Chains 1

2 Exponential random variables Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Introduction to Random Processes Continuous-time Markov Chains 2

3 Exponential distribution Exponential RVs often model times at which events occur Or time elapsed between occurrence of random events RV T exp(λ) is exponential with parameter λ if its pdf is f T (t) = λe λt, for all t 0 Cdf, integral of the pdf, is F T (t) = P (T t) = 1 e λt Complementary (c)cdf is P(T t) = 1 F T (t) = e λt 1 pdf (λ = 1) cdf (λ = 1) Introduction to Random Processes Continuous-time Markov Chains 3

4 Expected value Expected value of time T exp(λ) is E [T ] = 0 tλe λt dt = te λt + 0 Integrated by parts with u = t, dv = λe λt dt 0 e λt dt = λ Mean time is inverse of parameter λ λ is rate/frequency of events happening at intervals T Interpret: Average of λt events by time t Bigger λ smaller expected times, larger frequency of events T 1 T 2 S 1 S 2 t = 0 T 3 S 3 T 4 S 4 T 5 T 6 S 5 S 6 t = 5/λ T 7 T 8 S 7 S 8 T 9 T 10 S 9 S 10 t t = 10/λ Introduction to Random Processes Continuous-time Markov Chains 4

5 Second moment and variance For second moment also integrate by parts (u = t 2, dv = λe λt dt) E [ T 2] = t 2 λe λt dt = t 2 e λt First term is 0, second is (2/λ)E [T ] 2te λt dt E [ T 2] = 2 λ 0 tλe λt = 2 λ 2 The variance is computed from the mean and second moment var [T ] = E [ T 2] E 2 [T ] = 2 λ 2 1 λ 2 = 1 λ 2 Parameter λ controls mean and variance of exponential RV Introduction to Random Processes Continuous-time Markov Chains 5

6 Memoryless random times Def: Consider random time T. We say time T is memoryless if P ( T > s + t T > t ) = P (T > s) Probability of waiting s extra units of time (e.g., seconds) given that we waited t seconds, is just the probability of waiting s seconds System does not remember it has already waited t seconds Same probability irrespectively of time already elapsed Ex: Chemical reaction A + B AB occurs when molecules A and B collide. A, B move around randomly. Time T until reaction Introduction to Random Processes Continuous-time Markov Chains 6

7 Exponential RVs are memoryless Write memoryless property in terms of joint pdf P ( T > s + t T > t ) = P (T > s + t, T > t) P (T > t) = P (T > s) Notice event {T > s + t, T > t} is equivalent to {T > s + t} Replace P (T > s + t, T > t) = P (T > s + t) and reorder P (T > s + t) = P (T > t)p (T > s) If T exp(λ), ccdf is P (T > t) = e λt so that P (T > s + t) = e λ(s+t) = e λt e λs = P (T > t) P (T > s) If random time T is exponential T is memoryless Introduction to Random Processes Continuous-time Markov Chains 7

8 Continuous memoryless RVs are exponential Consider a function g(t) with the property g(t + s) = g(t)g(s) Q: Functional form of g(t)? Take logarithms log g(t + s) = log g(t) + log g(s) Only holds for all t and s if log g(t) = ct for some constant c Which in turn, can only hold if g(t) = e ct for some constant c Compare observation with statement of memoryless property P (T > s + t) = P (T > t) P (T > s) It must be P (T > t) = e ct for some constant c T continuous: only true for exponential T exp( c) T discrete: only geometric P (T > t) = (1 p) t with (1 p) = e c If continuous random time T is memoryless T is exponential Introduction to Random Processes Continuous-time Markov Chains 8

9 Main memoryless property result Theorem A continuous random variable T is memoryless if and only if it is exponentially distributed. That is P ( T > s + t T > t ) = P (T > s) if and only if f T (t) = λe λt for some λ > 0 Exponential RVs are memoryless. Do not remember elapsed time Only type of continuous memoryless RVs Discrete RV T is memoryless if and only of it is geometric Geometrics are discrete approximations of exponentials Exponentials are continuous limits of geometrics Exponential = time until success Geometric = nr. trials until success Introduction to Random Processes Continuous-time Markov Chains 9

10 Exponential times example First customer s arrival to a store takes T exp(1/10) minutes Suppose 5 minutes have passed without an arrival Q: How likely is it that the customer arrives within the next 3 mins.? Use memoryless property of exponential T P ( T 8 T > 5 ) = 1 P ( T > 8 T > 5 ) = 1 P (T > 3) = 1 e 3λ = 1 e 0.3 Q: How likely is it that the customer arrives after time T = 9? P ( T > 9 T > 5 ) = P (T > 4) = e 4λ = e 0.4 Q: What is the expected additional time until the first arrival? Additional time is T 5, and P ( T 5 > t T > 5 ) = P (T > t) E [ T 5 T > 5 ] = E [T ] = 1/λ = 10 Introduction to Random Processes Continuous-time Markov Chains 10

11 Time to first event Independent exponential RVs T 1, T 2 with parameters λ 1, λ 2 Q: Prob. distribution of time to first event, i.e., T := min(t 1, T 2 )? To have T > t we need both T 1 > t and T 2 > t Using independence of T 1 and T 2 we can write P (T > t) = P (T 1 > t, T 2 > t) = P (T 1 > t) P (T 2 > t) Substituting expressions of exponential ccdfs P (T > t) = e λ1t e λ2t = e (λ1+λ2)t T := min(t 1, T 2 ) is exponentially distributed with parameter λ 1 + λ 2 In general, for n independent RVs T i exp(λ i ) define T := min i T i T is exponentially distributed with parameter n i=1 λ i Introduction to Random Processes Continuous-time Markov Chains 11

12 First event to happen Q: Prob. P (T 1 < T 2 ) of T 1 exp(λ 1 ) happening before T 2 exp(λ 2 )? Condition on T 2 = t, integrate over the pdf of T 2, independence P (T 1 < T 2 ) = 0 P ( T 1 < t T 2 = t ) f T2 (t) dt = Substitute expressions for exponential pdf and cdf P (T 1 < T 2 ) = 0 (1 e λ1t )λ 2 e λ2t dt = λ 1 λ 1 + λ 2 Either T 1 comes before T 2 or vice versa, hence P (T 2 < T 1 ) = 1 P (T 1 < T 2 ) = λ 2 λ 1 + λ 2 Probabilities are relative values of rates (parameters) Larger rate smaller average higher prob. happening first 0 F T1 (t)f T2 (t) dt Introduction to Random Processes Continuous-time Markov Chains 12

13 Additional properties of exponential RVs Consider n independent RVs T i exp(λ i ). T i time to i-th event Probability of j-th event happening first ( ) λ P T j = min T i = j n i i=1 λ, j = 1,..., n i Time to first event and rank ordering of events are independent ( ) ( ) P min T i t, T i1 <... < T in = P min T i t P (T i1 <... < T in ) i i Suppose T exp(λ), independent of non-negative RV X Strong memoryless property asserts P ( T > X + t T > X ) = P (T > t) Also forgets random but independent elapsed times Introduction to Random Processes Continuous-time Markov Chains 13

14 Strong memoryless property example Independent customer arrival times T i exp(λ i ), i = 1,..., 3 Suppose customer 3 arrives first, i.e., min(t 1, T 2 ) > T 3 Q: Probability that time between first and second arrival exceeds t? We want to compute P ( min(t 1, T 2 ) T 3 > t ) min(t1, T 2 ) > T 3 Denote T i2 := min(t 1, T 2 ) the time to second arrival Recall T i2 exp(λ 1 + λ 2 ), T i2 independent of T i1 = T 3 Apply the strong memoryless property P ( T i2 T 3 > t ) T i2 > T 3 = P (Ti2 > t) = e (λ1+λ2)t Introduction to Random Processes Continuous-time Markov Chains 14

15 Probability of event in infinitesimal time Q: Probability of an event happening in infinitesimal time h? Want P (T < h) for small h Equivalent to P (T < h) = P (T < t) t h 0 λe λt dt λh = λ t=0 Sometimes also write P (T < h) = λh + o(h) o(h) o(h) implies lim = 0 h 0 h Read as negligible with respect to h Q: Two independent events in infinitesimal time h? P (T 1 h, T 2 h) (λ 1 h)(λ 2 h) = λ 1 λ 2 h 2 = o(h) Introduction to Random Processes Continuous-time Markov Chains 15

16 Counting and Poisson processes Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Introduction to Random Processes Continuous-time Markov Chains 16

17 Counting processes Random process N(t) in continuous time t R + Def: Counting process N(t) counts number of events by time t Nonnegative integer valued: N(0) = 0, N(t) {0, 1, 2,...} Nondecreasing: N(s) N(t) for s < t Event counter: N(t) N(s) = number of events in interval (s, t] N(t) continuous from the right N(S4) N(S 2) = 2, while N(S 4) N(S 2 ɛ) = 3 for small ɛ > 0 N(t) Ex.1: # text messages (SMS) typed 6 since beginning of class 5 4 Ex.2: # economic crises since Ex.3: # customers at Wegmans since 1 8 am this morning t S 1 S 2 S 3 S 4 S 5 S 6 Introduction to Random Processes Continuous-time Markov Chains 17

18 Independent increments Consider times s 1 < t 1 < s 2 < t 2 and intervals (s 1, t 1 ] and (s 2, t 2 ] N(t 1 ) N(s 1 ) events occur in (s 1, t 1 ] N(t 2 ) N(s 2 ) events occur in (s 2, t 2 ] Def: Independent increments implies latter two are independent P (N(t 1 ) N(s 1 ) = k, N(t 2 ) N(s 2 ) = l) = P (N(t 1 ) N(s 1 ) = k) P (N(t 2 ) N(s 2 ) = l) Number of events in disjoint time intervals are independent Ex.1: Likely true for SMS, except for have to send messages Ex.2: Most likely not true for economic crises (business cycle) Ex.3: Likely true for Wegmans, except for unforeseen events (storms) Does not mean N(t) independent of N(s), say for t > s These events are clearly dependent, since N(t) is at least N(s) Introduction to Random Processes Continuous-time Markov Chains 18

19 Stationary increments Consider time intervals (0, t] and (s, s + t] N(t) events occur in (0, t] N(s + t) N(s) events in (s, s + t] Def: Stationary increments implies latter two have same prob. dist. P (N(s + t) N(s) = k) = P (N(t) = k) Prob. dist. of number of events depends on length of interval only Ex.1: Likely true if lecture is good and you keep interest in the class Ex.2: Maybe true if you do not believe we become better at preventing crises Ex.3: Most likely not true because of, e.g., rush hours and slow days Introduction to Random Processes Continuous-time Markov Chains 19

20 Poisson process Def: A counting process N(t) is a Poisson process if (a) The process has stationary and independent increments (b) The number of events in (0, t] has Poisson distribution with mean λt P (N(t) = n) = e λt (λt) n An equivalent definition is the following (i) The process has stationary and independent increments (ii) Single event in infinitesimal time P (N(h) = 1) = λh + o(h) (iii) Multiple events in infinitesimal time P (N(h) > 1) = o(h) A more intuitive definition (even hard to grasp now) Conditions (i) and (a) are the same That (b) implies (ii) and (iii) is obvious Substitute small h in Poisson pmf s expression for P (N(t) = n) To see that (ii) and (iii) imply (b) requires some work n! Introduction to Random Processes Continuous-time Markov Chains 20

21 Explanation of model (i)-(iii) Consider time T and divide interval (0, T ] in n subintervals Subintervals are of duration h = T /n, h vanishes as n increases The m-th subinterval spans ( (m 1)h, mh ] Define A m as the number of events that occur in m-th subinterval A m = N ( mh ) N ( (m 1)h ) The total number of events in (0, T ] is the sum of A m, m = 1,..., n N(T ) = n A m= m=1 n N ( mh ) N ( (m 1)h ) In figure, N(T ) = 5, A 1, A 2, A 4, A 7, A 8 are 1 and A 3, A 5, A 6 are 0 m=1 S1 S2 S3 S4 S5 t h h h h h h h h t = 0 t = T Introduction to Random Processes Continuous-time Markov Chains 21

22 Probability distribution of A m (intuitive arg.) Note first that since increments are stationary as per (i), it holds P (A m = k) = P ( N ( mh ) N ( (m 1)h ) = k ) = P (N(h) = k) In particular, using (ii) and (iii) P (A m = 1) = P (N(h) = 1) = λh + o(h) P (A m > 1) = P (N(h) > 1) = o(h) Set aside o(h) probabilities They re negligible with respect to λh P (A m = 1) = λh P (A m = 0) = 1 λh A m is Bernoulli with parameter λh S1 S2 S3 S4 S5 t h h h h h h h h t = 0 t = T Introduction to Random Processes Continuous-time Markov Chains 22

23 Probability distribution of N(T ) (intuitive arg.) Since increments are also independent as per (i), A m are independent N(T ) is sum of n independent Bernoulli RVs with parameter λh N(T ) is binomial with parameters (n, λh) = (n, λt /n) As interval length h 0, number of intervals n The product n λh = λt stays constant N(T ) is Poisson with parameter λt (Law of rare events) Then (ii)-(iii) imply (b) and definitions are equivalent Not a proof because we neglected o(h) terms But explains what a Poisson process is S1 S2 S3 S4 S5 t h h h h h h h h t = 0 t = T Introduction to Random Processes Continuous-time Markov Chains 23

24 What is a Poisson process? Fundamental defining properties of a Poisson process Events happen in small interval h with probability λh proportional to h Whether event happens in an interval has no effect on other intervals Modeling questions Q1: Expect probability of event proportional to length of interval? Q2: Expect subsequent intervals to behave independently? If positive, then a Poisson process model is appropriate Typically arise in a large population of agents acting independently Larger interval, larger chance an agent takes an action Action of one agent has no effect on action of other agents Has therefore negligible effect on action of group Introduction to Random Processes Continuous-time Markov Chains 24

25 Examples of Poisson processes Ex.1: Number of people arriving at subway station. Number of cars arriving at a highway entrance. Number of customers entering a store... Large number of agents (people, drivers, customers) acting independently Ex.2: SMS generated by all students in the class. Once you send an SMS you are likely to stay silent for a while. But in a large population this has a minimal effect in the probability of someone generating a SMS Ex.3: Count of molecule reactions. Molecules are removed from pool of reactants once they react. But effect is negligible in large population. Eventually reactants are depleted, but in small time scale process is approximately Poisson Introduction to Random Processes Continuous-time Markov Chains 25

26 Formal argument to show equivalence Define A max := max (A m), maximum nr. of events in one interval m=1,...,n If A max 1 all intervals have 0 or 1 events. Consider probability P ( N(T ) = k Amax 1 ) For given h, N(T ) conditioned on A max 1 is binomial Parameters are n = T /h and p = λh + o(h) Interval length h 0 Parameter p 0, nr. of intervals n Product np lim h 0 np = lim h 0 (T /h)(λh + o(h)) = λt N(T ) conditioned on A max 1 is Poisson with parameter λt P ( N(T ) = k Amax 1 ) λt (λt )k = e k! Introduction to Random Processes Continuous-time Markov Chains 26

27 Formal argument to show equivalence (continued) Separate study in A max 1 and A max > 1. That is, condition P (N(T ) = k) = P ( N(T ) = k A max 1 ) P (A max 1) + P ( N(T ) = k A max > 1 ) P (A max > 1) Property (iii) implies that P (A max > 1) vanishes as h 0 P (A max > 1) n m=1 P (A m > 1) = no(h) = T o(h) h 0 Thus, as h 0, P (A max > 1) 0 and P (A max 1) 1. Then lim P (N(T ) = k) = lim P ( N(T ) = k Amax 1 ) h 0 h 0 Right-hand side is Poisson N(T ) Poisson with parameter λt Introduction to Random Processes Continuous-time Markov Chains 27

28 Properties of Poisson processes Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Introduction to Random Processes Continuous-time Markov Chains 28

29 Arrival times and interarrival times N(t) T 1 T 2 T 3 T 4 T 5 T 6 S 1 S 2 S 3 S 4 S 5 S 6 t Let S 1, S 2,... be the sequence of absolute times of events (arrivals) Def: S i is known as the i-th arrival time Notice that S i = min t (N(t) i) Let T 1, T 2,... be the sequence of times between events Def: T i is known as the i-th interarrival time Useful identities: S i = ik=1 T k and T i = S i S i 1, where S 0 = 0 Introduction to Random Processes Continuous-time Markov Chains 29

30 Interarrival times are i.i.d. exponential RVs Ccdf of T 1 P (T 1 > t) = P (N(t) = 0) = e λt T 1 has exponential distribution with parameter λ Since increments are stationary and independent, likely T i are i.i.d. Theorem Interarrival times T i of a Poisson process are independent identically distributed exponential random variables with parameter λ, i.e., P (T i > t) = e λt Have already proved for T 1. Let us see the rest Introduction to Random Processes Continuous-time Markov Chains 30

31 Interarrival times are i.i.d. exponential RVs Proof. Recall S i is i-th arrival time. Condition on S i P (T i+1 > t) = P ( T i+1 > t Si = s ) f Si (s) ds To have T i+1 > t given that S i = s it must be N(s + t) = N(s) P ( T i+1 > t S i = s ) = P ( N(t + s) N(s) = 0 N(s) = i ) Since increments are independent, drop conditioning on N(s) = i P ( T i+1 > t Si = s ) = P (N(t + s) N(s) = 0) Since increments are also stationary and N(t) is Poisson, then P ( T i+1 > t Si = s ) = P (N(t) = 0) = e λt Substituting into integral yields P (T i+1 > t) = e λt Introduction to Random Processes Continuous-time Markov Chains 31

32 Interarrival times example Let N 1 (t) and N 2 (t) be Poisson processes with rates λ 1 and λ 2 Suppose N 1 (t) and N 2 (t) are independent Q: What is the expected time till the first arrival from either process? Denote as S (i) 1 the first arrival time from process i = 1, 2 [ ( )] We are looking for E min S (1) 1, S (2) 1 Note that S (1) 1 = T (1) 1 and S (2) 1 = T (2) 1 S (1) 1 exp(λ 1 ) and S (2) 1 exp(λ 2 ) Also, S (1) 1 and S (2) 1 are independent ( ) Recall that min S (1) 1, S (2) 1 exp(λ 1 + λ 2 ), then [ ( )] E min S (1) 1, S (2) 1 1 = λ 1 + λ 2 Introduction to Random Processes Continuous-time Markov Chains 32

33 Alternative definition of Poisson process Start with sequence of independent random times T 1, T 2,... Times T i exp(λ) have exponential distribution with parameter λ Define i-th arrival time S i S i = T 1 + T T i Define counting process of events occurred by time t N(t) = max(s i t) i N(t) is a Poisson process N(t) T 1 T 2 T 3 T 4 T 5 T S 1 S 2 S 3 S 4 S 5 S 6 t If N(t) is a Poisson process interarrival times T i are i.i.d. exponential To show that definition is equivalent have to show the converse If interarrival times are i.i.d. exponential, process is Poisson Introduction to Random Processes Continuous-time Markov Chains 33

34 Alternative definition of Poisson process (cont.) Exponential i.i.d. interarrival times Q: Poisson process? Show that implies definition (i)-(iii) Stationarity true because all T i have same distribution Independent increments true because Interarrival times are independent Exponential RVs are memoryless Can have more than one event in (0, h] only if T 1 < h and T 2 < h P (N(h) > 1) P (T 1 h) P (T 2 h) We have no event in (0, h] if T 1 > h = (1 e λh ) 2 = (λh) 2 + o(h 2 ) = o(h) P (N(h) = 0) = P (T 1 h) = e λh = 1 λh + o(h) The remaining case is N(h) = 1, whose probability is P (N(h) = 1) = 1 P (N(h) = 0) P (N(h) > 1) = λh + o(h) Introduction to Random Processes Continuous-time Markov Chains 34

35 Three definitions of Poisson processes Def. 1: Prob. of event proportional to interval width. Intervals independent Physical model definition Can a phenomenon be reasonably modeled as a Poisson process? The other two definitions are used for analysis and/or simulation Def. 2: Prob. distribution of events in (0, t] is Poisson Event centric definition. Nr. of events in given time intervals Allows analysis and simulation Used when information about nr. of events in given time is desired Def. 3: Prob. distribution of interarrival times is exponential Time centric definition. Times at which events happen Allows analysis and simulation Used when information about event times is of interest Introduction to Random Processes Continuous-time Markov Chains 35

36 Number of visitors to a web page example Ex: Count number of visits to a webpage between 6:00pm to 6:10pm Def 1: Q: Poisson process? Yes, seems reasonable to have Probability of visit proportional to time interval duration Independent visits over disjoint time intervals Model as Poisson process with rate λ visits/second (v/s) Def 2: Arrivals in (s, s + t] are Poisson with parameter λt Prob. of exactly 5 visits in 1 sec? P (N(1) = 5) = e λ λ 5 /5! Expected nr. of visits in 10 minutes? E [N(600)] = 600λ On average, data shows N visits in 10 minutes. Estimate ˆλ = N/600 Def 3: Interarrival times are i.i.d. T i exp(λ) Expected time between visits? E [T i ] = 1/λ Expected arrival time S n of n-th visitor? Recall S n = n i=1 T i, hence E [S n ] = n i=1 E [T i] = n/λ Introduction to Random Processes Continuous-time Markov Chains 36

37 Superposition of Poisson processes Let N 1 (t), N 2 (t) be Poisson processes with rates λ 1 and λ 2 Suppose N 1 (t) and N 2 (t) are independent N 1 (t) 2 1 N 2 (t) t S 1 S 1 S 2 S 3 S 2 t Then N(t) := N 1 (t) + N 2 (t) is a Poisson process with rate λ 1 + λ 2 N(t) S 1 S 2 S 3 S 4 S 5 t Introduction to Random Processes Continuous-time Markov Chains 37

38 Thinning of a Poisson process Let B N = B 1, B 2,... be an i.i.d. sequence of Bernoulli(p) RVs Let N(t) be a Poisson process with rate λ, independent of B N Then M(t) := N(t) i=1 B i is a Poisson process with rate λp B i : N(t) M(t) S 1 S 2 S 3 S 4 S 5 t S 1 S 2 S 3 t Introduction to Random Processes Continuous-time Markov Chains 38

39 Splitting of a Poisson process Let Z N = Z 1, Z 2,... be an i.i.d. sequence of RVs, Z i {1,..., m} Let N(t) be a Poisson process with rate λ, independent of Z N Define N k (t) = N(t) i=1 I {Z i = k}, for each k = 1,..., m Then each N k (t) is a Poisson process with rate λp (Z 1 = k) N 1 (t) Z i : t N(t) S 1 S 2 S 3 S 4 S 5 t N 2 (t) N 3 (t) 1 S 1 S 1 S 2 S 3 S 1 t t Introduction to Random Processes Continuous-time Markov Chains 39

40 Glossary Random times Exponential distribution Memoryless random times Time to first event First event to happen Strong memoryless property Event in infinitesimal interval Continuous-time process Counting process Right-continuous function Poisson process Independent increments Stationary increments Equivalent definitions Arrival times Interarrival times Event and time centric Superposition of Poisson processes Thinning of a Poisson process Splitting of a Poisson process Introduction to Random Processes Continuous-time Markov Chains 40

Continuous time Markov chains

Continuous time Markov chains Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

3. Poisson Processes (12/09/12, see Adult and Baby Ross) 3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

EDRP lecture 7. Poisson process. Pawe J. Szab owski

EDRP lecture 7. Poisson process. Pawe J. Szab owski EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment

More information

STAT 380 Markov Chains

STAT 380 Markov Chains STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring 2016 1 / 38 1/41 PoissonProcesses.pdf (#2) Poisson Processes

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

1 Poisson processes, and Compound (batch) Poisson processes

1 Poisson processes, and Compound (batch) Poisson processes Copyright c 2007 by Karl Sigman 1 Poisson processes, and Compound (batch) Poisson processes 1.1 Point Processes Definition 1.1 A simple point process ψ = {t n : n 1} is a sequence of strictly increasing

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.

More information

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 This exam is closed book. YOU NEED TO SHOW YOUR WORK. Honor Code: Students are expected to behave honorably, following the accepted

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

EE126: Probability and Random Processes

EE126: Probability and Random Processes EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of

More information

ECE 313 Probability with Engineering Applications Fall 2000

ECE 313 Probability with Engineering Applications Fall 2000 Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) =

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

CS 237: Probability in Computing

CS 237: Probability in Computing CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 13: Normal Distribution Exponential Distribution Recall that the Normal Distribution is given by an explicit

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology 6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first

More information

Random variables and transform methods

Random variables and transform methods Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment: Moments Lecture 10: Central Limit Theorem and CDFs Sta230 / Mth 230 Colin Rundel Raw moment: Central moment: µ n = EX n ) µ n = E[X µ) 2 ] February 25, 2014 Normalized / Standardized moment: µ n σ n Sta230

More information

PoissonprocessandderivationofBellmanequations

PoissonprocessandderivationofBellmanequations APPENDIX B PoissonprocessandderivationofBellmanequations 1 Poisson process Let us first define the exponential distribution Definition B1 A continuous random variable X is said to have an exponential distribution

More information

Poisson processes and their properties

Poisson processes and their properties Poisson processes and their properties Poisson processes. collection {N(t) : t [, )} of rando variable indexed by tie t is called a continuous-tie stochastic process, Furtherore, we call N(t) a Poisson

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Birth-Death Processes

Birth-Death Processes Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death

More information

Arrivals and waiting times

Arrivals and waiting times Chapter 20 Arrivals and waiting times The arrival times of events in a Poisson process will be continuous random variables. In particular, the time between two successive events, say event n 1 and event

More information

EE126: Probability and Random Processes

EE126: Probability and Random Processes EE126: Probability and Random Processes Lecture 19: Poisson Process Abhay Parekh UC Berkeley March 31, 2011 1 1 Logistics 2 Review 3 Poisson Processes 2 Logistics 3 Poisson Process A continuous version

More information

LECTURE #6 BIRTH-DEATH PROCESS

LECTURE #6 BIRTH-DEATH PROCESS LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

errors every 1 hour unless he falls asleep, in which case he just reports the total errors

errors every 1 hour unless he falls asleep, in which case he just reports the total errors I. First Definition of a Poisson Process A. Definition: Poisson process A Poisson Process {X(t), t 0} with intensity λ > 0 is a counting process with the following properties. INDEPENDENT INCREMENTS. For

More information

TMA 4265 Stochastic Processes

TMA 4265 Stochastic Processes TMA 4265 Stochastic Processes Norges teknisk-naturvitenskapelige universitet Institutt for matematiske fag Solution - Exercise 9 Exercises from the text book 5.29 Kidney transplant T A exp( A ) T B exp(

More information

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions Introduction to Statistical Data Analysis Lecture 3: Probability Distributions James V. Lambers Department of Mathematics The University of Southern Mississippi James V. Lambers Statistical Data Analysis

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Special distributions

Special distributions Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.

More information

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014 Lecture 20 Text: A Course in Probability by Weiss 12.1 STAT 225 Introduction to Probability Models March 26, 2014 Whitney Huang Purdue University 20.1 Agenda 1 2 20.2 For a specified event that occurs

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Simulating events: the Poisson process

Simulating events: the Poisson process Simulating events: te Poisson process p. 1/15 Simulating events: te Poisson process Micel Bierlaire micel.bierlaire@epfl.c Transport and Mobility Laboratory Simulating events: te Poisson process p. 2/15

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Lecture 4: Bernoulli Process

Lecture 4: Bernoulli Process Lecture 4: Bernoulli Process Hyang-Won Lee Dept. of Internet & Multimedia Eng. Konkuk University Lecture 4 Hyang-Won Lee 1 / 12 Stochastic Process A stochastic process is a mathematical model of a probabilistic

More information

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution Pengyuan (Penelope) Wang June 15, 2011 Review Discussed Uniform Distribution and Normal Distribution Normal Approximation

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1

ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen. D. van Alphen 1 ECE 650 Lecture #10 (was Part 1 & 2) D. van Alphen D. van Alphen 1 Lecture 10 Overview Part 1 Review of Lecture 9 Continuing: Systems with Random Inputs More about Poisson RV s Intro. to Poisson Processes

More information

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002 EE/CpE 345 Modeling and Simulation Class 5 September 30, 2002 Statistical Models in Simulation Real World phenomena of interest Sample phenomena select distribution Probabilistic, not deterministic Model

More information

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have Lectures 16-17: Poisson Approximation 1. The Law of Rare Events Theorem 2.6.1: For each n 1, let X n,m, 1 m n be a collection of independent random variables with PX n,m = 1 = p n,m and PX n,m = = 1 p

More information

Math/Stat 352 Lecture 8

Math/Stat 352 Lecture 8 Math/Stat 352 Lecture 8 Sections 4.3 and 4.4 Commonly Used Distributions: Poisson, hypergeometric, geometric, and negative binomial. 1 The Poisson Distribution Poisson random variable counts the number

More information

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon. 604/643 Fall 200 Final Exam Solutions Wednesday, December 5, 9:00AM - 2:00noon Problem (32 points) Consider a Markov chain {X n ; n 0,, }, specified by the following transition diagram 06 05 09 04 03 2

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Transform Techniques - CF

Transform Techniques - CF Transform Techniques - CF [eview] Moment Generating Function For a real t, the MGF of the random variable is t t M () t E[ e ] e Characteristic Function (CF) k t k For a real ω, the characteristic function

More information

Optimization and Simulation

Optimization and Simulation Optimization and Simulation Simulating events: the Michel Bierlaire Transport and Mobility Laboratory School of Architecture, Civil and Environmental Engineering Ecole Polytechnique Fédérale de Lausanne

More information

Assignment 3 with Reference Solutions

Assignment 3 with Reference Solutions Assignment 3 with Reference Solutions Exercise 3.: Poisson Process Given are k independent sources s i of jobs as shown in the figure below. The interarrival time between jobs for each source is exponentially

More information

Computer Systems Modelling

Computer Systems Modelling Computer Systems Modelling Dr RJ Gibbens Computer Science Tripos, Part II Lent Term 2017/18 Last revised: 2018-01-10 (34a98a7) Department of Computer Science & Technology The Computer Laboratory University

More information

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 4 Continuous Random Variables and Probability Distributions Chapter 4 Continuous Random Variables and Probability Distributions Part 3: The Exponential Distribution and the Poisson process Section 4.8 The Exponential Distribution 1 / 21 Exponential Distribution

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

Math 597/697: Solution 5

Math 597/697: Solution 5 Math 597/697: Solution 5 The transition between the the ifferent states is governe by the transition matrix 0 6 3 6 0 2 2 P = 4 0 5, () 5 4 0 an v 0 = /4, v = 5, v 2 = /3, v 3 = /5 Hence the generator

More information

Chapters 3.2 Discrete distributions

Chapters 3.2 Discrete distributions Chapters 3.2 Discrete distributions In this section we study several discrete distributions and their properties. Here are a few, classified by their support S X. There are of course many, many more. For

More information

Stat 426 : Homework 1.

Stat 426 : Homework 1. Stat 426 : Homework 1. Moulinath Banerjee University of Michigan Announcement: The homework carries 120 points and contributes 10 points to the total grade. (1) A geometric random variable W takes values

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder

More information

Modeling Rare Events

Modeling Rare Events Modeling Rare Events Chapter 4 Lecture 15 Yiren Ding Shanghai Qibao Dwight High School April 24, 2016 Yiren Ding Modeling Rare Events 1 / 48 Outline 1 The Poisson Process Three Properties Stronger Property

More information

CAS Exam MAS-1. Howard Mahler. Stochastic Models

CAS Exam MAS-1. Howard Mahler. Stochastic Models CAS Exam MAS-1 Howard Mahler Stochastic Models 2019 Stochastic Models, HCM 12/31/18, Page 1 The slides are in the same order as the sections of my study guide. Section # Section Name 1 Introduction 2 Exponential

More information

December 19, Probability Theory Instituto Superior Técnico. Poisson Convergence. João Brazuna. Weak Law of Small Numbers

December 19, Probability Theory Instituto Superior Técnico. Poisson Convergence. João Brazuna. Weak Law of Small Numbers Simple to Probability Theory Instituto Superior Técnico December 19, 2016 Contents Simple to 1 Simple 2 to Contents Simple to 1 Simple 2 to Simple to Theorem - Events with low frequency in a large population

More information

Predator-Prey Population Dynamics

Predator-Prey Population Dynamics Predator-Prey Population Dynamics Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 2,

More information

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential. Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley Midterm 2 Review CS70 Summer 2016 - Lecture 6D David Dinh 28 July 2016 UC Berkeley Midterm 2: Format 8 questions, 190 points, 110 minutes (same as MT1). Two pages (one double-sided sheet) of handwritten

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

Continuous Probability Spaces

Continuous Probability Spaces Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes. Can not assign probabilities to each outcome and add

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

M/G/1 and M/G/1/K systems

M/G/1 and M/G/1/K systems M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0. IEOR 46: Introduction to Operations Research: Stochastic Models Chapters 5-6 in Ross, Thursday, April, 4:5-5:35pm SOLUTIONS to Second Midterm Exam, Spring 9, Open Book: but only the Ross textbook, the

More information

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random

More information

Disjointness and Additivity

Disjointness and Additivity Midterm 2: Format Midterm 2 Review CS70 Summer 2016 - Lecture 6D David Dinh 28 July 2016 UC Berkeley 8 questions, 190 points, 110 minutes (same as MT1). Two pages (one double-sided sheet) of handwritten

More information

Chapter 2. Poisson Processes

Chapter 2. Poisson Processes Chapter 2. Poisson Processes Prof. Ai-Chun Pang Graduate Institute of Networking and Multimedia, epartment of Computer Science and Information Engineering, National Taiwan University, Taiwan utline Introduction

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information