STAT 380 Markov Chains

Size: px
Start display at page:

Download "STAT 380 Markov Chains"

Transcription

1 STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 1/41 PoissonProcesses.pdf (#2)

2 Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: numbers of particles arriving in an interval has Poisson distribution, mean proportional to length of interval, numbers in several non-overlapping intervals independent. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 2/41 PoissonProcesses.pdf (2/41)

3 Notation, formal assumptions For s < t, denote number of arrivals in (s, t] byn(s, t). Model is 1 N(s, t) hasapoisson(λ(t s)) distribution. 2 For 0 s 1 < t 1 s 2 < t 2 s k < t k the variables N(s i, t i ); i =1,...,k are independent. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 3/41 PoissonProcesses.pdf (3/41)

4 Approach 2 Let 0 < S 1 < S 2 < be the times at which the particles arrive. Let T i = S i S i 1 with S 0 =0byconvention. Then T 1, T 2,... are independent Exponential random variables with mean 1/λ. Note P(T i > x) =e λx is called survival function of T i. Approaches 1 and 2 are equivalent. Both are deductions of a model based on local behaviour of process. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 4/41 PoissonProcesses.pdf (4/41)

5 Approach 3 Assume: 1 given all the points in [0, t] the probability of 1 point in the interval (t, t + h] is of the form λh + o(h) 2 given all the points in [0, t] the probability of 2 or more points in interval (t, t + h] is of the form o(h) All 3 approaches are equivalent. Ishow: 3implies1,1implies2and2implies3. First explain o, O. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 5/41 PoissonProcesses.pdf (5/41)

6 6/41 PoissonProcesses.pdf (6/41)

7 Landau s big and little O Notation: given functions f and g we write f (h) =g(h)+o(h) provided f (h) g(h) lim =0 h 0 h Aside: if there is a constant M such that lim sup f (h) g(h) h 0 h M we say f (h) =g(h)+o(h) Another form is f (h) =g(h)+o(h) means there is δ>0andm s.t. for all h <δ f (h) g(h) Mh Idea: o(h) is tiny compared to h while O(h) is (very) roughly the same size as h. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 7/41 PoissonProcesses.pdf (7/41)

8 Model 3 implies 1 Fix t, definef t (s) to be conditional probability of 0 points in (t, t + s] given value of process on [0, t]. Derive differential equation for f. Given process on [0, t] and0pointsin(t, t + s] probability of no points in (t, t + s + h] is f t+s (h) =1 λh + o(h) Given the process on [0, t] the probability of no points in (t, t + s] is f t (s). Using P(AB C) =P(A BC)P(B C) gives f t (s + h) =f t (s)f t+s (h) = f t (s)(1 λh + o(h)) Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 8/41 PoissonProcesses.pdf (8/41)

9 3 implies 1 continued Now rearrange, divide by h to get f t (s + h) f t (s) h = λf t (s)+ o(h) h Let h 0andfind f t (s) = λf t (s) s Differential equation has solution f t (s) =f t (0) exp( λs) =exp( λs). Notice: survival function of exponential rv. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 9/41 PoissonProcesses.pdf (9/41)

10 General case Notation: N(t) =N(0, t). N(t) is a non-decreasing function of t. Let P k (t) =P(N(t) =k) Evaluate P k (t + h) byconditioningonn(s); 0 s < t and N(t) =j. Given N(t) =j probability that N(t + h) =k is conditional probability of k j points in (t, t + h]. So, for j k 2: P(N(t + h) =k N(t) =j, N(s), 0 s < t) =o(h). Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 10/41 PoissonProcesses.pdf (10/41)

11 General case, continued For j = k 1wehave P(N(t + h) =k N(t) =k 1, N(s), 0 s < t) =λh + o(h) For j = k we have P(N(t + h) =k N(t) =k, N(s), 0 s < t) =1 λh + o(h) N is increasing so only consider j k. k P k (t + h) = P(N(t + h) =k N(t) =j)p j (t) j=0 = P k (t)(1 λh)+λhp k 1 (t)+o(h) Rearrange, divide by h and let h 0tget P k (t) = λp k(t)+λp k 1 (t) Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 11/41 PoissonProcesses.pdf (11/41)

12 12/41 PoissonProcesses.pdf (12/41)

13 General case, continued III For k = 0 the term P k 1 is dropped and P 0(t) = λp 0 (t) Using P 0 (0) = 1 we get P 0 (t) =e λt Put this into the equation for k =1toget P 1(t) = λp 1 (t)+λe λt Multiply by e λt to see ( e λt P 1 (t)) = λ With P 1 (0) = 0 we get P 1 (t) =λte λt Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 13/41 PoissonProcesses.pdf (13/41)

14 General case, continued IV For general k we have P k (0) = 0 and Check by induction that ( e λt P k (t)) = λe λt P k 1 (t) e λt P k (t) =(λt) k /k! Hence: N(t) haspoisson(λt) distribution. Similar ideas permit proof of P (N(s, t) =k N(u); 0 u s) = {λ(t s)}k e λ(t s) By induction can prove N has independent Poisson increments. k! Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 14/41 PoissonProcesses.pdf (14/41)

15 15/41 PoissonProcesses.pdf (15/41)

16 Exponential Interarrival Times Suppose N is a Poisson Process. Define T 1, T 2,... to be the times between 0 and the first point, the first point and the second and so on. Fact: T 1, T 2,... are iid exponential rvs with mean 1/λ. Already did T 1 rigorously. The event T > t is exactly the event N(t) =0. So P(T > t) =exp( λt) which is the survival function of an exponential rv. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 16/41 PoissonProcesses.pdf (16/41)

17 Exponential Interarrival Times deduced IdocaseofT 1, T 2. Let t 1, t 2 be two positive numbers and s 1 = t 1, s 2 = t 1 + t 2. Consider event {t 1 < T 1 t 1 + δ 1 } {t 2 < T 2 t 2 + δ 2 }. This is almost the same as the intersection of the four events: N(0, t 1 ]=0 N(t 1, t 1 + δ 1 ]=1 N(t 1 + δ 1, t 1 + δ 1 + t 2 ]=0 N(s 2 + δ 1, s 2 + δ 1 + δ 2 ]=1 which has probability e λt 1 λδ 1 e λδ 1 e λt 2 λδ 2 e λδ 2 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 17/41 PoissonProcesses.pdf (17/41)

18 Less Rigor Divide by δ 1 δ 2. Let δ 1 and δ 2 go to 0 to get joint density of T 1, T 2 : λ 2 e λt 1 e λt 2 This is joint density of two independent exponential variates. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 18/41 PoissonProcesses.pdf (18/41)

19 More Rigor Find joint density of S 1,...,S k. Use change of variables to find joint density of T 1,...,T k. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 19/41 PoissonProcesses.pdf (19/41)

20 More Rigor, II First step: Compute P(0 < S 1 s 1 < S 2 s 2 < S k s k ) This is just the event of exactly 1 point in each interval (s i 1, s i ]for i =1,...,k 1(s 0 =0)andatleastonepointin(s k 1, s k ]. This event has probability k 1 {λ(s i s i 1 )e λ(s i s i 1 ) }( 1 e λ(s k s k 1 ) ) 1 Second step: write this in terms of joint cdf of S 1,...,S k. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 20/41 PoissonProcesses.pdf (20/41)

21 More Rigor, III, k =2 P(0 < S 1 s 1 < S 2 s 2 ) = F S1,S 2 (s 1, s 2 ) F S1,S 2 (s 1, s 1 ) Notice tacit assumption s 1 < s 2. Differentiate twice, that is, take to get Simplify to f S1,S 2 (s 1, s 2 )= Recall tacit assumption to get 2 s 1 s 2 2 s 1 s 2 λs 1 e λs 1 λ 2 e λs 2 (1 e λ(s 2 s 1 ) ) f S1,S 2 (s 1, s 2 )=λ 2 e λs 2 1(0 < s 1 < s 2 ) Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 21/41 PoissonProcesses.pdf (21/41)

22 More Rigor, IV, change of variables Now compute the joint cdf of T 1, T 2 by This is F T1,T 2 (t 1, t 2 )=P(S 1 < t 1, S 2 S 1 < t 2 ) P(S 1 < t 1, S 2 S 1 < t 2 )= Differentiate twice to get = λ t1 0 t1 0 s1 +t 2 λ 2 e λs 2 ds 2 ds 1 s 1 ( ) e λs 1 e λ(s 1+t 2 ) ds 1 =1 e λt 1 e λt 2 + e λ(t 1+t 2 ) f T1,T 2 (t 1, t 2 )=λe λt 1 λe λt 2 This is joint density of 2 independent exponential rvs. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 22/41 PoissonProcesses.pdf (22/41)

23 Summary so far Have shown: Instantaneous rates model implies independent Poisson increments model Independent Poisson increments model implies independent exponential interarrivals. Next: show independent exponential interarrivals implies the instantaneous rates model. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / 38 23/41 PoissonProcesses.pdf (23/41)

24 Exponential interarrivals implies rates Suppose T 1,... iid exponential rvs with means 1/λ. Define N t by N t = k if and only if T T k t T T k+1 Let A be the event N(s) =n(s); 0 < s t. We are to show and P(N(t, t + h] =1 N(t) =k, A) =λh + o(h) P(N(t, t + h] 2 N(t) =k, A) =o(h) If n(s) is a possible trajectory consistent with N(t) =k then n has jumps at points s 1 t 1, s 2 t 1 + t 2,...,s k t t k < t and at no other points in (0, t]. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 24/41 PoissonProcesses.pdf (24/41)

25 Continued So given N(s) =n(s); 0 < s t with n(t) =k we are essentially being given T 1 = t 1,...,T k = t k, T k+1 > t s k and asked the conditional probabilty in the first case of the event B given by t s k < T k+1 t s k + h < T k+2 + T k+1. Conditioning on T 1,...,T k irrelevant (independence). P(N(t, t + h] =1 N(t) =k, A)/h = P(B T k+1 > t s k )/h = P(B) he λ(t s k ) Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 25/41 PoissonProcesses.pdf (25/41)

26 Continued The numerator may be evaluated by integration: P(B) = t sk +h t s k t s k +h s 1 λ 2 e λ(s 1+s 2 ) ds 2 ds 1 Let h 0togetthelimit as required. The computation of is similar. P(N(t, t + h] =1 N(t) =k, A)/h λ lim P(N(t, t + h] 2 N(t) =k, A)/h h 0 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 26/41 PoissonProcesses.pdf (26/41)

27 Properties of exponential rvs Convolution: IfX and Y independent rvs with densities f and g respectively and Z = X + Y then P(Z z) = Differentiating wrt z we get f Z (z) = z x f (x)g(y)dydx f (x)g(z x)dx This integral is called the convolution of densities f and g. If T 1,...,T n iid Exponential(λ) thens n = T T n has a Gamma(n,λ) distribution. Density of S n is for s > 0. f Sn (s) =λ(λs) n 1 e λs /(n 1)! Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 27/41 PoissonProcesses.pdf (27/41)

28 Proof Then P(S n > s) =P(N(0, s] < n) = f Sn (s) = d ds P(S n s) = d ds {1 P(S n > s)} n 1 (λs) j e λs /j! j=0 = λ j=1 This telescopes to n 1 = λe λs n 1 { j(λs) j 1 (λs) j} e λs j=1 { (λs) j j! j! } (λs)j 1 + λe λs (j 1)! f Sn (s) =λ(λs) n 1 e λs /(n 1)! + λe λs Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 28/41 PoissonProcesses.pdf (28/41)

29 Extreme Values Suppose X 1,...,X n are independent exponential rvs with means 1/λ 1,...,1/λ n Then Y =min{x 1,...,X n } has exponential distribution with mean Proof: 1 λ λ n P(Y > y) =P( kx k > y) = e λ k y = e λ k y Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 29/41 PoissonProcesses.pdf (29/41)

30 Memoryless Property Suppose X has exponential distribution. Conditional distribution of X x given X x is also exponential. Proof: P(X x > y X x) = P(X > x + y, X x) P(X > x) = PX > x + y) P(X x) = e λ(x+y) e λx = e λy Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 30/41 PoissonProcesses.pdf (30/41)

31 Hazard Rates Assume rv T > 0withdensityf and cdf F Hazard rate, or instantaneous failure rate, is This is just r(t) = lim δ 0 P(t < T t + δ T t) δ r(t) = f (t) 1 F (t) For an exponential random variable with mean 1/λ this is h(t) = λe λt e λt = λ The exponential distribution has constant failure rate. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 31/41 PoissonProcesses.pdf (31/41)

32 The Weibull distribution Weibull random variables have density, for t > 0, f (t λ, α) =λ(λt) α 1 e (λt)α. The corresponding survival function is, for t > 0, 1 F (t) =e (λt)α The hazard rate is r(t) =λ(λt) α 1 Hazard rate is increasing for α>1, decreasing for α<1. For α = 1 this is the exponential distribution. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 32/41 PoissonProcesses.pdf (32/41)

33 Weibull distribution, continued Since r(t) = we can integrate to find df (t)/dt 1 F (t) = d log(1 F (t)) dt 1 F (t) =exp{ t 0 r(s)ds}. So r determines F and f. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 33/41 PoissonProcesses.pdf (33/41)

34 Properties of Poisson Processes 1) If N 1 and N 2 are independent Poisson processes with rates λ 1 and λ 2, respectively, then N = N 1 + N 2 is a Poisson process with rate λ 1 + λ 2. 2) Suppose N is a Poisson process with rate λ. Supposeeachpointis marked with a label, say one of L 1,...,L r,independentlyofallother occurences. Suppose p i is the probability that a given point receives label L i. Let N i count the points with label i (so that N = N N r ). Then N 1,...,N r are independent Poisson processes with rates p i λ. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 34/41 PoissonProcesses.pdf (34/41)

35 Properties of Poisson Processes 3) Suppose U 1, U 2,... independent rvs, each uniformly distributed on [0, T ]. Suppose M is a Poisson(λT ) random variable independent of the U s. Let N(t) = M 1(U i t) Then N is a Poisson process on [0, T ] with rate λ. 1 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 35/41 PoissonProcesses.pdf (35/41)

36 Properties of Poisson Processes 4) Suppose N is a Poisson process with rate λ. Let S 1 < S 2 < be the times at which points arrive Given N(T )=ns 1,...,S n have the same distribution as the order statistics of a sample of size n from the uniform distribution on [0, T ]. 5) Given S n+1 = T, S 1,...,S n have the same distribution as the order statistics of a sample of size n from the uniform distribution on [0, T ]. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 36/41 PoissonProcesses.pdf (36/41)

37 Indications of some proofs 1) N 1,...,N r independent Poisson processes rates λ i, N = N i. Let A h be the event of 2 or more points in N in the time interval (t, t + h] Let B h be the event of exactly one point in N in the time interval (t, t + h]. Let A ih and B ih be the corresponding events for N i. Let H t denote the history of the processes up to time t; wecondition on H t. We are given: P(A ih H t )=o(h) and P(B ih H t )=λ i h + o(h). Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 37/41 PoissonProcesses.pdf (37/41)

38 Indications of some proofs, II Note that A h r (B ih B jh ) i=1 A ih i j Since P(B ih B jh H t )=P(B ih H t )P(B jh H t ) =(λ i h + o(h))(λ j h + o(h)) = O(h 2 ) = o(h) we have checked one of the two infinitesimal conditions for a Poisson process. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 38/41 PoissonProcesses.pdf (38/41)

39 Indications of some proofs, III Next let C h be the event of no points in N in the time interval (t, t + h]. Let C ih be the same for N i. Then shows P(C h H t )=P( C ih H t ) = P(C ih H t ) = (1 λ i h + o(h)) =1 ( λ i )h + o(h) P(B h H t )=1 P(C h H t ) P(A h H t ) =( λ i )h + o(h) Hence N is a Poisson process with rate λ i. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 39/41 PoissonProcesses.pdf (39/41)

40 Indications of some proofs, IV 2) The infinitesimal approach used for 1 can do part of this. See text for rest. Events defined as in 1): Define event B ih : there is one point in N i in (t, t + h] Define event B h :there is exactly one point in any of the r processes, B ih is union of B h and the subset of A h where there are two or more points in N in (t, t + h] butexactlyoneislabeledi. Since P(A h H t )=o(h) P(B ih H t )=p i P(B h H t )+o(h) = p i (λh + o(h)) + o(h) = p i λh + o(h) Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 40/41 PoissonProcesses.pdf (40/41)

41 Indications of some proofs, V Similarly, A ih is a subset of A h so P(A ih H t )=o(h) This shows each N i is Poisson with rate λp i. To get independence requires more work; see the text for the algebraic method which is easier. Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring / :13:19 41/41 PoissonProcesses.pdf (41/41)

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

Manual for SOA Exam MLC.

Manual for SOA Exam MLC. Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

STAT 380 Continuous Time Markov Chains

STAT 380 Continuous Time Markov Chains STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

Arrivals and waiting times

Arrivals and waiting times Chapter 20 Arrivals and waiting times The arrival times of events in a Poisson process will be continuous random variables. In particular, the time between two successive events, say event n 1 and event

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

3. Poisson Processes (12/09/12, see Adult and Baby Ross) 3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

EDRP lecture 7. Poisson process. Pawe J. Szab owski

EDRP lecture 7. Poisson process. Pawe J. Szab owski EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have Lectures 16-17: Poisson Approximation 1. The Law of Rare Events Theorem 2.6.1: For each n 1, let X n,m, 1 m n be a collection of independent random variables with PX n,m = 1 = p n,m and PX n,m = = 1 p

More information

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

More information

2 Continuous Random Variables and their Distributions

2 Continuous Random Variables and their Distributions Name: Discussion-5 1 Introduction - Continuous random variables have a range in the form of Interval on the real number line. Union of non-overlapping intervals on real line. - We also know that for any

More information

PROBABILITY DISTRIBUTIONS

PROBABILITY DISTRIBUTIONS Review of PROBABILITY DISTRIBUTIONS Hideaki Shimazaki, Ph.D. http://goo.gl/visng Poisson process 1 Probability distribution Probability that a (continuous) random variable X is in (x,x+dx). ( ) P x < X

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

TMA 4265 Stochastic Processes

TMA 4265 Stochastic Processes TMA 4265 Stochastic Processes Norges teknisk-naturvitenskapelige universitet Institutt for matematiske fag Solution - Exercise 9 Exercises from the text book 5.29 Kidney transplant T A exp( A ) T B exp(

More information

Stochastic Proceses Poisson Process

Stochastic Proceses Poisson Process Stochastic Proceses 2014 1. Poisson Process Giovanni Pistone Revised June 9, 2014 1 / 31 Plan 1. Poisson Process (Formal construction) 2. Wiener Process (Formal construction) 3. Infinitely divisible distributions

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Continuous time Markov chains

Continuous time Markov chains Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

More information

Assignment 3 with Reference Solutions

Assignment 3 with Reference Solutions Assignment 3 with Reference Solutions Exercise 3.: Poisson Process Given are k independent sources s i of jobs as shown in the figure below. The interarrival time between jobs for each source is exponentially

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

EE126: Probability and Random Processes

EE126: Probability and Random Processes EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of

More information

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory JUMP PROCESSES GENERALIZING STOCHASTIC INTEGRALS WITH JUMPS Tyler Hofmeister University of Calgary Mathematical and Computational Finance Laboratory Overview 1. General Method 2. Poisson Processes 3. Diffusion

More information

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017) UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable

More information

Lectures prepared by: Elchanan Mossel Yelena Shvets

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Lectures prepared by: Elchanan Mossel Yelena Shvets Follows Jim Pitman s book: Probability Section 4.2 Random Times Random times are often described

More information

Renewal theory and its applications

Renewal theory and its applications Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially

More information

Performance Modelling of Computer Systems

Performance Modelling of Computer Systems Performance Modelling of Computer Systems Mirco Tribastone Institut für Informatik Ludwig-Maximilians-Universität München Fundamentals of Queueing Theory Tribastone (IFI LMU) Performance Modelling of Computer

More information

EE126: Probability and Random Processes

EE126: Probability and Random Processes EE126: Probability and Random Processes Lecture 19: Poisson Process Abhay Parekh UC Berkeley March 31, 2011 1 1 Logistics 2 Review 3 Poisson Processes 2 Logistics 3 Poisson Process A continuous version

More information

Lecture 3. David Aldous. 31 August David Aldous Lecture 3

Lecture 3. David Aldous. 31 August David Aldous Lecture 3 Lecture 3 David Aldous 31 August 2015 This size-bias effect occurs in other contexts, such as class size. If a small Department offers two courses, with enrollments 90 and 10, then average class (faculty

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

Random variables and transform methods

Random variables and transform methods Chapter Random variables and transform methods. Discrete random variables Suppose X is a random variable whose range is {,,..., } and set p k = P (X = k) for k =,,..., so that its mean and variance are

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 217 Lecture 13: The Poisson Process Relevant textbook passages: Sections 2.4,3.8, 4.2 Larsen Marx [8]: Sections

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions

More information

Quantitative Model Checking (QMC) - SS12

Quantitative Model Checking (QMC) - SS12 Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over

More information

PoissonprocessandderivationofBellmanequations

PoissonprocessandderivationofBellmanequations APPENDIX B PoissonprocessandderivationofBellmanequations 1 Poisson process Let us first define the exponential distribution Definition B1 A continuous random variable X is said to have an exponential distribution

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

1 Poisson processes, and Compound (batch) Poisson processes

1 Poisson processes, and Compound (batch) Poisson processes Copyright c 2007 by Karl Sigman 1 Poisson processes, and Compound (batch) Poisson processes 1.1 Point Processes Definition 1.1 A simple point process ψ = {t n : n 1} is a sequence of strictly increasing

More information

Stat 512 Homework key 2

Stat 512 Homework key 2 Stat 51 Homework key October 4, 015 REGULAR PROBLEMS 1 Suppose continuous random variable X belongs to the family of all distributions having a linear probability density function (pdf) over the interval

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

CS 237: Probability in Computing

CS 237: Probability in Computing CS 237: Probability in Computing Wayne Snyder Computer Science Department Boston University Lecture 13: Normal Distribution Exponential Distribution Recall that the Normal Distribution is given by an explicit

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Chapter 2. Poisson Processes

Chapter 2. Poisson Processes Chapter 2. Poisson Processes Prof. Ai-Chun Pang Graduate Institute of Networking and Multimedia, epartment of Computer Science and Information Engineering, National Taiwan University, Taiwan utline Introduction

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross. Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M Ross John L Weatherwax November 14, 27 Introduction Chapter 1: Introduction to Stochastic Processes Chapter 1:

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Notes largely based on Statistical Methods for Reliability Data by W.Q. Meeker and L. A. Escobar, Wiley, 1998 and on their class notes.

Notes largely based on Statistical Methods for Reliability Data by W.Q. Meeker and L. A. Escobar, Wiley, 1998 and on their class notes. Unit 2: Models, Censoring, and Likelihood for Failure-Time Data Notes largely based on Statistical Methods for Reliability Data by W.Q. Meeker and L. A. Escobar, Wiley, 1998 and on their class notes. Ramón

More information

Review of Mathematical Concepts. Hongwei Zhang

Review of Mathematical Concepts. Hongwei Zhang Review of Mathematical Concepts Hongwei Zhang http://www.cs.wayne.edu/~hzhang Outline Limits of real number sequences A fixed-point theorem Probability and random processes Probability model Random variable

More information

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation. Solutions Stochastic Processes and Simulation II, May 18, 217 Problem 1: Poisson Processes Let {N(t), t } be a homogeneous Poisson Process on (, ) with rate λ. Let {S i, i = 1, 2, } be the points of the

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

STAT 830 Hypothesis Testing

STAT 830 Hypothesis Testing STAT 830 Hypothesis Testing Richard Lockhart Simon Fraser University STAT 830 Fall 2018 Richard Lockhart (Simon Fraser University) STAT 830 Hypothesis Testing STAT 830 Fall 2018 1 / 30 Purposes of These

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

1. Compute the c.d.f. or density function of X + Y when X, Y are independent random variables such that:

1. Compute the c.d.f. or density function of X + Y when X, Y are independent random variables such that: Final exam study guide Probability Theory (235A), Fall 2013 The final exam will be held on Thursday, Dec. 5 from 1:35 to 3:00 in 1344 Storer Hall. Please come on time! It will be a closed-book exam. The

More information

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Queueing Theory. VK Room: M Last updated: October 17, 2013. Queueing Theory VK Room: M1.30 knightva@cf.ac.uk www.vincent-knight.com Last updated: October 17, 2013. 1 / 63 Overview Description of Queueing Processes The Single Server Markovian Queue Multi Server

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations Chapter 5 Statistical Models in Simulations 5.1 Contents Basic Probability Theory Concepts Discrete Distributions Continuous Distributions Poisson Process Empirical Distributions Useful Statistical Models

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.

More information

IEOR 4703: Homework 2 Solutions

IEOR 4703: Homework 2 Solutions IEOR 4703: Homework 2 Solutions Exercises for which no programming is required Let U be uniformly distributed on the interval (0, 1); P (U x) = x, x (0, 1). We assume that your computer can sequentially

More information

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27 Probability Review Yutian Li Stanford University January 18, 2018 Yutian Li (Stanford University) Probability Review January 18, 2018 1 / 27 Outline 1 Elements of probability 2 Random variables 3 Multiple

More information

STAT509: Continuous Random Variable

STAT509: Continuous Random Variable University of South Carolina September 23, 2014 Continuous Random Variable A continuous random variable is a random variable with an interval (either finite or infinite) of real numbers for its range.

More information

M/G/1 queues and Busy Cycle Analysis

M/G/1 queues and Busy Cycle Analysis queues and Busy Cycle Analysis John C.S. Lui Department of Computer Science & Engineering The Chinese University of Hong Kong www.cse.cuhk.edu.hk/ cslui John C.S. Lui (CUHK) Computer Systems Performance

More information

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr. Simulation Discrete-Event System Simulation Chapter 4 Statistical Models in Simulation Purpose & Overview The world the model-builder sees is probabilistic rather than deterministic. Some statistical model

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Stat 150 Practice Final Spring 2015

Stat 150 Practice Final Spring 2015 Stat 50 Practice Final Spring 205 Instructor: Allan Sly Name: SID: There are 8 questions. Attempt all questions and show your working - solutions without explanation will not receive full credit. Answer

More information

Stochastic Processes

Stochastic Processes Stochastic Processes 8.445 MIT, fall 20 Mid Term Exam Solutions October 27, 20 Your Name: Alberto De Sole Exercise Max Grade Grade 5 5 2 5 5 3 5 5 4 5 5 5 5 5 6 5 5 Total 30 30 Problem :. True / False

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

Math Spring Practice for the Second Exam.

Math Spring Practice for the Second Exam. Math 4 - Spring 27 - Practice for the Second Exam.. Let X be a random variable and let F X be the distribution function of X: t < t 2 t < 4 F X (t) : + t t < 2 2 2 2 t < 4 t. Find P(X ), P(X ), P(X 2),

More information

Lecture 7. Poisson and lifetime processes in risk analysis

Lecture 7. Poisson and lifetime processes in risk analysis Lecture 7. Poisson and lifetime processes in risk analysis Jesper Rydén Department of Mathematics, Uppsala University jesper.ryden@math.uu.se Statistical Risk Analysis Spring 2014 Example: Life times of

More information

MATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015

MATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015 ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which

More information

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015 Markov chains c A. J. Ganesh, University of Bristol, 2015 1 Discrete time Markov chains Example: A drunkard is walking home from the pub. There are n lampposts between the pub and his home, at each of

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015

Review 1: STAT Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics. August 25, 2015 Review : STAT 36 Mark Carpenter, Ph.D. Professor of Statistics Department of Mathematics and Statistics August 25, 25 Support of a Random Variable The support of a random variable, which is usually denoted

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y Math 480 Exam 2 is Wed. Oct. 31. You are allowed 7 sheets of notes and a calculator. The exam emphasizes HW5-8, and Q5-8. From the 1st exam: The conditional probability of A given B is P(A B) = P(A B)

More information

Continuous case Discrete case General case. Hazard functions. Patrick Breheny. August 27. Patrick Breheny Survival Data Analysis (BIOS 7210) 1/21

Continuous case Discrete case General case. Hazard functions. Patrick Breheny. August 27. Patrick Breheny Survival Data Analysis (BIOS 7210) 1/21 Hazard functions Patrick Breheny August 27 Patrick Breheny Survival Data Analysis (BIOS 7210) 1/21 Introduction Continuous case Let T be a nonnegative random variable representing the time to an event

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999 Name: 2:15-3:30 pm Thursday, 21 October 1999 You may use a calculator and your own notes but may not consult your books or neighbors. Please show your work for partial credit, and circle your answers.

More information

Supratim Ray

Supratim Ray Supratim Ray sray@cns.iisc.ernet.in Biophysics of Action Potentials Passive Properties neuron as an electrical circuit Passive Signaling cable theory Active properties generation of action potential Techniques

More information

Chapter 4: Continuous Probability Distributions

Chapter 4: Continuous Probability Distributions Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information