Introduction to Probability and Statistics Winter 2015

Size: px
Start display at page:

Download "Introduction to Probability and Statistics Winter 2015"

Transcription

1 CALIFORNIA INSTITUTE OF TECHNOLOGY Ma 3/13 KC Border Introduction to Probability and Statistics Winter 215 Lecture 11: The Poisson Process Relevant textbook passages: Pitman [15]: Sections 2.4,3.8, 4.2 Larsen Marx [14]: Sections 3.8, 4.2, What makes a density? Any function f : R + R + such that f(x) dx < can be turned into a probability density function by normalizing it. That is, if the real number satisfies c f(x) dx, then f(x)/c is a probability density. Likewise for functions f : R R + such that f(x) dx <. The constants c are sometimes called normalizing constants, and they account for the odd look of many densities. For instance, /2 e z2 dz 2π is the normalizing constant for the Normal family. Much space is devoted in introductory statistics and probability textbooks to computing various integrals. In this course, I shall not spend time in lecture on the details of evaluating integrals. My view is that the evaluation of integrals, while a necessary part of the subject, frequently offers little insight. You all have had serious calculus classes recently, so you are probably better at integration than I am these days. On the occasions where it does provide some insight, we may spend some time on it. I do recommend the exposition in Pitman [15, Section 4.4, pp ] and Larsen Marx [14, Section 3.8, pp ]. We shall now consider some other important densities and pmfs The Gamma Function Definition The Gamma function is defined by Γ(s) t s 1 e t dt. Larsen Marx [14]: Section 4.6, pp Pitman [15]: p

2 Ma 3/13 Winter 215 KC Border The Poisson Process 11 2 The Gamma function is a continuous version of the factorial, and has the property that Γ(s + 1) sγ(s) for every s >, and Γ(m) (m 1)! for every natural number m. In particular, Γ(2) Γ(1) 1, Γ(1/2) π. There is no closed form formula for the Gamma function except at integer multiples of 1/2. See for instance, Pitman [15, pp ] or Apostol [2, pp ] for the cited properties, which follow from integration by parts Pitman [15]: Exercise , p. 294; p The Gamma family of distributions The density of the general Gamma(r, λ) distribution is given by f(t) λr Γ(r) tr 1 e λt (t > ). (1) v ::11.22 KC Border

3 Ma 3/13 Winter 215 KC Border The Poisson Process 11 3 The parameter r is referred to as the shape parameter or index and λ is a scale parameter. Why is it called the scale parameter? T Gamma(r, λ) λt Gamma(r, 1) It has mean and variance given by E X r λ, Var X r λ 2. According to Pitman [15, p. 291], In applications the distribution of a random variable may be unknown, but reasonably well approximated by some gamma distribution. ( λ) 2. λ.5 λ1 λ KC Border v ::11.22

4 Ma 3/13 Winter 215 KC Border The Poisson Process 11 4 ( λ) λ.5 λ1 λ ( λ).5 λ.5 λ1 λ v ::11.22 KC Border

5 Ma 3/13 Winter 215 KC Border The Poisson Process 11 5 ( λ) λ.5 λ1 λ ( ).5 r.5 r 1 r KC Border v ::11.22

6 Ma 3/13 Winter 215 KC Border The Poisson Process 11 6 ( ) r.5 r 1 r v ::11.22 KC Border

7 Ma 3/13 Winter 215 KC Border The Poisson Process 11 7 Hey: Read Me There are (at least) three incompatible, but easy to translate, naming conventions for the Gamma distribution. Pitman [p. 286][15] and Larsen and Marx [14, Defn , p. 272] refer to their parameters as r and λ, and call the function in equation (1) the Gamma(r, λ) density. Note that the shape parameter is the first parameter and the scale parameter is the second parameter for Pitman and Larsen and Marx. This is the convention that I used above in equation (1). Feller [p. 47][12] calls the scale parameter α instead of λ, and he calls the shape parameter ν instead of r. Cramér [p. 126][8] also calls the scale parameter α instead of λ, but the shape parameter he calls λ instead of r. Other than that they agree that equation (1) is the Gamma density, but they list the parameters in reverse order. That is, they list the scale parameter first, and the shape parameter second. Casella and Berger [4, eq , p. 99] call the scale parameter β and the shape parameter α, and list the shape parameter first and the scale parameter second. But here is the confusing part, their scale parameter β is our 1/λ. a Mathematica [2] and R [16] also invert the scale parameter. To get my Gamma(r, λ) density in Mathematica 8, you have to call PDF[GammaDistribution[r, 1/λ], t], to get it in R, you would call dgamma(t, r, rate 1/λ). I m sorry. It s not my fault. But you do have to be careful to know what convention is being used. a That is, C B write the gamma(α, β) density as 1 Γ(α)β α tα 1 e t/β Random Lifetime For a lifetime or duration T chosen at random according to a density f(t) on [, ), and cdf F (t), the survival function is Pitman [15]: 4.2 G(t) P (T > t) 1 F (t) t f(s) ds. When T is the (random) time to failure, the survival function G(t) at epoch t gives the probability of surviving (not failing) until t. Note the convention that the present is time t, and durations are measured as times after that. KC Border v ::11.22

8 Ma 3/13 Winter 215 KC Border The Poisson Process 11 8 Pitman [15]: 4.3 Aside: If you ve ever done any programming involving a calendar, you know the difference between a point in time, called an epoch by probabilists, and a duration, which is the difference between two epochs. The hazard rate λ(t) is defined by ( ) P T (t, t + h) T > t λ(t) lim. h h Or λ(t) f(t) G(t). Proof : By definition, P ( T (t, t + h) T > t ) P (T (t, t + h)) P (T > t) P (T (t, t + h)). G(t) Moreover P (T (t, t + h)) F (t + h) F (t), so the limit is just F (t)/g(t) f(t)/g(t). The hazard rate f(t)/g(t) is often thought of as the instantaneous probability of death or failure The Exponential Distribution The Exponential(λ) is widely used to model random durations or times. It is another name for the Gamma(1, λ) distribution. That is, the random time T has an Exponential(λ) distribution if it has density and cdf which gives survival function and hazard rate f(t) λe λt (t ), F (t) 1 e λt, G(t) e λt, λ(t) λ. That is, it has a constant hazard rate. The only distribution with a constant hazard rate λ > is the Exponential(λ) distirbution. v ::11.22 KC Border

9 Ma 3/13 Winter 215 KC Border The Poisson Process 11 9 The mean an Exponential(λ) random variable is given by λte λt dt Proof : Use the integration by parts formula: h g hg g h, with h (t) λe λt and g(t) t (so that h(t) e λt and g (t) 1) to get E T λte λt dt te λt + te λt + 1 e λt λ 1 λ. e λt dt λ e λt The variance of an Exponential(λ) is 1 λ 2. Proof : Var T E(T 2 ) (E T ) 2 t 2 λe λt dt 1 λ 2. Setting h (t) λe λt and g(t) t 2 and integrating by parts, we get t 2 e λt + 2 te λt dt 1 }{{} λ 2 E T/λ + 2 λ 2 1 λ 2 1 λ 2. KC Border v ::11.22

10 Ma 3/13 Winter 215 KC Border The Poisson Process The Exponential is Memoryless A property that is closely related to having a constant hazard rate is that the exponential distribution is memoryless in that for an Exponential random variable T, Pitman [15]: p. 279 P ( T > t + s T > t ) P (T > s), (s > ). To see this, recall that by definition, P ( T > t + s T > t ) P ((T > t + s) (T > t)) P (T > t) P (T > t + s) as (T > t + s) (T > t) P (T > t) G(t + s) G(t) e λ(t+s) e λt e λs G(s) P (T > s). In fact, the only memoryless distributions are Exponential. Proof : Rewrite memorylessness as or G(t + s) G(t) G(s), G(t + s) G(t)G(s) (t, s > ). It is well known that this last property (plus the assumption of continuity at least one point) is enough to prove that G must be an exponential (or identically zero) on the interval (, ). See J. Aczél [1, Theorem 1, p. 3] Joint distribution of Independent Exponentials Pitman [15]: p. 352 Let X Exponential(λ) and Y Exponential(µ) be independent. Then f(x, y) λe λx µe µy λµe λx µy, 1 Aczél [1] points out that there is another kind of solution to the functional equation when we extend the domain to [, ), namely G() 1 and G(t) for t > 1. v ::11.22 KC Border

11 Ma 3/13 Winter 215 KC Border The Poisson Process so P (X < Y ) y λe λx µe µy dx dy ) λe λx dx dy ( y µe µy ( µe µy e λx y) dy µe µy ( 1 e λy) dy µe µy dy }{{} 1 1 µ λ + µ 1 µ λ + µ λ λ + µ. µe (λ+µ)y dy (λ + µ)e (λ+µ)y dy } {{ } The sum of independent Exponentials Let X and Y be independent and identically distributed Exponential(λ) random variables. The density of the sum for t > is given by the convolution: f X+Y (t) t t f X (t y)f Y (y) dy λe λ(t y) λe λy dy λ 2 e λt dy tλ 2 e λt. since f Y (t y) if y > t Pitman [15]: pp This is a Gamma(2, λ) distribution. More generally, the sum of n independent and identically distributed Exponential(λ) random variables has a Gamma(n, λ) distribution, given by t n 1 f(t) λ n e λt (n 1)! Survival functions and moments For a nonnegative random variable with a continuous density f, integration by parts allows us to prove the following. KC Border v ::11.22

12 Ma 3/13 Winter 215 KC Border The Poisson Process Proposition Let F be a cdf with continuous density f on [, ). Then the p th moment can be calculated as x p f(x) dx px p 1( 1 F (x) ) dx px p 1 G(x) dx. Proof : Use the integration by parts formula: h g hg g h, with h (x) f(x) and g(x) x p (so that h(x) F (x) and g (x) px p 1 ) to get and let b. b x p f(x) dx x p F (x) b b px p 1 F (x) dx b b p F (b) px p 1 F (x) dx b b F (b) px p 1 dx px p 1 F (x) dx b px p 1( F (b) F (x) ) dx, In particular, the first moment, the mean, is given by the area under the survival function: E ( ) 1 F (x) dx G(x) dx. Larsen Marx [14]: Section 4.3 Pitman [15]: p The Poisson(λ) distribution The Poisson(λ) distribution is a discrete distribution that is concentrated on the nonnegative integers, and is sometimes used as a counter of rare events. For a random variable X with the Poisson(λ) distribution, where λ >, the probability mass function is P (X k) p λ (k) e λ λk, (k, 1, 2, 3,... ). k! To see why k p λ (k) 1 consult the supplementary notes on series. The Poisson distribution is named for the French mathematician Siméon Denis Poisson. But there is a model of fishing that predicts that the number of fish caught per hour by an angler follows a Poisson distribution. v ::11.22 KC Border

13 Ma 3/13 Winter 215 KC Border The Poisson Process μ5 μ1 μ2 μ λ5 λ1 λ2 λ KC Border v ::11.22

14 Ma 3/13 Winter 215 KC Border The Poisson Process The mean and variance of the Poisson If X has a Poisson(λ) distribution, then A similar argument shows that E X λ k e λ k1 j λ. j λ λk ke k! λ λλj e j! λ k (k 1)! λ λj e j! Var X λ The Poisson approximates the Binomial Another useful property of the Poisson is that approximates the Binomial distribution is a peculiar way. Fix λ and let X n have the Binomial distribution Binomial(n, λ/n). A n gets large, for each k we have ( ) ( ) k ( n λ P (X n k) 1 λ ) n k k n n ( ) k ( n(n 1)(n 2) (n k + 1) λ 1 λ k! n n n (n 1) (n k+1) ( ) k ( n n n λ n k 1 λ ) n k k! n n n (n 1) n n (n k+1) ( n λ n k k! n (1 (1) n )(1 2 n n λ λk e k!, 1 k! λk 1 e λ which is the Poisson(λ) probability of k. ) k ( ) n k ) n k 1 λ n (k 1) ) (1 ) ( n λ k 1 λ ) k ( 1 λ ) n k! n n v ::11.22 KC Border

15 Ma 3/13 Winter 215 KC Border The Poisson Process ( / ) ( ) Binomial Poisson ( / ) ( ) Binomial Poisson How might the Poisson distribution arise? The following story is a variation on one told by Feller [11, Section VI.6, pp ] and Pitman [15, 3.5, pp ], where the describe the Poisson scatter. This is not quite that phenomenon. Consider a number m of BBs (small metal balls) that are to be scattered KC Border v ::11.22

16 Ma 3/13 Winter 215 KC Border The Poisson Process among n bins. Assume that for each BB i, Let E i,b be the event that BB i hits Bin b. Prob(E i,b ) 1 n, (each bin is equally likely to be hit by BB i), and that for distinct BBs i 1,..., i k and any collection b 1,..., b k of bins, the events E is,b s, s 1,..., k are independent. Set λ m n, and let p λ (k) be the Poisson(λ) probability mass function. Then, fixing λ and k, if n is large enough, the number of bins with k hits np λ (k). (2) This result was called the Law of Small Numbers by von Bortkiewicz [18]. Here is a mildly bogus argument to convince you that it is plausible: Pick a bin, say Bin b and pick some number k of hits. The probability that BB i hits Bin b is 1/n λ/m. So the number of hits on Bin b, has a Binomial(m, λ/m) distribution, which for fixed λ and large m is approximated by the Poisson(λ) distribution, so Prob (Bin b contains k BBs) p λ (k). But because the total number of BBs, m, is fixed, the number of balls in Bin b and Bin c are not independent. (In fact, the joint distribution is one giant multinomial.) So I can t simply multiply this by n bins to get the number of bins with k hits. So imagine independently replicating this experiment r times. Say the experiment itself is a success if Bin b has k hits. The probability of a successful experiment is thus p λ (k). By the Law of Large Numbers, the number of successes in a large number r of experiments is close to rp λ (k). Now note that there is nothing special about Bin b, there are n bins, so summing over all bins and all replications one would expect that the number of bins with k hits would be n times the number of experiments in which Bin b has k hits (namely, rp λ (k)). Thus all together, in the r replications there about nrp λ (k) bins with k hits. Since all the replications are the same experiment, there should be about nrp λ (k) np λ (k) r bins with k hits per experiment. In this argument, I did a little handwaving (using the terms close and about). Note though that r has to be chosen after k, so we don t expect (2) to hold for all values of k, just the smallish ones. v ::11.22 KC Border

17 Ma 3/13 Winter 215 KC Border The Poisson Process Applications of the Poisson distribution Does the argument I just gave you work? There are many stories of data that fit this model, and many are told without any attribution. Many of these examples can ultimately be traced back the very carefully written book by William Feller [1] in 195. (I have the third edition, so will cite it.) During the Second World War, Nazi Germany use unmanned aircraft, the V1 Buzz Bombs, to attack London. (They weren t quite drones, since they were never designed to return or to be remote controlled. Once launched, where they came down was reasonably random.) Feller [11, pp ] cites R. D. Clarke [7] (an insurance adjuster for The Prudential), who reports that 144 square kilometres of South London was divided into 576 sectors of about 1/4 square kilometre, and the number of hits in each sector was recorded. There were 537 Buzz Bombs that hit south London, so λ 537/ Our model then predicts that the number of districts with k hits should be approximately 576p.9323 (k). Here is the actual data compared to the model s prediction: No. of Hits k: No. of Sectors with k hits: Poisson model prediction: That is amazingly close. The bins don t have to be geographical, they can be temporal. So distributing a fixed average number of events per time period over many independent time periods, should also give a Poisson distribution. Indeed Chung [6, p. 196] cites John Maynard Keynes [13, p. 42], who reports that Ladislaus von Bortkiewicz [18] reports that the distribution of the number of cavalrymen killed from being kicked by horses is described by a Poisson distribution! Here is Keynes s table. It covers the years and fourteen different Prussian Cavalry Corps. Annual Number of Casualties CorpsYears with N casualties. N Actual Theoretical KC Border v ::11.22

18 Ma 3/13 Winter 215 KC Border The Poisson Process Keynes [13, p. 42] also reports that von Bortkiewicz [18] reports that the distribution of the annual number of the number of child suicides follows a Poisson distribution. Chung [6, p. 196] also lists the following as examples of Poisson distributions. The number of color blind people in a large group. The number of raisins in cookies. (Who did this research?) The number of misprints on a page. (Again who did the counting? 2 ) Feller [ VI.7, pp ][11] lists these additional phenomena, and supplies citations to back up his claims. An experiment by Rutherford, Chadwick, and Ellis on the emission of α- particles for N 268 time intervals of 7.5 seconds each. Feller cites Harald Cramér [8, p. 436], who reports remarkable agreement with the Poisson. See the first three columns in Figure The number of chromosome interchanges in cells subjected to X-ray radiation. [5] Telephone connections to a wrong number. (Frances Thorndike [17]) Bacterial and blood counts. The Poisson distribution describes the number of occurrences of a rare phenomenon in independent large samples Sums of independent Poissons Let X be Poisson(µ) and Y be Poisson(λ) and independent. Poisson(µ + λ). Convolution: n P (X + Y n) P (X j, Y n j) j n j n j P (X j) P (Y n j) µ µj λ n j e j! e λ (n j)! (µ+λ) (µ + λ)n e, n! Then X + Y is 2 According to my late coauthor, Roko Aliprantis, Apostol s Law states there are an infinite number of misprints in any book. The proof is that every time you open a book, you find another misprint. v ::11.22 KC Border

19 Ma 3/13 Winter 215 KC Border The Poisson Process Figure Cramér [8, p. 436] KC Border v ::11.22

20 Ma 3/13 Winter 215 KC Border The Poisson Process 11 2 where the last step comes from the binomial theorem: n (µ + λ) n n! j!(n j)! µj λ n j. j Pitman [15]: 4.2; and pp The Poisson Arrival Process The Poisson arrival process is a mathematical model that is useful in modeling the number of events (called arrivals) over a continuous time period. For instance the number of telephone calls per minute, the number of Google queries in a second, the number of radioactive decays in a minute, the number of earthquakes per year, etc. In these phenomena, the events are rare enough to be counted, and to have measurable delays between them. (Interestingly, the Poisson model is not a good description of LAN traffic, see [3, 19].) The Poisson arrival process with parameter λ works like this: Let W 1, W 2,... be a sequence of independent and identically distributed Exponential(λ) random variables, representing waiting times for an arrival, on the sample space (Ω, E, P ). At each ω Ω, the first arrival happens at time W 1 (ω), the second arrival happens a duration W 2 (ω) later, at W 1 (ω) + W 2 (ω). The third arrival happens at W 1 (ω) + W 2 (ω) + W 3 (ω). Define T n W 1 + W W n. This is the epoch when the n th event occurs. variables is a nondecreasing sequence. The sequence T n of random An alternative description is this: Arrivals are scattered along the interval [, ) so that the number of arrival in disjoint intervals are independent, and the expected number of arrivals in an interval of length t is λt. For each ω we can associate a step function of time, N(t) defined by N(t) the number of arrivals that have occurred at a time t the number of indices n such that T n t Remark Since the function N depends on ω, I should probably write N(t, ω) the number of indices n such that T n (ω) t. But that is not traditional. Something a little better than no mention of ω that you can find, say in Doob s book [9] is a notation like N t (ω). But most of the time we want to think of N as a random function of time, and putting t in the subscript disguises this. v ::11.22 KC Border

21 Ma 3/13 Winter 215 KC Border The Poisson Process Definition The random function N is called the Poisson process with parameter λ. So why is this called a Poisson Process? Because N(t) has a Poisson(λt) distribution. There is nothing special about starting at time t. The Poisson process looks the same over every time interval. The Poisson process has the property that for any interval of length t, the distribution of the number of arrivals is Poisson(λt) Stochastic Processes A stochastic process is a set {X t : t T } of random variables on (Ω, E, P ) indexed by time. The time set T might be the natural numbers or integers, a discrete time process; or an interval of the real line, a continuous time process. Each random variable X t, t T is a function on Ω. The value X t (ω) depends on both ω and t. Thus another way to view a stochastic process is as a random function on T. In fact, it is not uncommon to write X(t) instead of X t. The Poisson process is a continuous time process with discrete jumps at exponentially distributed intervals. Other important examples of stochastic processes include the Random Walk and its continuous time version, Brownian motion. Bibliography [1] J. D. Aczél. 26. Lectures on functional equations and their applications. Mineola, NY: Dover. Reprint of the 1966 edition originally published by Academic Press. An Errtat and Corrigenda list has been added. It was originally published une the title Vorlesungen über Funktionalgleichungen and ihre Anwendungen, published by Birkhäuser Verlag, Base, [2] T. M. Apostol Calculus, 2d. ed., volume 1. Waltham, Massachusetts: Blaisdell. [3] J. Beran, R. P. Sherman, M. S. Taqqu, and W. Willinger Variable-bitrate video traffic and long-range dependence. IEEE Transactions on Communications 43(2/3/4): DOI: 1.119/ [4] G. Casella and R. L. Berger. 22. Statistical inference, 2d. ed. Pacific Grove, California: Wadsworth. KC Border v ::11.22

22 Ma 3/13 Winter 215 KC Border The Poisson Process [5] D. G. Catchside, D. E. Lea, and J. M. Thoday Types of chromosomal structural change induced by the irradiation of Tradescantia microspores. Journal of Genetics 47: [6] K. L. Chung Elementary probability theory with stochastic processes. Undergraduate Texts in Mathematics. New York, Heidelberg, and Berlin: Springer Verlag. [7] R. D. Clarke An application of the Poisson distribution. Journal of the Institute of Actuaries 72:481. [8] H. Cramér Mathematical methods of statistics. Number 34 in Princeton Mathematical Series. Princeton, New Jersey: Princeton University Press. Reprinted [9] J. L. Doob Stochastic processes. New York: Wiley. [1] W. Feller An introduction to probability theory and its applications, 1st. ed., volume 1. New York: Wiley. [11] An introduction to probability theory and its applications, 3d. ed., volume 1. New York: Wiley. [12] An introduction to probability theory and its applications, 2d. ed., volume 2. New York: Wiley. [13] J. M. Keynes A treatise on probability. London: Macmillan and Co. [14] R. J. Larsen and M. L. Marx An introduction to mathematical statistics and its applications, fifth ed. Boston: Prentice Hall. [15] J. Pitman Probability. Springer Texts in Statistics. New York, Berlin, and Heidelberg: Springer. [16] R Core Team R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. [17] F. Thorndike Applications of Poisson s probability summation. Bell System Technical Journal 5(4): [18] L. von Bortkiewicz Das Gesetz der kleinen Zahlen [The law of small numbers]. Leipzig: B.G. Teubner. [19] W. Willinger, M. S. Taqqu, R. P. Sherman, and D. V. Wilson Selfsimilarity through high variability: Statistical analysis of ethernet LAN traffic at the source level (extended version). IEEE/ACM Transactions on Networking 5(1): DOI: 1.119/ v ::11.22 KC Border

23 Ma 3/13 Winter 215 KC Border The Poisson Process [2] Wolfram Research, Inc. 21. Mathematica 8.. Champaign, Illinois: Wolfram Research, Inc. KC Border v ::11.22

24

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 217 Lecture 13: The Poisson Process Relevant textbook passages: Sections 2.4,3.8, 4.2 Larsen Marx [8]: Sections

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 12: The Law of Small Numbers Relevant textbook passages: Pitman [11]: Sections 2.4,3.8, 4.2 Larsen

More information

The Poisson Arrival Process

The Poisson Arrival Process Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 219 Lecture 13: The Poisson Arrival Process Relevant textbook passages: Pitman [12]: Sections 2.4,3.5, 4.2

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2018 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 8: Expectation in Action Relevant textboo passages: Pitman [6]: Chapters 3 and 5; Section 6.4

More information

Introducing the Normal Distribution

Introducing the Normal Distribution Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 10: Introducing the Normal Distribution Relevant textbook passages: Pitman [5]: Sections 1.2,

More information

Expectation is a positive linear operator

Expectation is a positive linear operator Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 6: Expectation is a positive linear operator Relevant textbook passages: Pitman [3]: Chapter

More information

Introducing the Normal Distribution

Introducing the Normal Distribution Department of Mathematics Ma 3/13 KC Border Introduction to Probability and Statistics Winter 219 Lecture 1: Introducing the Normal Distribution Relevant textbook passages: Pitman [5]: Sections 1.2, 2.2,

More information

Order Statistics; Conditional Expectation

Order Statistics; Conditional Expectation Department of Mathematics Ma 3/3 KC Border Introduction to Probability and Statistics Winter 27 Lecture 4: Order Statistics; Conditional Expectation Relevant textbook passages: Pitman [5]: Section 4.6

More information

ECE 313 Probability with Engineering Applications Fall 2000

ECE 313 Probability with Engineering Applications Fall 2000 Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) =

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Random variables and expectation

Random variables and expectation Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2019 Lecture 5: Random variables and expectation Relevant textbook passages: Pitman [6]: Sections 3.1 3.2

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam Math 5. Rumbos Fall 23 Solutions to Review Problems for Final Exam. Three cards are in a bag. One card is red on both sides. Another card is white on both sides. The third card in red on one side and white

More information

Random variables and expectation

Random variables and expectation Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 5: Random variables and expectation Relevant textbook passages: Pitman [5]: Sections 3.1 3.2

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Common ontinuous random variables

Common ontinuous random variables Common ontinuous random variables CE 311S Earlier, we saw a number of distribution families Binomial Negative binomial Hypergeometric Poisson These were useful because they represented common situations:

More information

Continuous Probability Spaces

Continuous Probability Spaces Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes. Can not assign probabilities to each outcome and add

More information

Differentiating an Integral: Leibniz Rule

Differentiating an Integral: Leibniz Rule Division of the Humanities and Social Sciences Differentiating an Integral: Leibniz Rule KC Border Spring 22 Revised December 216 Both Theorems 1 and 2 below have been described to me as Leibniz Rule.

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Chapter 4: An Introduction to Probability and Statistics

Chapter 4: An Introduction to Probability and Statistics Chapter 4: An Introduction to Probability and Statistics 4. Probability The simplest kinds of probabilities to understand are reflected in everyday ideas like these: (i) if you toss a coin, the probability

More information

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100] HW7 Solutions. 5 pts.) James Bond James Bond, my favorite hero, has again jumped off a plane. The plane is traveling from from base A to base B, distance km apart. Now suppose the plane takes off from

More information

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted

More information

Poisson approximations

Poisson approximations Chapter 9 Poisson approximations 9.1 Overview The Binn, p) can be thought of as the distribution of a sum of independent indicator random variables X 1 + + X n, with {X i = 1} denoting a head on the ith

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

DS-GA 1002 Lecture notes 2 Fall Random variables

DS-GA 1002 Lecture notes 2 Fall Random variables DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Randomness at the root of things 2: Poisson sequences

Randomness at the root of things 2: Poisson sequences SPECIAL FEATURE: FUZZY PHYSICS www.iop.org/journals/physed Randomness at the root of things 2: Poisson sequences Jon Ogborn 1, Simon Collins 2 and Mick Brown 3 1 Institute of Education, University of London,

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

Lectures prepared by: Elchanan Mossel Yelena Shvets

Lectures prepared by: Elchanan Mossel Yelena Shvets Introduction to probability Stat 134 FAll 2005 Berkeley Lectures prepared by: Elchanan Mossel Yelena Shvets Follows Jim Pitman s book: Probability Section 4.2 Random Times Random times are often described

More information

Math/Stat 352 Lecture 8

Math/Stat 352 Lecture 8 Math/Stat 352 Lecture 8 Sections 4.3 and 4.4 Commonly Used Distributions: Poisson, hypergeometric, geometric, and negative binomial. 1 The Poisson Distribution Poisson random variable counts the number

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Pump failure data. Pump Failures Time

Pump failure data. Pump Failures Time Outline 1. Poisson distribution 2. Tests of hypothesis for a single Poisson mean 3. Comparing multiple Poisson means 4. Likelihood equivalence with exponential model Pump failure data Pump 1 2 3 4 5 Failures

More information

Common probability distributionsi Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014

Common probability distributionsi Math 217 Probability and Statistics Prof. D. Joyce, Fall 2014 Introduction. ommon probability distributionsi Math 7 Probability and Statistics Prof. D. Joyce, Fall 04 I summarize here some of the more common distributions used in probability and statistics. Some

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Poisson Distributions

Poisson Distributions Poisson Distributions Engineering Statistics Section 3.6 Josh Engwer TTU 24 February 2016 Josh Engwer (TTU) Poisson Distributions 24 February 2016 1 / 14 Siméon Denis Poisson (1781-1840) Josh Engwer (TTU)

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 4 Continuous Random Variables and Probability Distributions Chapter 4 Continuous Random Variables and Probability Distributions Part 3: The Exponential Distribution and the Poisson process Section 4.8 The Exponential Distribution 1 / 21 Exponential Distribution

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Poisson processes Overview. Chapter 10

Poisson processes Overview. Chapter 10 Chapter 1 Poisson processes 1.1 Overview The Binomial distribution and the geometric distribution describe the behavior of two random variables derived from the random mechanism that I have called coin

More information

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999 Name: 2:15-3:30 pm Thursday, 21 October 1999 You may use a calculator and your own notes but may not consult your books or neighbors. Please show your work for partial credit, and circle your answers.

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and

More information

Probability Theory and Simulation Methods. April 6th, Lecture 19: Special distributions

Probability Theory and Simulation Methods. April 6th, Lecture 19: Special distributions April 6th, 2018 Lecture 19: Special distributions Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters 4, 6: Random variables Week 9 Chapter

More information

Summer School in Statistics for Astronomers & Physicists June 5-10, 2005

Summer School in Statistics for Astronomers & Physicists June 5-10, 2005 Summer School in Statistics for Astronomers & Physicists June 5-10, 2005 Session on Statistical Inference for Astronomers Poisson Processes and Gaussian Processes Donald Richards Department of Statistics

More information

Math Spring Practice for the Second Exam.

Math Spring Practice for the Second Exam. Math 4 - Spring 27 - Practice for the Second Exam.. Let X be a random variable and let F X be the distribution function of X: t < t 2 t < 4 F X (t) : + t t < 2 2 2 2 t < 4 t. Find P(X ), P(X ), P(X 2),

More information

errors every 1 hour unless he falls asleep, in which case he just reports the total errors

errors every 1 hour unless he falls asleep, in which case he just reports the total errors I. First Definition of a Poisson Process A. Definition: Poisson process A Poisson Process {X(t), t 0} with intensity λ > 0 is a counting process with the following properties. INDEPENDENT INCREMENTS. For

More information

Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 2 Random Variables. Random Variable Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Chapter 8: An Introduction to Probability and Statistics

Chapter 8: An Introduction to Probability and Statistics Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

Basics of Stochastic Modeling: Part II

Basics of Stochastic Modeling: Part II Basics of Stochastic Modeling: Part II Continuous Random Variables 1 Sandip Chakraborty Department of Computer Science and Engineering, INDIAN INSTITUTE OF TECHNOLOGY KHARAGPUR August 10, 2016 1 Reference

More information

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution Pengyuan (Penelope) Wang June 15, 2011 Review Discussed Uniform Distribution and Normal Distribution Normal Approximation

More information

ST 371 (V): Families of Discrete Distributions

ST 371 (V): Families of Discrete Distributions ST 371 (V): Families of Discrete Distributions Certain experiments and associated random variables can be grouped into families, where all random variables in the family share a certain structure and a

More information

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

3. Poisson Processes (12/09/12, see Adult and Baby Ross) 3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Common Discrete Distributions

Common Discrete Distributions Common Discrete Distributions Statistics 104 Autumn 2004 Taken from Statistics 110 Lecture Notes Copyright c 2004 by Mark E. Irwin Common Discrete Distributions There are a wide range of popular discrete

More information

1 Delayed Renewal Processes: Exploiting Laplace Transforms

1 Delayed Renewal Processes: Exploiting Laplace Transforms IEOR 6711: Stochastic Models I Professor Whitt, Tuesday, October 22, 213 Renewal Theory: Proof of Blackwell s theorem 1 Delayed Renewal Processes: Exploiting Laplace Transforms The proof of Blackwell s

More information

Exponential & Gamma Distributions

Exponential & Gamma Distributions Exponential & Gamma Distributions Engineering Statistics Section 4.4 Josh Engwer TTU 7 March 26 Josh Engwer (TTU) Exponential & Gamma Distributions 7 March 26 / 2 PART I PART I: EXPONENTIAL DISTRIBUTION

More information

Introduction. Probability and distributions

Introduction. Probability and distributions Introduction. Probability and distributions Joe Felsenstein Genome 560, Spring 2011 Introduction. Probability and distributions p.1/18 Probabilities We will assume you know that Probabilities of mutually

More information

In a five-minute period, you get a certain number m of requests. Each needs to be served from one of your n servers.

In a five-minute period, you get a certain number m of requests. Each needs to be served from one of your n servers. Suppose you are a content delivery network. In a five-minute period, you get a certain number m of requests. Each needs to be served from one of your n servers. How to distribute requests to balance the

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random

More information

PROBABILITY DISTRIBUTIONS

PROBABILITY DISTRIBUTIONS Review of PROBABILITY DISTRIBUTIONS Hideaki Shimazaki, Ph.D. http://goo.gl/visng Poisson process 1 Probability distribution Probability that a (continuous) random variable X is in (x,x+dx). ( ) P x < X

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

Random Experiments; Probability Spaces; Random Variables; Independence

Random Experiments; Probability Spaces; Random Variables; Independence Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 2: Random Experiments; Probability Spaces; Random Variables; Independence Relevant textbook passages:

More information

Lecture 22: A Review of Linear Algebra and an Introduction to The Multivariate Normal Distribution

Lecture 22: A Review of Linear Algebra and an Introduction to The Multivariate Normal Distribution Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Lecture 22: A Review of Linear Algebra and an Introduction to The Multivariate Normal Distribution Relevant

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

Performance Modelling of Computer Systems

Performance Modelling of Computer Systems Performance Modelling of Computer Systems Mirco Tribastone Institut für Informatik Ludwig-Maximilians-Universität München Fundamentals of Queueing Theory Tribastone (IFI LMU) Performance Modelling of Computer

More information

Chapter 4: Continuous Probability Distributions

Chapter 4: Continuous Probability Distributions Chapter 4: Continuous Probability Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 57 Continuous Random Variable A continuous random

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. A Probability Primer A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes. Are you holding all the cards?? Random Events A random event, E,

More information

Scientific Measurement

Scientific Measurement Scientific Measurement SPA-4103 Dr Alston J Misquitta Lecture 5 - The Binomial Probability Distribution Binomial Distribution Probability of k successes in n trials. Gaussian/Normal Distribution Poisson

More information

Modeling Random Experiments

Modeling Random Experiments Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2018 Lecture 2: Modeling Random Experiments Relevant textbook passages: Pitman [4]: Sections 1.3 1.4., pp.

More information

Special distributions

Special distributions Special distributions August 22, 2017 STAT 101 Class 4 Slide 1 Outline of Topics 1 Motivation 2 Bernoulli and binomial 3 Poisson 4 Uniform 5 Exponential 6 Normal STAT 101 Class 4 Slide 2 What distributions

More information

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days? IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

MATH : EXAM 2 INFO/LOGISTICS/ADVICE MATH 3342-004: EXAM 2 INFO/LOGISTICS/ADVICE INFO: WHEN: Friday (03/11) at 10:00am DURATION: 50 mins PROBLEM COUNT: Appropriate for a 50-min exam BONUS COUNT: At least one TOPICS CANDIDATE FOR THE EXAM:

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises IV (based on lectures 7-8, work week 5, hand in lecture Mon 30 Oct) ALL questions count towards the continuous assessment for this module. Q1. If a random variable

More information

Continuous Distributions

Continuous Distributions Chapter 5 Continuous Distributions 5.1 Density and Distribution Functions In many situations random variables can take any value on the real line or in a certain subset of the real line. For concrete examples,

More information

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability?

Probability: Why do we care? Lecture 2: Probability and Distributions. Classical Definition. What is Probability? Probability: Why do we care? Lecture 2: Probability and Distributions Sandy Eckel seckel@jhsph.edu 22 April 2008 Probability helps us by: Allowing us to translate scientific questions into mathematical

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information