Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Similar documents
3.4. The Binomial Probability Distribution

Introduction to Probability and Statistics Slides 3 Chapter 3

Discrete random variables and probability distributions

Random Variables Example:

Chapter 3 Discrete Random Variables

BINOMIAL DISTRIBUTION

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Analysis of Engineering and Scientific Data. Semester

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

HW on Ch Let X be a discrete random variable with V (X) = 8.6, then V (3X+5.6) is. V (3X + 5.6) = 3 2 V (X) = 9(8.6) = 77.4.

Binomial and Poisson Probability Distributions

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Math/Stat 352 Lecture 8

Statistics for Economists. Lectures 3 & 4

1 Inverse Transform Method and some alternative algorithms

Chapter 4 : Discrete Random Variables

Topic 3 - Discrete distributions

CHAPTER 4. Probability is used in inference statistics as a tool to make statement for population from sample information.

BIOSTATISTICS. Lecture 2 Discrete Probability Distributions. dr. Petr Nazarov

Things to remember when learning probability distributions:

Binomial random variable

Discrete Random Variables

Lecture 08: Poisson and More. Lisa Yan July 13, 2018

Known probability distributions

Topic 3: The Expectation of a Random Variable

HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS

Discrete Random Variables

MATH : EXAM 2 INFO/LOGISTICS/ADVICE

Mathematical Statistics 1 Math A 6330

Quick review on Discrete Random Variables

Chapter 3. Discrete Random Variables and Their Probability Distributions

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Discrete Distributions

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Special Discrete RV s. Then X = the number of successes is a binomial RV. X ~ Bin(n,p).

Continuous Random Variables. What continuous random variables are and how to use them. I can give a definition of a continuous random variable.

Chapters 3.2 Discrete distributions

Conditional Probability

FINAL EXAM: Monday 8-10am

STAT/MATH 395 PROBABILITY II

Relationship between probability set function and random variable - 2 -

Probability Distributions for Continuous Variables. Probability Distributions for Continuous Variables

ST 371 (V): Families of Discrete Distributions

Common Discrete Distributions

A random variable is a quantity whose value is determined by the outcome of an experiment.

Probability and Probability Distributions. Dr. Mohammed Alahmed

success and failure independent from one trial to the next?

14.2 THREE IMPORTANT DISCRETE PROBABILITY MODELS

Probability Experiments, Trials, Outcomes, Sample Spaces Example 1 Example 2

p. 4-1 Random Variables

CS 237: Probability in Computing

MATH 3670 First Midterm February 17, No books or notes. No cellphone or wireless devices. Write clearly and show your work for every answer.

Part 3: Parametric Models

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Chapter 3. Discrete Random Variables and Their Probability Distributions

Chapter 1: Revie of Calculus and Probability

What is Probability? Probability. Sample Spaces and Events. Simple Event

Chapter 5. Chapter 5 sections

The Geometric Distribution

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Discrete Distributions

37.3. The Poisson Distribution. Introduction. Prerequisites. Learning Outcomes

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

Notes 12 Autumn 2005

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

CS 361: Probability & Statistics

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Discrete Distributions

Unit 4 Probability. Dr Mahmoud Alhussami

Stats Review Chapter 6. Mary Stangler Center for Academic Success Revised 8/16

ACCESS TO SCIENCE, ENGINEERING AND AGRICULTURE: MATHEMATICS 2 MATH00040 SEMESTER / Probability

1 Probability and Random Variables

Probability and Statistics for Engineers

Some Special Discrete Distributions

Each trial has only two possible outcomes success and failure. The possible outcomes are exactly the same for each trial.

3 Modeling Process Quality

Solutions - Final Exam

Statistics 100A Homework 5 Solutions

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Random Variable And Probability Distribution. Is defined as a real valued function defined on the sample space S. We denote it as X, Y, Z,

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

Section 2.4 Bernoulli Trials

STA 111: Probability & Statistical Inference

Discrete Random Variables

Chapter 4: Continuous Random Variables and Probability Distributions

Lecture 8 : The Geometric Distribution

Lecture 3. Discrete Random Variables

Week 12-13: Discrete Probability

ECE 313 Probability with Engineering Applications Fall 2000

Discrete Distributions

Part 3: Parametric Models

CSCI2244-Randomness and Computation First Exam with Solutions

Chapter 7: Theoretical Probability Distributions Variable - Measured/Categorized characteristic

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Probability Theory and Simulation Methods. April 6th, Lecture 19: Special distributions

Special Mathematics Discrete random variables

Fundamental Tools - Probability Theory II

Transcription:

Expectations

Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x)

Expectations

Expectations Proposition If the rv X has a set of possible values D and pmf p(x), then the expected value of any function h(x ), denoted by E[h(X )] or µ hx, is computed by E[h(X )] = h(x) p(x) D

Expectations

Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.)

Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.) e.g. for the previous example, E[h(X )] = E(800X 900) = 800 E(X ) 900 = 700

Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.) e.g. for the previous example, E[h(X )] = E(800X 900) = 800 E(X ) 900 = 700 Corollary 1. For any constant a, E(aX ) = a E(X ). 2. For any constant b, E(X + b) = E(X ) + b.

Expectations

Expectations Definition Let X have pmf p(x) and expected value µ. Then the variance of X, denoted by V (X ) or σ 2 X, or just σ2 X, is V (X ) = D (x µ) 2 p(x) = E[(X µ) 2 ] The stand deviation (SD) of X is σ X = σx 2

Expectations Proposition V (X ) = σ 2 = [ D x 2 p(x)] µ 2 = E(X 2 ) [E(X )] 2

Expectations

Expectations Proposition If h(x ) is a function of a rv X, then V [h(x )] = σ 2 h(x ) = D {h(x) E[h(X )]} 2 p(x) = E[h(X ) 2 ] {E[h(X )]} 2 If h(x ) is linear, i.e. h(x ) = ax + b for some nonrandom constant a and b, then V (ax + b) = σ 2 ax +b = a2 σ 2 X and σ ax +b = a σ X In particular, σ ax = a σ X, σ X +b = σ X

Binomial Distribution

Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials

Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials Possible values for X in an n-trial experiment are x = 0, 1, 2,..., n.

Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials Possible values for X in an n-trial experiment are x = 0, 1, 2,..., n. Notation We use X Bin(n, p) to indicate that X is a binomial rv based on n trials with success probability p. We use b(x; n, p) to denote the pmf of X, and B(x; n, p) to denote the cdf of X, where B(x; n, p) = P(X x) = x b(x; n, p) y=0

Binomial Distribution Theorem {( n ) b(x; n, p) = x p x (1 p) n x x = 0, 1, 2,..., n 0 otherwise

Binomial Distribution

Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p).

Hypergeometric Distribution

Hypergeometric Distribution Assume we are drawing cards from a deck of well-shulffed cards with replacement, one card per each draw. We do this 5 times and record whether the outcome is or not. Then this is a binomial experiment.

Hypergeometric Distribution Assume we are drawing cards from a deck of well-shulffed cards with replacement, one card per each draw. We do this 5 times and record whether the outcome is or not. Then this is a binomial experiment. If we do the same thing without replacement, then it is NO LONGER a binomial experiment.

Hypergeometric Distribution Assume we are drawing cards from a deck of well-shulffed cards with replacement, one card per each draw. We do this 5 times and record whether the outcome is or not. Then this is a binomial experiment. If we do the same thing without replacement, then it is NO LONGER a binomial experiment. However, if we are drawing from 100 decks of cards without replacement and record only the first 5 outcomes, then it is approximately a binomial experiment.

Hypergeometric Distribution Assume we are drawing cards from a deck of well-shulffed cards with replacement, one card per each draw. We do this 5 times and record whether the outcome is or not. Then this is a binomial experiment. If we do the same thing without replacement, then it is NO LONGER a binomial experiment. However, if we are drawing from 100 decks of cards without replacement and record only the first 5 outcomes, then it is approximately a binomial experiment. What is the exact model for drawing cards without replacement?

Hypergeometric Distribution

Hypergeometric Distribution 1. The population or set to be sampled consists of N individuals, objects, or elements (a finite population).

Hypergeometric Distribution 1. The population or set to be sampled consists of N individuals, objects, or elements (a finite population). 2. Each individual can be characterized as a success (S) or a failure (F), and there are M successes in the population.

Hypergeometric Distribution 1. The population or set to be sampled consists of N individuals, objects, or elements (a finite population). 2. Each individual can be characterized as a success (S) or a failure (F), and there are M successes in the population. 3. A sample of n individuals is selected without replacement in such a way that each subset of size n is equally likely to be chosen.

Hypergeometric Distribution 1. The population or set to be sampled consists of N individuals, objects, or elements (a finite population). 2. Each individual can be characterized as a success (S) or a failure (F), and there are M successes in the population. 3. A sample of n individuals is selected without replacement in such a way that each subset of size n is equally likely to be chosen. Definition For any experiment which satisfies the above 3 conditions, let X = the number of S s in the sample. Then X is a hypergeometric random variable and we use h(x; n, M, N) to denote the pmf p(x) = P(X = x).

Hypergeometric Distribution

Hypergeometric Distribution Examples:

Hypergeometric Distribution Examples: In the second cards drawing example (without replacement and totally 52 cards), if we let X = the number of s in the first 5 draws, then X is a hypergeometric random variable with n = 5, M = 13 and N = 52.

Hypergeometric Distribution Examples: In the second cards drawing example (without replacement and totally 52 cards), if we let X = the number of s in the first 5 draws, then X is a hypergeometric random variable with n = 5, M = 13 and N = 52. For the pmf, the probability for getting exactly x (x = 0, 1, 2, 3, 4, or 5) s is calculated as following: ( 13 ) ( x 39 ) 5 x p(x) = P(X = x) = ( 52 ) 5

Hypergeometric Distribution Examples: In the second cards drawing example (without replacement and totally 52 cards), if we let X = the number of s in the first 5 draws, then X is a hypergeometric random variable with n = 5, M = 13 and N = 52. For the pmf, the probability for getting exactly x (x = 0, 1, 2, 3, 4, or 5) s is calculated as following: ( 13 ) ( x 39 ) 5 x p(x) = P(X = x) = ( 52 ) 5 where ( ) ( 13 x is the number of choices for getting x s, 39 5 x) is the number ) of choices for getting the remaining 5 x non- cards and is the total number of choices for selecting 5 cards from 52 ( 52 5 cards.

Hypergeometric Distribution

Hypergeometric Distribution Examples:

Hypergeometric Distribution Examples: For the same experiment (without replacement and totally 52 cards), if we let X = the number of s in the first 20 draws, then X is still a hypergeometric random variable, but with n = 20, M = 13 and N = 52.

Hypergeometric Distribution Examples: For the same experiment (without replacement and totally 52 cards), if we let X = the number of s in the first 20 draws, then X is still a hypergeometric random variable, but with n = 20, M = 13 and N = 52. However, in this case, all the possible values for X is 0, 1, 2,..., 13 and the pmf is where 0 x 13. p(x) = P(X = x) = ( 13 ) ( x 39 ) ( 52 20 20 x )

Hypergeometric Distribution

Hypergeometric Distribution Proposition If X is the number of S s in a completely random sample of size n drawn from a population consisting of M S s and (N M) F s, then the probability distribution of X, called the hypergeometric distribution, is given by ( M ) ( x N M ) n x P(X = x) = h(x; n, M, N) = ( N n) for x an integer satisfying max(0, n N + M) x min(n, M).

Hypergeometric Distribution Proposition If X is the number of S s in a completely random sample of size n drawn from a population consisting of M S s and (N M) F s, then the probability distribution of X, called the hypergeometric distribution, is given by ( M ) ( x N M ) n x P(X = x) = h(x; n, M, N) = ( N n) for x an integer satisfying max(0, n N + M) x min(n, M). Remark: If n < M, then the largest x is n. However, if n > M, then the largest x is M. Therefore we require x min(n, M).

Hypergeometric Distribution Proposition If X is the number of S s in a completely random sample of size n drawn from a population consisting of M S s and (N M) F s, then the probability distribution of X, called the hypergeometric distribution, is given by ( M ) ( x N M ) n x P(X = x) = h(x; n, M, N) = ( N n) for x an integer satisfying max(0, n N + M) x min(n, M). Remark: If n < M, then the largest x is n. However, if n > M, then the largest x is M. Therefore we require x min(n, M). Similarly, if n < N M, then the smallest x is 0. However, if n > N M, then the smallest x is n (N M). Thus x min(0, n N + M).

Hypergeometric Distribution

Hypergeometric Distribution Example: (Problem 70)

Hypergeometric Distribution Example: (Problem 70) An instructor who taught two sections of engineering statistics last term, the first with 20 students and the second with 30, decided to assign a term project. After all projects had been turned in, the instructor randomly ordered them before grading. Consider the first 15 graded projects.

Hypergeometric Distribution Example: (Problem 70) An instructor who taught two sections of engineering statistics last term, the first with 20 students and the second with 30, decided to assign a term project. After all projects had been turned in, the instructor randomly ordered them before grading. Consider the first 15 graded projects. a. What is the probability that exactly 10 of these are from the second section?

Hypergeometric Distribution Example: (Problem 70) An instructor who taught two sections of engineering statistics last term, the first with 20 students and the second with 30, decided to assign a term project. After all projects had been turned in, the instructor randomly ordered them before grading. Consider the first 15 graded projects. a. What is the probability that exactly 10 of these are from the second section? b. What is the probability that at least 10 of these are from the second section?

Hypergeometric Distribution Example: (Problem 70) An instructor who taught two sections of engineering statistics last term, the first with 20 students and the second with 30, decided to assign a term project. After all projects had been turned in, the instructor randomly ordered them before grading. Consider the first 15 graded projects. a. What is the probability that exactly 10 of these are from the second section? b. What is the probability that at least 10 of these are from the second section? c. What is the probability that at least 10 of these are from the same section?

Hypergeometric Distribution

Hypergeometric Distribution Proposition The mean and variance of the hypergeometric rv X having pmf h(x; n, M, N) are E(X ) = n M N V (X ) = ( ) N n n M ( N 1 N 1 M ) N

Hypergeometric Distribution Proposition The mean and variance of the hypergeometric rv X having pmf h(x; n, M, N) are E(X ) = n M N V (X ) = ( ) N n n M ( N 1 N 1 M ) N Remark: The ratio M N is the proportion of S s in the population. If we replace M N by p, then we get

Hypergeometric Distribution Proposition The mean and variance of the hypergeometric rv X having pmf h(x; n, M, N) are E(X ) = n M N V (X ) = ( ) N n n M ( N 1 N 1 M ) N Remark: The ratio M N is the proportion of S s in the population. If we replace M N( by p, ) then we get E(X ) = np and V (X ) = N n N 1 np(1 p).

Hypergeometric Distribution Proposition The mean and variance of the hypergeometric rv X having pmf h(x; n, M, N) are E(X ) = n M N V (X ) = ( ) N n n M ( N 1 N 1 M ) N Remark: The ratio M N is the proportion of S s in the population. If we replace M N( by p, ) then we get E(X ) = np and V (X ) = N n N 1 np(1 p). Recall the mean and variance for a binomial rv is np and np(1 p).

Hypergeometric Distribution Proposition The mean and variance of the hypergeometric rv X having pmf h(x; n, M, N) are E(X ) = n M N V (X ) = ( ) N n n M ( N 1 N 1 M ) N Remark: The ratio M N is the proportion of S s in the population. If we replace M N( by p, ) then we get E(X ) = np and V (X ) = N n N 1 np(1 p). Recall the mean and variance for a binomial rv is np and np(1 p). We see that the mean for binomial and hypergeometric rv s are equal, while the variances differ by the factor (N n)/(n 1).

Hypergeometric Distribution

Hypergeometric Distribution Example (Problem 70) continued:

Hypergeometric Distribution Example (Problem 70) continued: An instructor who taught two sections of engineering statistics last term, the first with 20 students and the second with 30, decided to assign a term project. After all projects had been turned in, the instructor randomly ordered them before grading. Consider the first 15 graded projects.

Hypergeometric Distribution Example (Problem 70) continued: An instructor who taught two sections of engineering statistics last term, the first with 20 students and the second with 30, decided to assign a term project. After all projects had been turned in, the instructor randomly ordered them before grading. Consider the first 15 graded projects. d. What are the mean value and standard deviation of the number of projects among these 15 that are from the second section?

Hypergeometric Distribution Example (Problem 70) continued: An instructor who taught two sections of engineering statistics last term, the first with 20 students and the second with 30, decided to assign a term project. After all projects had been turned in, the instructor randomly ordered them before grading. Consider the first 15 graded projects. d. What are the mean value and standard deviation of the number of projects among these 15 that are from the second section? e. What are the mean value and standard deviation of the number of projects not among these 15 that are from the second section?

Negative Binomial Distribution

Negative Binomial Distribution Consider the card drawing example again. This time, we still draw cards from a deck of well-shulffed cards with replacement, one card per each draw. However, we keep drawing until we get 5 s. Let X = the number of draws which do not give us a, then X is NO LONGER a binomial random variable, but a negative binomial random variable.

Negative Binomial Distribution

Negative Binomial Distribution 1. The experiment consists of a sequence of independent trials.

Negative Binomial Distribution 1. The experiment consists of a sequence of independent trials. 2. Each trial can result in either s success (S) or a failure (F).

Negative Binomial Distribution 1. The experiment consists of a sequence of independent trials. 2. Each trial can result in either s success (S) or a failure (F). 3. The probability of success is constant from trial to trial, so P(S on trial i) = p for i = 1, 2, 3,....

Negative Binomial Distribution 1. The experiment consists of a sequence of independent trials. 2. Each trial can result in either s success (S) or a failure (F). 3. The probability of success is constant from trial to trial, so P(S on trial i) = p for i = 1, 2, 3,.... 4. The experiment continues (trials are performed) until a total of r successes have been observed, where r is a specified positive integer.

Negative Binomial Distribution 1. The experiment consists of a sequence of independent trials. 2. Each trial can result in either s success (S) or a failure (F). 3. The probability of success is constant from trial to trial, so P(S on trial i) = p for i = 1, 2, 3,.... 4. The experiment continues (trials are performed) until a total of r successes have been observed, where r is a specified positive integer. Definition For any experiment which satisfies the above 4 conditions, let X = the number of failures that precede thr r th success. Then X is a negative binomial random variable and we use nb(x; r, p) to denote the pmf p(x) = P(X = x).

Negative Binomial Distribution

Negative Binomial Distribution Remark: 1. In some sources, the negative binomial rv is taken to be the number of trials X + r rather than the number of failures.

Negative Binomial Distribution Remark: 1. In some sources, the negative binomial rv is taken to be the number of trials X + r rather than the number of failures. 2. If r = 1, we call X a geometric random variable. The pmf for X is then the familiar one nb(x; 1, p) = (1 p) x p x = 0, 1, 2,...

Negative Binomial Distribution

Negative Binomial Distribution Proposition The pmf of the negative binomial rv X with parameters r = number of S s and p = P(S) is ( ) x + r 1 nb(x; r, p) = p r (1 p) x r 1 Then mean and variance for X are E(X ) = r(1 p) p and V (X ) = r(1 p) p 2, respectively

Negative Binomial Distribution

Negative Binomial Distribution Example: (Problem 78) Individual A has a red die and B has a green die (both fair). If they each roll until they obtain five doubles (1 1, 2 2,..., 6 6), what is the pmf of X = the total number of times a die is rolled? What are E(X ) and V (X )?

Poisson Distribution

Poisson Distribution Consider the following random variables:

Poisson Distribution Consider the following random variables: 1. The number of people arriving for treatment at an emergency room in each hour.

Poisson Distribution Consider the following random variables: 1. The number of people arriving for treatment at an emergency room in each hour. 2. The number of drivers who travel between Salt Lake City and Sandy during each day.

Poisson Distribution Consider the following random variables: 1. The number of people arriving for treatment at an emergency room in each hour. 2. The number of drivers who travel between Salt Lake City and Sandy during each day. 3. The number of trees in each square mile in a forest.

Poisson Distribution Consider the following random variables: 1. The number of people arriving for treatment at an emergency room in each hour. 2. The number of drivers who travel between Salt Lake City and Sandy during each day. 3. The number of trees in each square mile in a forest. None of them are binomial, hypergeometric or negative binomial random variables.

Poisson Distribution Consider the following random variables: 1. The number of people arriving for treatment at an emergency room in each hour. 2. The number of drivers who travel between Salt Lake City and Sandy during each day. 3. The number of trees in each square mile in a forest. None of them are binomial, hypergeometric or negative binomial random variables. In fact, the experiments associated with above random variables DO NOT involve trials.

Poisson Distribution Consider the following random variables: 1. The number of people arriving for treatment at an emergency room in each hour. 2. The number of drivers who travel between Salt Lake City and Sandy during each day. 3. The number of trees in each square mile in a forest. None of them are binomial, hypergeometric or negative binomial random variables. In fact, the experiments associated with above random variables DO NOT involve trials. We use Poisson distribution to model the experiment for occurence of events of some type over time or area.

Poisson Distribution

Poisson Distribution Definition A random variable X is said to have a Posiion distribution with parameter λ (λ > 0) if the pmf of X is p(x; λ) = e λ λ x x! x = 0, 1, 2,...

Poisson Distribution Definition A random variable X is said to have a Posiion distribution with parameter λ (λ > 0) if the pmf of X is p(x; λ) = e λ λ x x! x = 0, 1, 2,... 1. The value λ is frequently a rate per unit time or per unit area.

Poisson Distribution Definition A random variable X is said to have a Posiion distribution with parameter λ (λ > 0) if the pmf of X is p(x; λ) = e λ λ x x! x = 0, 1, 2,... 1. The value λ is frequently a rate per unit time or per unit area. 2. e is the base of the natural logarithm system.

Poisson Distribution Definition A random variable X is said to have a Posiion distribution with parameter λ (λ > 0) if the pmf of X is p(x; λ) = e λ λ x x! x = 0, 1, 2,... 1. The value λ is frequently a rate per unit time or per unit area. 2. e is the base of the natural logarithm system. 3. It is guaranteed that x=0 p(x; λ) = 1.

Poisson Distribution Definition A random variable X is said to have a Posiion distribution with parameter λ (λ > 0) if the pmf of X is p(x; λ) = e λ λ x x! x = 0, 1, 2,... 1. The value λ is frequently a rate per unit time or per unit area. 2. e is the base of the natural logarithm system. 3. It is guaranteed that x=0 p(x; λ) = 1. e λ = 1 + λ + λ2 2! + λ3 3! + = x=0 λ x x!

Poisson Distribution

Poisson Distribution Example: The red blood cell (RBC) density in blood is estimated by means of a hematometer. A blood sample is thoroughly mixed with a saline solution, and then pipetted onto a slide. The RBC s are counted under a microscope through a square grid. Because the solution is throughly mixed, the RBC s have an equal chance of being in a particular square in the grid. It is known that the number of cells counted in a given square follows a Poisson distribution and the parameter λ for certain blood sample is believed to be 1.5.

Poisson Distribution Example: The red blood cell (RBC) density in blood is estimated by means of a hematometer. A blood sample is thoroughly mixed with a saline solution, and then pipetted onto a slide. The RBC s are counted under a microscope through a square grid. Because the solution is throughly mixed, the RBC s have an equal chance of being in a particular square in the grid. It is known that the number of cells counted in a given square follows a Poisson distribution and the parameter λ for certain blood sample is believed to be 1.5. Then what is the probability that there is no RBC in a given square?

Poisson Distribution Example: The red blood cell (RBC) density in blood is estimated by means of a hematometer. A blood sample is thoroughly mixed with a saline solution, and then pipetted onto a slide. The RBC s are counted under a microscope through a square grid. Because the solution is throughly mixed, the RBC s have an equal chance of being in a particular square in the grid. It is known that the number of cells counted in a given square follows a Poisson distribution and the parameter λ for certain blood sample is believed to be 1.5. Then what is the probability that there is no RBC in a given square? What is the probability for a square containing exactly 2 RBC s?

Poisson Distribution Example: The red blood cell (RBC) density in blood is estimated by means of a hematometer. A blood sample is thoroughly mixed with a saline solution, and then pipetted onto a slide. The RBC s are counted under a microscope through a square grid. Because the solution is throughly mixed, the RBC s have an equal chance of being in a particular square in the grid. It is known that the number of cells counted in a given square follows a Poisson distribution and the parameter λ for certain blood sample is believed to be 1.5. Then what is the probability that there is no RBC in a given square? What is the probability for a square containing exactly 2 RBC s? What is the probability for a square containing at most 2 RBC s?

Poisson Distribution

Poisson Distribution Proposition If X has a Poisson distribution with parameter λ, then E(X ) = V (X ) = λ.

Poisson Distribution Proposition If X has a Poisson distribution with parameter λ, then E(X ) = V (X ) = λ. We see that the parameter λ equals to the mean and variance of the Poisson random variable X.

Poisson Distribution Proposition If X has a Poisson distribution with parameter λ, then E(X ) = V (X ) = λ. We see that the parameter λ equals to the mean and variance of the Poisson random variable X. e.g. for the previous example, the expected number of RBC s per square is thus 1.5 and the variance is also 1.5.

Poisson Distribution Proposition If X has a Poisson distribution with parameter λ, then E(X ) = V (X ) = λ. We see that the parameter λ equals to the mean and variance of the Poisson random variable X. e.g. for the previous example, the expected number of RBC s per square is thus 1.5 and the variance is also 1.5. In practice, the parameter usually is unknown to us. However, we can use the sample mean to estimate it. For example, if we observed 15 RBC s over 10 squares, then we can use x = 15 10 = 1.5 to estimate λ.

Poisson Distribution

Poisson Distribution Poisson Process: the occurrence of events over time.

Poisson Distribution Poisson Process: the occurrence of events over time. 1. There exists a parameter α > 0 such that for any short time interval of length t, the probability that exactly one event is received is α t + o( t).

Poisson Distribution Poisson Process: the occurrence of events over time. 1. There exists a parameter α > 0 such that for any short time interval of length t, the probability that exactly one event is received is α t + o( t). 2. The probability of more than one event being received during t is o( t) [which, along with Assumption 1, implies that the probability of no events during t] is 1 α t o( t)].

Poisson Distribution Poisson Process: the occurrence of events over time. 1. There exists a parameter α > 0 such that for any short time interval of length t, the probability that exactly one event is received is α t + o( t). 2. The probability of more than one event being received during t is o( t) [which, along with Assumption 1, implies that the probability of no events during t] is 1 α t o( t)]. 3. The number of events received during the time interval t is independent of the number received prior to this time interval.

Poisson Distribution

Poisson Distribution Proposition Let P k (t) denote the probability that k events will be observed during any particular time interval of length t. Then P k (t) = e αt (αt)k. k! In words, the number of events during a time interval of length t is a Poisson rv with parameter λ = αt. The expected number of events during any such time interval is then αt, so the expected number during a unit interval of time is α.

Poisson Distribution

Poisson Distribution Example: (Problem 92) Automobiles arrive at a vehicle equipment inspection station according to a Poisson process with rate α = 10 per hour. Suppose that with probability 0.5 an arriving vehicle will have no equipemnt violations.

Poisson Distribution Example: (Problem 92) Automobiles arrive at a vehicle equipment inspection station according to a Poisson process with rate α = 10 per hour. Suppose that with probability 0.5 an arriving vehicle will have no equipemnt violations. a. What is the probability that exactly ten arrive during the hour and all ten have no violations?

Poisson Distribution Example: (Problem 92) Automobiles arrive at a vehicle equipment inspection station according to a Poisson process with rate α = 10 per hour. Suppose that with probability 0.5 an arriving vehicle will have no equipemnt violations. a. What is the probability that exactly ten arrive during the hour and all ten have no violations? b. For any fixed y 10, what is the probability that y arrive during the hour, of which ten have no violations?

Poisson Distribution Example: (Problem 92) Automobiles arrive at a vehicle equipment inspection station according to a Poisson process with rate α = 10 per hour. Suppose that with probability 0.5 an arriving vehicle will have no equipemnt violations. a. What is the probability that exactly ten arrive during the hour and all ten have no violations? b. For any fixed y 10, what is the probability that y arrive during the hour, of which ten have no violations? c. What is the probability that ten no-violation cars arrive during the next 45 minutes?

Poisson Distribution

Poisson Distribution In some sense, the Poisson distribution can be recognized as the limit of a binomial experiment. Proposition Suppose that in the binomial pmf b(x; n, p), we let n and p 0 in such a way that np approaches a value λ > 0. Then b(x; n, p) p(x; λ).

Poisson Distribution In some sense, the Poisson distribution can be recognized as the limit of a binomial experiment. Proposition Suppose that in the binomial pmf b(x; n, p), we let n and p 0 in such a way that np approaches a value λ > 0. Then b(x; n, p) p(x; λ). This tells us in any binomial experiment in which n is large and p is small, b(x; n, p) p(x; λ), where λ = np.

Poisson Distribution In some sense, the Poisson distribution can be recognized as the limit of a binomial experiment. Proposition Suppose that in the binomial pmf b(x; n, p), we let n and p 0 in such a way that np approaches a value λ > 0. Then b(x; n, p) p(x; λ). This tells us in any binomial experiment in which n is large and p is small, b(x; n, p) p(x; λ), where λ = np. As a rule of thumb, this approximation can safely be applied if n > 50 and np < 5.

Poisson Distribution

Poisson Distribution Example 3.40: If a publisher of nontechnical books takes great pains to ensure that its books are free of typographical errors, so that the probability of any given page containing at least one such error is 0.005 and errors are independent from page to page, what is the probability that one of its 400-page novels will contain exactly one page with errors?

Poisson Distribution Example 3.40: If a publisher of nontechnical books takes great pains to ensure that its books are free of typographical errors, so that the probability of any given page containing at least one such error is 0.005 and errors are independent from page to page, what is the probability that one of its 400-page novels will contain exactly one page with errors? Let S denote a page containing at least one error, F denote an error-free page and X denote the number of pages containing at least one error. Then X is a binomial rv, and P(X = 1) = b(1; 400, 0.005) p(1; 400 0.005) = p(1; 2) = e 2 (2) 1! = 0.270671

Poisson Distribution

Poisson Distribution A proof for b(x; n, p) p(x; λ) as n and p 0 with np λ. lim n b(x; n, p) = n! x!(n x)! px = lim n n! x!(n x)! px (1 p) n x = lim n = λx x! lim (1 n p)n x = lim {1 np n n }n x n(n 1) (n x + 1) p x x! (np)[(n 1)p] [(n x + 1)p] x! = lim n {1 λ n }n x = e λ