Lecture 2. Binomial and Poisson Probability Distributions

Similar documents
Lecture 2 Binomial and Poisson Probability Distributions

Binomial and Poisson Probability Distributions

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

Week 4: Chap. 3 Statistics of Radioactivity

University of California, Berkeley, Statistics 134: Concepts of Probability. Michael Lugo, Spring Exam 1

STA Module 4 Probability Concepts. Rev.F08 1

Module 8 Probability

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Lecture 12. Poisson random variables

14 - PROBABILITY Page 1 ( Answers at the end of all questions )

Scientific Measurement

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Probability Distributions.

MAT Mathematics in Today's World

Expected Value 7/7/2006

Guidelines for Solving Probability Problems

Chapter 2: The Random Variable

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Statistical Methods for the Social Sciences, Autumn 2012

Discrete Probability. Chemistry & Physics. Medicine

Probability Distributions - Lecture 5

CSE 103 Homework 8: Solutions November 30, var(x) = np(1 p) = P r( X ) 0.95 P r( X ) 0.

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Intermediate Math Circles November 8, 2017 Probability II

success and failure independent from one trial to the next?

Name: Firas Rassoul-Agha

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

UNIT NUMBER PROBABILITY 6 (Statistics for the binomial distribution) A.J.Hobson

STAT 345 Spring 2018 Homework 4 - Discrete Probability Distributions

18.175: Lecture 17 Poisson random variables

Chapter 8: An Introduction to Probability and Statistics

Lectures on Elementary Probability. William G. Faris

MTH302 Quiz # 4. Solved By When a coin is tossed once, the probability of getting head is. Select correct option:

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

18.175: Lecture 13 Infinite divisibility and Lévy processes

4. Discrete Probability Distributions. Introduction & Binomial Distribution

Probability, For the Enthusiastic Beginner (Exercises, Version 1, September 2016) David Morin,

Lecture notes for probability. Math 124

Chapter 14. From Randomness to Probability. Copyright 2012, 2008, 2005 Pearson Education, Inc.

1 Normal Distribution.

What is the probability of a goal in a 3-minute slot? Over 30 3-minute slots, what is the probability of: 0 goals? 1 goal? 2 goals?

Relationship between probability set function and random variable - 2 -

Math Bootcamp 2012 Miscellaneous

CS 361: Probability & Statistics

1. Discrete Distributions

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

4 Lecture 4 Notes: Introduction to Probability. Probability Rules. Independence and Conditional Probability. Bayes Theorem. Risk and Odds Ratio

Random Models. Tusheng Zhang. February 14, 2013

Entropy. Probability and Computing. Presentation 22. Probability and Computing Presentation 22 Entropy 1/39

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Lecture 6. Statistical Processes. Irreversibility. Counting and Probability. Microstates and Macrostates. The Meaning of Equilibrium Ω(m) 9 spins

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

UNIT 4 MATHEMATICAL METHODS SAMPLE REFERENCE MATERIALS

Introduction to Probability 2017/18 Supplementary Problems

Topic 3 Random variables, expectation, and variance, II

The Exciting Guide To Probability Distributions Part 2. Jamie Frost v1.1

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

2. The Binomial Distribution

ECE 302: Chapter 02 Probability Model

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

CMPSCI 240: Reasoning Under Uncertainty

1. Analysis of Grouped Data

CME 106: Review Probability theory

Chapter 4: An Introduction to Probability and Statistics

the time it takes until a radioactive substance undergoes a decay

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Introduction to Probability

9/6/2016. Section 5.1 Probability. Equally Likely Model. The Division Rule: P(A)=#(A)/#(S) Some Popular Randomizers.

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

Probability and Statistics Concepts

Math 493 Final Exam December 01

2. AXIOMATIC PROBABILITY

REPEATED TRIALS. p(e 1 ) p(e 2 )... p(e k )

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Lecture 2: Discrete Probability Distributions

Probability and Statistics

Computational modeling

Discrete Finite Probability Probability 1

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Topic -2. Probability. Larson & Farber, Elementary Statistics: Picturing the World, 3e 1

Lecture 3. Discrete Random Variables

18440: Probability and Random variables Quiz 1 Friday, October 17th, 2014

Binomial Distribution *

Binomial and Poisson Probability Distributions

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Notes for Math 324, Part 17

Discrete Probability

Chapter 2: Probability Part 1

Part 3: Parametric Models

Discrete Probability

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

PROBABILITY DISTRIBUTIONS: DISCRETE AND CONTINUOUS

STAT 201 Chapter 5. Probability

Transcription:

Durkin, Lecture 2, Page of 6 Lecture 2 Binomial and Poisson Probability Distributions ) Bernoulli Distribution or Binomial Distribution: Consider a situation where there are only two possible outcomes (these are called Bernoulli trials). example: flipping a coin we either get a head or a tail rolling a dice we either get a six or we do not get a six (we get, 2, 3, 4, 5) Label the possible outcomes by the variable k. We want to find P(k) the probability for event k to occur. Since for this distribution k can take on only 2 values we define those values as: k = 0 or k = Let P(k=0) = q (remember 0 q ), then we have: P(k=) = p = - q since P(k=0) + P(k=) = (something must happen) We can write the probability distribution P(k) as: P(k) = p k q -k (this is a Bernoulli distribution) ote: for the coin toss we could define P() = probability for a head and assign its probability to be P() = 0.5 for the dice, define P() = probability for a six to be rolled and assign its probability to be P() = /6 What is the mean (µ) of P(k)? What is the Variance (σ 2 ) of P(k)? σ 2 = k=0 k=0 µ = P(k)k 2 P(k) k= 0 k =0 P(k )k P(k) = 0 q + p q + p = p µ 2 = ( q) ( q) 2 = q( q) = qp Suppose we have trials (e.g. we flip a coin times) what is the probability to get m successes (= heads)? First consider tossing a coin twice. List all possible out comes from tossing to coins: (no heads) P(m=0) = q 2 m = (one head) P(m=) = qp + pq (toss is a tail, toss 2 is a head or toss is head, toss 2 is a tail) m = 2 (two heads) P(m=2) = p 2 ote that for the case where m = we don't care which of the tosses is a head so there are two outcomes that give us head. We can write P(m=) as: P(m=) = 2pq We want the probability distribution function P(m,, p) where: p = probability for a success (e.g. 0.5 for a head in a coin toss) m = number of success (number of heads) = number of trials (e.g. coin tosses) If we look at the three choices for the coin flip example, each term (m =, 2, 3) is of the form: C m p m q -m (,, 2), = 2 for our example, q = - p always! The coefficient C m takes into account the number of ways an outcome can occur. For or 2 there is only one way for the outcome to arise (both tosses give heads or tails). Thus: C 0 = C 2 =

For m = (one head, two tosses) there are two ways that this can occur, thus: C = 2. Durkin, Lecture 2, Page 2 of 6 These coefficients are called the "Binomial Coefficients"! The Binomial Coefficients are just the number of ways of taking things m at time and are given by: C,m = =! Binomial Coefficient m!( m)! remember your factorials: 0! =! =, 2! = 2 = 2, 3! = 2 3 = 6, m! = 2 3 m Here the order of things is not important*. e.g. 2 tosses, one head case (m = ), we don't care if toss produced the head or if toss 2 produced the head. * Unordered groups such as our example are often called combinations. There are plenty of times where the order of occurrence is important, for example when we have distinguishable objects and we want to group them m at a time. For example, if we tossed a coin twice ( = 2) and asked how many ways can we get one head (m = ), the answer would be 2. Ordered arrangements are called permutations and are given by:! P, m = permutation ( m)! Example: Suppose we have 3 balls, one white, one red, and one blue. The number of possible pairs we could have, keeping track of order is 6 (rw, wr, rb, br, wb, bw). Using the permutation formula we have: 3!/(3-2)! = 6. However, if order is not important (rw = wr) then we would use the binomial formula to get (3!)/[(3-2)!(2!)] = 3. The binomial distribution is given by: P(m,, p) = C,m p m q m = p m q m =! m!( m)! p m q m Does this formula make sense, e.g. if we sum over all possibilities do we get? To show that this distribution is normalized properly, first remember the Binomial Theorem: k k (a + b) k = l a k l b l l =0

For this example a = q = - p and b = p, and (by definition) a + b =. P(m,, p) = p m q m = ( p + q) = Thus the distribution is normalized properly. Durkin, Lecture 2, Page 3 of 6 What is the mean of this distribution? µ = m =0 mp(m,, p) = mp(m,, p) = m m= 0 P(m,, p) m p m q m m = 0 A cute way of evaluating the above sum is to take the derivative: p m q m p m = 0 = m p m q m m m =0 m p m q m = p p m m ( m)( p) m =0 p m ( m)( p) m m p m q m = ( p) p m ( p) m ( p) mp m ( p) m What's the variance of a binomial distribution? Using a similar trick as for the average we find: p µ = ( p) () ( p) µ µ = p for a binomial distribution σ 2 = (m µ) 2 P(m,, p) P(m,, p) = pq Example : Suppose you observed m special events in a sample of events. The measured probability (sometimes called efficiency ) for a special event to occur is ε = m /. The error on the probability is ε( ε) δε = = ε( ε ) (sometimes called "error on the efficiency") Thus you want to have a sample () as large as possible to reduce the certainty in the probability measurement! Example 2: Suppose a baseball player's batting average is 0.333 ( for 3 on average). Consider the case where the player either gets a hit or makes an out (forget about walks here!). In this example: p = prob. for a hit = 0.333 and q = - p = 0.667 (prob. for "no hit"). On average how many hits does the player get in 00 at bats? µ = p = 00(0.33) = 33 hits What's the standard deviation for the number of hits in 00 at bats? σ = (pq) /2 = (00 0.33 0.67) /2 4.7 hits Thus we expect 33 ± 5 hits per 00 at bats Consider a game where the player bats 4 times: Probability of 0/4 = (0.67) 4 = 20% Probability of /4 = [4!/(3!!)](0.33) (0.67) 3 = 40% Probability of 2/4 = [4!/(2!2!)](0.33) 2 (0.67) 2 = 29%

Probability of 3/4 = [4!/(!3!)](0.33) 3 (0.67) = 0% Probability of 4/4 = [4!/(0!4!)](0.33) 4 (0.67) 0 = % ote: the probability of getting at least one hit is: - P(0) = 0.8 Durkin, Lecture 2, Page 4 of 6 2) The Poisson Distribution: Another popular discrete distribution is the Poisson distribution. Consider the following conditions: a) p is very small and approaches 0. For example suppose we had a 00 sided dice instead of a 6 sided dice. Here p = /00 instead of /6. Suppose we had a 000 sided dice, p = /000...etc. b) is very large, it approaches. For example, instead of throwing 2 dice, we could throw 00 or 000 dice. c) The product p is finite. A good example of the above conditions occurs when one considers radioactive decay. Suppose we have 25 mg of an element. This is 0 20 atoms. Suppose the lifetime (λ) of this element = 0 2 years 5x0 9 seconds. The probability of a given nucleus to decay in one second = /λ = 2x0-20 /sec. For this example: = 0 20 (very large) p = 2x0-20 (very small) p = 2 (finite!) We can derive an expression for the Poisson distribution by taking the appropriate limits of the binomial distribution.! P(m,, p) = m!( m)! p m q m Using condition b) we obtain:! ( )ê( m +)( m)! = m ( m)! ( m)! q m = ( p) m = p( m) + p2 ( m)( m ) +... p + ( p)2 e p 2! 2! Putting this altogether we obtain: P(m,, p) = m p m e p = e µ µ m m! m! Here we've let µ = p. It is easy to show that: µ = p = mean of a Poisson distribution σ 2 = p = µ = variance of a Poisson distribution. ote: m is always an integer 0. µ does not have to be an integer. Since p is fixed, we write P(m, µ) instead of P(m,, p) (only need 2 numbers to specify Poisson) Back to our example of radioactivity. a) What s the probability of zero decays in one second if the average = 2 decays/sec? P(0,2) = e 2 2 0 = e 2 0! = e 2 = 0.35 3.5% b) What s the probability of more than one decay in one second if the average = 2 decays/sec? P(>,2) = P(0,2) P(,2) = e 2 2 0 e 2 2 = e 2 2e 2 = 0.594 59.4% 0!! c) Estimate the most probable number of decays/sec? We want: m P(m,µ) = 0 m* To solve this problem its convenient to maximize lnp(m, µ) instead of P(m, µ).

ln P(m,µ) = ln( e µ µ m ) = µ+mln µ ln m! m! In order to handle the factorial when take the derivative we use Stirling's Approximation: ln(m!) mln(m)-m Durkin, Lecture 2, Page 5 of 6 ln P(m,µ) = ( µ + m *lnµ ln m *!) ( µ +m*lnµ m*lnm*+m*) = ln µ ln m * + = 0 m m m m* = µ In this example the most probable value for m is just the average of the distribution. Therefore if you observed m events in an experiment, the error on m is µ = m. Caution: The above derivation is only approximate since we used Stirlings Approximation which is only valid for large m. Another subtle point is that strictly speaking m can only take on integer values while µ is not restricted to be an integer. Below is a comparison of a binomial and Poisson distribution, each with a mean of. 0.5 Probability 0.4 0.3 0.2 0. poisson binomial µ= =3, p=/3 0 0 2 3 4 5 m Below is a comparison between a binomial and Poisson distribution where again the mean of each distribution =. As we see, for moderately small p (= /0) there s not much difference between the two distributions.

Durkin, Lecture 2, Page 6 of 6 Probability 0.4 0.35 0.3 0.25 0.2 0.5 0. 0.05 0 binomial =0,p=0. poisson µ= 0.0.0 2.0 3.0 4.0 5.0 6.0 7.0 m