Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Similar documents
Statistics for Economists. Lectures 3 & 4

Relationship between probability set function and random variable - 2 -

Discrete Distributions

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Discrete Distributions

Random Variables Example:

Introduction to Statistics. By: Ewa Paszek

Topic 3: The Expectation of a Random Variable

Mathematical Statistics 1 Math A 6330

MATH Notebook 5 Fall 2018/2019

SDS 321: Introduction to Probability and Statistics

STAT 516 Midterm Exam 2 Friday, March 7, 2008

Binomial and Poisson Probability Distributions

Discrete Random Variables

Lecture 13. Poisson Distribution. Text: A Course in Probability by Weiss 5.5. STAT 225 Introduction to Probability Models February 16, 2014

Statistics for Engineers Lecture 2 Discrete Distributions

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

Discrete Distributions

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Chapter 3: Discrete Random Variable

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Probability Distributions

Chapter 3. Discrete Random Variables and Their Probability Distributions

success and failure independent from one trial to the next?

Notes for Math 324, Part 17

Polytechnic Institute of NYU MA 2212 MIDTERM Feb 12, 2009

Discrete Random Variable Practice

Common Discrete Distributions

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Lecture 3. Discrete Random Variables

Week 6, 9/24/12-9/28/12, Notes: Bernoulli, Binomial, Hypergeometric, and Poisson Random Variables

Notes 12 Autumn 2005

Binomial random variable

Introduction to Probability and Statistics Slides 3 Chapter 3

Part (A): Review of Probability [Statistics I revision]

More on Distribution Function

Probability Density Functions and the Normal Distribution. Quantitative Understanding in Biology, 1.2

DEFINITION: IF AN OUTCOME OF A RANDOM EXPERIMENT IS CONVERTED TO A SINGLE (RANDOM) NUMBER (E.G. THE TOTAL

Chapter 3 Discrete Random Variables

What is Probability? Probability. Sample Spaces and Events. Simple Event

UNIT NUMBER PROBABILITY 6 (Statistics for the binomial distribution) A.J.Hobson

Analysis of Engineering and Scientific Data. Semester

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Topic 3: The Expectation of a Random Variable

CME 106: Review Probability theory

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Geometric Distribution The characteristics of a geometric experiment are: 1. There are one or more Bernoulli trials with all failures except the last

15 Discrete Distributions

Edexcel past paper questions

DISCRETE VARIABLE PROBLEMS ONLY

Chapter 2 Random Variables

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

REPEATED TRIALS. p(e 1 ) p(e 2 )... p(e k )

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

STAT 345 Spring 2018 Homework 4 - Discrete Probability Distributions

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Statistics for Managers Using Microsoft Excel (3 rd Edition)

Part 3: Parametric Models

Chapter 5. Chapter 5 sections

Homework 4 Solution, due July 23

Conditional distributions (discrete case)

Discrete probability distributions

Distribusi Binomial, Poisson, dan Hipergeometrik

Test Problems for Probability Theory ,

What is a random variable

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

Special distributions

Things to remember when learning probability distributions:

Chapter 1: Revie of Calculus and Probability

Unit II. Page 1 of 12

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Chapter 4 : Discrete Random Variables

Guidelines for Solving Probability Problems

Introduction to Probability 2017/18 Supplementary Problems

STAT/MATH 395 PROBABILITY II

Joint Distribution of Two or More Random Variables

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Expectation MATH Expectation. Benjamin V.C. Collins, James A. Swenson MATH 2730

Chapter 3. Discrete Random Variables and Their Probability Distributions

Discrete Distributions

Discrete Probability Distributions

Page Max. Possible Points Total 100

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

Find the value of n in order for the player to get an expected return of 9 counters per roll.

PRACTICE PROBLEMS FOR EXAM 2

Math 493 Final Exam December 01

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Discrete Probability Refresher

Probability and Statistics Concepts

Special Mathematics Discrete random variables

Transcription:

Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution 2.6 Poisson Distribution 2-1 2.1 Random Variables of the Discrete Type It seems like we solve the finding probability problem Not really. We are only able to solve simple problems For example, a fair coin is tossed 10 times. Find P(at least 5 heads). It is not easy if we only use knowledge from chapter 1 Too much counting involved. We would like to study this type of problem mathematically and systematically. Random variable Given a random experiment with an outcome space S, a function X that assigns one and only one real number X(s) = x to each element s in S is called a random variable. The space of X is the set of real numbers { x : X(s) = x, s S}, where s S means that the element s belongs to the set S. Examples: Example 2.1-1, tossing coin example above, simple tossing coin & dice example, etc. Example 2.1-2 on next slide 2-2 1

Example 2.1-2 Let the random experiment be the cast of a fair die. Then the outcome space associated with this experiment is S = {1, 2, 3, 4, 5, 6}, with the elements of S indicating the number of spots on the side facing up. For each s S, let X(s) = s. The space of the random variable X is then {1, 2, 3, 4, 5, 6}. We know the probability for each number showing up Find the following probabilities P(X = 5) P(2 X 5) P(X 2) Solutions P(X = 5) = 1 / 6 P(2 X 5) = 4 / 6 P(X 2) = 2 / 6 2-3 Probability Mass Function How to find the probability through random variable? Probability mass function or probability distribution The probability mass function (p. m. f.) f(x) of a discrete random variable X is a function that satisfies the following properties: (a) f ( x) 0, xs I often use table, instead of p. m. f. f(x) (since it is much easier) Examples for p. m. f. Examples 2.1-3 2.1-3 Given p. m. f., we can find any kind of probability Examples (b) f ( x) 1 xs (c) P( X A)= f ( x), where A S xa 2-4 2

Examples for p. m. f. Example 2.1-3 Roll a fair four-sided die twice, and let X be the maximum of the two outcomes. The outcome space for this experiment is S 0 = {(d 1, d 2 ) : d 1 = 1, 2, 3, 4; d 2 = 1, 2, 3, 4}, where we assume that each of these 16 points has probability 1/16. Find the p.m.f of X Solution Possible values for X: 1, 2, 3, 4 P(X = 1) = P[(1, 1)] = 1/16, P(X = 2) = P[{(1, 2), (2, 1), (2, 2)}] = 3/16, Similarly, P(X = 3) = 5/16, P(X = 4) = 7/16. f(x) =P(X=x) = (2x 1)/16 X 1 2 3 4 Example 2.1-7 A fair four-sided die with outcomes 1, 2, 3, and 4 is rolled twice. Let X equal the sum of the two outcomes. Find the p.m.f of X The possible values of X are 2, 3, 4, 5, 6, 7, and 8. f(x) 1/16 3/16 5/16 7/16 2-5 Hypergeometric Distribution Consider a collection of N = N 1 +N 2 similar objects, N 1 of them belonging to one of two dichotomous classes (red chips,... N 2 of them belonging to the second class (blue chips, ). A collection of n objects is selected from these N objects at random and without replacement. Find the probability that exactly x (where the nonnegative integer x satisfies x n, x N 1, and n x N 2 ) of these n objects belong to the first class and n x belong to the second. If we assume that each of the C(N, n) (combinations) ways of selecting n objects from N = N 1 + N 2 objects has the same probability, it follows that the desired probability is 2-6 3

Examples of Hypergeometric Distribution Example 2.1-5 In a small pond there are 50 fish, 10 of which have been tagged. If a fisherman s catch consists of 7 fish selected at random and without replacement, and X denotes the number of tagged fish Find the probability that exactly 2 tagged fish are caught Example 2.1-6 A lot (collection) consisting of 100 fuses is inspected by the following procedure: Five fuses are chosen at random and tested; if all five blow at the correct amperage, the lot is accepted. Suppose that the lot contains 20 defective fuses. X is a random variable equal to the number of defective fuses in the sample of 5 Find the probability of accepting the lot Plots of hypergeometric distributions See text at page 46 2-7 2.2 Mathematical Expectation If f(x) is the p. m. f. of the random variable X of the discrete type with space S, and if the summation u ( x) f ( x) exists, then the x S sum is called the mathematical expectation or the expected value of the function u(x), and it is denoted by E[u(X)]. That is, E[ u ( X )] u ( x) f ( x) x S mathematical expectation is an extremely important concept in summarizing important characteristics of probability distributions Theorem 2.2-1: When it exists, the mathematical expectation E has the following properties: (a) If c is a constant, then E(c) = c. (b) If c is a constant and u(.) is a function, then E[cu(X) ] = ce[u(x)] (c) If c 1 and c 2 are constants and u 1 and u 2 are functions, then E[c 1 u 1 (X) + c 2 u 2 (X) ] = c 1 E[u 1 (X )] + c 2 E[u 2 (X) ] Let us see some examples From the text & ex. 2.2-7 2-8 4

Examples of Mathematical Expectation Example 2.2-2 Let the random variable X have the pmf f(x) = 1/3, x S X, where S X = { 1, 0, 1}. Let u(x) = X 2. Find E(u(X)) Example 2.2-3 Let X have the pmf f(x) = x/10, x = 1, 2, 3, 4. Find a). E(X) b). E(X 2 ) c). E[X(5 X)] Example 2.2-4 Let u(x) = (x b) 2, where b is not a function of X, and suppose E[(X b) 2 ] exists. To find that value of b for which E[(X b) 2 ] is a minimum Example 2.2-5 Let X have a hypergeometric distribution in which n objects are selected from N = N1 + N2 objects Find the the mean of X, i.e., E(X) 2-9 Example 2.2-6 Geometric Distribution Say an experiment has probability of success p, where 0 < p < 1, and probability of failure 1 p = q. This experiment is repeated independently until the first success occurs; say this happens on the X trial. Possible value of X Find the p.m.f of X Find the mean of X 2-10 5

2.3 Special Mathematical Expectations Mean of the random variable X: μ = E(X) Meaning? Variance of the random variable X: σ 2 = Var(X) = E(X μ) 2 Var(X) = E(X 2 ) - μ 2 Standard deviation of X: Meaning? Why both? EX ( ) 2 2 Properties of mean and variance E(aX + b) = ae(x) + b; Var(aX + b) = a 2 Var(X) Moments of the random variable X: E(X k ), k = 1, 2, Moments of X about its mean μ: E[(X - μ) k ], k = 1, 2, We will see the usefulness of moments later Examples from the text 2-11 Examples of Means and Variances Example 2.3-1 Let X equal the number of spots on the side facing upward after a fair six-sided die is rolled. Example 2.3-2 Let X have the pmf f (x) = 1/3, x = 1, 0, 1. Example 2.3-3 Let X have a uniform distribution on the first m positive integers. Example 2.3-4 In Example 2.2-5 concerning the hypergeometric distribution, we found that the mean of that distribution is In order to find variance, we first find E[X(X 1)] 2-12 6

Moment-Generating Function The mean, variance, and standard deviation are important characteristics of a distribution. For some distributions, binomial distribution, for instance, it is fairly difficult to directly compute the mean and the variance. Although this moment-generating function is extremely important, there is a uniqueness property that is even more important. Use m. g. f. to find mean and variance and identify distribution Moment-generating function of X Let X be a random variable of the discrete type with p. m. f. f(x) and tx tx space S. If there is a positive number h such that Ee ( ) e f( x) xs exists and is finite for -h < t < h, then the function of t defined by tx M () t E( e ) is called the moment-generating function of X 2-13 Examples Moment-Generating Function Example 2.3-5 If X has the mgf Find its pmf Solution by definition Example 2.3-6 Suppose the mgf of X is Find its pmf Solution change M(t) into the format of the definition 2-14 7

Finding Moments through mgf From the theory of Laplace transforms, we have ( r Therefore, ) r r M (0) x f( x) E( X ), r 1,2,... xs Use this properties to find mean and variance Common distributions moment-generating functions Geometric distribution Example 2.3-7 Moment-generating function and mean and variance 2-15 2.4 Binomial Distribution A Bernoulli experiment is a random experiment and it has only two mutually exclusive and exhaustive outcomes Success vs failure; P(success) = p; P(failure) = 1 p = q Bernoulli distribution, the easiest distribution; f(x) = p x (1 p) 1-x, x = 0, 1 mean and variance =? Example 2.4-1 Suppose that the probability of germination of a beet seed is 0.8 and the germination of a seed is called a success. If we plant 10 seeds and can assume that the germination of one seed is independent of the germination of another seed, this would correspond to 10 Bernoulli trials with p = 0.8. Example 2.4-2 In the Michigan daily lottery the probability of winning when placing a six-way boxed bet is 0.006. A bet placed on each of 12 successive days would correspond to 12 Bernoulli trials with p = 0.006 2-16 8

Finding Probability of Bernoulli Trials Example 2.4-3 Out of millions of instant lottery tickets, suppose that 20% are winners. If five such tickets are purchased, then (0, 0, 0, 1, 0) is a possible observed sequence in which the fourth ticket is a winner and the other four are losers. Assuming independence among winning and losing tickets, find the probability of this outcome Solution: (0.8)(0.8)(0.8)(0.2)(0.8) = (0.2)(0.8) 4 Example 2.4-4 If five beet seeds are planted in a row, a possible observed sequence would be (1, 0, 1, 0, 1) in which the first, third, and fifth seeds germinated and the other two did not. If the probability of germination is p = 0.8, find the probability of this outcome, assuming independence Solution: (0.8)(0.2)(0.8)(0.2)(0.8) = (0.8) 3 (0.2) 2 2-17 Binomial Distribution Binomial Experiment is a pre-set number of Bernoulli trials 1. A Bernoulli ( success failure) experiment is performed n times. 2. The trials are identical and independent. 3. The probability of success on each trial is a constant p; then the probability of failure is q = 1 - p. 4. The random variable X = the number of successes in the n trials. X has a Binomial distribution How to obtain the Binomial distribution? An simple Example 2.4-3. What about exact one winner? General formula for Binomial distribution ( ) x nx f x ncx p (1 p), x constants n and p are called the parameters 0,1, 2,..., n 2-18 9

Examples of Binomial Distribution Example 2.4-5 In the instant lottery with 20% winning tickets, if X is equal to the number of winning tickets among n = 8 that are purchased, then the probability of purchasing two winning tickets is Example 2.4-7 In Example 2.4-1, the number X of seeds that germinate in n = 10 independent trials is b(10, 0.8); P(X 8) = P(X = 6) = P(X 6) = Plots of binomial distributions p = 0.5 => symmetric distribution Moment generating function M ( t) ( q t n pe ) 2-19 Notes on Binomial Distribution Everything regarding Binomial distribution First identify n & p Then finding probability Mean and standard deviation Graphically display of Binomial distribution Cumulative distribution function or simply distribution function of the random variable X: F(x) = P(X x) With cdf F(x) you can calculate any probability Computing Note TI-83 plus or TI-84 Smart phone app In Excel, function BINOM.DIST(x, n, p, logical_value) gives f(x) if logical_value = 0 and gives F(x) if logical_value = 1. 2-20 10

2.5 Negative Binomial Distribution Wait until exactly r successes occur in a sequence of independent Bernoulli trials How to obtain pmf of NBD? The r th success in x th trial Exactly r -1 successes in the first x -1 trials It covers geometric distribution as a special case Moment generating function Moments: mean & variance 2-21 Examples Example 2.5-1 Some biology students were checking eye color in a large number of fruit flies. For the individual fly, suppose that the probability of white eyes is 1/4 and the probability of red eyes is 3/4, and that we may treat these observations as independent Bernoulli trials. Find the probability that at least four flies have to be checked for eye color to observe a white-eyed fly Example 2.5-2 Suppose that during practice a basketball player can make a free throw 80% of the time. Furthermore, assume that a sequence of free-throw shooting can be thought of as independent Bernoulli trials. Let X equal the minimum number of free throws that this player must attempt to make a total of 10 shots. Find the pmf of X, mean & variance Example 2.5-4 Let the moments of X be defined by E(X r ) = 0.8, r = 1, 2, 3,... Find mgf & pmf of X Example 2.5-5 A fair six-sided die is rolled until each face is observed at least once. On the average, how many rolls of the die are needed? See p. 78 2-22 11

2.6 The Poisson Distribution Poisson process a Poisson process is a stochastic process which counts the number of events and the time that these events occur in a given time interval. Poisson distribution Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space x p. m. f. e f( x), x 0,1, 2,..., where 0 x! 2-23 Moments of Poisson Distribution m. g. f. of Poisson distribution Maclaurin series expansion of e λ, M(t) = E(e tx t x ) = ( e 1) e Using mgf to obtain mean and variance µ = M ' (0); Only one known distribution whose mean = variance Can we directly calculate mean and variance? E(X) is not that difficult To find variance, we first find E[X(X 1)] e x 0 x! 2-24 12

Example 2.6-1 Examples of Finding Probability Let X have a Poisson distribution with a mean of λ = 5. P(X 6) = P(X > 5) = P(X = 6) = Example 2.6-3 A USB flash drive is sometimes used to back up computer files. However, in the past, a less reliable backup system that was used was a computer tape, and flaws occurred on these tapes. In a particular situation, flaws (bad records) on a used computer tape occurred on the average of one flaw per 1200 feet. If one assumes a Poisson distribution, what is the distribution of X, the number of flaws in a 4800-foot roll? P(X = 0)= P(X 4)= Example 2.6-4 In a large city, telephone calls to 911 come on the average of two every 3 minutes. If one assumes an approximate Poisson process, what is the probability of five or more calls arriving in a 9-minute period? P(X 5)= 2-25 Graphical Displays Graphical display of p. m. f. of Poisson distribution with different λ The Example 2.6-2 & 2.6-6 Pay attention to the shape of histogram the bell-curve Approximation now has no practical meaning But B(n, p) Poisson(λ), where np = λ 2-26 13

Common Discrete Distributions For any distribution, if we know its p.m.f., we know everything about this distribution. Probability Mean and standard deviation Moment-generating function Bernoulli distribution Simplest distribution and building block of binomial distribution Binomial distribution Negative binomial distribution & geometric distribution Broad applications Poisson distribution Occurrence of event in a given time period Hypergeometric distribution 2-27 14