An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

Similar documents
STAT509: Discrete Random Variable

Chapter 3: Discrete Random Variable

Known probability distributions

Notes for Math 324, Part 17

Relationship between probability set function and random variable - 2 -

Introduction to Probability and Statistics Slides 3 Chapter 3

Binomial and Poisson Probability Distributions

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 3 Discrete Random Variables

Binomial random variable

Discrete Probability Distributions

Discrete Random Variable Practice

Bernoulli Trials, Binomial and Cumulative Distributions

Discrete Distributions

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

ax, 0 < x < 1 0, otherwise. f(x)dx = 1).

Discrete Distributions

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 120 minutes.

Discrete Random Variables

An-Najah National University Faculty of Engineering Industrial Engineering Department. Course : Quantitative Methods (65211)

STAT 509 STATISTICS FOR ENGINEERS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina

Statistics for Economists. Lectures 3 & 4

Chapter 5 Joint Probability Distributions

Continuous-Valued Probability Review

Discrete Distributions

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

Applied Statistics I

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

CMPSCI 240: Reasoning Under Uncertainty

HYPERGEOMETRIC and NEGATIVE HYPERGEOMETIC DISTRIBUTIONS

Chapter 3. Discrete Random Variables and Their Probability Distributions

Copyright c 2006 Jason Underdown Some rights reserved. choose notation. n distinct items divided into r distinct groups.

Discrete Distributions

1. Analysis of Grouped Data

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

CIVL 7012/8012. Basic Laws and Axioms of Probability

Discrete random variables and probability distributions

ACM 116: Lecture 2. Agenda. Independence. Bayes rule. Discrete random variables Bernoulli distribution Binomial distribution

4Continuous Random. Variables and Probability Distributions CHAPTER OUTLINE LEARNING OBJECTIVES

Statistics for Engineers Lecture 2 Discrete Distributions

DISCRETE VARIABLE PROBLEMS ONLY

II. The Binomial Distribution

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Probability and Statistics for Engineers

ECE-580-DOE : Statistical Process Control and Design of Experiments Steve Brainerd 27 Distributions:

Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators.

EDEXCEL S2 PAPERS MARK SCHEMES AVAILABLE AT:

REPEATED TRIALS. p(e 1 ) p(e 2 )... p(e k )

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Conditional Probability

Page Max. Possible Points Total 100

Chapter 7: Theoretical Probability Distributions Variable - Measured/Categorized characteristic

Random Variables Example:

Poisson Processes and Poisson Distributions. Poisson Process - Deals with the number of occurrences per interval.

Topic 3 - Discrete distributions

Bernoulli and Binomial

Discrete Random Variables

BINOMIAL DISTRIBUTION

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

There are two basic kinds of random variables continuous and discrete.

Probability and Probability Distributions. Dr. Mohammed Alahmed

Common Discrete Distributions

Chapter 3. Discrete Random Variables and Their Probability Distributions

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Bernoulli Trials and Binomial Distribution

Random Variable And Probability Distribution. Is defined as a real valued function defined on the sample space S. We denote it as X, Y, Z,

Mathematical Statistics 1 Math A 6330

Discrete Probability Distributions

Discrete Random Variables. Discrete Random Variables

37.3. The Poisson Distribution. Introduction. Prerequisites. Learning Outcomes

Discrete Probability Distributions

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

MAT X (Spring 2012) Random Variables - Part I

Solutions - Final Exam

Find the value of n in order for the player to get an expected return of 9 counters per roll.

Lecture 3. Discrete Random Variables

2. A music library has 200 songs. How many 5 song playlists can be constructed in which the order of the songs matters?

Discrete Random Variables

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Discrete Random Variables

Probability (10A) Young Won Lim 6/12/17

STA 584 Supplementary Examples (not to be graded) Fall, 2003

Lecture 4: Random Variables and Distributions

TOPIC 12 PROBABILITY SCHEMATIC DIAGRAM

Topic 3: The Expectation of a Random Variable

Probability Theory and Random Variables

Probability Distribution

HW on Ch Let X be a discrete random variable with V (X) = 8.6, then V (3X+5.6) is. V (3X + 5.6) = 3 2 V (X) = 9(8.6) = 77.4.

EE 505 Introduction. What do we mean by random with respect to variables and signals?

Lecture 08: Poisson and More. Lisa Yan July 13, 2018

Discrete Probability Distribution

2Probability CHAPTER OUTLINE LEARNING OBJECTIVES

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Transcription:

An-Najah National University Faculty of Engineering Industrial Engineering Department Course : Quantitative Methods (65211) Instructor: Eng. Tamer Haddad 2 nd Semester 2009/2010

Chapter 3 Discrete Random Variables and Probability Distributions 3-1 DISCRETE RANDOM VARIABLES: Example: A voice communication system for a business contains 48 external lines. At a particular time, the system is observed, and some of the lines are being used. X: denote the number of lines in use. Then, X can assume any of the integer values 0 through 48. When the system is observed, if 10 lines are in use, x = 10.

Example: In a semiconductor manufacturing process, two wafers from a lot are tested. Each wafer is classified as pass or fail. Assume that the probability that a wafer passes the test is 0.8 and that wafers are independent. The random variable X is defined to be equal to the number of wafers that pass.

3-2 PROBABILITY DISTRIBUTION AND PROBABILITY MASS FUNCTION: Random variables are so important in random experiments that sometimes we essentially ignore the original sample space of the experiment and focus on the probability distribution of the random variable. The probability distribution of a random variable X is a description of the probabilities associated with the possible values of X. For a discrete random variable, the distribution is Example: There is a chance that a bit transmitted through a digital transmission channel is received in error. Let X equal the number of bits in error in the next four bits transmitted The Possible Values of X are {0,1,2,3,4}

Suppose That: Probability Distribution

Probability Mass Function:

Example: Let the random variable X denote the number of semiconductor wafers that need to be analyzed in order to detect a large particle of contamination. P(Large Contamination) = 0.01 and that the wafers are independent. Determine the probability distribution of X. P(X=1) = P(p) = 0.01 P(X=2) = P(ap) = 0.99(0.01) = 0.0099

Question 3-15. a) P(X 2) = (1/8)+(2/8)+(2/8)+(2/8)+(1/8) = 1 b) P(X > -2) = (2/8)+(2/8)+(2/8)+(2/8)+(1/8) = 1 (1/8) = 7/8 c) P(-1 X 1) = (2/8)+(2/8)+(2/8) = 6/8 d) P(X -1 or X = 2) = (1/8)+(2/8)+(1/8) = 4/8

3-3 CUMULATIVE DISTRIBUTION FUNCTION: Example: There is a chance that a bit transmitted through a digital transmission channel is received in error. Let X equal the number of bits in error in the next four bits transmitted The Possible Values of X are {0,1,2,3,4} We might be interested in the probability of three or fewer bits being in error. This question can be expressed as P(X 3) P(X 3) = P(X = 0) + P(X = 1) + P(X = 2) + P(X = 3) = 0.6561+0.2916+0.0486+0.0036 = 0.9999 This approach can also be used to determine P(X = 3) = P(X 3) P(X 2) = 0.0036

it is sometimes useful to be able to provide cumulative probabilities such as P(X x) and that such probabilities can be used to find the probability mass function of a random variable. Using cumulative probabilities is an alternate method of describing the probability distribution of a random variable.

Cumulative Distribution Function:

Like a probability mass function, a cumulative distribution function provides probabilities. Notice that even if the random variable X can only assume integer values, the cumulative distribution function can be defined at no integer values. For the last example: F(1.5) = P(X 1.5) = P(X = 0) + P(X = 1) = 0.6561+0.2916 = 0.9477

Example: Determine the probability mass function of X from the following cumulative distribution function: f(-2) = 0.2 0 = 0.2 f(0) = 0.7 0.2 = 0.5 f(2) = 1.0 0.7 = 0.3

Example: Suppose that a day s production of 850 manufactured parts contains 50 parts that do not conform to customer requirements. Two parts are selected at random, without replacement, from the batch. Let the random variable X equal the number of nonconforming parts in the sample. What is the cumulative distribution function of X?

Question 3-35. a) P(X 3) = 1 b) P(X 2) = 0.5 c) P(1 X 2) = 0.5 d) P(X > 2) = 1 P(X 2) = 1-0.5 = 0.5

3-4 MEAN AND VARIANCE OF A DISCRETE RANDOM VARIABLE: The mean is a measure of the center or middle of the probability distribution. The variance is a measure of the dispersion, or variability in the distribution.

Two Different Distribution can have the same mean and variance. Still, these measures are simple, useful summaries of the probability distribution of X. Equal means and different variances See the derivation of V(X). Page 75. Equal means and variances

Example: Let X equal the number of bits in error in the next four bits transmitted. Although X never assumes the value 0.4, the weighted average of the possible values is 0.4.

Example: Two new product designs are to be compared on the basis of revenue potential. The revenue from design A can be predicted quite accurately to be $3 million. Marketing concludes that there is a probability of 0.3 that the revenue from design B will be $7 million, but there is a 0.7 probability that the revenue will be only $2 million. Which design do you prefer? X: denote the revenue from design A. Y: denote the revenue from design B. E(X) = 3$ E(Y) = 0.3(7) + 0.7(2) = 3.5$. We might prefer design B. For design B: S.D. = 2.29$

Example: Expected Value of a Function of a Discrete Random Variable:

Example: X is the number of bits in error in the next four bits transmitted. What is the Expected Value of the square of the number of bits in error?? h (X ) = X 2 Therefore, In a special case that h (X ) = ax + b a, b: constants E[h (X )] = a E(X) + b.

Question 3-47. The range of the random variable X is [0, 1, 2, 3, x] where x is unknown. If each value is equally likely and the mean of X is 6, determine x. f (x i ) = 1/5 0 f (0) + 1 f (1) + 2 f (2) + 3 f (3) + x f (x) = 6 0(1/5) + 1(1/5) + 2(1/5) + 3(1/5) + x(1/5) = 6 x = 24

3-5 DISCRETE UNIFORM DISTRIBUTION: Example: The first digit of a part s serial number is equally likely to be any one of the digits 0 through 9. If one part is selected from a large batch and X is the first digit of the serial number X has a discrete uniform distribution with probability 0.1 for each value in R= {0,1,2,,9} f(x) = 0.1

Example: let the random variable X denote the number of the 48 voice lines that are in use at a particular time. X is a discrete uniform random variable with a range of 0 to 48. Let the random variable Y denote the proportion of the 48 voice lines that are in use at a particular time, Y=X/48

Question 3-55. Product codes of 2, 3, or 4 letters are equally likely. What is the mean and standard deviation of the number of letters in 100 codes? It is a uniform distribution E(X) of 100 codes = 100 [(4+2)/2] = 300 S.D. of 100 codes = (100) 2 * [(4-2+1) 2 1]/12 = 6666.67

3-6 BINOMIAL DISTRIBUTION: Consider the following: 1. Flip a coin 10 times. Let X = number of heads obtained. 2. A machine produces 1% defective parts. Let X = number of defective parts in the next 25 parts produced. 3. In the next 20 births at a hospital. Let X = the number of female births. Each of these random experiments can be thought of as consisting of a series of repeated, random trials. The random variable in each case is a count of the number of trials that meet a specified criterion. The outcome from each trial either meets the criterion that X counts or it does not; (Success / Failure) LABELS

A trial with only two possible outcomes is used so frequently as a building block of a random experiment that it is called a Bernoulli trial. It is usually assumed that the trials that constitute the random experiment are independent. (probability of a success in each trial is constant) Example: The chance that a bit transmitted through a digital transmission channel is received in error is 0.1. E: a bit in error O: a bit is okay.

The event that X=2 consists of the six outcomes: Using the assumption that the trials are independent, In general,

Binomial Distribution:

Binomial Distribution Examples:

Example: Each sample of water has a 10% chance of containing a particular organic pollutant. Assume that the samples are independent with regard to the presence of the pollutant. Find the probability that in the next 18 samples, exactly 2 contain the pollutant. Determine the probability that at least four samples contain the pollutant.

Determine the probability that 3 X < 7. Binomial Distribution (Mean and Variance): Example: For the last example: E(X) = 18(0.1) = 1.8 V(X) = 18(0.1)(1-0.1) = 1.62

Question 3-71. The phone lines to an airline reservation system are occupied 40% of the time. Assume that the events that the lines are occupied on successive calls are independent. Assume that 10 calls are placed to the airline. A- What is the probability that for exactly three calls the lines are occupied? P(X = 3) = [10!/(3!7!)]*(0.4) 3 *(0.6) 7 = 0.215 B- What is the probability that for at least one call the lines are not occupied? Probability = 1 - P(X = 0) =1 { [10!/(0!10!)]*(0.4) 0 *(0.6) 10 }= 1-0.215 = 1-0.006 = 0.994 C- What is the expected number of calls in which the lines are all occupied? E(X) = 10(0.4) = 4

3-7 GEOMETRIC AND NEGATIVE BINOMIAL DISTRIBUTION: 3-7.1 Geometric Distribution: Assume a series of Bernoulli trials (independent trials with constant probability p of a success on each trial) Trials are conducted until a success is obtained. Let the random variable X denote the number of trials until the first success. Example: The chance that a bit transmitted through a digital transmission channel is received in error is 0.1. Assume the transmissions are independent events X: # of bits transmitted until the first error occurs. P(X = 5) = P(OOOOE) = (0.9 4 )0.1 = 0.066 where X = {1,2,3, }

Geometric Distribution:

Example: The probability that a wafer contains a large particle of contamination is 0.01. If it is assumed that the wafers are independent, what is the probability that exactly 125 wafers need to be analyzed before a large particle is detected? P = 0.01 X: # of samples analyzed until a large particle is detected. X is a geometric random variable

Geometric Distribution (Mean and Variance): Lack of Memory Property Example: For the previous example, E(X) = 1/0.01 = 100 V(X) = (1-0.01)/0.01 2 = 9900

3-7.2 Negative Binomial Distribution: A generalization of a geometric distribution in which the random variable is the number of Bernoulli trials required to obtain r successes results in the Negative Binomial Distribution. Example: The probability that a bit transmitted through a digital transmission channel is received in error is 0.1. Assume the transmissions are independent events, and let the random variable X denote the number of bits transmitted until the fourth error. X has a negative binomial distribution with (r = 4). P(X=10) =????? The probability that exactly three errors occur in the first nine trials is determined from the binomial distribution to be

Because the trials are independent, the probability that exactly three errors occur in the first 9 trials and trial 10 results in the fourth error is the product of the probabilities of these two events, namely, Negative Binomial Distribution: If r = 1 Geometric

Negative Binomial random variable can be interpreted as the sum of r geometric random variables

Negative Binomial Distribution (Mean and Variance): Example: A Web site contains three identical computer servers. Only one is used to operate the site, and the other two are spares. The probability of a failure is 0.0005. Assuming that each request represents an independent trial, what is the mean number of requests until failure of all three servers? E(X) = 3/0.0005 = 6000 requests What is the probability that all three servers fail within five requests?

Question 3-85. The probability of a successful optical alignment in the assembly of an optical data storage product is 0.8. Assume the trials are independent. A) What is the probability that the first successful alignment requires exactly four trials? Then X is a geometric random variable with p = 0.8 P( X 4) (1 0.8) 3 0.8 0.0064 B) What is the probability that the first successful alignment requires at least four trials? P( X 4) (1 0.8) P( X 0 1) 0.8 (1 0.8) 0.8 0.2(0.8) 0.2 2 P( X 1 C) What is the probability that the first successful alignment requires at least four trials? (0.8) 0.2 0.2 2) 0.8 3 0.8 (1 0.8) 3 0.8 P( X 2 3) 0.8 (1 0.8) 0.9984 P( X 3 4) 0.8 P( X 4) 1 P( X 3) 1 [ P( X 1) P( X 2) P( X 3) 1 [(1 0.8) 0 0.8 (1 0.8) 1 0.8 (1 0.8) 2 0.8] 1 [0.8 0.2(0.8) 0.2 2 (0.8)] 1 0.992 0.008

3-8 HYPERGEOMETRIC DISTRIBUTION: 850 manufactured parts contains 50 non conforming to customer requirements. Select two parts without replacement. Let A and B denote the events that the first and second parts are nonconforming, respectively. P(B A) = 49/849 & P(A) = 50/850 Trials are not independent. Let X: the number of nonconforming parts in the sample. P(X = 0) = (800/850)*(799/849) = 0.886 P(X = 1) = (800/850)*(50/849) + (50/850)*(800/849) = 0.11 P(X = 2) = (50/850)*(49/849) = 0.003 A general formula for computing probabilities when samples are selected without replacement is quite useful.

Hypergeometric Distribution:

The expression {K,n} min is used in the definition of the range of X

Example: Batch, 100 from local supplier, 200 from next state supplier. 4 parts are selected randomly without replacement. What is the probability they are all from the local supplier???? X: # of parts in the sample from the local supplier. X has a hypergeometric distribution.

What is the probability that two or more parts in the sample are from the local supplier? What is the probability that at least one part in the sample is from the local supplier?

Hypergeometric Distribution (Mean and Variance): Example: For the previous Example, E(X) = 4(100/300) = 1.33 V(X) = 4(1/3)(2/3)[(300-4)/299] = 0.88 If n is small relative to N, the correction is small and the hypergeometric distribution is similar to the binomial.

Question 3-101. A company employs 800 men under the age of 55. Suppose that 30% carry a marker on the male chromosome that indicates an increased risk for high blood pressure. A) If 10 men in the company are tested for the marker in this chromosome, what is the probability that exactly 1 man has the marker? X : # of men who carry the marker on the male chromosome for an increased risk for high blood pressure. N=800, K=240 n=10 P( X 1) 240 560 9 240! 560! 1 1!239! 9!551! 800 800! 10 10!790! 0.1201 B) If 10 men in the company are tested for the marker in this chromosome, what is the probability that more than 1 has the marker? P( X 1) 1 P( X 1) 1 [ P( X 0) P( X 1)] P( X 0) 240 560 0 10 800 10 240! 560! 0!240! 800! 10!790! 10!550! 0.0276 P( X 1) 1 P( X 1) 1 [0.0276 0.1201] 0.8523

3-8 POISSON DISTRIBUTION: Example: Consider the transmission of n bits over a digital communication channel. X : # of bits in error. When the probability that a bit is in error is constant and the transmissions are independent, X has a binomial distribution. Let λ = pn, then E(X) = np = λ Suppose that n is INCREASES and p is DECREASES, S.t. E(X) = λ remains CONSTANT, then:

Example: Flaws occur at random along the length of a thin copper wire. X : the # of flaws in a length of L millimeters of wire Suppose that the average number of flaws in L millimeters is λ. Partition the length of wire into n subintervals of small length, say, 1 micrometer each. If the subinterval chosen is small enough, the probability that more than one flaw occurs in the subinterval is negligible. We can interpret the assumption that flaws occur at random to imply that every subinterval has the same probability of containing a flaw, say, p. If we assume that the probability that a subinterval contains a flaw is independent of other subintervals, we can model the distribution of X as approximately a binomial random variable.

That is, the probability that a subinterval contains a flaw is. With small enough subintervals, n is very large and p is very small. Therefore, the distribution of X is obtained as in the previous example. The same reasoning can be applied to any interval, including an interval of time, an area, or a volume. Poisson Distribution:

It is important to use consistent units in the calculation of probabilities, means, and variances involving Poisson random variables. average number of flaws per millimeter of wire is 3.4, then the average number of flaws in 10 millimeters of wire is 34, and the average number of flaws in 100 millimeters of wire is 340. Poisson Distribution (Mean and Variance):

Example: For the case of the thin copper wire, suppose that the number of flaws follows a Poisson distribution with a mean of 2.3 flaws per millimeter. Determine the probability of exactly 2 flaws in 1 millimeter of wire. X: # of flaws in 1 mm of wire. E(X) = 2.3 flaws/mm Determine the probability of 10 flaws in 5 millimeters of wire. X: # of flaws in 5 mm of wire. Determine the probability of at least 1 flaw in 2 millimeters of wire.

Question 3-111. Astronomers treat the number of stars in a given volume of space as a Poisson random variable. The density in the Milky Way Galaxy in the vicinity of our solar system is one star per 16 cubic light years. A) What is the Probability of two or more stars in 16 cubic light years? X : # of stars per 16 cubic light years P(X 2) = 1 [ P(X=0) + P(X=1) ] = 0.564 B) How many cubic light years of space must be studied so that the probability of one or more stars exceeds 0.95? In order that P(X 1) = 1-P(X=0)=1-e - λ exceed 0.95, we need λ=3. Therefore 3*16=48 cubic light years of space must be studied.