Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Similar documents
Chapter 2: The Random Variable

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Lecture Notes 2 Random Variables. Random Variable

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Stochastic Processes. Review of Elementary Probability Lecture I. Hamid R. Rabiee Ali Jalali

RS Chapter 2 Random Variables 9/28/2017. Chapter 2. Random Variables

Plotting data is one method for selecting a probability distribution. The following

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Name: Firas Rassoul-Agha

Continuous-Valued Probability Review

ESS011 Mathematical statistics and signal processing

Review of Probability. CS1538: Introduction to Simulations

Relationship between probability set function and random variable - 2 -

Chapter 2: Random Variables

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Introduction...2 Chapter Review on probability and random variables Random experiment, sample space and events

Introduction to Probability Theory for Graduate Economics Fall 2008

CME 106: Review Probability theory

Chapter 1. Sets and probability. 1.3 Probability space

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Random Variables Example:

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Review of Elementary Probability Lecture I Hamid R. Rabiee

Chapter 1 Probability Theory

Chapter 2. Discrete Distributions

Lecture 3. Discrete Random Variables

B.N.Bandodkar College of Science, Thane. Subject : Computer Simulation and Modeling.

Things to remember when learning probability distributions:

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

Statistics and Econometrics I

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Discrete Random Variables

ECE Lecture 4. Overview Simulation & MATLAB

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Multiple Random Variables

Random variables. DS GA 1002 Probability and Statistics for Data Science.

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Chap 2.1 : Random Variables

II. Probability. II.A General Definitions

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

The random variable 1

Recitation 2: Probability

Week 2. Review of Probability, Random Variables and Univariate Distributions

1: PROBABILITY REVIEW

Lecture 2 Binomial and Poisson Probability Distributions

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 2 Random Variables

Continuous Random Variables

Lecture 2. Binomial and Poisson Probability Distributions

Probability Distribution

RVs and their probability distributions

Math 105 Course Outline

MATH Notebook 5 Fall 2018/2019

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Discrete Probability Distribution

STAT509: Discrete Random Variable

Statistics 100A Homework 5 Solutions

Math 493 Final Exam December 01

Chapter 3. Chapter 3 sections

Discrete Random Variable

Probability and Statistics Concepts

DS-GA 1002 Lecture notes 2 Fall Random variables

Random variables, distributions and limit theorems

MATH Solutions to Probability Exercises

Brief Review of Probability

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

ECON 5350 Class Notes Review of Probability and Distribution Theory

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

1 Presessional Probability

Lecture 2: Repetition of probability theory and statistics

1. Discrete Distributions

X = X X n, + X 2

Central Limit Theorem and the Law of Large Numbers Class 6, Jeremy Orloff and Jonathan Bloom

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Chapter 3 Discrete Random Variables

5. Conditional Distributions

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Chapter 6 Continuous Probability Distributions

Topic 3: The Expectation of a Random Variable

Business Statistics PROBABILITY DISTRIBUTIONS

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Chapter 2: Elements of Probability Continued

CMPSCI 240: Reasoning Under Uncertainty

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

Chapter 4 Multiple Random Variables

Chapter 3: Random Variables 1

Homework 3 solution (100points) Due in class, 9/ (10) 1.19 (page 31)

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Transcription:

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function Common Random Variables and their Distribution Functions Transformation of a Single Random Variable Averages of Random Variables Characteristic Function Chebyshev's Inequality Computer Generation of Random Variables

What is a Random Variable? Why random variable?: It is easier to describe and manipulate outcomes and events of the chance eperiments using numerical values rather than in words. purpose of a random variable maps each point in S into a point on R 1. This type of transform or mapping is called a function.

What is a Random Variable? (Cont.) For eample, = () where is the random variable, is its specific value, and denotes the outcome of a chance eperiment. Figure 3.1: A r.v. is a mapping of the sample space into a real line.

What is a Random Variable? (Cont.) NOTE: Events are now described by the random variable, rather than in words, and corresponding probability of the event can be calculated via r.v. Eample: For a chance eperiment of tossing a fair coin, define a r.v. (): (head) = 1 and (tail) = 0 Then, using the equally likely assignment of probability, we have: P( = 1) = ½ and P( = 0) = ½ Value assignment of a r.v. entirely depends on the convenience, i.e. values 0 and 1 would be more convenient to handle than values and e.

What is a Random Variable? (Cont.) Eample 3.1 Consider an eperiment of rolling a pair of dice, and find the probability of all possible values of the sum. Solution: Define a r.v. as the sum of the two dice, then referring the Table 3.1 and applying the equally likely assignment of probability.

What is a Random Variable? (Cont.) Eample 3.1 (Cont.) Table 3.1: Die 1 Die 2 1 spot 2 spot 3 spot 4 spot 5 spot 6 spot 1 spot 2 3 4 5 6 7 2 spot 3 4 5 6 7 8 3 spot 4 5 6 7 8 9 4 spot 5 6 7 8 9 10 5 spot 6 7 8 9 10 11 6 spot 7 8 9 10 11 12 We have: P( = 2) = 1/36, P( = 3) = 2/36=1/18, P( = 4) = 3/36 = 1/12, P( = 5) = 4/36 = 1/9, P(= 6) = 5/36, P( = 7) = 6/36 = 1/6, P( = 8) = 5/36, P(= 9) = 4/36 = 1/9, P( = 10) = 3/36 = 1/12, P( = 11) = 2/36 = 1/18, P( = 12) = 1/36

What is a Random Variable? (Cont.) FACT: Functions of r.v. are ALSO r.v.'s. For instance: Y e, W ln, U cos, V 2

What is a Random Variable? (Cont.) Categorization of random variables: 1. Discrete random variable assumes only a countable number of values. (e.g.) rolling dice: Eample 3.1 (only 11 values) 2. Continuous random variable can assume a continuum of values. (e.g.) weather vane: angle of the indicator (any value from 0 to 2 radian) 3. Mied random variable is a combination of above two types.

Probability Distribution Functions Description of r.v. using probability: In case of discrete r.v.: probability mass distribution tabulate (or plot) probability of its values Two types of general methods of description: (1) Cumulative (probability) distribution function: cdf (2) Probability density function: pdf These apply to and work for all three types of r.v., i.e., continuous, discrete, and mied random variables.

Cumulative Distribution Function Definition 3.1 Cumulative Distribution Function: The cdf of a random variable is defined as follows: F ( ) P( )

Cumulative Distribution Function (Cont.) Eample 3.3 Consider a chance eperiment of rolling a pair of fair dice, and define a r.v. as the sum of the numbers showing up. Find the cdf of. (refer to Eample 3.1) Solution: Since the cdf is defined as F () = P( ), for instance if = 3, we have: ( ) P 2 3 P 2 P 3 F P 1,1 P1,2 2,1 P1,1 P1,2 P2,1 1 36 1 36 1 36 1 12 Notice that: F (-) = P() = 0, and F () = P(S) = 1.

Cumulative Distribution Function (Cont.) Eample 3.3 (Cont.) Figure 3.2: The cdf of, which is the sum of two dice. F ( ) 0, 2 1 36, 2 3 3 36, 3 4 6 36, 4 5 10 36, 5 6 15 36, 6 7 21 36, 7 8 26 36, 8 9 30 36, 9 10 33 36, 10 11 35 36, 11 12 36 36, 12

Cumulative Distribution Function (Cont.) General properties of cdf: 1. Limiting values: lim F ( ) 0 and lim ( ) 1 2. The cdf is right hand continuous, i.e.: F ( ) lim F ( ) 0 0 3. The cdf F () is monotonous non-decreasing function of. 4. The probability of having values b/w 1 and 2 is given by: P F ) F ( ) F 1 2 ( 2 1

The CDF (Cont.) Eample 3.4 A coin is tossed three times. Let r.v. be the number of heads in three trials. S = {0, 1, 2, 3} For a small positive number 1 F (1 ) P[ 1 ] P{0 heads} 8 1 3 F (1) P[ 1] P{0 or 1heads} 8 8 For a small positive number F (1 ) P[ 1 ] P{0 or 1 heads} 1 2 1 2

The CDF (Cont.) Eample 3.4 (Cont.) CDF Derivative of F () with respect to. 3) ( 8 1 2) ( 8 3 1) ( 8 3 ) ( 8 1 ) ( u u u u F 3) ( 8 1 2) ( 8 3 1) ( 8 3 ) ( 8 1 ) ( f

Cumulative Distribution Function (Cont.) Brief verification: 1. Notice that the inverse images of at = and = are respectively: 1 ( ) and 2. This is due to the fact that the cdf is defined as F () P( ) rather than F () P( < ): detailed proof is omitted! To prove this, we need the so called continuity aiom, which is beyond the scope of this class: will be discussed at the graduate level course. 3. Follows from property # 4. 1 ( ) S

Cumulative Distribution Function (Cont.) Brief verification (Cont.): 4. Let 1 2, then we have: which is equivalent to: Rearranging the above provides: 0 ) ( ) ( ) ( 1 2 2 1 F F P ) ( ) ( ) ( 2 2 1 1 F P F 2 1 1 2 1 1 2 ) ( P P P P

Cumulative Distribution Function (Cont.) Eample 3.4 Find the probability that the sum of two dice is between 3 and 7 inclusive. Solution: From the definition and properties of cdf, and using figure 3.1, we have: P 3 7 F 21 36 (7) F 1 36 Note that there 20 outcomes favorable to the event, thus applying the equally likely assignment probability, we get 20/36. (3 ) 20 36 5 9 F (7) F (2)

Cumulative Distribution Function (Cont.) Eample 3.5 Is F () below is a valid cdf? 1 2 1 F ( ) 1 tan 2 Solution: Check if all the properties of cdf are satisfied: self study Figure 3.3: Suitable cdf.

Probability Density Function Definition 3.2 Probability Density Function: The pdf of a (continuous) random variable is defined as follows: f df ( ) ( ) d

Probability Density Function (Cont.) NOTE: interpretation of pdf!!! Using the definition of derivative, we can rewrite pdf as: For sufficiently small, we can remove the limit, and thus: f P F F ) ( ) ( ) ( F F f ) ( ) ( lim ) ( 0

Probability Density Function (Cont.) General properties of pdf: Since the pdf of a r.v. is the derivative of the cdf, cdf F () can be epressed as the integration of the pdf f (), i.e.: F ( ) f ( u) du 1. The pdf is a non-negative function, i.e.: f ( ) 0, 2. The area under pdf is unity, i.e.: f ( ) d 1 3. The probability of having values b/w 1 and 2 is given by: 2 P( 1 2) f ( ) d 1

Probability Density Function (Cont.) Brief verification: 1. Since the cdf is non-decreasing, and pdf is the derivative ( or slope) of it, it must be non-negative. 2. Using the relationship b/w the cdf and pdf, we have: f def ( ) d F ( ) 1

Probability Density Function (Cont.) Brief verification (Cont.): 3. From the property of the cdf, and using the relationship b/w the cdf and pdf, we have: P( 1 2) F ( 2) F ( 1 ) 2 1 2 f f ( u) du ( u) du 1 f ( u) du

Probability Density Function (Cont.) Eample 3.6 Obtain the pdf of a r.v. whose cdf is given as: (eample 3.5) 1 2 1 F ( ) 1 tan 2 and find the probability of an event in {2 < 5}. Solution: d 1 Since tan d 1 f ( ) 2 1 1 1 2, we easily get :

Probability Density Function (Cont.) Eample 3.6 (Cont.) Using the property of the cdf, we obtain: 1 1 1 P(2 5) tan 5 tan 2 0. 085 Figure 3.4: The pdf of in Eample 3.6.

Probability Density Function (Cont.) Remark: The definition of the pdf can generally be applied to both continuous and discrete random variables!!! Motive: Since the cdf of a discrete r.v. can generally be epressed as the sum of the weighted & shifted unit step function u(), i.e.: F N ( ) p u( i1 i i ) Recall that the unit step function is defined as u() 1 for 0 and 0 elsewhere.

Probability Density Function (Cont.) Motive (Cont.): The derivative of u() is the unit impulse function () as: du( ) ( ) d the pdf f () of a discrete r.v. can generally be described as the sum of the weighted & shifted unit impulse function (), i.e.: f N ( ) p ( i1 i i )

Probability Density Function (Cont.) Eample 3.7 Epress the pdf of the r.v. discussed in Eample 3.3. Solution: The cdf of in terms of u() can be epressed as: ] 12 11 2 10 3 9 4 8 5 7 6 6 5 5 4 4 3 3 2 2 [ 36 1 ) ( u u u u u u u u u u u F

Probability Density Function (Cont.) Eample 3.7 (Cont.) Solution (Cont.): Taking the derivative, we find the pdf to be: ] 12 11 2 10 3 9 4 8 5 7 6 6 5 5 4 4 3 3 2 2 [ 36 1 ) ( f

Probability Density Function (Cont.) Figure 3.5: The pdf of in Eample 3.7

Common Random Variables and Their Distribution Functions Commonly occurring and widely used r.v.'s are discussed In terms of their cdf & pdf

Common Random Variables and Their Distribution Functions (Cont.) Uniform Random Variable: The uniform random variable is defined in terms of its probability density function as follows: 1 b a, a b, b a f ( ) 0, otherwise By integration, we can derive its cumulative distribution function (cdf) to be: 0, a F ( ) a b a, a b 1, b

Common Random Variables and Their Distribution Functions (Cont.) Figure 3.6: Uniform r.v.: (a) pdf, (b) cdf in case of a=0 & b=5

Common Random Variables and Their Distribution Functions (Cont.) Eample 3.8 Resistors are known to be uniformly distributed within 10% tolerance range. Find the probability that a nominal 1000 resistor has a value b/w 990 and 1010. Solution: Since the tolerance region is bounded in the interval [900; 1100] centered at the nominal value of 1000, the pdf of the resistance R is given by: f R 1 200, ( r) 0, 900 r 1100 otherwise

Common Random Variables and Their Distribution Functions (Cont.) Eample 3.8 (Cont.) Solution (Cont.): Therefore, the probability that the resistor has a resistance value in the range of [990; 1010] is then: P(990 1010 dr R 1010 ) 990 0.1 200

Common Random Variables and Their Distribution Functions (Cont.) Gaussian Random Variable: The Gaussian random variable is defined in terms of its probability density function as follows: 2 2 m 2 e f ( ) 2 2 where m and 2 are parameters called the mean and variance respectively. The cdf can be derived by direct integration of the pdf, which cannot be epressed in a closed form, however...

Common Random Variables and Their Distribution Functions (Cont.) Gaussian Random Variable (Cont.): Instead, we use either of the following functions defined as: (1) Q function: 1 2 u 2 Q( ) e du 2 whose numerical values are tabulated in Appendi C. (2) Error function: 2 erf ( ) e 0 2 u du

Common Random Variables and Their Distribution Functions (Cont.) Gaussian Random Variable (Cont.): In terms of the Q function, the cdf of the Gaussian r.v. is given by: m F ( ) 1 Q derivation: assignment Question: Epress Q and error functions in terms of each other: assignment

Common Random Variables and Their Distribution Functions (Cont.) Figure: Illustration of integration for Q and error functions.

Common Random Variables and Their Distribution Functions (Cont.) Figure 3.7: Gaussian distribution for m = 5 and = 2: (a) pdf; (b) cdf.

Common Random Variables and Their Distribution Functions (Cont.) Eample 3.9 A mechanical component, whose 5mm thickness (T) is known to have a Gaussian distribution w/ m = 5 and = 0.05. Find the probability that T is less than 4.9mm OR greater than 5.1mm. Solution: We use here the relationship Q() = 1 - Q( ) for < 0. P( T 4.9mm OR T 5.1mm) 1 P(4.9mm T 5.1mm) 1 5.1 F 4.9 5.1 5 1 1 Q 0.05 1 2Q F T T 4.9 5 1 Q 0.05 Q 2 Q2 1 1 Q 2 Q2 2 0. 02275

Common Random Variables and Their Distribution Functions (Cont.) Eponential Random Variable 사건들이발생하는시간의간격 장치와시스템의수명을모형화 f e ( ) u( ), 0 F e ( ) (1 ) u( )

Common Random Variables and Their Distribution Functions (Cont.) Gamma Random Variable: The Gamma random variable has its probability density function as follows: f ( ) b c ( b) b1 e c u( ), b and c 0 where (b) is the gamma function given by the integral: 0 b1 y ( b) y e dy It can be shown by evaluation that (1) = 1 and (½ )= ½, and by replacing b by b+1, we can also show that (b+1) = b(b), which in turn provides that (n+1) = n! in case of b an integer n.

Common Random Variables and Their Distribution Functions (Cont.) Special cases: 1. Chi-square pdf: (statistics) where b and c are replaced by b = n/2 and c = 2 2. Erlang pdf: (queuing theory) where b = n is an integer. ) ( 2) ( 2 1 ) ( 2 1 2 2 u e n f n n ) ( 1! ) ( 1 u e n c f c n n

Common Random Variables and Their Distribution Functions (Cont.) Figure 3.8: Plots of (a) chi-square pdf and (b) Erlang pdf.

Common Random Variables and Their Distribution Functions (Cont.) Cauchy Random Variable: The Cauchy random variable has its probability density function as follows: f ( ) 2 2 Corresponding cdf is given by: 1 1 1 F ( ) tan 2 Note: Eample 3.5 with figure 3.3 is the case of Cauchy r.v. for = 1.

Common Random Variables and Their Distribution Functions (Cont.) Binomial Random Variable: The r.v. is binomially distributed if it takes on the non-negative integer values 0, 1, 2,, n with probabilities: where p + q = 1. Corresponding pdf and the cdf, using the unit impulse function, are given by: n k k n k k q p k n f 0 ) ( n k q p k n k P k n k, 2, 0,1,, ) ( k k n k q p k n F ) (

Common Random Variables and Their Distribution Functions (Cont.) Remark: This is the distribution describing the number (k) of heads occurring in n tosses of a fair coin, in which case p = q = 0.5. Figure 3.9: The pdf of binomial r.v. for n = 10: (a) p = q = 1/2 ; (b) p =1/5, q = 4/5.

Common Random Variables and Their Distribution Functions (Cont.) Geometric Random Variable: Suppose we flip a biased coin w/ probability of a head is p whereas the probability of tail is 1- p. Then, the probability of getting the head (success) for the first time at the k-th toss is given by: P(first succes at trial k) P( k) where p is the probability of success. called the geometric distribution k1 1 p p, k 1, 2, corresponds to the geometric mean of k - 1 and k + 1 values.

Common Random Variables and Their Distribution Functions (Cont.) Poisson Random Variable: The r.v. is Poisson with parameter a if it takes on the non-negative integer values 0, 1, 2, with probabilities: k a a P( k) e, k 0,1, k! 2, Corresponding cdf of a Poisson r.v. is given by: F ( ) k a e k! k a

Common Random Variables and Their Distribution Functions (Cont.) Figure 3.10: Poisson pdf (a) a = 0.9; (b) a = 0.2.

Common Random Variables and Their Distribution Functions (Cont.) Limiting Forms of Binomial and Poisson Distributions: Theorem 3.1 De Moivre-Laplace theorem: For n sufficiently large, the binomial distribution can be approimated by the samples of a Gaussian curve properly scaled and shifted as: p k q nk where m = np and 2 = npq respectively. Proof: omit e km 2 2 2, n 1

Common Random Variables and Their Distribution Functions (Cont.) Corresponding cdf of a binomial r.v., based on the De Moivre-Laplace theorem, can be approimated as: np F ( ) 1 Q npq where Q() is the Q-function.

Common Random Variables and Their Distribution Functions (Cont.) Figure 3.11: Binomial cdf and De Moivre-Laplace approimation for n = 10: (a) p = 0.2; (b) a = 0.5.

Common Random Variables and Their Distribution Functions (Cont.) Limiting Forms of Binomial and Poisson Distributions (Cont.): Theorem 3.2 : The Poisson distribution approaches to a Gaussian distribution for a >> 1 as: k a e k! a Proof: e ka 2 2a 2a for n 1, This is due to the fact that a binomial distribution approaches the Poisson distribution with a = np if n >> 1 and p << 1, and applying the De Moivre-Laplace theorem proves it! p 1, np a

Common Random Variables and Their Distribution Functions (Cont.) Eample 3.11 In a digital communication system, the probability of an error is 10-3. What is the probability of 3 errors in transmission of 5000 bits? Solution: This has a binomial distribution, and the Poisson approimation with n = 5000, p = 10-3, and k = 3 shows: 3 5 5 P(3 errors) e 0.14037 3! whereas the eact value of the probability using the binomial distribution is: 5000 4997 3 3 3 3 10 1 10 0. 14036

Common Random Variables and Their Distribution Functions (Cont.) Poisson Points and Eponential Probability Density Function: Consider a finite interval T (fig. 3.12), with the probability of k occurrence of an event ( ni : arrival of electrons at certain point) in this interval obeys a Poisson distribution, i.e.: k T T P( k) e, k 0,1, k! 2, where is the number of events per unit time.

Common Random Variables and Their Distribution Functions (Cont.) Poisson Points and Eponential Probability Density Function (Cont.): Then, the interval from an arbitrarily selected point and the net event is a r.v. W following an eponential distribution with its pdf as: f W w ( w) e u( w) Figure 3.12: Poisson points on a finite interval T.

Common Random Variables and Their Distribution Functions (Cont.) Poisson Points and Eponential Probability Density Function (Cont.): Proof: First, find the cdf of W as: w FW ( w) P( W w) 1 P( 0) 1 e, w 0 {W w} corresponds to the event that there is at least one Poisson event in the interval (0, w). Differentiating it, we obtain: f W d w ( w) FW ( w) e u( w) dw

Common Random Variables and Their Distribution Functions (Cont.) Eample 3.12 Raindrops impinge on a tin roof at a rate of 100/sec. What is the probability that the interval between adjacent raindrops is greater than 1(msec), and 10(msec)? Solution: This can be approimated by a Poisson point process with = 100, and therefore: P( W P( W 10 10 2 3 ) 1 P( W 1 ) 1 P( W 1 10 1000.01 1 1 e e 0. 368 10 2 3 ) ) 1000.001 0.1 1 e e 0. 905

Common Random Variables and Their Distribution Functions (Cont.) Pascal Random Variable: The r.v. has a Pascal distribution if it takes on the positive integer values 1, 2, 3, with probabilities: n 1 k nk P( k) p q for k 1, 2,, k 1 and q 1 p

Common Random Variables and Their Distribution Functions (Cont.) Note: 1. For eample, in a sequence of coin tossing, the probability of getting the k-th head on the n-th toss obeys a Pascal distribution, where p = prob. of a head. 2. Reasoning: We get the first k-1 heads in any order in the first n-1 tosses, which is binomial, and then must get a head on the n-th toss. 3. Geometric distribution is a special case of the Pascal distribution. Eample 3.13 Self study

Common Random Variables and Their Distribution Functions (Cont.) Hypergeometric Random Variable: Consider a bo containing N items, K of which are defective. Then, the probability of obtaining k defective items in a selection of n items without replacement follows the hypergeometric distribution: K N K k n k P ( k), k 0,1, 2,, n N n Eample 3.14 Self study