Introductory Probability Joint Probability with Independence; Binomial Distributions Nicholas Nguyen nicholas.nguyen@uky.edu Department of Mathematics UK
Agenda Comparing Two Variables with Joint Random Variables and Double Integrals Binomial Distributions Review Announcements: The sixth homework is available and due next Monday. The seventh homework is available and due next Wednesday. The next quiz is on Friday.
Comparing Two Variables Let X and Y be independent continuous random variables in [, 1] with density functions f X (x) = 2x, f Y (y) = 3y 2, x,y 1. Let us nd the probability that Y X 2. Let's nd the joint density function rst. Since X and Y are independent, their joint density function is the product of their individual density functions: f (x,y) = f X (x) f Y (y) = 2x 3y 2. We integrate the joint density function over the region in the square [,1] [,1] that satises y x 2 : P(Y X 2 ) = 2x 3y 2 dydx. y x 2
Comparing Two Variables Let X and Y be independent continuous random variables in [, 1] with density functions f X (x) = 2x, f Y (y) = 3y 2, x,y 1. Let us nd the probability that Y X 2. Let's nd the joint density function rst. Since X and Y are independent, their joint density function is the product of their individual density functions: f (x,y) = f X (x) f Y (y) = 2x 3y 2. We integrate the joint density function over the region in the square [,1] [,1] that satises y x 2 : P(Y X 2 ) = 2x 3y 2 dydx. y x 2
Comparing Two Variables Let X and Y be independent continuous random variables in [, 1] with density functions f X (x) = 2x, f Y (y) = 3y 2, x,y 1. Let us nd the probability that Y X 2. Let's nd the joint density function rst. Since X and Y are independent, their joint density function is the product of their individual density functions: f (x,y) = f X (x) f Y (y) = 2x 3y 2. We integrate the joint density function over the region in the square [,1] [,1] that satises y x 2 : P(Y X 2 ) = 2x 3y 2 dydx. y x 2
Region Y X 2 1 Y Y = X 2 X 1
End Limits 1 Y Y = X 2 X 1 The x-coordinate can go from to 1 (left and right sides of the square).
End Limits 1 Y Y = X 2 X 1 For a xed x-value, the y-coordinate can range from (the bottom edge) to x 2 (on the graph of y = x 2 ).
Evaluating the Integral Thus, the probability that Y X 2 is x 2 2x 3y 2 dydx = = = ( 2x y 3 ) x 2 dx 2x x 6 dx 2x 7 dx = 2 1 8 x 8 = 1 4.
Evaluating the Integral Thus, the probability that Y X 2 is x 2 2x 3y 2 dydx = = = ( 2x y 3 ) x 2 dx 2x x 6 dx 2x 7 dx = 2 1 8 x 8 = 1 4.
Evaluating the Integral Thus, the probability that Y X 2 is x 2 2x 3y 2 dydx = = = ( 2x y 3 ) x 2 dx 2x x 6 dx 2x 7 dx = 2 1 8 x 8 = 1 4.
Evaluating the Integral Thus, the probability that Y X 2 is x 2 2x 3y 2 dydx = = = ( 2x y 3 ) x 2 dx 2x x 6 dx 2x 7 dx = 2 1 8 x 8 = 1 4.
Independent Trials Process (Continuous) A sequence of continuous random variables X 1,...,X n that are mutually independent and have the same density function f X (x) is called an independent trials process. If X = (X 1,...,X n ), then for any point (x 1,...,x n ), the density function of X is the product of each density function: f (x 1,...,x n ) = f X (x 1 )... f X (x n ).
Independent Trials Process (Continuous) A sequence of continuous random variables X 1,...,X n that are mutually independent and have the same density function f X (x) is called an independent trials process. If X = (X 1,...,X n ), then for any point (x 1,...,x n ), the density function of X is the product of each density function: f (x 1,...,x n ) = f X (x 1 )... f X (x n ).
Independent Trials Process (Discrete) A sequence of discrete random variables X 1,...,X n that are mutually independent and have the same distribution function m X is called an independent trials process. If X = (X 1,...,X n ), then if ω = (ω 1,...,ω n ) is a sequence of outcomes, the distribution function m of X is m(ω) = m X (ω 1 )... m X (ω n ).
Independent Trials Process (Discrete) A sequence of discrete random variables X 1,...,X n that are mutually independent and have the same distribution function m X is called an independent trials process. If X = (X 1,...,X n ), then if ω = (ω 1,...,ω n ) is a sequence of outcomes, the distribution function m of X is m(ω) = m X (ω 1 )... m X (ω n ).
Bernoulli Trials Processes An important example of a discrete independent trials process is a Bernoulli trials process: a sequence of independent trials with 2 outcomes each (success or failure). For each trial: m(success) = p, m(failure) = 1 p = q. For example, with 5 trials, the sequence (S, S, F, F, F) has probability m(success) 2 m(failure) 3 = p 2 q 3. Let X record the number of successes in n trials. Then for any whole number k n, ( ) n P(X = k) = b(n,p,k) = p k q n k k
Bernoulli Trials Processes An important example of a discrete independent trials process is a Bernoulli trials process: a sequence of independent trials with 2 outcomes each (success or failure). For each trial: m(success) = p, m(failure) = 1 p = q. For example, with 5 trials, the sequence (S, S, F, F, F) has probability m(success) 2 m(failure) 3 = p 2 q 3. Let X record the number of successes in n trials. Then for any whole number k n, ( ) n P(X = k) = b(n,p,k) = p k q n k k
Bernoulli Trials Processes An important example of a discrete independent trials process is a Bernoulli trials process: a sequence of independent trials with 2 outcomes each (success or failure). For each trial: m(success) = p, m(failure) = 1 p = q. For example, with 5 trials, the sequence (S, S, F, F, F) has probability m(success) 2 m(failure) 3 = p 2 q 3. Let X record the number of successes in n trials. Then for any whole number k n, ( ) n P(X = k) = b(n,p,k) = p k q n k. k
An important example of a discrete independent trials process is a Bernoulli trials process: a sequence of independent trials with 2 outcomes each (success or failure). For each trial: m(success) = p, m(failure) = 1 p = q. For example, with 5 trials, the sequence (S, S, F, F, F) has probability m(success) 2 m(failure) 3 = p 2 q 3. Let X record the number of successes in n trials. Then for any whole number k n, ( ) n P(X = k) = b(n,p,k) = p k q n k. k The distribution function b(n,p,k) (n and p xed) is called the binomial distribution function.
Example We have a coin that lands heads with probability 1/5 and toss it 3 times. Then the probability of getting at least one head is 1 P(X = ) = 1 b(3,1/5,) ( ) 3 = 1 (1/5) (4/5) 3 = 1 1 1 (64/125) = 61/125.
Example We have a coin that lands heads with probability 1/5 and toss it 3 times. Then the probability of getting at least one head is 1 P(X = ) = 1 b(3,1/5,) ( ) 3 = 1 (1/5) (4/5) 3 = 1 1 1 (64/125) = 61/125.
Example We have a coin that lands heads with probability 1/5 and toss it 3 times. Then the probability of getting at least one head is 1 P(X = ) = 1 b(3,1/5,) ( ) 3 = 1 (1/5) (4/5) 3 = 1 1 1 (64/125) = 61/125.
Bernoulli Trials Processes and Distributions For any Bernoulli trials process, we can record: The number of successes in a xed number of trials (binomial distribution) The number of trials up to and including the rst success (geometric distribution) The number of trials up to and including the kth success, k xed (negative binomial distribution)
Bernoulli Trials Processes and Distributions For any Bernoulli trials process, we can record: The number of successes in a xed number of trials (binomial distribution) The number of trials up to and including the rst success (geometric distribution) The number of trials up to and including the kth success, k xed (negative binomial distribution)
Bernoulli Trials Processes and Distributions For any Bernoulli trials process, we can record: The number of successes in a xed number of trials (binomial distribution) The number of trials up to and including the rst success (geometric distribution) The number of trials up to and including the kth success, k xed (negative binomial distribution)
Next Time Please read Section 5.1 (you can skip the historical remarks). We will study another distribution associated with Bernoulli trials: the geometric distribution. Homework 6 is due next Monday. Homework 7 is due next Wednesday. A quiz is this Friday.