Ch. 5 Joint Probability Distributions and Random Samples

Size: px
Start display at page:

Download "Ch. 5 Joint Probability Distributions and Random Samples"

Transcription

1 Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However, it is often useful to have more than one random variable defined in a random eperiment. It is especially useful if you re interested in the relationship between those two variables. In this class, we will learn about joint probability distributions of only TWO random variables despite the fact that the book covers joint distributions of more than two random variables. Two Discrete Random Variables Joint Probability Mass Function of Discrete Random Variables The joint probability mass function of the discrete random variables X and Y, denoted as p(, y), satisfies the following properties: 1. p(, y) 0 2. y p(, y) = 1 3. p(, y) = P(X = and Y = y) [Note: P(X = and Y = y) can also be denoted as P(X =, Y = y)] Just as the probability mass function of a single random variable X is assumed to be zero at all values outside the range of X, so is the joint probability mass function of X and Y assumed to be zero at values for which a probability is not specified. STA 3032 Ch. 5 Notes 1

2 Eample 1 Determine the value of c that makes the function p(, y) = c( + y) a joint probability mass function over the nine points with = 1, 2, 3 and y = 1, 2, 3. It is easy to see that the first property of joint probability distributions is satisfied because all values of X and Y are non-negative numbers. Thus, as long as c is positive, the function p(, y) will always be greater than or equal to 0. Now, we need to ensure that the sum over the range of (X, Y) equals 1. Let R denote the range of (X, Y). p(, y) = p(1, 1) + p(1, 2) + p(1, 3) + p(2, 1) + p(2, 2) + p(2, 3) + p(3, 1) R +p(3, 2) + p(3, 3) = 2c + 3c + 4c + 3c + 4c + 5c + 4c + 5c + 6c = 36c Since we want the sum to equal 1, we can set this result equal to 1 and solve for c. 36c = 1 c = 1 36 Thus, p(, y) = 1 ( + y). 36 Eample 2 Using the joint pmf from Eample 1, p(, y) = 1 ( + y) for = 1, 2, 3 and y = 1, 2, 3, find 36 the following: a) P(X = 1, Y < 4) To find this probability, we must add up the probability of each pair of and y where equals 1 and y is less than 4. P(X = 1, Y < 4) = p(1,1) + p(1,2) + p(1,3) = 1 36 [(1 + 1) + (1 + 2) + (1 + 3)] = 1 36 ( ) = 9 36 = 1 4 STA 3032 Ch. 5 Notes 2

3 Eample 2 Continued b) P(X = 1) To find this probability, we need to sum up the probability of each pair of and y where equals 1. P(X = 1) = p(1,1) + p(1,2) + p(1,3) = 1/4 c) P(Y = 2) This time the value of Y will remain constant, so we need to sum up the probabilities of each, y pair where y equals 2. P(Y = 2) = p(1,2) + p(2,2) + p(3,2) = 1 12 ( ) = = 1 3 d) P(X < 2, Y < 2) To find this answer, we must add up the probability of each, y pair where both and y are less than 2. There s only one, y pair that satisfies that requirement: (1, 1). P(X < 2, Y < 2) = p(1,1) = 1 36 (1 + 1) = 2 36 = 1 18 Marginal Probability Mass Functions of Discrete Random Variables The individual probability distribution of a random variable is referred to as its marginal probability distribution. The marginal probability distribution of a variable can be determined from the joint probability distribution of that variable and other random variables. For a discrete random variable, X, you can find the marginal distribution of X, P(X = ), by summing the joint distribution, P(X =, Y = y), over all points in the range of (X, Y) where X =. If the joint probability mass function of the discrete random variables X and Y is p(, y), the marginal probability mass function of X is P(X = ) = p X () = p(, y) y STA 3032 Ch. 5 Notes 3

4 and the marginal probability mass function of Y is P(Y = y) = p Y (y) = p(, y) Eample 3 Using the joint pmf from Eamples 1 and 2, p(, y) = 1 ( + y) for = 1, 2, 3 and y = 1, 2, 3, 36 find the marginal probability distribution of the random variable X. We need to calculate the marginal pmf for each value of X. We do this by keeping the value of X constant and summing the joint pdf over all possible values of Y. p X (1) = p(1, y) = p(1, 1) + p(1, 2) + p(1, 3) = 1 36 ( ) = 9 36 = 1 4 y p X (2) = p(2, y) = p(2, 1) + p(2, 2) + p(2, 3) = 1 12 ( ) = = 1 3 y p X (3) = p(3, y) = p(3, 1) + p(3, 2) + p(3, 3) = 1 15 ( ) = = 5 12 y Two Continuous Random Variables Joint Probability Density Function of Continuous Random Variables A joint probability density function for the continuous random variables X and Y, denoted as f(, y), satisfies the following properties: 1. f(, y) 0 for all, y 2. f(, y)ddy = 1 3. For any region A of two-dimensional space, P (X, Y) A = f(, y)ddy A In particular, if A is any two-dimensional rectangle {(, y): a b, c y d}, then P (X, Y) A = P(a X b, c Y d) = f(, y)dyd a b c d STA 3032 Ch. 5 Notes 4

5 Typically, f(, y) is defined over all of two-dimensional space by assuming that f(, y) = 0 for all points where f(, y) is not specified. Eample 4 Suppose f(, y) = 10e 2 3y is a joint probability density function over the range > 0 and 0 < y <. Determine the following: a) P(X < 1, Y < 2) Since the range of X is > 0, we will need to integrate from 0 to 1 for X. The range of Y is 0 < y <, so we will integrate over the interval 0 to for Y. 1 P(X < 1, Y < 2) = 10 e 2 3y dyd = 10 e e 3y y= y=0 = 10 3 e 2 [ e 3 e 0 ]d = 10 3 e 2 [1 e 3 ]d 0 1 = 10 3 e 2 e 5 d = e e 5 0 = e e 2 = e 5 5 e = 10 5e 2 3 2e = 1 3 (2e 5 5e 2 + 3) = 1 ( ) = d b) P(1 < X < 2) 2 2 P(1 < X < 2) = 10 e 2 3y dyd = 10 e 2 e 5 d = e e 2 = 10 e e 4 e e 2 2 = 1 3 (2e 10 5e 4 2e 5 + 5e 2 ) = = STA 3032 Ch. 5 Notes 5

6 Eample 4 Continued c) P(Y > 3) Since 0 < y <, we will integrate with respect to y from 3 to and integrate with respect to from 3 to. P(Y > 3) = 10 e 2 3y dyd = 10 3 e 2 [ e 3y ] y= y= d = 10 3 e 2 [ e 3 e 9 ] d = 10 3 e 9 e 2 + e 5 d 3 = 10 e e 9 e = 10 0 e e 9 e 6 2 = 10 3 e 15 2 e 15 5 = 1 3 (5e 15 2e 15 ) = e 15 = d) P(X < 2, Y < 2) 2 2 P(X < 2, Y < 2) = 10 e 2 3y dyd = 10 e 2 e 5 d = e e 2 = 10 e e = 1 3 (2e 10 5e ) = = Marginal Probability Density Functions of Continuous Random Variables If the joint probability density function of the continuous random variables X and Y is f(, y), the marginal probability density functions of X and Y are P(X = ) = f X () = f(, y)dy y and P(Y = y) = f Y (y) = f(, y)d where the first integral is over all points in the range of (X, Y) for which X = and the second integral is over all points in the range of (X, Y) for which Y = y. STA 3032 Ch. 5 Notes 6

7 Eample 5 Using the joint pdf from Eample 4, f(, y) = 10e 2 3y for > 0 and 0 < y <, find the marginal probability distribution of X. Just like with discrete joint distributions, we need to keep constant. However, this time we will integrate the pdf with respect to y over the entire range of Y instead of summing like we did for discrete distributions. f X () = 10 e 2 3y dy 0 = 10e 2 3 = 10e e 3y = 10e 2 ( e 3 e 0 ) 0 3 (1 e 3 ) = 10 3 (e 2 e 5 ) for > 0 Independent Random Variables In some random eperiments, knowledge of the values of X does not change any of the probabilities associated with the values of Y. Two random variables X and Y are independent if for every pair of and y p(, y) = p X () p Y (y) for discrete X and Y or f(, y) = f X ()f Y (y) for continuous X and Y Eample 6 Using the joint discrete pmf from Eamples 1-3, p(, y) = 1 ( + y) for = 1, 2, 3 and 36 y = 1, 2, 3, determine if X and Y independent. In Eample 3, we found the marginal distribution of X, but we also need the marginal distribution of Y to determine independence. p Y (1) = p(, 1) = p(1, 1) + p(2, 1) + p(3, 1) = 1 36 ( ) = 9 36 = 1 4 STA 3032 Ch. 5 Notes 7

8 Eample 6 Continued p Y (2) = p(, 2) p Y (3) = p(, 3) Recall that, p X (1) = 1 4 = p(1, 2) + p(2, 2) + p(3, 2) = 1 12 ( ) = = 1 3 = p(1, 3) + p(2, 3) + p(3, 3) = 1 15 ( ) = = 5 12 p X (2) = 1 3 p X (3) = 5 12 In order to be independent, p(, y) = p X () p Y (y) for all values of X and Y. p(1,1) = = = p X (1) p Y(1) Since the statement does not hold up when X = 1 and Y = 1, they are not independent. X and Y are dependent. More Than Two Random Variables Please skip this section. Conditional Distributions When two variables are defined in a random eperiment, knowledge of one can change the probabilities that we associate with the values of the other. Conditional Probability Mass Function Recall that the definition of conditional probability for events A and B is P(B A) = P(A B)/P(A). This definition can be applied with the event A defined to be X = and event B defined to be Y = y. Thus, for the discrete random variables X and Y with joint pmf p(, y) the conditional probability of Y = y given that X = is denoted by P(Y = y X = ) = p Y X (y ) = P(X =, Y = y) P(X = ) = p(, y) p X () STA 3032 Ch. 5 Notes 8

9 Eample 7 Using the joint pmf, p(, y) = 1 ( + y) for = 1, 2, 3 and y = 1, 2, 3, find the following: 36 a) Conditional probability distribution of Y given that X = 1 Using the marginal distribution of X found in Eample 3, p Y X (1 1) = p(1,1) p X (1) = 2/36 1/4 = = 8 36 = 2 9 p Y X (2 1) = p(1,2) p X (1) = 3/36 1/4 = = = 1 3 p Y X (3 1) = p(1,3) p X (1) = 4/36 1/4 = = = 4 9 b) Conditional probability distribution of X given that Y = 2 Using the marginal distribution of Y found in Eample 6, p X Y (1 2) = p(1,2) p Y (2) = 3/36 1/3 = = 9 36 = 1 4 p X Y (2 2) = p(2,2) p Y (2) = 4/36 1/3 = = = 1 3 p X Y (3 2) = p(3,2) p Y (2) = 5/36 1/3 = = = 5 12 Conditional Probability Density Function Given continuous random variables X and Y with joint probability density function f(, y), the conditional probability density function of Y given X = is f Y X (y ) = f(, y) f X () for f X () > 0 The conditional probability density function provides the conditional probabilities for the values of Y given X =. STA 3032 Ch. 5 Notes 9

10 Because the conditional probability density function f Y X (y) is a probability density function for all y in R, the following properties are satisfied: 1) f Y X (y ) 0 2) f Y X (y )dy = 1 3) P(Y B X = ) = f Y X (y )dy B for any set B in the range of Y Eample 8 Using the joint pdf from Eamples 4 and 5, f(, y) = 10e 2 3y for > 0 and 0 < y <, find the following: a) Conditional probability distribution of Y given X = 1 Recall that, in Eample 5, we found f X () = 10 3 (e 2 e 5 ) f Y X (y 1) = f(1, y) f X (1) = = 3.157e 3y 10e 2 3y = 10 3 (e 2 e 5 ) e 2 3y 1 3 (e 2 e 5 ) b) Conditional probability distribution of X given Y = 2 First we need to find the marginal distribution of Y. f Y (y) = 10 e 2 3y d = f X Y ( 2) = y = 5e 3y (0 + e 2y ) = 5e 5y = e 2 e 3y 1 3 (e 2 e 5 ) 10e 3y 1 2 e 2 = 5e 3y ( e e 2y ) y f(, 2) f X (2) = 10e 2 6 5e 10 = 10e 6 e 2 5e 10 = 2e 6 e 10 e 2 = e 2 STA 3032 Ch. 5 Notes 10

11 5.2 Epected Values, Covariance, and Correlation Mean and Variance of Discrete Random Variables Recall that for a discrete random variable X with pmf, f(), E(X) = p(). Similarly, for a discrete random variable X with a joint probability mass function,p(, y), the epected value of X can be determined by the following: where R is the range of (X, Y). E(X) = μ X = p(, y) Recall that for a discrete random variable X with pmf, f(), σ 2 = ( μ) 2 p(). Similarly, for a discrete random variable X with a joint probability mass function,p(, y), the variance of X can be determined by the following: where R is the range of (X, Y). σ X 2 = ( μ ) 2 p(, y) Similarly, the formulas for the epected value and variance of Y are as follows: R E(Y) = μ Y = y p(, y) σ Y 2 = (y μ Y ) 2 p(, y) R R R Eample 9 Using the joint pmf, p(, y) = 1 ( + y) for = 1, 2, 3 and y = 1, 2, 3, find the epected 36 value and variance of X. E(X) = 1 p(1,1) + 1 p(1,2) + 1 p(1,3) + 2 p(2,1) + 2 p(2,2) + 2 p(2,3) +3 p(3,1) + 3 p(3,2) + 3 p(3,3) = = = = = 13 6 = STA 3032 Ch. 5 Notes 11

12 Eample 9 Continued σ X 2 = p(1,1) p(1,2) p(1,3) p(2,1) p(2,2) p(2,3) p(3,1) p(3,2) p(3,3) = = = = = Mean and Variance of Continuous Random Variables For a continuous random variable X with a joint probability mass function,f(, y), the epected value and variance of X can be determined by the following: E(X) = μ X = f(, y)ddy R σ X 2 = ( μ X ) 2 f(, y)ddy R where R is the range of (X, Y). Similarly, the formulas for the epected value and variance of Y are as follows: E(Y) = μ Y = yf(, y)ddy R where R is the range of (X, Y). σ Y 2 = (y μ Y ) 2 f(, y)ddy R STA 3032 Ch. 5 Notes 12

13 Epected Value of a Function of Two Random Variables Let X and Y be random variables and h(x, Y) be a function of those random variables. h(, y) p(, y) for discrete X and Y y E[h(X, Y)] = h(, y) f(, y)ddy for continuous X and Y y Covariance In eperiments involving two or more random variables, we often are interested in measuring the relationship between the variables. A common measurement to describe the variation between two variables is covariance. The covariance between the random variables X and Y, denoted as cov(x, Y) or σ XY, is Cov(X, Y) = E[(X μ X )(Y μ Y )] = E(XY) μ X μ Y If the covariance is negative, then X and Y have an inverse relationship i.e. as X increases Y decreases or vice versa. However, if X increases as Y increases, their covariance will be positive. Thus, covariance is a measure of the linear relationship between the random variables. If the relationship between the variables is nonlinear, the covariance might not detect the relationship. Correlation Another measure of the variance between two random variables is correlation. Correlation is often easier to interpret that covariance because it is scaled by the standard deviation of each variable. This allows the correlation of two variables to be comparable to the correlation of a different set of two variables. The correlation between random variables X and Y, denoted as ρ XY, is where 1 ρ XY 1 Cov(X, Y) ρ XY = V(X)V(Y) = Cov(X, Y) σ X σ Y Please read the two propositions on pg. 210 of the tetbook. STA 3032 Ch. 5 Notes 13

14 Eample 10 Determine the covariance and correlation for the following joint probability distribution: y p(, y) 1/8 1/4 1/2 1/8 Before we can calculate the covariance, we need to find the epected values of X and Y. μ X = p(, y) = ( 1) ( 0.5) (0.5) (1) 1 8 = = 1 8 = μ Y = y p(, y) = ( 2) ( 1) (1) (2) 1 8 = = 1 4 = 0.25 y Now we need to find the epected value of XY. E(XY) = y p(, y) = y = = 7 8 = We have all the values necessary to find the covariance, Cov(, y). Cov(, y) = E(XY) μ X μ Y = (0.125)(0.25) = = Before we can find the correlation, we need to calculate the variances of X and Y. σ 2 X = 2 p(, y) μ 2 X = ( 1) ( 0.5) (0.5) (1) = = = = σ 2 Y = y 2 p(, y) μ 2 Y = ( 2) ( 1) (1) (2) y = = = = Now we can find the correlation. STA 3032 Ch. 5 Notes 14

15 Eample 10 Continued ρ XY = Cov(, y) = σ X σ Y = = 1 Thus, X and Y are correlated and have a positive linear relationship. 5.3 Statistics and Their Distributions Please skip this section. 5.4 The Distribution of the Sample Mean The assumption that distributions and population parameters are known is unrealistic. In statistics, we use sample data to estimate distributions and parameters. Definitions 1. parameter number that describes the population (i.e. μ, σ 2, σ, p (binomial)) 2. statistic function of the observations in a random sample, number that describes the sample (i.e., s 2, s) 3. population distribution distribution from which we select our sample, parameters are usually unknown 4. sample distribution distribution of the data we actually observe, describe the data with statistics, such as the sample mean ( ); the larger the sample size (n), then the closer the sample represents the population 5. sampling distribution a probability distribution of a sample statistic, such as, when samples of size n are taken repeatedly; describes the variability of the statistic; generated by repeating a sampling eperiment a very large number of times 6. point estimate a single number calculated from sample data, used to estimate a parameter of the population 7. point estimator formula or rule that is used to calculate the point estimate for a particular set of data (i.e. sample statistics) STA 3032 Ch. 5 Notes 15

16 Common Point Estimators Population Parameter Point Estimator or Sample Statistic Mean μ Variance σ 2 s 2 Standard Deviation σ s Binomial Proportion p p (read as p-hat) We have many different choices for the point estimator of a parameter. For eample, if we wish to estimate the population mean, we might consider the sample mean or sample median as point estimators. In order to decide which point estimator of a particular parameter is the best one to use, we need to eamine their statistical properties and develop some criteria for comparing estimators. The Case of a Normal Population Distribution Please read this section in the book. Central Limit Theorem (CLT) Consider a random sample of n observations selected from a population of any shape mean, μ, and standard deviation, σ. Then, when n is sufficiently large (n 30), the sampling distribution of X will be approimately normal with a mean, μ X, and standard deviation, σ X, given by μ X = μ σ X = σ n Note: If the population is normal, is normal for any size n. Eample 11 A sample of 4 observations is taken from a normal distribution with μ = 50 and σ = 6. Find the sampling distribution of X. The CLT gives us three properties we must find to determine the sampling distribution of X. 1. Is the distribution normal? (Note: This will be true if the population is normal or if the sample size is greater than or equal to 30.) 2. μ X = μ 3. σ X = σ n Let s find each of the properties. STA 3032 Ch. 5 Notes 16

17 Eample 11 Continued For the first property, we know that the distribution is normal because the sample is taken from a normal distribution. Notice that if the population was not normal, the sampling distribution would not be normal because the sample size is smaller than 30. The mean of the sampling distribution of X is equal to the mean of the population. μ X = μ = 50 Finally, the standard deviation of the sampling distribution of X is the population standard deviation divided by the square root of the sample size. σ X = σ n = 6 4 = 3 Note: I provided etra eplanation in this eample since it is the first problem of this type that we covered. The remaining eamples will have shorter answers that are similar to what I epect you to provide on your assignments. Eample 12 A sample of 100 observations is taken from a Binomial distribution with n = 25 and p = 0.8. Determine the sampling distribution of the sample mean. Note: I will denote the sample size as N to distinguish it from the number of trials, n. Since this sample is from a Binomial distribution, we will need to use some formulas from Ch. 3 to find the mean and standard deviation. 1. Normal? Yes, because N = μ X = μ = np = 25(0.8) = σ X = σ N np(1 p) = N = 25(0.8)(0.2) 100 = 4 10 = 0.2 STA 3032 Ch. 5 Notes 17

18 Eample 13 A sample of 2 observations is taken from a uniform distribution over the interval 0 to 1. What is the sampling distribution of the sample mean? Note: Since the sample is from a uniform distribution, we will need to use some formulas from Ch. 4 to find the mean and standard deviation. 1. Normal? No, n = 2 < μ X = μ a + b = 2 = = σ X = σ n = b a 12 n = = IMPORTANT: Notice in Eample 13, that μ X = μ and σ X = σ/ n still hold true even though the sampling distribution of X is NOT normal. Finding P(X ) If is normal, use the z-table with the following z-score: z = μ σ μ = σ/ n Eample 14 Let X = ages of Sarasota residents, where μ is claimed to be 60 and σ = 16. Suppose we have a random sample of 64 residents. Find P(X < 55). First, we must find the sampling distribution of. We know the distribution is normal because n = 64 > 30. Thus, we can use the z-table to find the probability of. We need to calculate the z-score of 55. z = / 64 = 5 2 = 2.5 STA 3032 Ch. 5 Notes 18

19 Eample 14 Continued Therefore, P(X < 55) = P(Z < 2.5). Now, draw the region that we re interested in and label it with the probabilities from the z-table. P(X < 55) = P(Z < 2.5) = Eample 15 Suppose test scores are approimately normal with a mean of 72 and a standard deviation of 12. If 16 scores are randomly selected, what is the probability that the average score is higher than 70? In this case, the sample size is less than 30, but since we are sampling from a normal distribution, the sampling distribution of is normal. P(X > 70) = P μ > σ 12/ 16 = P Z > 2 = P(Z > 0.67) 3 Now, draw the region we re interested in and label it with the probabilities from the z-table. P(X > 70) = P(Z > 0.67) = 1 P(Z < 0.67) = = STA 3032 Ch. 5 Notes 19

20 IMPORTANT: Notice that Eample 15 asked for the probability of the average, thus we needed to find the sampling distribution of. If only 1 score had been randomly selected and it asked for the probability that the score was less than or greater than some value, we would have used the z-score we learned about in Ch. 4. Eample 16 The distribution of violent crimes per day in a certain city possesses a mean of 1.3 and a standard deviation of 1.7. A random sample of 50 days is observed and the daily mean number of crimes is found to be 0.9 with a standard deviation of 1.4. Find the sampling distribution of the sample mean. In this case, we are given several bits of information in the problem, including information we don t need to find the sampling distribution of X. The first sentence is describing the distribution of the population because there is no mention of sample being taken or a sample size. Thus, the mean and standard deviation presented in this sentence is for the population. μ = 1.3 σ = 1.7 The second sentence is describing a sample that was taken. The mean and standard deviation given in this sentence were found from the sample. Thus, these values are the sample mean and sample standard deviation. = 0.9 s = 1.4 Now that we ve identified all the values given in the problem, we can find the sampling distribution. 1. Normal? Yes, because n = 50 > μ = μ = σ = σ n = = = 0.24 Other Applications of the Central Limit Theorem Please skip this section. 5.5 The Distribution of a Linear Combination Please skip this section. STA 3032 Ch. 5 Notes 20

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

1 x 2 and 1 y 2. 0 otherwise. c) Estimate by eye the value of x for which F(x) = x + y 0 x 1 and 0 y 1. 0 otherwise

1 x 2 and 1 y 2. 0 otherwise. c) Estimate by eye the value of x for which F(x) = x + y 0 x 1 and 0 y 1. 0 otherwise Eample 5 EX: Which of the following joint density functions have (correlation) ρ XY = 0? (Remember that ρ XY = 0 is possible with symmetry even if X and Y are not independent.) a) b) f (, y) = π ( ) 2

More information

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1 Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on

More information

Joint Probability Distributions, Correlations

Joint Probability Distributions, Correlations Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables Lecture 6 : Independence, Covariance and Correlation of Discrete Random Variables 0/ 3 Definition Two discrete random variables X and Y defined on the same sample space are said to be independent if for

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Chapter 4 Multiple Random Variables Chapter 41 Joint and Marginal Distributions Definition 411: An n -dimensional random vector is a function from a sample space S into Euclidean space n R, n -dimensional

More information

Joint Probability Distributions, Correlations

Joint Probability Distributions, Correlations Joint Probability Distributions, Correlations What we learned so far Events: Working with events as sets: union, intersection, etc. Some events are simple: Head vs Tails, Cancer vs Healthy Some are more

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

EXAM # 3 PLEASE SHOW ALL WORK!

EXAM # 3 PLEASE SHOW ALL WORK! Stat 311, Summer 2018 Name EXAM # 3 PLEASE SHOW ALL WORK! Problem Points Grade 1 30 2 20 3 20 4 30 Total 100 1. A socioeconomic study analyzes two discrete random variables in a certain population of households

More information

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5 Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Ch. 7 Statistical Intervals Based on a Single Sample

Ch. 7 Statistical Intervals Based on a Single Sample Ch. 7 Statistical Intervals Based on a Single Sample Before discussing the topics in Ch. 7, we need to cover one important concept from Ch. 6. Standard error The standard error is the standard deviation

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise. 54 We are given the marginal pdfs of Y and Y You should note that Y gamma(4, Y exponential( E(Y = 4, V (Y = 4, E(Y =, and V (Y = 4 (a With U = Y Y, we have E(U = E(Y Y = E(Y E(Y = 4 = (b Because Y and

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

Math Review Sheet, Fall 2008

Math Review Sheet, Fall 2008 1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the

More information

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables: CHAPTER 5 Jointl Distributed Random Variable There are some situations that experiment contains more than one variable and researcher interested in to stud joint behavior of several variables at the same

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Notes for Math 324, Part 20

Notes for Math 324, Part 20 7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Chapter 5 Class Notes

Chapter 5 Class Notes Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 536 December, 00 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. You will have access to a copy

More information

STAT 430/510: Lecture 15

STAT 430/510: Lecture 15 STAT 430/510: Lecture 15 James Piette June 23, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4... Conditional Distribution: Discrete Def: The conditional

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

Probability Review. AP Statistics April 25, Dr. John Holcomb, Cleveland State University

Probability Review. AP Statistics April 25, Dr. John Holcomb, Cleveland State University Probability Review AP Statistics April 25, 2015 Dr. John Holcomb, Cleveland State University PROBLEM 1 The data below comes from a nutritional study conducted at Ohio University. Eighty three subjects

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

, 0 x < 2. a. Find the probability that the text is checked out for more than half an hour but less than an hour. = (1/2)2

, 0 x < 2. a. Find the probability that the text is checked out for more than half an hour but less than an hour. = (1/2)2 Math 205 Spring 206 Dr. Lily Yen Midterm 2 Show all your work Name: 8 Problem : The library at Capilano University has a copy of Math 205 text on two-hour reserve. Let X denote the amount of time the text

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Bayesian statistics, simulation and software

Bayesian statistics, simulation and software Module 1: Course intro and probability brush-up Department of Mathematical Sciences Aalborg University 1/22 Bayesian Statistics, Simulations and Software Course outline Course consists of 12 half-days

More information

DEPARTMENT OF MATHEMATICS AND STATISTICS

DEPARTMENT OF MATHEMATICS AND STATISTICS DEPARTMENT OF MATHEMATICS AND STATISTICS Memorial University of Newfoundland St. John s, Newfoundland CANADA A1C 5S7 ph. (709) 737-8075 fax (709) 737-3010 Alwell Julius Oyet, Phd email: aoyet@math.mun.ca

More information

Mathematical Statistics. Gregg Waterman Oregon Institute of Technology

Mathematical Statistics. Gregg Waterman Oregon Institute of Technology Mathematical Statistics Gregg Waterman Oregon Institute of Technolog c Gregg Waterman This work is licensed under the Creative Commons Attribution. International license. The essence of the license is

More information

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom 1 Learning Goals Covariance and Correlation Class 7, 18.05 Jerem Orloff and Jonathan Bloom 1. Understand the meaning of covariance and correlation. 2. Be able to compute the covariance and correlation

More information

Lecture 1: Bayesian Framework Basics

Lecture 1: Bayesian Framework Basics Lecture 1: Bayesian Framework Basics Melih Kandemir melih.kandemir@iwr.uni-heidelberg.de April 21, 2014 What is this course about? Building Bayesian machine learning models Performing the inference of

More information

University of Illinois ECE 313: Final Exam Fall 2014

University of Illinois ECE 313: Final Exam Fall 2014 University of Illinois ECE 313: Final Exam Fall 2014 Monday, December 15, 2014, 7:00 p.m. 10:00 p.m. Sect. B, names A-O, 1013 ECE, names P-Z, 1015 ECE; Section C, names A-L, 1015 ECE; all others 112 Gregory

More information

Introduction to Probability Theory for Graduate Economics Fall 2008

Introduction to Probability Theory for Graduate Economics Fall 2008 Introduction to Probability Theory for Graduate Economics Fall 008 Yiğit Sağlam October 10, 008 CHAPTER - RANDOM VARIABLES AND EXPECTATION 1 1 Random Variables A random variable (RV) is a real-valued function

More information

MTH135/STA104: Probability

MTH135/STA104: Probability MTH5/STA4: Probability Homework # Due: Tuesday, Dec 6, 5 Prof Robert Wolpert Three subjects in a medical trial are given drug A After one week, those that do not respond favorably are switched to drug

More information

Topic 5: Discrete Random Variables & Expectations Reference Chapter 4

Topic 5: Discrete Random Variables & Expectations Reference Chapter 4 Page 1 Topic 5: Discrete Random Variables & Epectations Reference Chapter 4 In Chapter 3 we studied rules for associating a probability value with a single event or with a subset of events in an eperiment.

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

ECON 5350 Class Notes Review of Probability and Distribution Theory

ECON 5350 Class Notes Review of Probability and Distribution Theory ECON 535 Class Notes Review of Probability and Distribution Theory 1 Random Variables Definition. Let c represent an element of the sample space C of a random eperiment, c C. A random variable is a one-to-one

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/ STA263/25//24 Tutorial letter 25// /24 Distribution Theor ry II STA263 Semester Department of Statistics CONTENTS: Examination preparation tutorial letterr Solutions to Assignment 6 2 Dear Student, This

More information

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

The mean, variance and covariance. (Chs 3.4.1, 3.4.2) 4 The mean, variance and covariance (Chs 3.4.1, 3.4.2) Mean (Expected Value) of X Consider a university having 15,000 students and let X equal the number of courses for which a randomly selected student

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Lecture 8 Sampling Theory

Lecture 8 Sampling Theory Lecture 8 Sampling Theory Thais Paiva STA 111 - Summer 2013 Term II July 11, 2013 1 / 25 Thais Paiva STA 111 - Summer 2013 Term II Lecture 8, 07/11/2013 Lecture Plan 1 Sampling Distributions 2 Law of Large

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

MAS108 Probability I

MAS108 Probability I 1 BSc Examination 2008 By Course Units 2:30 pm, Thursday 14 August, 2008 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators.

More information

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable

More information

ECE 302: Probabilistic Methods in Engineering

ECE 302: Probabilistic Methods in Engineering Purdue University School of Electrical and Computer Engineering ECE 32: Probabilistic Methods in Engineering Fall 28 - Final Exam SOLUTION Monday, December 5, 28 Prof. Sanghavi s Section Score: Name: No

More information

Math438 Actuarial Probability

Math438 Actuarial Probability Math438 Actuarial Probability Jinguo Lian Department of Math and Stats Jan. 22, 2016 Continuous Random Variables-Part I: Definition A random variable X is continuous if its set of possible values is an

More information

Gov Multiple Random Variables

Gov Multiple Random Variables Gov 2000-4. Multiple Random Variables Matthew Blackwell September 29, 2015 Where are we? Where are we going? We described a formal way to talk about uncertain outcomes, probability. We ve talked about

More information

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential. Math 8A Lecture 6 Friday May 7 th Epectation Recall the three main probability density functions so far () Uniform () Eponential (3) Power Law e, ( ), Math 8A Lecture 6 Friday May 7 th Epectation Eample

More information

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 2 2 March 22 X a c.r.v. with p.d.f. f and g : R R: then Y g(x is a random variable and E(Y g(f(d variance:

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II Week 8 Jointly Distributed Random Variables Part II Week 8 Objectives 1 The connection between the covariance of two variables and the nature of their dependence is given. 2 Pearson s correlation coefficient

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

2009 GCE A Level H1 Mathematics Solution

2009 GCE A Level H1 Mathematics Solution 2009 GCE A Level H1 Mathematics Solution 1) x + 2y = 3 x = 3 2y Substitute x = 3 2y into x 2 + xy = 2: (3 2y) 2 + (3 2y)y = 2 9 12y + 4y 2 + 3y 2y 2 = 2 2y 2 9y + 7 = 0 (2y 7)(y 1) = 0 y = 7 2, 1 x = 4,

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional

More information

IE 336 Seat # Name (clearly) < KEY > Open book and notes. No calculators. 60 minutes. Cover page and five pages of exam.

IE 336 Seat # Name (clearly) < KEY > Open book and notes. No calculators. 60 minutes. Cover page and five pages of exam. Open book and notes. No calculators. 60 minutes. Cover page and five pages of exam. This test covers through Chapter 2 of Solberg (August 2005). All problems are worth five points. To receive full credit,

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Statistic: a that can be from a sample without making use of any unknown. In practice we will use to establish unknown parameters.

Statistic: a that can be from a sample without making use of any unknown. In practice we will use to establish unknown parameters. Chapter 9: Sampling Distributions 9.1: Sampling Distributions IDEA: How often would a given method of sampling give a correct answer if it was repeated many times? That is, if you took repeated samples

More information

Multivariate Distributions CIVL 7012/8012

Multivariate Distributions CIVL 7012/8012 Multivariate Distributions CIVL 7012/8012 Multivariate Distributions Engineers often are interested in more than one measurement from a single item. Multivariate distributions describe the probability

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

TMA4265: Stochastic Processes

TMA4265: Stochastic Processes General information TMA4265: Stochastic Processes Andrea Riebler August 18th, 2015 You find all important information on the course webpage: https://wiki.math.ntnu.no/tma4265/2015h/start Please check this

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

Random Variables and Expectations

Random Variables and Expectations Inside ECOOMICS Random Variables Introduction to Econometrics Random Variables and Expectations A random variable has an outcome that is determined by an experiment and takes on a numerical value. A procedure

More information

Random Variables. P(x) = P[X(e)] = P(e). (1)

Random Variables. P(x) = P[X(e)] = P(e). (1) Random Variables Random variable (discrete or continuous) is used to derive the output statistical properties of a system whose input is a random variable or random in nature. Definition Consider an experiment

More information

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) 1 EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix) Taisuke Otsu London School of Economics Summer 2018 A.1. Summation operator (Wooldridge, App. A.1) 2 3 Summation operator For

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner. GROUND RULES: This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner. This exam is closed book and closed notes. Show

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information