Business Statistics 41000: Homework # 2 Solutions

Size: px
Start display at page:

Download "Business Statistics 41000: Homework # 2 Solutions"

Transcription

1 Business Statistics 4000: Homework # 2 Solutions Drew Creal February 9, 204 Question #. Discrete Random Variables and Their Distributions (a) The probabilities have to sum to, which means that P (X = 0.) =. We can therefore deduce that P (X = 0.) = (b) Here is a plot of the distribution (c) The random variable X can take on two values greater than We sum their probabilities: P (X > 0.05) = P (X = 0.07) + P (X = 0.0) = = 0.6. (d) E[X] = = = 0.062

2 (e) V [X] = 0. ( ) ( ) ( ) ( ) 2 = 0. ( 0.042) ( 0.022) (0.008) (0.038) 2 = 0. (0.008) (0.0005) ( ) (0.004) = = (f ) The standard deviation is the square root of the variance. We have σ X = σ 2 X = =

3 Question # 2. Discrete Random Variables and Their Distributions (a) The random variable X can only take on two outcomes which are and 0. We are rolling a fair die so that the probability P (X = ) is the same as the probability that we roll a six which is P (X = ) = 6. The probability that X = 0 is therefore 5 6. It is important that you recognize that X Bernoulli ( 6). We can write this in a two-way table: x p(x) (b) We know that X Bernoulli ( 6). In class, we wrote out the formulas for the mean and variance of a Bernoulli (p) random variable. These are E[X] = p and V [X] = p( p). Now, we just plug p = 6 into these formulas to get E[X] = 6 and V [X] = 0.39.

4 Question # 3. Discrete Random Variables and Their Distributions (a) We just sum the probabilities between 0 and 0.5. P (0 < R < 0.5) = P (R = 0.0) + P (R = 0.05) + P (0.0) = = 0.7. Notice that we do not include P (R = 0.5) because it says less than 0.5 not less than or equal to. (b) We want to compute P (X < 0) = P (X = 0.05) + P (X = 0.0) = = 0.2. (c) Here, we apply the formula for the expected value that we gave in class. E[R] = = = (d) Here, we rst compute the variance and then just take the square root. V [R] = 0. ( ) ( ) ( ) ( ) ( ) ( ) 2 = 0. ( 0.096) ( 0.056) ( 0.036) (0.004) (0.054) (0.04) 2 = 0. (0.009) + 0. (0.003) (0.003) ( ) (0.003) + 0. (0.008) = = Taking the square root we get σ X = σ 2 X = =

5 (e) Here is a plot of the distribution p(r) r

6 Question # 4. Marginal and Conditional Distributions of Discrete Random Variables (a) Here is the tree diagram. 0.8 E = P(E = and G = ) = 0.5 * 0.8 = G = (GOOD) E = 0 P(E = 0 and G = ) = 0.5 * 0.2 = 0. E = P(E = and G = 0) = 0.5 * 0.4 = 0.2 G = 0 (BAD) 0.6 E = 0 P(E = 0 and G = 0) = 0.5 * 0.6 = 0 (b) Given the information from the tree diagram above, we can easily represent it as a two-way table. G 0 p E (e) E p G (g) (c) Here, we are given P (E = G = ) but we want to know P (G = E = ). (NOTE: This is exactly the same problem as the example from Lecture #3 where we were testing for a disease.) However, we have already computed the joint distribution of (E, G) and it is simple to obtain the marginal distributions. Therefore, we can use our formula for the denition of

7 a conditional probability P (G =, E = ) P (G = E = ) = P (E = ) = = One nal comment to make. In this problem, we have implicitly used Bayes Rule.

8 Question # 5. Marginal and Conditional Distributions of Discrete Random Variables (a) To compute P (X 0., Y 0.), we need to add the probabilities. P (X 0., Y 0.) = P (X = 0.05, Y = 0.05) + P (X = 0., Y = 0.05) +P (X = 0.05, Y = 0.) + P (X = 0., Y = 0.) = = 0.5 (b) To obtain the marginal distribution of X, we simply add each column downwards. X Y p X (x) Notice that p X (x) is a probability distribution itself because p X (x) = P (X = 0.05) + P (X = 0.0) + P (X = 0.5) = =. The probabilities sum to one. (c) To obtain the marginal distribution of Y, we simply add each column across. X p Y (y) Y Notice that p Y (y) is also a probability distribution itself because p Y (y) = P (Y = 0.05) + P (Y = 0.0) + P (Y = 0.5) = =. The probabilities sum to one. (d) We know the joint probabilities and the marginal probabilites from our work above, therefore to determine the conditional distribution P (Y = y X = 0.5) we can use our formulas from

9 class. Since Y has three outcomes, there are three probabilities that we need to calculate. P (Y = 0.05, X = 0.5) P (Y = 0.05 X = 0.5) = P (X = 0.5) = = 0.75 P (Y = 0., X = 0.5) P (Y = 0. X = 0.5) = P (X = 0.5) = = P (Y = 0.5, X = 0.5) P (Y = 0.5 X = 0.5) = P (X = 0.5) = = 0.75 Notice that P (Y = y X = 0.5) is also a probability distribution itself because P (Y = y X = 0.5) = P (Y = 0.05 X = 0.5) + P (Y = 0. X = 0.5) + P (Y = 0.5 X = 0.5) = =. The probabilities sum to one. (e) We know the joint probabilities and the marginal probabilites from our work above, therefore to determine the conditional distribution P (Y = y X = 0.05) we can use our formulas from class. Since Y has three outcomes, there are three probabilities that we need to calculate. P (Y = 0.05, X = 0.05) P (Y = 0.05 X = 0.05) = P (X = 0.05) = = P (Y = 0., X = 0.05) P (Y = 0. X = 0.05) = P (X = 0.05) = = 0.67 P (Y = 0.5, X = 0.05) P (Y = 0.5 X = 0.05) = P (X = 0.05) = = Notice that P (Y = y X = 0.05) is also a probability distribution itself because P (Y = y X = 0.05) = P (Y = 0.05 X = 0.05) + P (Y = 0. X = 0.05) + P (Y = 0.5 X = 0.05) = =. The probabilities sum to one. (f ) By comparing (d) and (e), the distributions are clearly not independent as the conditional dis-

10 tributions we calculated are not equal to one another nor are they equal to the marginal distribution p Y (y). To determine if it is positive or negatively correlated, compare the conditional distributions P (Y = 0.5 X = 0.5) vs P (Y = 0.5 X = 0.05) and P (Y = 0.05 X = 0.5) vs P (Y = 0.05 X = 0.05). Notice that conditional on X being small (large), the probabilities get larger for Y when it is also small (large). Therefore, it is positively correlated. (g) To calculate the mean and variance of Y, we need to use the marginal probability distribution from part (c). E[Y ] = = = 0.08 V [Y ] = 0.24 ( ) ( ) ( ) 2 = 0.24 ( 0.058) ( 0.008) (0.042) 2 = 0.24 (0.0034) ( ) (0.008) = = (h) To calculate the conditional mean of the distribution P (Y = y X = 0.5), we use the probabilities from this distribution. E[Y X = 0.5] = = = 0.28 If we know X = 5% we would guess a higher value for Y because higher values for Y are more likely when X = 5%.

11 Question # 6. Sampling WITHOUT replacement and WITH replacment (a) The distribution of Y is Bernoulli(.5). y p(y ) (b) P (Y 2 = y 2 Y = ) is the conditional distribution of the second voter given that we chose a Democrat on the rst pick: y 2 P (y 2 Y = ) (c) and (d) To write out the joint probability distribution P (Y 2 = y 2, Y = y ) in a table we rst need to calculate the entries! P (Y =, Y 2 = ) = P (Y 2 = Y = )P (Y = ) = (4/9) (/2) = 0.22 P (Y =, Y 2 = 0) = P (Y 2 = 0 Y = )P (Y = ) = (5/9) (/2) = 0.28 P (Y = 0, Y 2 = ) = P (Y 2 = Y = 0)P (Y = 0) = (5/9) (/2) = 0.28 P (Y = 0, Y 2 = 0) = P (Y 2 = 0 Y = 0)P (Y = 0) = (4/9) (/2) = 0.22 Then, we just put these values in a table. We can calculate the marginal distributions easily by summing over rows and columns.

12 Y 2 0 p Y (y ) Y p Y2 (y 2 ) (e) Here, we can use our formulas relating joint distributions to conditional and marginal distributions, i.e. P (Y = y, Y 2 = y 2, Y 3 = y 3 ) = P (Y 3 = y 3 Y 2 = y 2, Y = y )P (Y 2 = y 2 Y = y )P (Y = y ). This formula is applied in the table. p(y, y 2, y 3 ) (0,0,0) (/2)*(4/9)*(3/8) = (0,0,) (/2)*(4/9)*(5/8) = 0.39 (0,,0) (/2)*(5/9)*(4/8) = 0.39 (0,,) (/2)*(5/9)*(4/8) = 0.39 (,0,0) (/2)*(5/9)*(4/8) = 0.39 (,0,) (/2)*(5/9)*(4/8) = 0.39 (,,0) (/2)*(4/9)*(5/8) = 0.39 (,,) (/2)*(4/9)*(3/8) = (f ) Now, we are sampling WITH replacement! The conditional distribution is: y 2 P (y 2 Y = ) You should see that the marginal distribution of Y 2 and the conditional distributions of P (Y 2 = y 2 Y = 0) and P (Y 2 = y 2 Y = ) are all the same. This is one way we can tell that Y and Y 2 are independent.

13 Question # 7. Independence and Identical Distributions (a) We can quickly calculate the marginal distributions The conditional distributions are Y 2 0 p Y (y ) Y p Y2 (y 2 ) P (Y 2 = Y = ) = P (Y 2 =, Y = 0) P (Y = ) = = P (Y 2 = 0 Y = ) = P (Y 2 = 0, Y = ) P (Y = ) = = P (Y 2 = Y = 0) = P (Y 2 =, Y = 0) P (Y = 0) = = P (Y 2 = 0 Y = 0) = P (Y 2 = 0, Y = ) P (Y = 0) = = The conditional distributions P (Y 2 = y 2 Y = ) and P (Y 2 = y 2 Y = 0) are not the same. They depend on what we observe for Y. Therefore, they are NOT independent. (b) The marginal distributions p Y (y ) and p Y2 (y 2 ) are the same. Therefore, the random variables are identically distributed. They are not i.i.d. because they are not BOTH independent and identically distributed. (c) We are sampling without replacement. If we sampled with replacement, Y and Y 2 would be independent (we would put the rst voter back, so the probability of the second voter being Democrat would not depend whether the rst was a Democrat).

14 (d) We need to calculate the joint probabilities of Y and Y 2. P (Y 2 =, Y = ) = P (Y 2 = Y = )P (Y = ) = (6999/9999)(7000/0000) = P (Y 2 = 0, Y = ) = P (Y 2 = 0 Y = )P (Y = ) = (2999/9999)(7000/0000) = P (Y 2 =, Y = 0) = P (Y 2 = Y = 0)P (Y = 0) = (7000/9999)(3000/0000) = P (Y 2 = 0, Y = 0) = P (Y 2 = 0 Y = 0)P (Y = 0) = (2999/9999)(3000/0000) = Notice that the numbers are so large that it really depends on how we round the answers. Y 2 0 p Y (y ) Y p Y2 (y 2 ) The conditional probabilities are: P (Y 2 = Y = ) = P (Y 2 =, Y = ) P (Y = ) = (6999/9999) = P (Y 2 = Y = 0) = P (Y 2 =, Y = 0) P (Y = ) = (7000/9999) = The probabilities do depend on what we observe for Y but notice that they are very close! The data is approximately i.i.d..

15 Question # 8. Covariance, Correlation, Independence and Identical Distributions W 5 5 p V (v) V p W (w) X 5 5 p Y (y) Y p X (x) (a) First, we need to compute the means of both X and Y using the marginal distributions. E[X] = = 0 E[Y ] = = 0 The covariance between (X, Y ) is cov(x, Y ) = 0.45 (5 0) (5 0) (5 0) (5 0) (5 0) (5 0) (5 0) (5 0) = = 20 (b) First, we need to compute the means of both W and V using the marginal distributions. E[W ] = = 0 E[V ] = = 4

16 The covariance between (W, V ) is cov(w, V ) = 0.05 (5 0) (5 4) (5 0) (5 4) (5 0) (5 4) (5 0) (5 4) = = 0 (c) If two random variables are independent, they ALWAYS have zero covariance. From part (a) we saw that σ XY is not zero and therefore X and Y are NOT independent. (d) Be careful here! Two random variables with zero covariance are NOT always independent (see the next question!!). We need to calculate the conditional probabilities. P (W = 5 V = 5) = P (W = 5 V = 5) = P (W = 5, V = 5) P (V = 5) = (0.05/.) = 0.5 P (W = 5, V = 5) P (V = 5) = (0.45/0.9) = 0.5 The conditionals are the same, so W and V are independent. (e) X and Y are identically distributed: they take on the same values and the marginal probabilities are the same. They are NOT i.i.d. due to part (c). (f ) To compute the correlation between X and Y, we can use the formula: ρ X,Y = cov(x, Y ) σ X σ Y From part (a), we know that cov(x, Y ) = 20. First, we need to compute the standard deviations of X and Y. From part (e), we know that they are identically distributed so they

17 have the same mean and the same variance (standard deviations). V [X] = 0.5 (5 0) (5 0) 2 = = 25 σ x = σ y = 25 = 5 This implies that ρ X,Y = cov(x, Y ) σ X σ Y = = 4/5

18 Question # 9. Covariance, Correlation, Independence and Identical Distributions (a) First, we need to compute the means of both X and Y using the marginal distributions. The marginal distributions are X - 0 p Y (y) Y p X (x) and therefore we get E[X] = = 0 E[Y ] = = 2 3 The covariance between (X, Y ) is (here in the calculations we only take into consideration those combinations with non-zero joint probability) cov(x, Y ) = (0 0) ( ) ( 0) ( 0.667) = ( 0) ( 0.667)

19 (b) The conditional probabilities can be calculated as: P (Y =, X = ) P (Y = X = ) = P (X = ) = /3 /3 = P (Y =, X = ) P (Y = 0 X = ) = P (X = ) = 0 /3 = 0 P (Y =, X = ) P (Y = X = 0) = P (X = ) = 0 /3 = 0 P (Y =, X = ) P (Y = 0 X = 0) = P (X = ) = /3 /3 = P (Y =, X = ) P (Y = X = ) = P (X = ) = /3 /3 = P (Y =, X = ) P (Y = 0 X = ) = P (X = ) = 0 /3 = 0 (c) No. The marginal and conditional probabilities are not the same. These two random variables have zero covariance but are NOT independent. (d) Take another look at Y and X. There is a nonlinear relationship between these variables: Y = X 2. Remember, covariance and correlation only measure linear relationships. Here X and Y are related but not linearly. The point of this question is to drive home the message that: If X and Y are independent, covariance is ALWAYS zero. If covariance is zero, X and Y are NOT ALWAYS independent.

20 Question # 0. Expected Value and Variance of Linear Combinations First, recognize that R f is not a random variable. This means that the asset always has a return of 0.02 no matter what. (a) Here, we can apply our formulas for expectations of linearly related random variables. E[P ] = E[.4 R f +.6 R 3 ] =.4 R f +.6 E[R 3 ] = = (b) Here, we can apply our formulas for the variance of linearly related random variables. V [P ] = V [.4 R f +.6 R 3 ] = (.6) 2 V [R 3 ] = (0.36) (0.0225) = It is important to note that R f drops out from the calculation because it is not a random variable. (c) R P,R 3 =. The random variables P and R 3 are linearly related. Therefore, their correlation is. (d) R P,R 2 = 0. The correlation between R 3 and R 2 is zero because these random variables are independent. Taking linear combinations does not change the correlation (Remember problem # 3 part (g) of Homework # ) unless you multiply by a negative number which changes the sign.

21 (e) Here, we can apply our formulas for expectations of linearly related random variables. E[P 2 ] = E[0.2 R f R + 0.4R 2 ] = 0.2 R f E[R ] E[R 2 ] = = = (f ) Here, we can apply our formulas for the variance of linearly related random variables. V [P 2 ] = V [0.2 R f R + 0.4R 2 ] = (0.4) 2 V [R ] + (0.4) 2 V [R 2 ] + 2(0.4)(0.4)corr R,R 2 σ R σ R2 = 0.6 (0.05) (0.0) 2 + 2(0.4)(0.4)(0.5)(0.05)(0.) = = (g) P 2 is a linear combination of R and R 2, which are both independent of R 3. So corr(p 2, R 3 ) = 0. We have not formally shown this but it should make intuitive sense.

22 Question #. Expectation and Variance of Linear Combinations (a) For each random variable i, we have E[Y i ] = E[W i ] = and V [Y i ] = V [W i ] = 0. E[Z] = E[Y + Y 2 + Y 3 ] = E[Y ] + E[Y 2 ] + E[Y 3 ] = 3 and V ar[z] = V ar[y + Y 2 + Y 3 ] = V ar[y ] + V ar[y 2 ] + V ar[y 3 ] = 30 In the last part, the variance of the sum is equal to the sum of the variances because each Y i is independent. (b) We just use our formulas again E[V ] = E[3W ] = 3E[W ] = and V ar[v ] = V ar[3w ] = 9V ar[w ] = 90 Tripling your bet is more protable on average but much riskier. (c) This is simple if you realize that the mean of a sum is the sum of the means, and since the W 's are independent, the variance of the sum is the sum of the variances. So since A = 2 T,

23 E[T ] = E[A] = 0 and V ar[t ] = 20, V ar[a] = 5. (d) [ ] E[T ] = E W i = i= E [W i ] = 0 i= and [ ] V ar[t ] = V ar W i = = i= V ar [W i ] i= 0 = 0n i= and E[A] = E [ n ] W i = n i= E [W i ] = 0 i= and V ar[a] = V ar = = [ n V ar i= i= ] W i i= [ n W i ] 0 0 = n2 n (e) E[X] = E = n = n = µ [ n i= ] X i i= µ i= E [X i ]

24 and V ar[x] = V ar = = [ n V ar i= i= ] X i i= [ n X i ] n 2 σ2 = σ2 n

Econ 371 Problem Set #1 Answer Sheet

Econ 371 Problem Set #1 Answer Sheet Econ 371 Problem Set #1 Answer Sheet 2.1 In this question, you are asked to consider the random variable Y, which denotes the number of heads that occur when two coins are tossed. a. The first part of

More information

Properties of Summation Operator

Properties of Summation Operator Econ 325 Section 003/004 Notes on Variance, Covariance, and Summation Operator By Hiro Kasahara Properties of Summation Operator For a sequence of the values {x 1, x 2,..., x n, we write the sum of x 1,

More information

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q Problem Set #5 Econ 103 Part I Problems from the Textbook Chapter 4: 1, 3, 5, 7, 9, 11, 13, 15, 25, 27, 29 Chapter 5: 1, 3, 5, 9, 11, 13, 17 Part II Additional Problems 1. Suppose X is a random variable

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Stat 150 Practice Final Spring 2015

Stat 150 Practice Final Spring 2015 Stat 50 Practice Final Spring 205 Instructor: Allan Sly Name: SID: There are 8 questions. Attempt all questions and show your working - solutions without explanation will not receive full credit. Answer

More information

Expectation is linear. So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then,

Expectation is linear. So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then, Expectation is linear So far we saw that E(X + Y ) = E(X) + E(Y ). Let α R. Then, E(αX) = ω = ω (αx)(ω) Pr(ω) αx(ω) Pr(ω) = α ω X(ω) Pr(ω) = αe(x). Corollary. For α, β R, E(αX + βy ) = αe(x) + βe(y ).

More information

Distributions of linear combinations

Distributions of linear combinations Distributions of linear combinations CE 311S MORE THAN TWO RANDOM VARIABLES The same concepts used for two random variables can be applied to three or more random variables, but they are harder to visualize

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2 IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read

More information

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables Lecture 6 : Independence, Covariance and Correlation of Discrete Random Variables 0/ 3 Definition Two discrete random variables X and Y defined on the same sample space are said to be independent if for

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ST 370 The probability distribution of a random variable gives complete information about its behavior, but its mean and variance are useful summaries. Similarly, the joint probability

More information

Class 8 Review Problems 18.05, Spring 2014

Class 8 Review Problems 18.05, Spring 2014 1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements

More information

Introductory Probability

Introductory Probability Introductory Probability Conditional Probability: Bayes Probabilities Nicholas Nguyen nicholas.nguyen@uky.edu Department of Mathematics UK Agenda Computing Bayes Probabilities Conditional Probability and

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek Bhrushundi

More information

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1 Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 13: Expectation and Variance and joint distributions Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

Business Statistics 41000: Homework # 5

Business Statistics 41000: Homework # 5 Business Statistics 41000: Homework # 5 Drew Creal Due date: Beginning of class in week # 10 Remarks: These questions cover Lectures #7, 8, and 9. Question # 1. Condence intervals and plug-in predictive

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Probability Basics. Part 3: Types of Probability. INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder

Probability Basics. Part 3: Types of Probability. INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder Probability Basics Part 3: Types of Probability INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder September 30, 2016 Prof. Michael Paul Prof. William Aspray Example A large government

More information

Intelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham

Intelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham Intelligent Data Analysis Principal Component Analysis Peter Tiňo School of Computer Science University of Birmingham Discovering low-dimensional spatial layout in higher dimensional spaces - 1-D/3-D example

More information

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STAT 430/510 Probability Lecture 7: Random Variable and Expectation STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula

More information

Class 8 Review Problems solutions, 18.05, Spring 2014

Class 8 Review Problems solutions, 18.05, Spring 2014 Class 8 Review Problems solutions, 8.5, Spring 4 Counting and Probability. (a) Create an arrangement in stages and count the number of possibilities at each stage: ( ) Stage : Choose three of the slots

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0. EXPECTED VALUE of a RV corresponds to the average value one would get for the RV when repeating the experiment, independently, infinitely many times. Sample (RIS) of n values of X (e.g. More accurately,

More information

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes Midterm 2 Assignment Last Name (Print):. First Name:. Student Number: Signature:. Date: March, 2010 Due: March 18, in class. Instructions:

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Variance reduction techniques

Variance reduction techniques Variance reduction techniques Lecturer: Dmitri A. Moltchanov E-mail: moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Simulation with a given accuracy; Variance reduction techniques;

More information

Conditional distributions (discrete case)

Conditional distributions (discrete case) Conditional distributions (discrete case) The basic idea behind conditional distributions is simple: Suppose (XY) is a jointly-distributed random vector with a discrete joint distribution. Then we can

More information

01 Probability Theory and Statistics Review

01 Probability Theory and Statistics Review NAVARCH/EECS 568, ROB 530 - Winter 2018 01 Probability Theory and Statistics Review Maani Ghaffari January 08, 2018 Last Time: Bayes Filters Given: Stream of observations z 1:t and action data u 1:t Sensor/measurement

More information

Expectations and Variance

Expectations and Variance 4. Model parameters and their estimates 4.1 Expected Value and Conditional Expected Value 4. The Variance 4.3 Population vs Sample Quantities 4.4 Mean and Variance of a Linear Combination 4.5 The Covariance

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty CMPSCI 240: Reasoning Under Uncertainty Lecture 8 Prof. Hanna Wallach wallach@cs.umass.edu February 16, 2012 Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/

More information

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 You can t see this text! Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2 Eric Zivot Spring 2015 Eric Zivot (Copyright 2015) Probability Review - Part 2 1 /

More information

Regression and Covariance

Regression and Covariance Regression and Covariance James K. Peterson Department of Biological ciences and Department of Mathematical ciences Clemson University April 16, 2014 Outline A Review of Regression Regression and Covariance

More information

Homework 5 Solutions

Homework 5 Solutions 126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Introduction to discrete probability. The rules Sample space (finite except for one example)

Introduction to discrete probability. The rules Sample space (finite except for one example) Algorithms lecture notes 1 Introduction to discrete probability The rules Sample space (finite except for one example) say Ω. P (Ω) = 1, P ( ) = 0. If the items in the sample space are {x 1,..., x n }

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

1 Normal Distribution.

1 Normal Distribution. Normal Distribution.. Introduction A Bernoulli trial is simple random experiment that ends in success or failure. A Bernoulli trial can be used to make a new random experiment by repeating the Bernoulli

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.

More information

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008

COS 424: Interacting with Data. Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008 COS 424: Interacting with Data Lecturer: Dave Blei Lecture #2 Scribe: Ellen Kim February 7, 2008 1 Probability Review 1.1 Random Variables A random variable can be used to model any probabilistic outcome.

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

a b = a T b = a i b i (1) i=1 (Geometric definition) The dot product of two Euclidean vectors a and b is defined by a b = a b cos(θ a,b ) (2)

a b = a T b = a i b i (1) i=1 (Geometric definition) The dot product of two Euclidean vectors a and b is defined by a b = a b cos(θ a,b ) (2) This is my preperation notes for teaching in sections during the winter 2018 quarter for course CSE 446. Useful for myself to review the concepts as well. More Linear Algebra Definition 1.1 (Dot Product).

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Data Analysis and Monte Carlo Methods

Data Analysis and Monte Carlo Methods Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

CMPSCI 240: Reasoning Under Uncertainty

CMPSCI 240: Reasoning Under Uncertainty CMPSCI 240: Reasoning Under Uncertainty Lecture 7 Prof. Hanna Wallach wallach@cs.umass.edu February 14, 2012 Reminders Check the course website: http://www.cs.umass.edu/ ~wallach/courses/s12/cmpsci240/

More information

Random variables (discrete)

Random variables (discrete) Random variables (discrete) Saad Mneimneh 1 Introducing random variables A random variable is a mapping from the sample space to the real line. We usually denote the random variable by X, and a value that

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Lecture 4 : Conditional Probability and Bayes Theorem 0/ 26

Lecture 4 : Conditional Probability and Bayes Theorem 0/ 26 0/ 26 The conditional sample space Motivating examples 1. Roll a fair die once 1 2 3 S = 4 5 6 Let A = 6 appears B = an even number appears So P(A) = 1 6 P(B) = 1 2 1/ 26 Now what about P ( 6 appears given

More information

Notes on Mathematics Groups

Notes on Mathematics Groups EPGY Singapore Quantum Mechanics: 2007 Notes on Mathematics Groups A group, G, is defined is a set of elements G and a binary operation on G; one of the elements of G has particularly special properties

More information

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20 Problem Set MAS 6J/.6J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 0 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain a

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Lecture 1: Basics of Probability

Lecture 1: Basics of Probability Lecture 1: Basics of Probability (Luise-Vitetta, Chapter 8) Why probability in data science? Data acquisition is noisy Sampling/quantization external factors: If you record your voice saying machine learning

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2 a: Conditional Probability and Bayes Rule

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2 a: Conditional Probability and Bayes Rule 2E1395 - Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2 a: Conditional Probability and Bayes Rule Exercise 2A1 We can call X the observation (X i indicates that the program

More information

LECTURE 1. Introduction to Econometrics

LECTURE 1. Introduction to Econometrics LECTURE 1 Introduction to Econometrics Ján Palguta September 20, 2016 1 / 29 WHAT IS ECONOMETRICS? To beginning students, it may seem as if econometrics is an overly complex obstacle to an otherwise useful

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28 Discrete Random Variables:

More information

STA 2201/442 Assignment 2

STA 2201/442 Assignment 2 STA 2201/442 Assignment 2 1. This is about how to simulate from a continuous univariate distribution. Let the random variable X have a continuous distribution with density f X (x) and cumulative distribution

More information

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis Lecture 10 Class URL: http://vlsicad.ucsd.edu/courses/cse21-s14/ Lecture 10 Notes Midterm Good job overall! = 81; =

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

CMPT 882 Machine Learning

CMPT 882 Machine Learning CMPT 882 Machine Learning Lecture Notes Instructor: Dr. Oliver Schulte Scribe: Qidan Cheng and Yan Long Mar. 9, 2004 and Mar. 11, 2004-1 - Basic Definitions and Facts from Statistics 1. The Binomial Distribution

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, 008 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Expectations and Variance

Expectations and Variance 4. Model parameters and their estimates 4.1 Expected Value and Conditional Expected Value 4. The Variance 4.3 Population vs Sample Quantities 4.4 Mean and Variance of a Linear Combination 4.5 The Covariance

More information

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find

(4) Suppose X is a random variable such that E(X) = 5 and Var(X) = 16. Find Chapter 5 Exercises A random variable X has the distribution: X 6 PX 6 5 6 Find the expected value, variance, and standard deviation of X. A number Y is chosen at random from the set S = {,, }. Find the

More information

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 CS 70 Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20 Today we shall discuss a measure of how close a random variable tends to be to its expectation. But first we need to see how to compute

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

Review of Basic Probability Theory

Review of Basic Probability Theory Review of Basic Probability Theory James H. Steiger Department of Psychology and Human Development Vanderbilt University James H. Steiger (Vanderbilt University) 1 / 35 Review of Basic Probability Theory

More information

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7

UCSD ECE250 Handout #20 Prof. Young-Han Kim Monday, February 26, Solutions to Exercise Set #7 UCSD ECE50 Handout #0 Prof. Young-Han Kim Monday, February 6, 07 Solutions to Exercise Set #7. Minimum waiting time. Let X,X,... be i.i.d. exponentially distributed random variables with parameter λ, i.e.,

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Some Probability and Statistics

Some Probability and Statistics Some Probability and Statistics David M. Blei COS424 Princeton University February 13, 2012 Card problem There are three cards Red/Red Red/Black Black/Black I go through the following process. Close my

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Exam 1 Review With Solutions Instructor: Brian Powers

Exam 1 Review With Solutions Instructor: Brian Powers Exam Review With Solutions Instructor: Brian Powers STAT 8, Spr5 Chapter. In how many ways can 5 different trees be planted in a row? 5P 5 = 5! =. ( How many subsets of S = {,,,..., } contain elements?

More information

Final Review: Problem Solving Strategies for Stat 430

Final Review: Problem Solving Strategies for Stat 430 Final Review: Problem Solving Strategies for Stat 430 Hyunseung Kang December 14, 011 This document covers the material from the last 1/3 of the class. It s not comprehensive nor is it complete (because

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables We have a probability space (S, Pr). A random variable is a function X : S V (X ) for some set V (X ). In this discussion, we must have V (X ) is the real numbers X induces a

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 10: Expectation and Variance Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2016 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips

6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips 6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Tutorial:A Random Number of Coin Flips Hey, everyone. Welcome back. Today, we're going to do another fun problem that

More information