HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

Similar documents
ENGG2430A-Homework 2

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Math 510 midterm 3 answers

Bivariate Distributions

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Review of Probability. CS1538: Introduction to Simulations

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Random variables (discrete)

Some Concepts of Probability (Review) Volker Tresp Summer 2018

STAT 430/510: Lecture 16

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Lecture 2: Repetition of probability theory and statistics

Midterm Exam 1 Solution

Math Spring Practice for the final Exam.

Bivariate distributions

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

SOR201 Solutions to Examples 3

FINAL EXAM: Monday 8-10am

Multivariate probability distributions and linear regression

Midterm Exam 1 (Solutions)

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

Class 8 Review Problems solutions, 18.05, Spring 2014

CME 106: Review Probability theory

Functions of two random variables. Conditional pairs

Lecture 2: Review of Probability

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

X = X X n, + X 2

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

1 Basic continuous random variable problems

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Regression Analysis. Ordinary Least Squares. The Linear Model

18.440: Lecture 26 Conditional expectation

Homework 5 Solutions

ECSE B Solutions to Assignment 8 Fall 2008

STOR Lecture 16. Properties of Expectation - I

Algorithms for Uncertainty Quantification

Jointly Distributed Random Variables

1 Basic continuous random variable problems

Notes for Math 324, Part 19

Joint Distribution of Two or More Random Variables

Regression and Covariance

We will briefly look at the definition of a probability space, probability measures, conditional probability and independence of probability events.

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Properties of Summation Operator

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2018 Kannan Ramchandran February 14, 2018.

University of Illinois ECE 313: Final Exam Fall 2014

Simple Linear Regression Estimation and Properties

ECE Homework Set 3

Machine Learning: Homework Assignment 2 Solutions

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

CMPSCI 240: Reasoning Under Uncertainty

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

STAT Chapter 5 Continuous Distributions

BASICS OF PROBABILITY

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

Notes 12 Autumn 2005

MATH 3510: PROBABILITY AND STATS June 15, 2011 MIDTERM EXAM

STAT Homework 8 - Solutions

Hypergeometric, Poisson & Joint Distributions

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STT 441 Final Exam Fall 2013

Homework 10 (due December 2, 2009)

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Lecture 3 - Expectation, inequalities and laws of large numbers

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

SDS 321: Introduction to Probability and Statistics

SOLUTION FOR HOMEWORK 11, ACTS 4306

1 Presessional Probability

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

Nonparametric hypothesis tests and permutation tests

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Homework 4 Solution, due July 23

Lecture 22: Variance and Covariance

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual

Expectation, inequalities and laws of large numbers

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

Review of Basic Probability Theory

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Statistics Examples. Cathal Ormond

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

SDS 321: Introduction to Probability and Statistics

Class 8 Review Problems 18.05, Spring 2014

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

18.440: Lecture 28 Lectures Review

Covariance and Correlation

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

[POLS 8500] Review of Linear Algebra, Probability and Information Theory

Transcription:

HW5 Solutions 1. (50 pts.) Random homeworks again (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] Answer: Applying the definition of expectation we have E[XY ] = xy p X,Y (x, y) x y = xy p X (x)p Y (y) x y ( ) ( ) = x p X (x) y p Y (y) x = E[X]E[Y ] (b) Define the covariance of two random variables X and Y as: Cov(X, Y ) := E [(X µ X )(Y µ Y )] (i) (7 pts.) What is the covariance between two independent random variables? Answer: Note that X µ X is independent of Y µ Y (since they are shifted versions of X and Y respectively). Hence using part(a), we write expectation of the product as the product of the expectations. Cov(X, Y ) = E [(X µ X )(Y µ Y )] = E[(X µ X )]E[(Y µ Y )] = (E[X] µ X )(E[Y ] µ Y ) = 0. (ii) (7 pts.) For two general random variables X and Y (not necessarily independent), express the variance of X + Y in terms of the variance of X, the variance of Y, and the covariance of X and Y. Answer: Using the definition of variance we have Var(X + Y ) = E [ (X + Y E[X + Y ]) 2] = E [ (X µ X + Y µ Y ) 2] y = E [ (X µ X ) 2 + (Y µ Y ) 2 + 2(X µ X )(Y µ Y ) ] 1

Now applying the linearity of expectation we have Var(X + Y ) = E[(X E[X]) 2 ] + E[(Y E[Y ]) 2 ] + 2E[(X µ X )(Y µ Y )] = Var(X) + Var(Y ) + 2 Cov(X, Y ) (iii) (7 pts.) Show that if X and Y are two independent random variances, then Var(X + Y ) = Var(X) + Var(Y ) Give an example of a dependent X and Y for which it is not true. Answer: Combining our results from the previous parts we can write Var(X + Y ) = Var(X) + Var(Y ) + 2 Cov(X, Y ) = Var(X) + Var(Y ). As an example of dependent X and Y choose Y = X and Var(X) > 0. then Var(X + Y ) = Var(2X) = 4Var(X) Var(X) + Var(X) = 2Var(X) (c) (7 pts.) Generalize part (b)(ii) to compute the variance of the sum of X + Y + Z of three random variables in terms of the variances of X, Y, and Z as well as the covariances between each of the random variables. What happens when X, Y, Z are mutually independent? Answer: Using our answer from part (b) we can write Var(X + Y + Z) = Var(X + Y ) + Var(Z) + 2 Cov(X + Y, Z) = [Var(X) + Var(Y ) + 2 Cov(X, Y )] + Var(Z) + 2 E[(X + Y µ x µ Y )(Z µ Z )] We can expand out the last term in the previous line as E[(X + Y µ x µ Y )(Z µ Z )] = E[(X µ x )(Z µ Z ) + (Y µ Y )(Z µ Z )] Plugging back into our previous equation we have = Cov(X, Z) + Cov(Y, Z) Var(X + Y + Z) = Var(X) + Var(Y ) + Var(Z) + 2Cov(X, Z) + 2Cov(Y, Z) + 2Cov(X, Z) If X, Y, Z are mutually independent, then Cov(X, Z) = Cov(Y, Z) = Cov(X, Z) = 0, hence Var(X + Y + Z) = Var(X) + Var(Y ) + Var(Z) (d) In the problem where n = 3 homeworks are randomly returned to n = 3 students, compute the variance of the number of students that get their own homeworks back in two ways. (i) (7 pts.) Directly by using the distribution of the number of students that get their homework back which we calculated in class. Answer: Recall from class that the pmf for the n = 3 case is given by 1/3 x = 0 p X (x) = 1/2 x = 1 1/6 x = 3 2

Also recall that the mean is 1. Using this information we calculate the variance as Var(X) = (x 1) 2 p X (x) x {0,1,3} = (0 1) 2 1/3 + (1 1) 2 1/2 + (3 1) 2 1/6 = 1 (ii) (7 pts.) Using part(d) by an appropriate choice of X,Y, and Z. Answer: Let X, Y, and Z be indicator random variables such that they are 1 when student 1,2, or 3 gets their homework back respectively and 0 otherwise. We also note that the mean of these indicator random variables is 1/3 (in general the mean of an indicator random variable is the probability that it is 1). By symmetry Var(X) = Var(Y ) = Var(Z) = (0 1/3) 2 2/3 + (1 1/3) 2 (1/3) = 2/9 Now we compute the covariances as Cov(X, Y ) = Cov(Y, Z) = Cov(X, Z) = E[(X µ X )(Z µ Z )] = E[XZ] µ X µ Z Since there is only one outcome in which both student 1 and student 2 get their correct homework back (namely when all 3 get them back), it has P(XZ = 1) = 1/6 so the expectation is E[XZ] = 1 1/6 + 0 5/6 Plugging into our expression for the covariance we find Cov(X, Z) = 1/6 1/9 = 1/18 Using part(d) we have Var(X + Y + Z) = 3 2/9 + 2 3 1/18 = 1 3

2. (50 pts.) Bit Torrent Consider the Bit Torrent problem where m chunks of a movie are randomly distributed across infinite number of servers. Each server has a uniformly and independently selected chunk of the movie. I query the servers one-by-one and download whatever chunk I find on each server to my local hard disk. Suppose each server query takes 1 second and each chunk of the movie plays for t seconds (t integer). Answer each of these questions. None of the final answers should involve summations. (a) (10 pts.) What is the expected time to get the first chunk of the movie so that I can start watching the movie? What is the distribution of this random variable? Answer: Let T 1 be the random variable denoting the time to get the first chunk of the movie then T 1 Geom(1/m) which we know has mean given by E[T 1 ] = m (b) (20 pts.) What is the expected time to get the first two chunks of the movie? (Hint: consider the first time to see either the first chunk or the second chunk of the movie.) Answer: Let T 1 Geom(2/m) be the time to get either the first or second chunk of the movie and T 2 Geom(1/m) be the time to get the remaining chunk. The second random variable is geometric by the memoryless property of the geometric distribution. Our total time to collect the first two movies is T = T 1 + T 2 so by linearity of expectation and the mean of a geometric random variable, E[ T ] = E[ T 1 ] + E[ T 2 ] = m/2 + m = 3/2 m (c) (20 pts.) I start playing the movie once I get the first chunk of the movie. What is the probability that I have the second chunk of the movie on my hard disk before I finish playing the first chunk so that I can immediately continue playing the movie? Answer: Let T 1 and T 2 be the time until we collect the first and second chunks respectively then we want to calculate the following probability P(T 2 T 1 t) where t Z ++ (this notation means t is a positive integer. With just one + means including 0). Using total probability we can write P(T 2 T 1 t) = P(T 2 T 1 t, T 2 > T 1 ) + P(T 2 T 1 t, T 2 < T 1 ) = P(T 2 T 1 t T 2 > T 1 )P(T 2 > T 1 ) + P(T 2 T 1 t T 2 < T 1 )P(T 2 < T 1 ) By the symmetry of the problem we know P(T 2 < T 1 ) = P(T 2 > T 1 ) = 1/2. Furthermore, the quantity P(T 2 T 1 t T 2 < T 1 ) = 1 since t is defined to be a positive quantity. Plugging in we can write, P(T 2 T 1 t) = 1/2 + P(T 2 T 1 t T 2 > T 1 )1/2 = 1/2 + 1/2 P(T 2 t) 4

Now using the PMF of the geometric distribution and the sum of a geometric series we can simplify P(T 2 T 1 t) = 1/2 + 1/2 t (1 1/m) i 1 1 m i=1 P(T 2 T 1 t) = 1/2 + 1/2 1 t 1 (1 1/m) i m P(T 2 T 1 t) = 1/2 + 1/2 1 m i=0 = 1 1/2 (1 1/m) t 1 (1 1/m) t 1 (1 1/m) So our final answer is P(T 2 T 1 t) = 1 1 (1 1/m)t 2 5