Homework 5 Solutions

Similar documents
Homework 10 (due December 2, 2009)

ENGG2430A-Homework 2

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Homework 9 (due November 24, 2009)

Math 510 midterm 3 answers

More than one variable

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

STAT 515 MIDTERM 2 EXAM November 14, 2018

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

HW4 : Bivariate Distributions (1) Solutions

Continuous r.v practice problems

Mathematics 426 Robert Gross Homework 9 Answers

Class 8 Review Problems solutions, 18.05, Spring 2014

STA 256: Statistics and Probability I

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

1 Basic continuous random variable problems


Notes for Math 324, Part 19

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Properties of Summation Operator

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Multivariate distributions

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Bivariate Distributions

Problem 1. Problem 2. Problem 3. Problem 4

ECE 302: Probabilistic Methods in Engineering

Expectation, inequalities and laws of large numbers

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Chapter 5 Joint Probability Distributions

CH5 CH6(Sections 1 through 5) Homework Problems

Solution to Assignment 3

Review of Probability Theory

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

2 (Statistics) Random variables

Multivariate Random Variable

Bivariate distributions

STAT/MA 416 Midterm Exam 3 Monday, November 19, Circle the section you are enrolled in:

1 Basic continuous random variable problems

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

CS145: Probability & Computing

Assignment 1 (Sol.) Introduction to Machine Learning Prof. B. Ravindran. 1. Which of the following tasks can be best solved using Clustering.

Jointly Distributed Random Variables

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Course 1 Solutions November 2001 Exams

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Let X and Y denote two random variables. The joint distribution of these random

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Multivariate probability distributions and linear regression

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Covariance and Correlation

MULTIVARIATE PROBABILITY DISTRIBUTIONS

STAT 516 Midterm Exam 3 Friday, April 18, 2008

Joint Probability Distributions and Random Samples (Devore Chapter Five)

STAT 430/510: Lecture 16

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

FINAL EXAM: Monday 8-10am

Partial Solutions for h4/2014s: Sampling Distributions

Machine Learning: Homework Assignment 2 Solutions

Lecture 22: Variance and Covariance

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

18 Bivariate normal distribution I

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

ECONOMICS 207 SPRING 2006 LABORATORY EXERCISE 5 KEY. 8 = 10(5x 2) = 9(3x + 8), x 50x 20 = 27x x = 92 x = 4. 8x 2 22x + 15 = 0 (2x 3)(4x 5) = 0

DIFFERENTIAL EQUATIONS

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

conditional cdf, conditional pdf, total probability theorem?

DIFFERENTIAL EQUATIONS

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Chapter 2. Probability

MAS223 Statistical Inference and Modelling Exercises

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

STOR Lecture 16. Properties of Expectation - I

Lecture 2: Review of Probability

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

DIFFERENTIAL EQUATIONS

Functions of two random variables. Conditional pairs

Multiple Random Variables

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Solutions to Assignment #8 Math 501 1, Spring 2006 University of Utah

FINAL EXAM: 3:30-5:30pm

18.440: Lecture 26 Conditional expectation

Probability Theory and Statistics. Peter Jochumzen

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

x i p X (x i ) if X is discrete provided that the relevant sum or integral is absolutely convergent, i.e., i

Assignment 9. Due: July 14, 2017 Instructor: Dr. Mustafa El-Halabi. ! A i. P (A c i ) i=1

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

Stat 5101 Notes: Algorithms

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Continuous Random Variables

Business Statistics 41000: Homework # 2 Solutions

Transcription:

126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx = 1, 2, 3, y = 1, 2 p(x, y) = Answer. (a) the value of the constant c 3 2 x=1 y=1 c(x + y) = 1, implies that c = 1/21. (b) the marginal probability mass functions of X and Y. p X (x) = 2 y=1 (1/21)(x + y) = (2x + 3)/21. x = 1, 2, 3. p Y (y) = 3 x=1 (1/21)(x + y) = (6 + 3y)/21. y = 1, 2. (c) P (X 2 Y = 1) P (X 2 Y = 1) = p(2,1)+p(3,1) p Y (1) = 7.21 9/21 = 7/9 (d)e(x) and E(Y ). E(X) = 3 2 x=1 y=1 (1/21)x(x + y) = 46/21 E(Y ) = 3 2 x=1 y=1 (1/21)y(x + y) = 11/7 2. Two dice are rolled. The sum of the outcomes is denoted by X and the absolute value of their difference by Y. Calculate the joint probability mass function of X and Y and the marginal probability mass functions of X and Y. Answer. Table 1 gives p(x, y), the joint probability mass function of X and Y; p X (x), the marginal probability mass function of X; and p Y (y), the marginal probability mass function of Y. 3. Two points are placed on a segment of length l independently and at random to divide the line into three parts. What is the probability that the length of none of the three parts exceeds a given value α, l/3 α l? Answer. Let X and Y be the two points that are placed on the segment. Let E be the event that the length of none of the three parts exceeds the given value α. Clearly, P (E X < Y ) = P (E Y < X) and P (X < Y ) = P (Y < X) = 1/2. Therefore, P (E) = P (E X < Y )P (X < Y ) + P (E Y < X)P (Y < X) = P (E X < Y )1 + P (E X < Y )1 = P (E X < Y ). 1-1

y x 1 2 3 4 5 p X (x) 2 1/36 1/36 3 2/36 2/36 4 1/36 2/36 3/36 5 2/36 2/36 4/36 6 1/36 2/36 2/36 5/36 7 2/36 2/36 2/36 6/36 8 1/36 2/36 2/36 5/36 9 2/36 2/36 4/36 1 1/36 2/36 3/36 11 2/36 2/36 12 1/36 1/36 p Y (y) 6/36 1/36 8/36 6/36 4/36 2/36 Table 1: The joint probability mass function of X and Y in question 2 This shows that for calculation of P (E), we may reduce the sample space to the case where X < Y. The reduced sample space is S = {(x, y) : x < y, < x < l, < y < l}. The desired probability is the area of R = {(x, y)s : x < α, y x < α, y > l α} divided by area (S) = l 2 /2. But area(r) = { (3α l) 2 2 if l 3 α l 2 l 2 2 3l2 2 (1 α l )2 if l 2 α l Hence the desired probability is { ( 3α P (E) = l 1) 2 if l 3 α l 2 1 3(1 α l )2 if l 2 α l 4. Let the joint probability mass function of random variables X and Y be given by p(x, y) = 7 x2 y i (1, 1), (1, 2), (2, 1) Are X and Y independent? Why or why not? Answer. Note that p(1, 1) = 1/7 p X (1) = p(1, 1) + p(1, 2) = 1/7 + 2/7 = 3/7 p Y (1) = p(1, 1) + p(2, 1) = 1/7 + 5/7 = 6/7. Since p(1, 1) p X (1)p Y (1), X and Y are dependent. 5. The joint probability mass function p(x,y) of the random variables X and Y is given by the following table. Determine if X and Y are independent. 1-2

y x 1 2 3.1681.184.574.41 1.184.1936.616.44 2.574.616.196.14 3.41.44.14.1 Answer. For i, j {, 1, 2, 3}, the sum of the numbers in the ith row is p X (i) and the sum of the numbers in the jth row is p Y (j). We have that p X () =.41, p X (1) =.44, p X (2) =.14, p X (3) =.1; p Y () =.41, p Y (1) =.44, p Y (2) =.14, p Y (3) =.1. Since for all x, y {, 1, 2, 3}, p(x, y) = p X (x)p Y (y), X and Y are independent. 6. A point is selected at random from the disk R = {(x, y) R 2 : x 2 + y 2 1}. Let X be the x-coordinate and Y be the y-coordinate of the point selected. Determine if X and Y are independent random variables. Answer. The joint probability density function of X and Y is given by { 1 area(r) = 1 π if(x, y) R Now f X (x) = 1 x 2 1 1 x 2 π dy = 2 π 1 x 2, f Y (y) = 1 y 2 1 1 y 2 π dy = 2 π 1 y 2 Since f(x, y) f X (x)f Y (y), the random variables X and Y are not independent. 7. Let the joint probability density function of continuous random variables X and Y be given by { 2 if < x < y < 1 Find f X Y (x y). Answer. Since F Y (y) = y 2dx = 2y, < y < 1, we have that f X Y (x y) = f(x,y) f Y (y) = 2/2y = 1/y, < x < y, < y < 1. 8. First a point Y is selected at random from the interval (, 1). Then another point X is selected at random from the interval (Y, 1). Find the probability density function of X. Answer. Let f(x, y) be the joint probability density function of X and Y. Clearly, f X Y (x y)fy (y). Thus f X (x) = f X Y (x y)fy (y)dy. Now < y < 1 f Y (y) = and 1-3

Therefore, for < x < 1, f X (x) = x f X Y (x y) = 1 y if < y < 1, y < x < 1 f X (x) = 1 1 y dy = ln(1 x) and hence { ln(1 x) < x < 1 9. A point is selected at random and uniformly from the region R = {(x, y) : x + y 1}. Find the conditional probability density function of X given Y = y. Answer. Let f (x, y) be the joint probability density function of X and Y. {.5 x + y 1 and f Y (y) = 1 y, 1 y 1. Hence f X Y (x y) =.5 1 y = 1 2(1 y ), 1 + y x 1 y, 1 y 1. 1. Let the joint probability density function of random variables X and Y be { 2e (x+2y) if x, y elsewhere. Find E(X), E(Y ), and E(X 2 + Y 2 ). Answer. Since e x.2e 2y, X and Y are independent exponential random variables with parameters 1 and 2, respectively. Thus E(X) = 1, E(Y ) = 1/2, E(X 2 ) = V ar(x) + [E(X)] 2 = 1 + 1 = 2, E(Y 2 ) = V ar(y ) + [E(Y )] 2 = 1/4 + 1/4 = 1/2. Therefore, E(X 2 + Y 2 ) = 2 + 1/2 = 5/2 11. Suppose that random digits are generated from the set {, 1,..., 9} independently and successively. Find the expected number of digits to be generated until the pattern (a) 7 appears, (b) 156156 appears, (c) 575757 appears. Answer. (a) E( 7) = E(7 7) = 1,. (b) E( 156156) = E( 156) + E(156 156156) = E(156 156) + E(156156 156156) = 1, + 1,, = 1, 1,. (c) E( 575757) = E( 57) + E(57 5757) + E(5757 575757) = E(57 57)+E(5757 5757)+E(575757 575757) = 1+1, +1,, = 1, 1, 1. 1-4

12. Suppose that 8 balls are placed into 4 boxes at random and independently. What is the expected number of the empty boxes? Answer. Let if the ith box is empty X i = elsewhere. The expected number of the empty boxes is E(X 1 + X 2 +... + X 25 ) = 4E(X i ) = 4P (X i = 1) = 4( 39 4 )8 5.28 13. Let the joint probability mass function of random variables X and Y be given by p(x, y) = 7x(x + y) if x = 1, 2, 3, y = 3, 4 elsewhere. Find Cov(X, Y ). Answer. E(X) = 3 4 x=1 y=3 (1/7)x2 (x + y) = 17/7 E(Y ) = 3 4 x=1 y=3 (1/7)xy(x + y) = 124/35 E(XY ) = 3 4 x=1 y=3 (1/7)x2 y(x + y) = 43/5. Therefore Cov(X, Y ) = E(XY ) E(X)E(Y ) = 43/5 (17/7).(124/35) = 1/125. 14. Let X and Y be the coordinates of a random point selected uniformly from the unit disk {(x, y) : x 2 + y 2 1}. Are X and Y independent? Are they uncorrelated? Why or why not? Answer. The joint probability density function of X and Y is given by π x 2 + y 2 1 elsewhere. X and Y are dependent because, for example, P ( < X <.5 Y = ) = 1/4 while, P ( < X < 2) = 2 1/2 1 x 2 1 π dydx = 2 1/2 π 1 x 2 dx = 1/6 + 3 4π P ( < X <.5 Y = ). X and Y are uncorrelated because E(X) = x 2 +y 2 1 x/πdxdy = 1/π 1 E(Y ) = x 2 +y 2 1 y/πdxdy = 1/π 1 2π and E(XY ) = x 2 +y 2 1 xy/πdxdy = 1/π 1 Cov(X, Y ) = E(XY ) E(X)E(Y ) =. r 2 cos(θ)dθdr = 2π r 2 sin(θ)dθdr = 2π r 3 cos(θ) sin(θ)dθdr = implying that 15. Mr. Ingham has invested money in three assets; 18% in the first asset, 4% in the second one, and 42% in the third one. Let r 1, r 2, and r 3 be the annual rate of returns for these three investments, respectively. For 1 i, j 3, Cov(r i, r j ) is the ith element in the jth row of the following table. [Note that V ar(r i ) = Cov(r i, r i )]. Find the standard deviation of the annual rate of return for Mr. Ingham s total investment. Answer. Let r be the annual rate of return for Mr. Inghams total investment. We have 1-5

r 1 r 2 r 3 r 1.64.3.15 r 2.3.144.21 r 3.15.21.1 V ar(r) = V ar(.18r 1 +.4r 2 +.42r 3 ) = (.18) 2 V ar(r 1 ) + (.4) 2 V ar(r 2 ) + (.42) 2 V ar(r 3 ) +2(.18)(.4)Cov(r 1, r 2 ) + 2(.18)(.42)Cov(r 1, r 3 ) + 2(.4)(.42)Cov(r 2, r 3 ) = (.18) 2 (.64) + (.4) 2 (.144) + (.42) 2 (.1) +2(.18)(.4)(.3) + 2(.18)(.42)(.15) + 2(.4)(.42)(.21) =.1979. 16. Let the joint probability density function of X and Y be given by { sin x sin y if x π/2, y π/2 elsewhere. Calculate the correlation coefficient of X and Y. Answer. Since X and Y are independent random variables. [This can also be shown directly by verifying the relation f X (x)f Y (y).] Hence Cov(X, Y ) =, and therefore ρ(x, Y ) =. 1-6