HW4 : Bivariate Distributions (1) Solutions

Similar documents
Distributions of Functions of Random Variables

Mathematics 426 Robert Gross Homework 9 Answers

Bivariate Distributions

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MATH 395 PROBABILITY II

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Homework 10 (due December 2, 2009)

Chapter 4 Multiple Random Variables

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Bivariate distributions


Continuous r.v practice problems

Continuous Random Variables

More on Distribution Function

HW3 : Moment functions Solutions

18.440: Lecture 19 Normal random variables

Homework 5 Solutions

4. Distributions of Functions of Random Variables

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Lecture 3. Conditional distributions with applications

4 Pairs of Random Variables

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Test one Review Cal 2

6.041/6.431 Fall 2010 Quiz 2 Solutions

1 Solution to Problem 2.1

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Probability and Distributions

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Chapter 4. Chapter 4 sections

STA 256: Statistics and Probability I

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

Order Statistics and Distributions

NO CALCULATORS. NO BOOKS. NO NOTES. TURN OFF YOUR CELL PHONES AND PUT THEM AWAY.

Covariance and Correlation

Math 226 Calculus Spring 2016 Practice Exam 1. (1) (10 Points) Let the differentiable function y = f(x) have inverse function x = f 1 (y).

Chapter 4. Multivariate Distributions. Obviously, the marginal distributions may be obtained easily from the joint distribution:

STAT/MATH 395 PROBABILITY II

CS145: Probability & Computing

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

MATH/STAT 395 Winter 2013 This homework is due at the beginning of class on Friday, March 1.

STAT 515 MIDTERM 2 EXAM November 14, 2018

Math 222 Spring 2013 Exam 3 Review Problem Answers

Statistics 351 Probability I Fall 2006 (200630) Final Exam Solutions. θ α β Γ(α)Γ(β) (uv)α 1 (v uv) β 1 exp v }

7.1. Calculus of inverse functions. Text Section 7.1 Exercise:

EXAM. Exam #1. Math 3350 Summer II, July 21, 2000 ANSWERS

Math 10C - Fall Final Exam

Final Exam Review Quesitons

Lecture 14: Multivariate mgf s and chf s

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Ordinary Differential Equations

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

When Are Two Random Variables Independent?

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

10 BIVARIATE DISTRIBUTIONS

Chapter 4 Multiple Random Variables

MULTIVARIATE PROBABILITY DISTRIBUTIONS

STAT 516 Midterm Exam 3 Friday, April 18, 2008

4. CONTINUOUS RANDOM VARIABLES

Final Exam 2011 Winter Term 2 Solutions

Multivariate distributions

4r 2 12r + 9 = 0. r = 24 ± y = e 3x. y = xe 3x. r 2 6r + 25 = 0. y(0) = c 1 = 3 y (0) = 3c 1 + 4c 2 = c 2 = 1

Solutions to Assignment #8 Math 501 1, Spring 2006 University of Utah

Integration by Parts

Lesson 3: Linear differential equations of the first order Solve each of the following differential equations by two methods.

Solutions to Exam 2, Math 10560

Discrete Random Variables (1) Solutions

STAT 430/510: Lecture 15

Bivariate Transformations

Assignment 16 Solution. Please do not copy and paste my answer. You will get similar questions but with different numbers!

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

(a) x cos 3x dx We apply integration by parts. Take u = x, so that dv = cos 3x dx, v = 1 sin 3x, du = dx. Thus

1 x 2 and 1 y 2. 0 otherwise. c) Estimate by eye the value of x for which F(x) = x + y 0 x 1 and 0 y 1. 0 otherwise

1 4 (1 cos(4θ))dθ = θ 4 sin(4θ)

Multiple Random Variables

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

ISyE 3044 Fall 2017 Test #1a Solutions

Conditional densities, mass functions, and expectations

2 (Statistics) Random variables

2 Functions of random variables

ENGG2430A-Homework 2

Class 8 Review Problems solutions, 18.05, Spring 2014

Math 240 Calculus III

Lecture 19: Properties of Expectation

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

Joint Distribution of Two or More Random Variables

Bell-shaped curves, variance

Bivariate Distributions

Prelim 2 Math Please show your reasoning and all your work. This is a 90 minute exam. Calculators are not needed or permitted. Good luck!

SOLUTION FOR HOMEWORK 12, STAT 4351

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

18 Bivariate normal distribution I

THE USE OF A CALCULATOR, CELL PHONE, OR ANY OTHER ELECTRONIC DEVICE IS NOT PERMITTED IN THIS EXAMINATION.

Lecture 5: Moment generating functions

M GENERAL MATHEMATICS -2- Dr. Tariq A. AlFadhel 1 Solution of the First Mid-Term Exam First semester H

2015 Math Camp Calculus Exam Solution

Transcription:

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 7 Néhémy Lim HW4 : Bivariate Distributions () Solutions Problem. The joint probability mass function of X and Y is given by the following table : X Y 8 8 4 (a) Give the marginal probability mass functions of X and Y. Answer. The marginal probability mass function of X is given by : p X () p XY (, ) + p XY (, ) 8 + 4 3 8 p X () p XY (, ) + p XY (, ) 8 + 5 8 The marginal probability mass function of Y is given by : p Y () p XY (, ) + p XY (, ) 8 + 8 4 p Y () p XY (, ) + p XY (, ) 4 + 3 4 (b) Are X and Y independent? Answer. No since p X () p Y () 3/8 /4 3/3 /8 p XY (, ). (c) Compute P(XY 3). Answer. The set of values that satisfy xy 3 is : B {(, ), (, ), (, )}. Thus,

P(XY 3) P ((X, Y ) B ) p XY (, ) + p XY (, ) + p XY (, ) 8 + 4 + 8 (d) Compute P(X/Y > ). Answer. The only couple (x, y) that satisfies x/y > is (,). Thus, P(X/Y > ) p XY (, ) 8 Problem. Assume that X and Y have joint probability density function : with S {(x, y) < y < x < } f XY (x, y) x S(x, y) (a) Verify that f XY is a valid joint probability density function. Answer. It is clear that f XY (x, y) for all x, y R. Second, we have : { x } f XY (x, y) dx dy dy x dx Therefore, f XY dx is a valid joint probability density function. (b) Give the marginal probability density function of X. Answer. The marginal probability density function of X, for x (, ), is given by : f X (x) and f X (x) otherwise. Thus : x f XY (x, y) dy x dy f X (x) (,) (x)

(c) Give the marginal probability density function of Y. Answer. The marginal probability density function of Y, for y (, ), is given by : f Y (y) and f Y (y) otherwise. Thus : (d) Find the expected value of X. y x dx ln(x) y ln(y) f XY (x, y) dx f Y (y) ln(y) (,) (y) Answer. Using the marginal probability density function of X, the expected value of X is : E[X] xf X (x) dx x dx (e) Find the expected value of Y. Answer. Using the marginal probability density function of Y, the expected value of Y is : E[Y ] yf Y (y) dy y ln(y) dy 3

Integrating by parts u ln(y) dv y dy du dy/y v y / gives : [ y ln(y) E[Y ] [ ] y + 4 4 ] y + dy Problem 3.. The joint probability density function of X and Y is given by : with S {(x, y) x >, y > } f XY (x, y) xe (x+y) S (x, y) (a) Give the marginal probability density function of X. Answer. The marginal probability density function of X for x > is given by : f X (x) and f X (x) otherwise. Thus, f XY (x, y) dy xe x e y dy xe x [ e y] xe x f X (x) xe x (, ) (x) (b) Give the marginal probability density function of Y. Answer. The marginal probability density function of Y for y > is given by : f Y (y) f XY (x, y) dx e y xe x dx e y e y 4

The third equality comes from the fact that we recognize the expected value of an exponential distribution with parameter. Hence, f Y (y) e y (, ) (y) (c) Are X and Y independent? Answer. Yes since for all x, y R, f X (x)f Y (y) xe x (, ) (x) e y (, ) (y) xe (x+y) (, ) (x, y) f XY (x, y). The joint probability density function of X and Y is given by : f XY (x, y) S (x, y) with S {(x, y) < x < y, < y < } (a) Give the marginal probability density function of X. Answer. The marginal probability density function of X for x (, ) is given by : f X (x) and f X (x) otherwise. Thus, x f XY (x, y) dy dy ( x) f X (x) ( x) (,) (x) (b) Give the marginal probability density function of Y. Answer. The marginal probability density function of Y for y (, ) is given by : f Y (y) y y and f Y (y) otherwise. Thus, (c) Are X and Y independent? Answer. No since for instance, f XY (x, y) dx dx f Y (y) y (,) (y) f X (.5)f Y (.5) (.5).5 f XY (.5,.5) 5

Problem 4. Let X and Y be two discrete random variables with joint probability mass function given by : p XY (x, y) 6 min(x,y) S(x, y) where S {(x, y) Z x, y, x y }. Reminder : Z is the set of all integers. (a) Plot the points in S in the xy-plane for x 3 and y 3. Answer. There are points in S such that x 3 and y 3. 3 y.5.5.5 3 x (b) Compute p X () the marginal probability mass function of X for x. Answer. For x, p XY (x, y) is nonzero for y and y. Therefore, the marginal probability mass function of X for x is p X () p XY (, ) + p XY (, ) 6 + 6 3. (c) Compute p X (x) the marginal probability mass function of X for x,, 3. Answer. The marginal probability mass function of X for x,, 3 is given by p X () p XY (, ) + p XY (, ) + p XY (, ) 6 + 6 + 6 3 p X () p XY (, ) + p XY (, ) + p XY (, 3) 6 + 6 + 6 6 p X (3) p XY (3, ) + p XY (3, 3) + p XY (3, 4) 6 + 6 3 + 6 3. 6

(d) Derive from (c) a general formula for p X (x) the marginal probability mass function of X for x N, x. Answer. In general, we have, for x N, x, p X (x) p XY (x, x ) + p XY (x, x) + p XY (x, x + ) 6 x + 6 x + 6 x 3 x. (e) Using a simple argument, give p Y (y) the marginal probability mass function of Y. Answer. Noting the symmetry of the joint probability mass function, we have 3 y p Y (y) 3 y N, y y otherwise (f) Compute P(X Y, X < ). Answer. The set of values in S that satisfy both x y and x < is : B {(, ), (, )}. Hence, P(X Y, X < ) P ((X, Y ) B) p XY (, ) + p XY (, ) 6 + 6 4 7