Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Similar documents
Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Bivariate distributions

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Chapter 4 Multiple Random Variables

Probability and Statistics Notes

Bivariate Distributions

Contents 1. Contents

Probability Theory and Statistics. Peter Jochumzen

Stat 5101 Notes: Algorithms

Continuous Random Variables

Multivariate distributions

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Summary of Chapters 7-9

Exercises and Answers to Chapter 1

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Review: mostly probability and some statistics

Recitation 2: Probability

Mathematics 426 Robert Gross Homework 9 Answers

MAS223 Statistical Inference and Modelling Exercises

Random Variables. P(x) = P[X(e)] = P(e). (1)

Probability and Distributions

ENGG2430A-Homework 2

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Actuarial Science Exam 1/P

Chapter 5 Joint Probability Distributions

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Multiple Random Variables

Stat 5101 Notes: Algorithms (thru 2nd midterm)

2 Functions of random variables

Statistics for scientists and engineers

Mathematical Statistics 1 Math A 6330

1 Review of Probability and Distributions

Math Review Sheet, Fall 2008

HW4 : Bivariate Distributions (1) Solutions

Ch3. Generating Random Variates with Non-Uniform Distributions

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Transformations and Expectations

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Basics on Probability. Jingrui He 09/11/2007

SDS 321: Introduction to Probability and Statistics

Introduction to Probability and Stocastic Processes - Part I

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Exercises with solutions (Set D)

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Notes for Math 324, Part 19

Random Variables and Their Distributions

Ch. 5 Joint Probability Distributions and Random Samples

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

MATH Notebook 4 Fall 2018/2019

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

2.2 Separable Equations

SDS 321: Introduction to Probability and Statistics

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

3. Probability and Statistics

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

STAT 450: Statistical Theory. Distribution Theory. Reading in Casella and Berger: Ch 2 Sec 1, Ch 4 Sec 1, Ch 4 Sec 6.

Conditional densities, mass functions, and expectations

SOLUTION FOR HOMEWORK 12, STAT 4351

Review of Probability Theory

Stat 5101 Lecture Slides: Deck 7 Asymptotics, also called Large Sample Theory. Charles J. Geyer School of Statistics University of Minnesota

1 Joint and marginal distributions

ECE531 Homework Assignment Number 6 Solution

Jointly Distributed Random Variables

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

[Chapter 6. Functions of Random Variables]

2 (Statistics) Random variables

Stat 426 : Homework 1.

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Practice Examination # 3

Conditional Distributions

Multivariate Distributions CIVL 7012/8012

Stat 5101 Lecture Slides: Deck 8 Dirichlet Distribution. Charles J. Geyer School of Statistics University of Minnesota

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

EXAM # 3 PLEASE SHOW ALL WORK!

Binomial and Poisson Probability Distributions

STAT 512 sp 2018 Summary Sheet

Solution to Assignment 3

CMPSCI 240: Reasoning Under Uncertainty

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Never leave a NEGATIVE EXPONENT or a ZERO EXPONENT in an answer in simplest form!!!!!

ECE 4400:693 - Information Theory

Distributions of Functions of Random Variables

Chapter 4 Multiple Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

STA 4321/5325 Solution to Homework 5 March 3, 2017

This does not cover everything on the final. Look at the posted practice problems for other topics.

Lecture 2: Review of Probability

Homework 3. (33-40) The graph of an exponential function is given. Match each graph to one of the following functions.

Transcription:

Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional p.d.f. h(y x) and the conditional probability P ( Y 3 2 X ). What is the expected value of Y when X? Solution: The conditional p.d.f h(y x) f(x, y)/f (x) is immediately seen to be h(y x) 2 2( x) x. To find P ( Y 3 X ) we integrate the conditional p.d.f. h(y ) on the interval 2 /2 y 3/, and we obtain P ( 2 Y 3 X ) 3/ /2 dy ( 3 Since expectation is linear, we have E(Y X ) E(/3) /3. ) 3. 2. Let X and Y be discrete random variables with joint p.m.f. f(x, y) x + y, x, 2, y, 2, 3,. 32 Find the marginal p.m.f. s of X and Y and the conditional p.m.f. s g(x y) and h(y x). Find P ( Y 3 X ) and P (Y 2 X 2). Finally, find E(Y X ) and find V ar(y X ). Solution: The marginal p.m.f. s of X and Y are immediately seen to be f (x) y,2,3, f 2 (y) x,2 f(x, y) x +, 32 f(x, y) 3 + 2y 32.

2 The conditional p.m.f. s are thus seen to be h(y x) g(x y) f(x, y) f (x) f(x, y) f 2 (y) (x + y)/32 (x + )/32 x + y x +, (x + y)/32 (3 + 2y)/32 x + y 3 + 2y. Also, we have P ( Y 3 X ) y,2,3 h(y ) 2 + 3 + 9. Note, this can also be computed as h( ) 5 9. Next, we compute Finally, we have that E(Y X ) P (Y 2 X 2) y,2,3, y,2 h(y 2) 2 + 8 + 2 + 2 8 7 8. y h(y ) () 2 + (2) 3 + (3) + () 8 7, and V ar(y X ) E(Y 2 X ) E(Y X ) 57 7 ( ) 2 8.53. 7

3 3. Let W equal the weight of a box of oranges which is supposed to weight -kg. Suppose that P (W < ).5 and P (W >.5).. Call a box of oranges light, good, or heavy depending on if W <, W.5, or W >.5, respectively. In n 5 independent observations of these boxes, let X equal the number of light boxes and Y the number of good boxes. Find the joint p.m.f. of X and Y. How is Y distributed? Name the distribution and state the values of the parameters associated to this distribution. Given X 3, how is Y distributed? Determine E(Y X 3) and find the correlation coefficient ρ of X and Y. Solution: The random variables are said to come from a trinomial distribution in this case since there are three exhaustive and mutually exclusive outcomes light, good, or heavy, having probabilities p.5, p 2.85, and p 3., respectively. It is easy to see that the trinomial p.m.f. in this case is 5! f(x, y) x!y!(5 x y)! px p y 2p 5 x y 3. That is, f(x, y) is exactly the joint p.m.f. of X and Y, the number of light boxes and good boxes, where p.5, p 2.85, and p 3. are the various probabilities of each of the three events: light, good, and heavy. The random variable is binomially distributed, but the parameters of the distribution depend on the value of the random variable X. We have that the random variable Y is (conditionally) p binomially distributed b(n x, 2 p ) since the marginal distributions of X, Y are b(n, p ), b(n, p 2 ) and the conditional p.m.f. of Y is thus, with n 5, h(y x) f(x, y) f (x) n! x!y!(n x y)! px p y 2p n x y n!x!(n x)! x!y!(n x y)!n! px (n x)! y!(n x y)! ( p2 p x p p y 2 3 x!(n x)! n! p n x y 3 ( p ) y ( p ) n x ( p ) y ) y ( ) n x y p3. p In this case, with the specified values of n, p, p 2, and p 3, we have for X 3 h(y 3) 7! y!(7 y)! (.897)y (.53) 7 y, p x ( p ) n x so that Y is conditionally b(7,.897) when X 3. Since µ np for binomial distribution, we have that E(Y X 3) (7)(.897) 2.5. It is not hard to see that in fact E(Y x) (n x) p 2 p in general, and that a similar formula holds for E(X y). The correlation coefficient is now found using the fact that since each of

the conditional expectations E(Y x) (n x) p 2 p and E(X y) (n y) p p 2 is linear, then the square of the correlation coefficient ρ 2 is equal to the product of the respective coefficients of x and y in the conditional expectations. ( ) ( ) ρ 2 p2 p p p 2 p p 2 ( p )( p 2 ), from which it follows that p p 2 ρ ( p )( p 2 ).89. The fact that the correlation coefficient is negative follows from the fact that, for example, E(Y x) µ Y + ρ σ Y σ X (x µ X ), and noting that the coefficient of x in E(Y x) is seen to be negative (and also σ Y, σ X > ).. Let X have the uniform distribution U(, 2) and let the conditional distribution of Y, given that X x, be U(, x). Find the joint p.d.f. f(x, y) of X and Y, and be sure to state the domain of f(x, y). Find E(Y x). Solution: We have that f(x, y) y, < x 2, y x. x Now, E(Y x) x y y x dx 3x y3 x x2 3.

5 5. The support of a random variable X is the set of x-values such that f(x). Given that X has p.d.f. f(x) x 2 /3, < x < 2, what is the support of X 2? Find the p.m.f. of the random variable Y X 2. Solution: The p.d.f. g(y) of Y X 2 is obtained as follows. We note that the possible y-values that can be obtained are in the range y, so the support of g(y) needs to be the interval [, ]. Now, on the interval [, ], there is a one-to-one transformation represented by x y. We first find G(y) P (Y y) P (X 2 y) P (X y) for X x in [, 2], corresponding to Y y in [, ]. We have G(y) y f(x) dx y x 2 3 dx, and in particular g(y) G (y) ( y) 2 ( y), from the chain rule and the Fundamental 3 Theorem of Calculus. Simplifying, we have g(y) y, < y <. 6 In order to find g(y) on < y <, we need to work a little harder. For x in the interval < x < there is a two-to-one transformation given by x y for < x <, and x y for < x <. We then calculate G(y) for < y < (i.e., < x < ) as before, but now using two integrals G(y) P (Y y) P (X 2 y) P ( y < X < ) + P ( < X < y), so for y in the interval < y < we have G(y) y f(x) dx + y f(x) dx. Again, from the chain rule and the Fundamental Theorem of Calculus we have g(y) G (y) f( y) ( y) + f( y) ( y) y 3 2 y + y 3 2 y y 3. So, g(y) { y/3 if < y <, y/6 if < y <. There is no problem defining g() g(), or even just leaving the p.d.f. undefined at the points y and y.

6 6. Let X, X 2 denote two independent random variables each with the χ 2 (2) distribution. Find the joint p.d.f. of Y X and Y 2 X + X 2. What is the support of Y, Y 2 (i.e., what is the domain of the joint p.d.f., where f(y, y 2 ) )? Are Y and Y 2 independent? Solution: We have that X, X 2 have the same p.d.f. h(x) 2 e x/2, x <, corresponding to the χ 2 (r) distribution with r 2 degrees of freedom. By the way, this is the same as saying that X, X 2 follow exponential distributions with θ 2. Since X, X 2 are independent, the joint p.d.f. of X and X 2 is f(x, x 2 ) h(x )h(x 2 ) e (x +x 2 )/2. The change of variables formula is g(y, y 2 ) J f(v (y, y 2 ), v 2 (y, y 2 )) using the determinant of the Jacobian v (x, x 2 ) v (x, x 2 ) y y 2 J v 2 (x, x 2 ) v 2 (x, x 2 ) y y 2 where v i (y, y 2 ) is the inverse of u i, Y i u i (X, X 2 ), i, 2. In this case, Y u (X, X 2 ) X, so X v (Y, Y 2 ) Y, and Y 2 u 2 (X, X 2 ) X + X 2, so X 2 v 2 (Y, Y 2 ) Y 2 Y. Hence, J, so the joint p.d.f. of Y and Y 2 is g(y, y 2 ) f(y, y 2 y ) e y 2/2, y y 2 <. To determine if Y and Y 2 are independent we compute the marginal p.d.f. s. We have, and g 2 (y 2 ) g (y ) y2 y e y 2/2 dy 2 e y 2/2 dy y 2 e y 2/2, 2 e y 2/2 Since, g(y, y 2 ) g (y )g 2 (y 2 ), Y and Y 2 are dependent. y 2 e y /2.