Multivariate distributions

Similar documents
Continuous distributions

2 Functions of random variables

STAT 430/510: Lecture 15

CHAPTER 8. Normal distribution Introduction. A synonym for normal is Gaussian. The first thing to do is show that this is a density.

3 Multiple Discrete Random Variables

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT 430/510 Probability

Continuous r.v practice problems

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

STA 256: Statistics and Probability I

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Introduction to Statistical Inference Self-study

Bivariate distributions

Notes for Math 324, Part 19

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

1 Random Variable: Topics

5. Conditional Distributions

Mathematics 426 Robert Gross Homework 9 Answers

conditional cdf, conditional pdf, total probability theorem?

Continuous Random Variables

4 Pairs of Random Variables

Homework 10 (due December 2, 2009)

Lecture 1: August 28

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

Probability and Distributions

STAT Chapter 5 Continuous Distributions

Random Variables and Their Distributions

Chapter 5 Joint Probability Distributions

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Physics 403 Probability Distributions II: More Properties of PDFs and PMFs

Review: mostly probability and some statistics

General Random Variables

Chapter 4 Multiple Random Variables

Recitation 2: Probability

STAT 430/510: Lecture 16

1 Presessional Probability

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Formulas for probability theory and linear models SF2941

MAS223 Statistical Inference and Modelling Exercises

SDS 321: Introduction to Probability and Statistics

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

1 Review of Probability and Distributions

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Conditional distributions. Conditional expectation and conditional variance with respect to a variable.

More on Distribution Function

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

2 (Statistics) Random variables

7 Random samples and sampling distributions

1 Review of di erential calculus

1 Joint and marginal distributions

Continuous Distributions

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Statistics 100A Homework 5 Solutions

Homework 5 Solutions

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Discrete Random Variables

Problem 1. Problem 2. Problem 3. Problem 4

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

3 Conditional Expectation

SDS 321: Introduction to Probability and Statistics

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

ENGG2430A-Homework 2

Chapter 5. Random Variables (Continuous Case) 5.1 Basic definitions

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

1 Solution to Problem 2.1

M378K In-Class Assignment #1

Random Variables and Probability Distributions

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Quick Tour of Basic Probability Theory and Linear Algebra

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Ch. 5 Joint Probability Distributions and Random Samples

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Review of Probability Theory

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Homework 9 (due November 24, 2009)

Stochastic Models of Manufacturing Systems


Lecture 2: Repetition of probability theory and statistics

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Continuous Random Variables

Chapter 2: Random Variables

Probability Review. Gonzalo Mateos

Random Variables (Continuous Case)

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 1

Statistics for scientists and engineers

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

3. Probability and Statistics

18.440: Lecture 28 Lectures Review

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

1.1 Review of Probability Theory

Transcription:

CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density p(x, y) P(X x, Y y). Remember that here the comma means and."" In the continuous case a density is a function such that P(a X b, c Y d) ˆ b ˆ d a c f(x, y)dy dx. Example.: If f X,Y (x, y) ce x e y for < x < and x < y <, what is c? Answer. We use the fact that a density must integrate to. So Recalling multivariable calculus, this is so c 6. x ce x e y dy dx. ce x e x dx c 6, The multivariate distribution function of (X, Y ) is defined by F X,Y (x, y) P(X x, Y y). In the continuous case, this is and so we have F X,Y (x, y) ˆ x ˆ y f(x, y) F (x, y). x y The extension to n random variables is exactly similar. We have or P(a X b, c Y d) ˆ ˆ P((X, Y ) D) f X,Y (x, y)dy dx, ˆ b ˆ d a D c f X,Y (x, y)dy dx, f X,Y dy dx

. MULTIVARIATE DISTRIBUTIONS when D is the set (x, y) : a x b, c y d}. One can show this holds when D is any set. For example, ˆ ˆ P(X < Y ) f X,Y (x, y)dy dx. x<y} If one has the joint density f X,Y (x, y) of X and Y, one can recover the marginal densities of X and of Y : f X (x) f X,Y (x, y)dy, f Y (y) f X,Y (x, y)dx. If we have a binomial with parameters n and p, this can be thought of as the number of successes in n trials, and P(X k) n! k!(n k)! pk ( p) n k. If we let k k, k n k, p p, and p p, this can be rewritten as n! k!k! pk p k, as long as n k + k. Thus this is the probability of k successes and k failures, where the probabilities of success and failure are p and p, resp. A multivariate random vector is (X,..., X r ) with P(X n,..., X r n r ) n! n! n r! pn p nr r, where n + + n r n and p + p r. Thus this generalizes the binomial to more than categories. In the discrete case we say X and Y are independent if P(X x, Y y) P(X x)p(y y) for all x and y. In the continuous case, X and Y are independent if P(X A, Y B) P(X A)P(Y B) for all pairs of subsets A, B of the reals. The left hand side is an abbreviation for and similarly for the right hand side. In the discrete case, if we have independence, P(ω : X(ω) is in A and Y (ω) is in B}) p X,Y (x, y) P(X x, Y y) P(X x)p(y y) p X (x)p Y (y).

In other words, the joint density p X,Y ˆ b ˆ d a c.. INTRODUCTION 3 factors. In the continuous case, f X,Y (x, y)dy dx P(a X b, c Y d) P(a X b)p(c Y d) ˆ b a ˆ b ˆ d a f X (x)dx c ˆ d c f Y (y)dy f X (x)f Y (y) dy dx.. One can conclude from this by taking partial derivatives that f X,Y (x, y) f X (x)f Y (y), or again the joint density factors. Going the other way, one can also see that if the joint density factors, then one has independence. Example.: Suppose one has a floor made out of wood planks and one drops a needle onto it. What is the probability the needle crosses one of the cracks? Suppose the needle is of length L and the wood planks are D across. Answer. Let X be the distance from the midpoint of the needle to the nearest crack and let Θ be the angle the needle makes with the vertical. Then X and Θ will be independent. X is uniform on [, D/] and Θ is uniform on [, π/]. A little geometry shows that the needle will cross a crack if L/ > X/ cos Θ. We have f X,Θ 4 and so we have to integrate this πd constant over the set where X < L cos Θ/ and Θ π/ and X D/. The integral is ˆ π/ ˆ L cos θ/ 4 L dx dθ πd πd. If X and Y are independent, then ˆ ˆ P(X + Y a) f X,Y (x, y)dx dy ˆ ˆ ˆ x+y a} x+y a} ˆ a y f X (x)f Y (y)dx dy f X (x)f Y (y)dx dy F X (a y)f Y (y)dy. Differentiating with respect to a, we have ˆ f X+Y (a) f X (a y)f Y (y)dy. There are a number of cases where this is interesting.

4. MULTIVARIATE DISTRIBUTIONS () If X is a gamma with parameters s and λ and Y is a gamma with parameters t and λ, then straightforward integration shows that X + Y is a gamma with parameters s + t and λ. In particular, the sum of n independent exponentials with parameter λ is a gamma with parameters n and λ. () If Z is a N (, ), then F Z (y) P(Z y) P( y Z y) F Z ( y) F Z ( y). Differentiating shows that f Z (y) ce y/ (y/) (/), or Z is a gamma with parameters and. So using () above, if Z i are independent N (, ) s, then n i Z i is a gamma with parameters n/ and, i.e., a χ n. (3) If X i is a N (µ i, σi ) and the X i are independent, then some lengthy calculations show that n i X i is a N ( µ i, σi ). (4) The analogue for discrete random variables is easier. If X and Y takes only nonnegative integer values, we have r P(X + Y r) P(X k, Y r k) k r P(X k)p(y r k). k In the case where X is a Poisson with parameter λ and Y is a Poisson with parameter µ, we see that X + Y is a Poisson with parameter λ + µ. To check this, use the above formula to get r P(X + Y r) P(X k)p(y r k) k r e k λ λk µr k e µ k! e (λ+µ) r! (r k)! r r λk µ r k k k using the binomial theorem. (λ+µ) (λ + µ)r e r! Note that it is not always the case that the sum of two independent random variables will be a random variable of the same type. If X and Y are independent normals, then Y is also a normal (with E ( Y ) E Y and Var( Y ) ( ) VarY VarY ), and so X Y is also normal. To define a conditional density in the discrete case, we write p X Y y (x y) P(X x Y y).

This is equal to P(X x, Y y) P(Y y) Analogously, we define in the continuous case.. INTRODUCTION 5 f X Y y (x y) p(x, y) p Y (y). f(x, y) f Y (y). Just as in the one-dimensional case, there is a change of variables formula. Let us recall how the formula goes in one dimension. If X has a density f X and y g(x), then F Y (y) P(Y y) P(g(X) y) P(X g (y)) F X (g (y)). Taking the derivative, using the chain rule, and recalling that the derivative of g (y) is /g (y), we have f Y (y) f X (g (y)) g (y). The higher dimensional case is very analogous. Suppose Y g (X, X ) and Y g (X, X ). Let h and h be such that X h (Y, Y ) and X h (Y, Y ). (This plays the role of g.) Let J be the Jacobian of the mapping (x, x ) (g (x, x ), g (x, x )), so that J g x g x g x g x. (This is the analogue of g (y).) Using the change of variables theorem from multivariable calculus, we have f Y,Y (y, y ) f X,X (x, x ) J. Example.3: Suppose X is N (, ), X is N (, 4), and X and X are independent. Let Y X + X, Y X 3X. Then y g (x, x ) x + x, y g (x, x ) x x 3, so J 7. 3 (In general, J might depend on x, and hence on y.) Some algebra leads to x 3 7 y + 7 y, x 7 y 7 y. Since X and X are independent, f X,X (x, x ) f X (x )f X (x ) π e x / 8π e x /8. Therefore f Y,Y (y, y ) π e ( 3 7 y + 7 y ) / 8π e ( 7 y 7 y ) /8 7.

6. MULTIVARIATE DISTRIBUTIONS.. Further examples and applications Example.4: Suppose we roll two dice with sides,,,, 3, 3. Let X be the largest value obtained on any of the two dice. Let Y be the sum of the two dice. Find the joint p.m.f. of X and Y. Answer. First we make a table of all the possible outcomes. Note that individually, X,, 3 and Y, 3, 4, 5, 6. The table for possible outcomes of (X, Y ) jointly is: outcome 3 (X, Y ) (, ) (, 3) (3, 4) (, 3) (, 4) (3, 5) 3 (3, 4) (3, 5) (3, 6) Using this table we have that the p.m.f. is given by: X\Y 3 4 5 6 P (X, Y ) 9 9 9 3 9 9 9 Example.5: Let X, Y have joint pdf ce x e y, < x <, < y < f(x, y). otherwise (a) Find c that makes this a joint pdf: Answer. The region that we integrate over in the first quadrant thus Then c. (b) Find P (X < Y ). c ce x e y dxdy c [ ] e y e y dy c e y [ e x] dy c. Copyright 7 Phanuel Mariano, Patricia Alonso-Ruiz.

.. FURTHER EXAMPLES AND APPLICATIONS 7 Answer. We need to draw the region Let D (x, y) < x < y, < y < } and set up the the double integral: ˆ ˆ P (X < Y ) f(x, y)da D ˆ y work 3. (c) Set up the double integral for P (X >, Y < ) Answer. P (X >, Y < ) (d) Find the marginal f X (x): Answer. We have f X (x) ˆ e x e y dxdy e x e y dxdy ( e )e. f(x, y)dy e x [ e y e x. ] e x e y dy [ e x + ] Example.6: Let X, Y be r.v. with joint pdf f(x, y) 6e x e 3y < x <, < y <. Are X, Y independent? Answer. Find f X and f Y and see if f f X f Y. First f X (x) f Y (y) 6e x e 3y dy e x, 6e x e 3y dx 3e y. which are both exponential. Since f f X f Y then yes they are independent! Example.7: Let X, Y have Are X, Y independent? f X,Y (x, y) x + y, < x <, < y < Answer. Note that there is no way to factor x + y f X (x)f Y (y), hence they can t be independent.

8. MULTIVARIATE DISTRIBUTIONS Proposition.. If X i N (µ i, σi ) are independent for i n then X + + X n N ( ) µ + µ n, σ + + σn. In particular if X N (µ x, σ x) and Y N (µ y, σ y) then X + Y N ( µ x + µ y, σ x + σ y) and X Y N ( µ x µ y, σ x + σ y). In general for two independent Gaussian X and Y we have cx + dy N ( cµ x + dµ y, cσ x + dσ y). Example.8: Suppose T N (95, 5) and H N (65, ) represents the grades of T. and H. in their Probability class. (a) What is the probability that their average grades will be less than 9? Answer. By the proposition T + H N (6, 6). Thus ( ) T + H P 9 P (T + H 8) P ( Z Φ (.56).996 ) ( ) 8 6 8 6 Φ 6 6 (b) What is the probability that H. will have scored higher than T.? Answer. Using H T N ( 3, 6) we compute P (H > T ) P (H T > ) P (H T < ) ( P Z ( 3) ) 6 Φ(3.84).6 (c) Answer question (b) if T N (9, 64) and H N (7, 5). Answer. By the proposition T H N (, 89) and so P (H > T ) P (H T > ) P (H T < ) ( P Z ( ) ) 7 Φ(.8).9 Example.9: Suppose X, X have joint distribution x + 3 f X,X (x, x ) (x ) x, x. otherwise

.. FURTHER EXAMPLES AND APPLICATIONS 9 Find the joint pdf of Y X + X and Y X. Answer. Step : Find the Jacobian: So Step : Solve for x, x and get y g (x, x ) x + x, y g (x, x ) x. J (x, x ) 4x x x y, x y y. Step 3: The joint pdf of Y, Y is given by the formula: f Y,Y (y, y ) f X,X (x, x ) J (x, x ) f X,X ( y, y y ) 4x [ y ( 4x + 3 y ) ] y y, y y otherwise

. MULTIVARIATE DISTRIBUTIONS.3. Exercises Exercise.: Suppose that balls are chosen without replacement from an urn consisting of 5 white and 8 red balls. Let X equal if the first ball selected is white and zero otherwise. Let Y equal if the second ball selected is white and zero otherwise. (A) Find the probability mass function of X, Y. (B) Find E (XY ). (C) Is it true that E (XY ) (E X)(E Y )? (D) Are X, Y independent? Exercise.: Suppose you roll two fair dice. Find the probability mass function of X and Y, where X is the largest value obtained on any die, and Y is the sum of the values. Exercise.3: Suppose the joint density function of X and Y is f(x, y) 4 for < x < and < y <. (A) Calculate P ( < X <, 3 < Y < 4 3). (B) Calculate P (XY < ). (C) Calculate the marginal distributions f X (x) and f Y (y). Exercise.4: Find P (X < Y ). The joint probability density function of X and Y is given by f(x, y) e (x+y), x <, y <. Exercise.5: Suppose X and Y are independent random variables and that X is exponential with λ and Y is uniform on (, 5). Calculate the probability that X + Y < 8. 4 Exercise.6: Consider X and Y given by the joint density x y y x f(x, y) otherwise. (A) Find the marginal pdfs, f X (x) and f Y (x). (B) Are X and Y independent random variables? (C) Find P ( ) Y X. (D) Find P ( Y X Y ) X 4. (E) Find E [X]. Exercise.7: Consider X and Y given by the joint density 4xy x, y f(x, y) otherwise. (A) Find the joint pdf s, f X and f Y. (B) Are X and Y independent? (C) Find EY.

Exercise.8:.3. EXERCISES Consider X, Y given by the joint pdf (x + y) x, y f(x, y) 3 otherwise. Are X and Y independent random variables? Exercise.9: Suppose that gross weekly ticket sales for UConn basketball games are normally distributed with mean $,, and standard deviation $3,. What is the probability that the total gross ticket sales over the next two weeks exceeds $4, 6,? Exercise.: Suppose the joint density function of the random variable X and X is 4x x < x <, < x < f(x, x ) otherwise. Let Y X + X and Y X 3X. What is the joint density function of Y and Y? Exercise.: Suppose the joint density function of the random variable X and X is 3 f(x, x ) (x + x ) < x <, < x < otherwise. Let Y X x and Y X + 3X. What is the joint density function of Y and Y? Exercise.: We roll two dice. Let X be the minimum of the two numbers that appear, and let Y be the maximum. Find the joint probability mass function of (X, Y ), that is, P (X i, Y j): i j 3 4 5 6 3 4 5 6 Also find the marginal probability mass functions of X and Y. Finally, find the conditional probability mass function of X given that Y is 5, P (X i Y 5), for i,..., 6.

. MULTIVARIATE DISTRIBUTIONS Solution to Exercise.(A): We have Solution to Exercise.(B):.4. Selected solutions p(, ) P (X, Y ) P (RR) 8 7 3 4 39, p(.) P (X, Y ) P (W R) 5 8 3 39, p(, ) P (X, Y ) P (RW ) 8 5 3 39, p(, ) P (X, Y ) P (W W ) 5 4 3 5 39. E (XY ) P (X, Y ) 5 39.8 Solution to Exercise.(C): Not true because (E X)(E Y ) P (X ) P (Y ) ( ) 5 5 3 69.479 Solution to Exercise.(D): X and Y are not independent because P (X, Y ) 5 39 P (X ) P (Y ) ( 5 3). Solution to Exercise.: First we need to figure what values X, Y can attain. Note that X can be any of,, 3, 4, 5, 6, but Y is the sum and can only be as low as and as high as. First we make a table of possibilities for (X, Y ) given the values of the dice. Recall X is the largest of the two, and Y is the sum of them. The possible outcomes are given by: st die\nd die 3 4 5 6 (, ) (, 3) (3, 4) (4, 5) (5, 6) (6, 7) (, 3) (, 4) (3, 5) (4, 6) (5, 7) (6, 8) 3 (3, 4) (3, 5) (3, 6) (4, 7) (5, 8) (6, 9) 4 (4, 5) (4, 6) (4, 7) (4, 8) (5, 9) (6, ) 5 (5, 6) (5, 7) (5, 8) (5, 9) (5, ) (6, ) 6 (6, 7) (6, 8) (6, 9) (6, ) (6, ) (6, )

Then we make a table of the pmf p(x, y)..4. SELECTED SOLUTIONS 3 X\Y 3 4 5 6 7 8 9 3 4 5 6 Solution to Exercise.3(A): We integrate the pdf over the bounds and get ˆ ˆ 4 3 4 dydx ( ) ( 4 4 3 ) 3. 3 Solution to Exercise.3(B): We need to find the region that is within < x, y < and y <. (Try to draw the region) We get two regions from this. One with bounds x < x <, < y < and the region < x <, < y <. Then x P (XY < ) ˆ ˆ 4 dydx + ˆ + x dx + ln. Solution to Exercise.3(C): Recall that f X (x) f(x, y)dy for < x < and otherwise. By symmetry, f Y ˆ ˆ ˆ x 4 dy is the same. 4 dydx Solution to Exercise.4: Draw a picture of the region and note the integral needs to be set up in the following way: are ˆ y [ P (X < Y ) e (x+y) dxdy e y + e y] dy [ ] ( ) e y e y.

4. MULTIVARIATE DISTRIBUTIONS Solution to Exercise.5: We know that f X (x) 4 e x 4 when x otherwise and f Y (y) 3 when < y < 5 otherwise Since X, Y are independent then f X,Y f X f Y, thus f X,Y (x, y) e x 4 when x, < y < 5 otherwise Draw the region (X + Y < 8), which correspond to x, < y < 5 and y < 8 x. Drawing a picture of the region, we get the corresponding bounds of < y < 5 and < x < 4 y, so that P (X + Y < 8) ˆ 5 ˆ 5 ˆ 4 y 3 8 3 x e 4 dxdy ( e y/8 ) dx ( ) e 3 8 e 3 4 Solution to Exercise.6(A): We have 5x 4 x f X (x) otherwise, f Y (y) 3 y( y3 ) y otherwise. Solution to Exercise.6(B): No, since f X,Y f X f Y. Solution to Exercise.6(C): P ( ) Y X. 4 Solution to Exercise.6(D): Also 4. Solution to Exercise.6(E): Use f X and the definition of expected value, which is 5/6. Solution to Exercise.7(A): f X x and f Y y. Solution to Exercise.7(B): Yes! Since f(x, y) f X f Y. Solution to Exercise.7(C): We have EY y ydy 3. Solution to Exercise.8: We get f X ( 3 x + 3) while fy 3 + 4 3 y and f f Xf Y. Solution to Exercise.9: If W X + X is the sales over the next two weeks, then W is normal with mean,, +,, 4, 4, and variance 3, + 3,.

.4. SELECTED SOLUTIONS 5 Thus the variance is 3, + 3, 35, 69.93. Hence ( ) 4, 6, 4, 4, P (W > 5,, ) P Z > 35, 69.93 P (Z >.649) Φ (.6).7 Solution to Exercise.: Step : Find the Jacobian: y g (x, x ) x + x, y g (x, x ) x 3x. So J (x, x ) 7. 3 Step : Solve for x, x and get x 3 7 y + y y x 7 y 7 y Step 3: The joint pdf of Y, Y is given by the formula: f Y,Y (y, y ) f X,X (x, x ) J (x, x ) ( 3 f X,X 7 y + y y, 7 y ) 7 y 7. Since we are given the joint pdf of X and X, then plugging it these into f X,X, we have 4 (3y f Y,Y (y, y ) 7 3 + y ) (y y ) < 3y + y < 7, < y y < otherwise. Solution to Exercise.: Step : Find the Jacobian: So y g (x, x ) x x, y g (x, x ) x + 3x. J (x, x ) 7. 3

6. MULTIVARIATE DISTRIBUTIONS Step : Solve for x, x and get x 7 (3y + y ) x 7 ( y + y ) Step 3: The joint pdf of Y, Y is given by the formula: f Y,Y (y, y ) f X,X (x, x ) J (x, x ) ( f X,X 7 (3y + y ), ) 7 ( y + y ) 7. Since we are given the joint pdf of X and X, then plugging it these into f X,X, we have 3 f Y,Y (y, y ) ( 7 3 (3y + y ) + ( y + y ) ) < 3y + y < 7, < y + y < 7 otherwise.