UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y respectively. and The probability of a joint event {,} is the joint probability distribution function,, defined as,,, The joint probability density function,, may be defined as the second derivative of the joint probability distribution function.,, 2. A joint probability density function is,, f(x,y) = for 0 < x < a, 0 < y < b = 0 elsewhere. Find the joint probability distribution function., for 0 < x < a, 0 < y < b = 0 elsewhere,, = GRIET ECE 1
=, 0 0; 0 = 0, 0 = 1 ; 3. (a) Let X and Y be random variable with the joint density function., 1 Find marginal and conditional density functions. (b) Distinguish between joint distribution and marginal distribution. (a) 2 21 2 2,, 1 0 1 (b) If (X,Y) is a two dimensional discrete random variable such that, then is called probability mass function or probability function of (X,Y) provided the following conditions are satisfied 1. 0 2. 1 The set of triples,,, i=1, 2,.m, j=1, 2,..n is called joint probability distribution of (X,Y). X= )= is called marginal probability function of X. It is defined for X=, and denotes as. The collection of pairs,, i=1,2,3,.. is called the marginal probability distribution of X. GRIET ECE 2
4. Distinguish between point conditioning and interval conditioning? We know that the conditional probability of A given B is written as This concept can very well applied to conditional distribution function also. Let A be an event (X x) for the random variable X. If B is given, the conditional distribution function of the random variable X is The event B can be defined from some characteristic of the physical experiment. It may be defined in terms of the random variable X or some other random variable other than X. If the event B is defined in terms of another variable Y i.e., if the random variable X is conditioned by a second random variable Y where (Y < y) it is called point conditioning. Provided, P(Y y) 0 =, If the random variable Y is defined in such a way that its value lies between two constants and. then this is called interval conditioning. The corresponding conditional density function is given by =,, 0 If X is given, =,, 0 GRIET ECE 3
5. Let X and Y be jointly continuous random variables with joint density function, 0, 0 = 0 otherwise Check whether X and Y are independent. Find,, exp 0,0 = 0 otherwise, Let Similarly,, Since,., are independent. (i) 1,1, =. Consider Let = Upper limit for Lower limit for 0 1 Similarly 1 1,11 GRIET ECE 4
6.If X and Y are two random variables which are Gaussian, if a random variable Z is defined as Z =X + Y, Find. Let X and Y be two normalized Gaussian random variables i.e., 1 and 0 and =. = =. exp 2 2 = exp 2 2 = exp 2 =.. exp 2 Let 2 2... =. From the property of Gaussian pdf,. 1.. So, Z is also a Gaussian Random Variable with zero mean and variance=2. GRIET ECE 5
7. Two independent random variables X and Y have the probability density functions respectively, 0;,; = 0 otherwise. Calculate the probability distribution and density functions of the random variable Z = X + Y.. 1 for 0. for 0 1 1 Consider =.. 1 =... 1.. = 1. = 1 =. 1 = 1 1 0 Consider 1 = = 1 1 = 1 1 1 = 1 1 0 GRIET ECE 6
11 0 1 = 1 1 8. If,,. where X and Y are two random variables, if Z=X + Y, find. We know that p.d.f of two random variables is the convolution of p.d.f s of individual random variables. = = But 0.5 =. =. Thus and Assuming that the random variable x changes between α and +α. We get =.. = =..2 = GRIET ECE 7
9. Explain how to determine PDF of sum of two random variables. Let X and Y be the two random variables. Let Z be the another random variable which is sum of X and Y i.e., Z =X + Y Let, be the joint probability density function of X and Y. Let Z be a particular value belongs to the random variable Z. The sample space of the random variable Z can be represented by using the X-Y plane as shown in fig 4.9.1 below. The density function of Z can be determined by assuming any one of the random variables, say X is fixed i.e., X may take any value between - to +. The CDF of the random variable Z is given by =,, GRIET ECE 8
=, Since X and Y are independent,. Convolution of any two density functions like X and Y is a density function of Z. Therefore the density function of Z is given by the convolution of density function of X and Y. 10. Explain central limit theorem. State and prove central limit theorem. The central limit theorem states that the random variable X which is the sum of large number of random variables always approaches the Gaussian distribution irrespective of the type of distribution each variable process and their amount of contribution into the sum. Let us consider two random variables X 1 and X 2 as shown in figure 4.10.1 (a) and (b). The new random variable X 3 = X 1 + X 2 gives the triangular density is convolved with a rectangular density of figure 4.10.1 (a) to create new random variable xy, the xz result approximately follows the Gaussian distribution as shown in figure 4.10.1 (d). GRIET ECE 9
GRIET ECE 10