UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

Similar documents
Continuous Random Variables

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

UNIT-2: MULTIPLE RANDOM VARIABLES & OPERATIONS

Random Signals and Systems. Chapter 3. Jitendra K Tugnait. Department of Electrical & Computer Engineering. Auburn University.

Chapter 4 Multiple Random Variables

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

STAT 430/510 Probability

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

Chapter 5: Joint Probability Distributions

ECE 5615/4615 Computer Project

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

Copula Regression RAHUL A. PARSA DRAKE UNIVERSITY & STUART A. KLUGMAN SOCIETY OF ACTUARIES CASUALTY ACTUARIAL SOCIETY MAY 18,2011

ENGG2430A-Homework 2

Conditional distributions (discrete case)

Turbulent Flows. U (n) (m s 1 ) on the nth repetition of a turbulent flow experiment. CHAPTER 3: THE RANDOM NATURE OF TURBULENCE

Multiple Random Variables

MATH/STAT 395 Winter 2013 This homework is due at the beginning of class on Friday, March 1.

Joint p.d.f. and Independent Random Variables

HW Solution 12 Due: Dec 2, 9:19 AM

Multivariate Random Variable

ECE Homework Set 3

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

2 Functions of random variables

Bivariate distributions

Chapter 5,6 Multiple RandomVariables

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Check Your Understanding of the Lecture Material Finger Exercises with Solutions

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Probability Foundation for Electrical Engineers Prof. Krishna Jagannathan Department of Electrical Engineering Indian Institute of Technology, Madras

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

MAT 4872 Midterm Review Spring 2007 Prof. S. Singh

ECE-340, Spring 2015 Review Questions

[POLS 8500] Review of Linear Algebra, Probability and Information Theory

STAT 430/510: Lecture 15

Sampling Distributions

More than one variable

PROBABILITY AND STOCHASTIC PROCESSES A Friendly Introduction for Electrical and Computer Engineers

STAT/MATH 395 PROBABILITY II

Multivariate random variables

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Properties of the Autocorrelation Function

216 If there are three color then the last ball must be green. Considering whatever first or second ball are red we get P (X = 3) = = 42

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Probability and Stochastic Processes

Random Processes Why we Care

UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011

Multivariate random variables

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

Functions of two random variables. Conditional pairs

Chapter 5 continued. Chapter 5 sections

The story of the film so far... Mathematics for Informatics 4a. Jointly distributed continuous random variables. José Figueroa-O Farrill

9.07 Introduction to Probability and Statistics for Brain and Cognitive Sciences Emery N. Brown

1. Point Estimators, Review

Bivariate Distributions. Discrete Bivariate Distribution Example

Introduction to Probability and Stocastic Processes - Part I

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs

Lecture 11. Probability Theory: an Overveiw

Course on Inverse Problems

ECE531: Principles of Detection and Estimation Course Introduction

EE4601 Communication Systems

A Probability Review

STA 256: Statistics and Probability I

Brief Review of Probability

II&Ij <Md Tmlaiiiiiit, aad once in Ihe y a w Teataa m i, the vmb thatalmta oot Uiaapirit world. into as abode or wotld by them- CooTBOtioa

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

EXAM # 3 PLEASE SHOW ALL WORK!

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Let X and Y denote two random variables. The joint distribution of these random

Probability, Random Processes and Inference

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Multiple Random Variables

ECE 353 Probability and Random Signals - Practice Questions

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER.

4 Pairs of Random Variables

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

MAHALAKSHMI ENGINEERING COLLEGE TIRUCHIRAPALLI

Review of Probability Theory

ECE 4400:693 - Information Theory

Neatest and Promptest Manner. E d i t u r ami rul)lihher. FOIt THE CIIILDIIES'. Trifles.

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

Generalized Linear Models

Definition of a Stochastic Process

First Year Examination Department of Statistics, University of Florida

Solutions to Homework Set #6 (Prepared by Lele Wang)

ECE Lecture #10 Overview

Review: mostly probability and some statistics

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

Name of the Student: Problems on Discrete & Continuous R.Vs

Gaussian random variables inr n

Probability and Statistics for Final Year Engineering Students

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

6.041/6.431 Fall 2010 Quiz 2 Solutions

Chapter 2: Random Variables

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

Transcription:

UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y respectively. and The probability of a joint event {,} is the joint probability distribution function,, defined as,,, The joint probability density function,, may be defined as the second derivative of the joint probability distribution function.,, 2. A joint probability density function is,, f(x,y) = for 0 < x < a, 0 < y < b = 0 elsewhere. Find the joint probability distribution function., for 0 < x < a, 0 < y < b = 0 elsewhere,, = GRIET ECE 1

=, 0 0; 0 = 0, 0 = 1 ; 3. (a) Let X and Y be random variable with the joint density function., 1 Find marginal and conditional density functions. (b) Distinguish between joint distribution and marginal distribution. (a) 2 21 2 2,, 1 0 1 (b) If (X,Y) is a two dimensional discrete random variable such that, then is called probability mass function or probability function of (X,Y) provided the following conditions are satisfied 1. 0 2. 1 The set of triples,,, i=1, 2,.m, j=1, 2,..n is called joint probability distribution of (X,Y). X= )= is called marginal probability function of X. It is defined for X=, and denotes as. The collection of pairs,, i=1,2,3,.. is called the marginal probability distribution of X. GRIET ECE 2

4. Distinguish between point conditioning and interval conditioning? We know that the conditional probability of A given B is written as This concept can very well applied to conditional distribution function also. Let A be an event (X x) for the random variable X. If B is given, the conditional distribution function of the random variable X is The event B can be defined from some characteristic of the physical experiment. It may be defined in terms of the random variable X or some other random variable other than X. If the event B is defined in terms of another variable Y i.e., if the random variable X is conditioned by a second random variable Y where (Y < y) it is called point conditioning. Provided, P(Y y) 0 =, If the random variable Y is defined in such a way that its value lies between two constants and. then this is called interval conditioning. The corresponding conditional density function is given by =,, 0 If X is given, =,, 0 GRIET ECE 3

5. Let X and Y be jointly continuous random variables with joint density function, 0, 0 = 0 otherwise Check whether X and Y are independent. Find,, exp 0,0 = 0 otherwise, Let Similarly,, Since,., are independent. (i) 1,1, =. Consider Let = Upper limit for Lower limit for 0 1 Similarly 1 1,11 GRIET ECE 4

6.If X and Y are two random variables which are Gaussian, if a random variable Z is defined as Z =X + Y, Find. Let X and Y be two normalized Gaussian random variables i.e., 1 and 0 and =. = =. exp 2 2 = exp 2 2 = exp 2 =.. exp 2 Let 2 2... =. From the property of Gaussian pdf,. 1.. So, Z is also a Gaussian Random Variable with zero mean and variance=2. GRIET ECE 5

7. Two independent random variables X and Y have the probability density functions respectively, 0;,; = 0 otherwise. Calculate the probability distribution and density functions of the random variable Z = X + Y.. 1 for 0. for 0 1 1 Consider =.. 1 =... 1.. = 1. = 1 =. 1 = 1 1 0 Consider 1 = = 1 1 = 1 1 1 = 1 1 0 GRIET ECE 6

11 0 1 = 1 1 8. If,,. where X and Y are two random variables, if Z=X + Y, find. We know that p.d.f of two random variables is the convolution of p.d.f s of individual random variables. = = But 0.5 =. =. Thus and Assuming that the random variable x changes between α and +α. We get =.. = =..2 = GRIET ECE 7

9. Explain how to determine PDF of sum of two random variables. Let X and Y be the two random variables. Let Z be the another random variable which is sum of X and Y i.e., Z =X + Y Let, be the joint probability density function of X and Y. Let Z be a particular value belongs to the random variable Z. The sample space of the random variable Z can be represented by using the X-Y plane as shown in fig 4.9.1 below. The density function of Z can be determined by assuming any one of the random variables, say X is fixed i.e., X may take any value between - to +. The CDF of the random variable Z is given by =,, GRIET ECE 8

=, Since X and Y are independent,. Convolution of any two density functions like X and Y is a density function of Z. Therefore the density function of Z is given by the convolution of density function of X and Y. 10. Explain central limit theorem. State and prove central limit theorem. The central limit theorem states that the random variable X which is the sum of large number of random variables always approaches the Gaussian distribution irrespective of the type of distribution each variable process and their amount of contribution into the sum. Let us consider two random variables X 1 and X 2 as shown in figure 4.10.1 (a) and (b). The new random variable X 3 = X 1 + X 2 gives the triangular density is convolved with a rectangular density of figure 4.10.1 (a) to create new random variable xy, the xz result approximately follows the Gaussian distribution as shown in figure 4.10.1 (d). GRIET ECE 9

GRIET ECE 10