More than one variable

Similar documents
Random variables (discrete)

Properties of Summation Operator

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 4 continued. Chapter 4 sections

Continuous Random Variables

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

UCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)

01 Probability Theory and Statistics Review

18.440: Lecture 26 Conditional expectation

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Bivariate Distributions

Expectation of Random Variables

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Review of Probability. CS1538: Introduction to Simulations

6.041/6.431 Fall 2010 Quiz 2 Solutions

Joint Distribution of Two or More Random Variables

2 (Statistics) Random variables

Bivariate Paired Numerical Data

STOR Lecture 16. Properties of Expectation - I

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

UCSD ECE153 Handout #27 Prof. Young-Han Kim Tuesday, May 6, Solutions to Homework Set #5 (Prepared by TA Fatemeh Arbabjolfaei)

Math 510 midterm 3 answers

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Multivariate probability distributions and linear regression

1 Probability theory. 2 Random variables and probability theory.

Lecture 22: Variance and Covariance

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Lecture 13 (Part 2): Deviation from mean: Markov s inequality, variance and its properties, Chebyshev s inequality

5 CORRELATION. Expectation of the Binomial Distribution I The Binomial distribution can be defined as: P(X = r) = p r q n r where p + q = 1 and 0 r n

M378K In-Class Assignment #1

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Homework 5 Solutions

5 Operations on Multiple Random Variables

conditional cdf, conditional pdf, total probability theorem?

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

Appendix A : Introduction to Probability and stochastic processes

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

Random Variables and Expectations

Notes 12 Autumn 2005

Probability Theory and Statistics. Peter Jochumzen

Homework 4 Solution, due July 23

Solutions to Homework Set #6 (Prepared by Lele Wang)

MATH Notebook 4 Fall 2018/2019

Solutions to Homework Set #5 (Prepared by Lele Wang) MSE = E [ (sgn(x) g(y)) 2],, where f X (x) = 1 2 2π e. e (x y)2 2 dx 2π

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Notes for Math 324, Part 19

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Bivariate distributions

Final Examination Solutions (Total: 100 points)

Probability Review. Chao Lan

1 Random Variable: Topics

Lecture 2: Repetition of probability theory and statistics

1 Basic continuous random variable problems

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

ECE Homework Set 3

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Bayesian statistics, simulation and software

Final Review: Problem Solving Strategies for Stat 430

Homework 9 (due November 24, 2009)

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

Let X and Y denote two random variables. The joint distribution of these random

Deviations from the Mean

Algorithms for Uncertainty Quantification

180B Lecture Notes, W2011

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

HW1 (due 10/6/05): (from textbook) 1.2.3, 1.2.9, , , (extra credit) A fashionable country club has 100 members, 30 of whom are

Statistical Methods in Particle Physics

Tom Salisbury

Communication Theory II

18.440: Lecture 28 Lectures Review

Econ 371 Problem Set #1 Answer Sheet

Multivariate Random Variable

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

1 Presessional Probability

Chapter 4 Multiple Random Variables

ENGG2430A-Homework 2

Lecture 4 : Random variable and expectation

CME 106: Review Probability theory

Lecture 2: Review of Probability

Lecture 11. Probability Theory: an Overveiw

Multiple Random Variables

1 Random variables and distributions

Jointly Distributed Random Variables

Joint Gaussian Graphical Model Review Series I

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Conditional distributions (discrete case)

Discrete Probability Refresher

Chapter 5 Class Notes

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

3 Multiple Discrete Random Variables

Math Spring Practice for the final Exam.

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

Expectation, inequalities and laws of large numbers

Homework 10 (due December 2, 2009)

Transcription:

Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to be denoted by f X,Y, is defined by: f X,Y (x j,y j ) P(X x j,y y j ) and f X,Y (x,y) 0 when (x,y) (x j,y j ) (i.e., at least one of x or y is not equal to x j or y j, respectively). The marginal distribution of X is defined by the probability function P(X x i ) j P(X x i,y y j ). P(Y y j ) i P(X x i,y y j ). Note that P(X x i ) 0 and i P(X x i) 1. The mean and variance of X can be defined in the usual way. The conditional distribution of X given Y y j is defined by the probability function P(X x i Y y j ) P(X x i,y y j ). P(Y y j ) The conditional mean of X given Y y j is defined by E[X Y y j ] i x i P(X Y y j ). and similarly for the variance: V ar[x Y y j ] E(X 2 Y y j ) (E(X Y y j )) 2. Although the E[X Y y j ] depends on the particular values of Y, it turns out that its average does not, and, indeed, is the same as the E[X]. More precisely, it holds: E[E(X Y )] E[X] and E[E(Y X)] E[Y ]. 1

CHAPTER. MORE THAN ONE VARIABLE 2 That is, the expectation of the conditional expectation of X is equal to its expectation, and likewise for Y. The covariance of X and Y is defined by where cov[x,y ] E[(X E[X])(Y E[Y ])] E[XY ] E[X]E[Y ] E[XY ] i x i y j P(X x i,y y j ). j The result obtained next provides the range of values of the covariance of two r.v.s; it is also referred to as a version of the CauchySchwarz inequality. Theorem.1 Cauchy Schwarz inequality 1. Consider the r.v.s X and Y with E[X] E[Y ] 0 and V ar[x] V ar[y ] 1. Then always 1 E[XY ] 1, and E[XY ] 1 if and only if P(X Y ) 1, and E[XY ] 1 if and only if P(X Y ) 1. 2. For any r.v.s X and Y with finite expectations and positive variances σx 2 and σy 2, it always holds: σ Xσ Y Cov(X,Y ) σ X σ Y, and Cov(X,Y ) σ X σ Y if and only if P[Y E[Y ]+ σ Y σ X (X E[X])] 1, Cov(X,Y ) σ X σ Y if and only if P[Y E[Y ] σ Y σ X (X EX)] 1. The correlation coefficient between X and Y is defined by ( )( ) corr[x,y ] E[ X E[X] Y E[Y ] σ X σ Y ] Cov[X,Y ] E[XY ] E[X]E[Y ] σ X σ Y σ X σ Y. The correlation always lies between 1 and +1. Example.1 Let X and Y be two r.v.s with finite expectations and equal (finite) variances, and set U X + Y and V X Y. Calculate if r.v.s U and V are correlated. E[UV ] E[(X + Y )(X Y )] E(X 2 Y 2 ) E[X 2 ] E[Y 2 ] E[U]E[V ] [E(X + Y )][E(X Y )] (E[X] + E[Y ])(E[X] E[Y ]) (E[X]) 2 (E[Y ]) 2 Cov(U,V ) E[UV ] E[U]E[V ] (E[X 2 ] E[X] 2 ) (E[Y 2 ] E[Y ] 2 ) V ar(x) V ar(y ) 0 U and V are uncorrelated. For two r.v.s X and Y with finite expectations, and (positive) standard deviations σ X and σ Y, it holds:

CHAPTER. MORE THAN ONE VARIABLE 3 and V ar(x + Y ) σ 2 X + σ 2 Y + 2Cov(X,Y ) V ar(x + Y ) σ 2 X + σ 2 Y if X and Y are uncorrelated. Proof V ar(x + Y ) E[(X + Y ) E(X + Y )] 2 E[(X E[X]) + E(Y E[Y ])] 2 E(X E[X]) 2 + E(Y E[Y ]) 2 + 2E[(X E[X])(Y E[Y ])] σ 2 X + σ 2 Y + 2Cov(X,Y ). Random variables X and Y are said to be independent if P(X x i,y y j ) P(X x i )P(Y y j ). If X and Y are independent then Cov[X,Y ] 0. The converse is NOT true. There exist many pairs of random variables with Cov[X,Y ] 0 which are not independent. Example.2 A fair dice is thrown three times. The result of first throw is scored as X 1 1 if the dice shows 5 or and X 1 0 otherwise; X 2 and X 3 are scored likewise for the second and third throws. Let Y 1 X 1 + X 2 and Y 2 X 1 X 3. Show that P(Y 1 0,Y 2 1) 4. Calculate the remaining probabilities in the bivariate distribution of the pair (Y 1,Y 2 ) and display the joint probabilities in an appropriate table. 1. Find the marginal probability distributions of Y 1 and Y 2. 2. Calculate the means and variances of Y 1 and Y 2. 3. Calculate the covariance of Y 1 and Y 2. 4. Find the conditional distribution of Y 1 given Y 2 0. 5. Find the conditional mean of Y 1 given Y 2 0.

CHAPTER. MORE THAN ONE VARIABLE 4 P(X 1 1) P({5, }) 1, P(X 3 2 1) 1, P(X 3 3) 1 3 For Y 1 to be 0, X 1 and X 2 must be 0. Then Y 2 to be 1, X 3 must be 1. P(Y 1 0,Y 2 1) P(X 1 0)P(X 2 0)P(X 3 1) 2 2 1 4 3 3 3 Y 1 0 1 2 4 2-1 8 1 Y 2 0 4 2 1 0 12 0 12 3 1 1. Marginal probability distribution of Y 1 : y 1 0 1 2 12 12 3 P(Y 1 y 1 ) Marginal probability distribution of Y 2 : 2. y 2-1 0-1 P(Y 2 y 2 ) E[Y 1 ] 0 12 + 1 12 + 2 3 2 3 E[Y1 2 ] 0 2 12 + 12 12 + 22 3 8 9 V ar[y 1 ] E[Y 2 1 ] (E[Y 1 ]) 2 8 9 ( 2 3 ) 2 4 9 E[Y 2 ] 1 + 0 + 1 0 E[Y 2 2 ] ( 1) 2 + 02 + 12 4 9 V ar[y 2 ] E[Y 2 2 ] (E[Y 2 ]) 2 4 9 3. Cov[Y 1,Y 2 ] E[Y 1 Y 2 ] E[Y 1 ]E[Y 2 ] E[Y 1 Y 2 ] 1 ( 1) 2 +1 1 4 + 2 1 2 2 9

CHAPTER. MORE THAN ONE VARIABLE 5 4. P(Y 1 0 Y 2 0) P(Y 1 0 Y 2 0) P(Y 2 0) P(Y 1 1 Y 2 0) P(Y 1 1 Y 2 0) P(Y 2 0) P(Y 1 2 Y 2 0) P(Y 1 2 Y 2 0) P(Y 2 0) 8 1 8, 1 5. E[Y 1 Y 2 0] 1 + 2 1 8 Exercises Exercise.1 The random variables X and Y have a joint probability function given by { c(x f(x,y) 2 y + x) x-2,-1,0,1,2 y1,2,3 0 otherwise Determine the value of c. Find P(X > 0) and P(X + Y 0) Find the marginal distributions of X and Y. Find E[X] and V ar[x]. Find E[Y ] and V ar[y ]. Find the conditional distribution of X given Y 1 and E[X Y 1]. Find the probability function for Z X + Y and show that E[Z] E[X] + E[Y ] Find Cov[X,Y ] and show that V ar[z] V ar[x] + V ar[y ] + 2Cov[X,Y ]. Find the correlation between X and Y. Are X and Y independent? Table for the joined probabilities:

CHAPTER. MORE THAN ONE VARIABLE X -2-1 0 1 2 1 2c 0 0 2c c 10c Y 2 c c 0 3c 10c 20c 3 10c 2c 0 4c 14c 30c 18c 3c 0 9c 30c 0c Since the sum of probabilities must add to one, c 1. 0 P(X > 0) 39 0 P(X + Y 0) P(X 2,Y 2) + P(X 1,Y 1) 1 10 Marginal distributions for X: x -2-1 0 1 2 P(X x) 18/0 3/0 0 9/0 30/0 Marginal distributions for Y : y 1 2 3 P(Y y) 10/0 20/0 30/0 E[X] 2 18/0 1 3/0 + 0 0 + 1 9/0 + 2 30/0 30/0 1/2 E[X 2 ] ( 2) 2 18/0 + ( 1) 2 3/0 + 0 0 + 1 2 9/0 + 2 2 30/0 3.4 V ar[x] E[X 2 ] E[X] 2 3.4 0.5 2 3. E[Y ] 1 1/ + 2 1/3 + 3 1/2 14/ 7/3 E[Y 2 ] 1 2 1/ + 2 2 1/3 + 3 2 1/2 3/.0 V ar[y ].0 (7/3) 2 5/9 P(X 2 Y 1) 0.2, P(X 1 Y 1) 0, P(X 0 Y 1) 0, P(X 1 Y 1) 0.2, P(X 2 Y 1) 0. E[X Y 1] 2 0.2 1 0 + 0 0 + 1 0.2 + 2 0. 1 Z X + Y z -1 0 1 2 3 4 5 P(Z z) ( 2/0 /0 11/0 4/0 9/0 14/0 14/0 E[Z] 1 0 1 2 + 1 11 + 2 4 + 3 9 + 4 14 + 5 14) 170 2 5 1 + 2 21 E[X] + E[Y ] 3 0 E[X,Y ] 2 1 2/0 2 2 /0 2 3 10/0 1 2 1/0 1 3 2/0 + 1 1 2/0+1 2 3/0+1 3 4/0+2 1 /0+2 2 10/0+2 3 14/0 1 Cov[X,Y ] E[X,Y ] E[X]E[Y ] 1/ E[Z 2 ] 1 84 (1 2 + 1 11 + 4 4 + 9 9 + 1 14 + 25 14) 0 0 V ar[z] 84 ( 170 3.3722 0 0 V ar[x] + V ar[y ] + 2Cov[X,Y ] 3. + 5/9 2 1/ 3.3722 corr[x,y ] Cov[X,Y ] 1/ 0.12 V ar[x]v ar[y ] 3. 5/9

CHAPTER. MORE THAN ONE VARIABLE 7 X and Y are independent: P(X 1,Y 1) P(X 1)P(Y 1). Exercise.2 The following experiment is carried out. Three fair coins are tossed. Any coins showing heads are removed and the remaining coins are tossed. Let X be the number of heads on the first toss and Y the number of heads on the second toss. Note that if X 3 then Y 0. Find the joint probability function and marginal distributions of X and Y. We have that P(Y y,x x) P(Y y X x)p(x x). Suppose X 0, this has a probability 0.5 3. Then Y X 0 has a Binomial distribution with parameters n 3 and p 0.5. Similarly Y X 1 has a Binomial distribution with parameters n 2 and p 0.5. In this way we see we can produce a table of the joint probabilities: X 0 1 2 3 0 1/4 /4 12/4 8/4 /4 Y 1 5/4 12/4 12/4 0 /4 2 3/4 /4 0 0 9/4 3 1/4 0 0 0 1/4 1/8 3/8 3/8 1/8 1 Marginal distribution for X: x 0 1 2 3 P(X x) 1/8 3/8 3/8 1/8 Marginal distribution for Y : y 0 1 2 3 P(Y y) /4 /4 9/4 1/4