Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Similar documents
Properties of Summation Operator

Chapter 4. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

ECON Fundamentals of Probability

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Random Variables and Expectations

2 (Statistics) Random variables

Jointly Distributed Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Expectation and Variance

Joint Distribution of Two or More Random Variables

MAS113 Introduction to Probability and Statistics. Proofs of theorems

LIST OF FORMULAS FOR STK1100 AND STK1110

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

STAT 430/510: Lecture 16

Covariance and Correlation

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Problem Set #5. Econ 103. Solution: By the complement rule p(0) = 1 p q. q, 1 x 0 < 0 1 p, 0 x 0 < 1. Solution: E[X] = 1 q + 0 (1 p q) + p 1 = p q

Chp 4. Expectation and Variance

4. Distributions of Functions of Random Variables

STAT/MATH 395 PROBABILITY II

ACM 116: Lectures 3 4

STAT Chapter 5 Continuous Distributions

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

SDS 321: Introduction to Probability and Statistics

More on Distribution Function

Discrete Probability Refresher

Stochastic Models of Manufacturing Systems

Bivariate distributions

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

6.041/6.431 Fall 2010 Quiz 2 Solutions

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Stat 5101 Notes: Algorithms

Formulas for probability theory and linear models SF2941

CMPSCI 240: Reasoning Under Uncertainty

Probability and Distributions

Notes for Math 324, Part 19

Regression and Covariance

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

3 Multiple Discrete Random Variables

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Chapter 5 continued. Chapter 5 sections

STA 256: Statistics and Probability I

Lecture 2: Review of Probability

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Lecture 10. Variance and standard deviation

CS70: Jean Walrand: Lecture 22.

6 The normal distribution, the central limit theorem and random samples

The Binomial distribution. Probability theory 2. Example. The Binomial distribution

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

STT 441 Final Exam Fall 2013

ORF 245 Fundamentals of Statistics Great Expectations

Continuous Random Variables

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Lecture 2: Repetition of probability theory and statistics

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

Joint Probability Distributions, Correlations

Mathematics 426 Robert Gross Homework 9 Answers

Econ 325: Introduction to Empirical Economics

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Properties of Random Variables

ORF 245 Fundamentals of Statistics Chapter 4 Great Expectations

18.440: Lecture 26 Conditional expectation

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

CMPSCI 240: Reasoning Under Uncertainty

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Econ 325: Introduction to Empirical Economics

Class 8 Review Problems solutions, 18.05, Spring 2014

Lecture 16 : Independence, Covariance and Correlation of Discrete Random Variables

Functions of two random variables. Conditional pairs

Final Exam. Economics 835: Econometrics. Fall 2010

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Lecture 11. Probability Theory: an Overveiw

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Stochastic Simulation Introduction Bo Friis Nielsen

Random Variables and Their Distributions

Bivariate Distributions

ENGG2430A-Homework 2

Review: mostly probability and some statistics

Econ 2120: Section 2

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

2. Discrete Random Variables Part II: Expecta:on. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

Notes 12 Autumn 2005

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review

Probability and Statistics Notes

Lecture 1: August 28

Ch. 5 Joint Probability Distributions and Random Samples

Transcription:

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula by Hiro Kasahara Vancouver School of Economics University of British Columbia Hiro Kasahara (UBC) Econ 325 1 / 28

Discrete Random Variables: X and Y Let X and Y be two discrete random variables. X takes n possible values: {x 1,..., x n } Y takes m possible values: {y 1,..., y m }. Hiro Kasahara (UBC) Econ 325 2 / 28

Notations: The joint probability mass function is given by p X,Y ij = P (X = x i, Y = y j ), i = 1,... n; j = 1,..., m. The marginal probability mass function of X is p X i = P (X = x i ) = m j=1 p X,Y ij, i = 1,... n, and the marginal probability mass function of Y is p Y j = P (Y = y j ) = n p X,Y ij, j = 1,... m. Hiro Kasahara (UBC) Econ 325 3 / 28

Notations: Table: Example of Joint Distribution of X and Y Y = y 1 Y = y 2 Y = y 3 Marg. prob. of X X = x 1 p X,Y 11 p X,Y 12 p X,Y 13 p1 X X = x 2 p X,Y 21 p X,Y 22 p X,Y 23 Marg. prob. of Y p1 Y p2 Y p3 Y p2 X 1.00 P (X = x 1 ) = 3 j=1 p X,Y 1j = p X,Y 11 + p X,Y 12 + p X,Y 13 = p X 1. Hiro Kasahara (UBC) Econ 325 4 / 28

Notations: Table: Example of Joint Distribution of X and Y Y = 30 Y = 60 Y = 100 Marg. Prob. of X X = 0 0.24 0.12 0.04 0.40 X = 1 0.12 0.36 0.12 0.60 Marg. Prob. of Y 0.36 0.48 0.16 1.00 P(X = 0) = 0.24 + 0.12 + 0.04 = 0.40. Hiro Kasahara (UBC) Econ 325 5 / 28

Proposition If a and b are constants, then E(a + bx) = a + be(x). Proof: E(a + bx) n def = (a + bx i )pi X = (a + bx 1 )p X 1 +... + (a + bx n )p X n = (ap X 1 +... + ap X n ) + (bx 1 p X 1 +... + bx n p X n ) = a (p1 X +... + pn X ) + b (x 1 p1 X +... + x n pn X ) n n = a pi X + b x i pi X = a + be(x). Hiro Kasahara (UBC) Econ 325 6 / 28

Clicker Question 4-1 For any constant a, b, and for any function g(x), which of the following is true? A). E[a + b g(x)] = a + b g(e[x]). B). E[a + b g(x)] = a + b E[g(X)]. C). E[a + b g(x)] = a + b g(e[x]) = a + b E[g(X)]. Hiro Kasahara (UBC) Econ 325 7 / 28

Question 1 in Worksheet Prove that, for any any constant a, b, and for any function g(x), E[a + b g(x)] = a + b E[g(X)]. Hiro Kasahara (UBC) Econ 325 8 / 28

Proposition Cov (a 1 + b 1 X, a 2 + b 2 Y ) = b 1 b 2 Cov (X, Y ), where a 1, a 2, b 1, and b 2 are some constants. Proof: Cov(X, Y ) def = E[(a 1 + b 1 X E(a 1 + b 1 X))(a 2 + b 2 Y E(a 2 + b 2 Y ))] = E[(a 1 a 1 + b 1 X b 1 E(X))(a 2 a 2 + b 2 Y b 2 E(Y )] = E[b 1 b 2 (X E(X))(Y E(Y ))] = n m j=1 = b 1 b 2 n b 1 b 2 (x i E(X))(y j E(Y )) p X,Y ij m j=1 = b 1 b 2 Cov(X, Y ). [x i E(X)][y j E(Y )] p X,Y ij Hiro Kasahara (UBC) Econ 325 9 / 28

Clicker Question 4-2 For any constant a and b, which of the following is true? A). Var[a + bx] = a + bvar[x]. B). Var[a + bx] = bvar[x]. C). Var[a + bx] = b 2 Var[X]. Hiro Kasahara (UBC) Econ 325 10 / 28

Question 2 in Worksheet Prove that, for any any constant a, b, Var[a + bx] = b 2 Var[X]. Hiro Kasahara (UBC) Econ 325 11 / 28

Proposition If X and Y are independent, then Cov (X, Y ) = 0. Proof: Cov(X, Y ) def = E[(X E(X))(Y E(Y ))] n m = [x i E(X)][y j E(Y )] P(X = x i, Y = y j ) = = n n j=1 m [x i E(X)][y j E(Y )]pi X pj Y (by independence) j=1 j=1 m {[x i E(X)]pi X }{[y j E(Y )]pj Y } Hiro Kasahara (UBC) Econ 325 12 / 28

Proof continued = = n m {[x i E(X)]pi X }{[y j E(Y )]pj Y } j=1 n [x i E(X)]pi X { n = x i pi X = { E(X) E(X) m [y j E(Y )]pj Y j=1 } n m E(X)pi X y j pj Y n p X i } j=1 m E(Y ) E(Y ) (by definition of E(X) and E(Y )) j=1 p Y j m E(Y )pj Y j=1 Hiro Kasahara (UBC) Econ 325 13 / 28

Proof continued = { E(X) E(X) n p X i } m E(Y ) E(Y ) = {E(X) E(X) 1} {E(Y ) E(Y ) 1} = 0 0 = 0. j=1 p Y j Hiro Kasahara (UBC) Econ 325 14 / 28

Clicker Question 4-3 Suppose that X and Y are independent. For any function g(x) and h(y), which of the following is true? A). Cov (g(x), h(y )) = 0 always holds. B). Cov (g(x), h(y )) = 0 does not hold in some cases. Hiro Kasahara (UBC) Econ 325 15 / 28

Question 4 in Worksheet Prove that, when X and Y are independent, for any function g(x) and h(y). Cov (g(x), h(y )) = 0 Hiro Kasahara (UBC) Econ 325 16 / 28

Proposition Var (X + Y ) = Var (X) + Var (Y ) + 2Cov (X, Y ). Proof: Let X def = X E(X) and Ỹ def = Y E(Y ). Var(X + Y ) def = E[(X + Y E(X + Y )) 2 ] = E[((X E(X))) + (Y E(Y ))) 2 ] = E[( X + Ỹ )2 ] = E[ X 2 + Ỹ 2 + 2 XỸ ] = E[((X E(X)) 2 ] + E[(Y E(Y )) 2 ] + 2E[((X E(X))(Y E(Y ))] = Var(X) + Var(Y ) + 2Cov(X, Y ) by definition of variance and covariance Hiro Kasahara (UBC) Econ 325 17 / 28

Clicker Question 4-4 For any function g(x) and h(y), which of the following is true? A). Var(g(X) + h(y )) = Var(g(X)) + Var(h(Y )) B). Var(g(X) + h(y )) = Var(g(X)) + Var(h(Y )) + 2Cov(g(X), h(y )) C). Var(g(X) + h(y )) = g(var(x)) + h(var(y )) Hiro Kasahara (UBC) Econ 325 18 / 28

Question 5 in Worksheet Prove that, for any constant a and b, Var[aX + by ] = a 2 Var[X] + b 2 Var[Y ] + 2abCov[X, Y ]. Hiro Kasahara (UBC) Econ 325 19 / 28

Bernoulli random variable The probability mass function: { 0 with probability p X = 1 with probability 1 p What is E[X] and Var[X]? Hiro Kasahara (UBC) Econ 325 20 / 28

Proposition If X is a Bernoulli random variable with P(X = 1) = p, then E[X] = p. Proof: E(X) def = x=0,1 xp x (1 p) 1 x = (0)(1 p) + (1)(p) = p Hiro Kasahara (UBC) Econ 325 21 / 28

Clicker Question 4-6 Suppose that X 1 and X 2 are two Bernoulli random variables that are stochastically independent with P(X 1 = 1) = P(X 2 = 1) = p. Which of the following is true? A). E[X 1 + X 2 ] = p. B). E[X 1 + X 2 ] = 2p. C). E[X 1 + X 2 ] = p/2. Hiro Kasahara (UBC) Econ 325 22 / 28

Question 7 in Worksheet Prove that, if X and Y are two Bernoulli random variables that are stochastically independent with P(X = 1) = P(Y = 1) = p, then E[X + Y ] = 2p. Hiro Kasahara (UBC) Econ 325 23 / 28

Proposition If X is a Bernoulli random variable with P(X = 1) = p, then V [X] = p(1 p). Proof: Var(X) def = x=0,1 (x p) 2 p x (1 p) 1 x = (0 p) 2 (1 p) + (1 p) 2 p = p 2 (1 p) + (1 p) 2 p = (p + (1 p)) p(1 p) = p(1 p). Hiro Kasahara (UBC) Econ 325 24 / 28

Clicker Question 4-7 Suppose that X 1 and X 2 are two Bernoulli random variables that are stochastically independent with P(X i = 1) for i = 1, 2. Which of the following is true? A). Var[(X 1 + X 2 )/2] = p(1 p). B). Var[(X 1 + X 2 )/2] = p(1 p)/2. C). Var[(X 1 + X 2 )/2] = p(1 p)/4. Hiro Kasahara (UBC) Econ 325 25 / 28

Question 9 in Worksheet Prove that, if X and Y are two Bernoulli random variables that are stochastically independent with P(X = 1) = P(Y = 1) = p, then [ ] X + Y p(1 p) Var =. 2 2 Hiro Kasahara (UBC) Econ 325 26 / 28

Proposition Suppose that X i for i = 1,..., n are n Bernoulli random variables that are stochastically independent to each other with P(X i = 1) for i = 1,..., n. Let X := 1 n n X i. Then, (1) E[ X] = p and (2) Var[ X] = p(1 p) n. Proof for (1): E[ X] = 1 n E[X 1 + X 2 +... + X n ] = 1 n {E[X 1] + E[X 2 ] +... + E[X n ]} = p + p +... + p n = n p = p. n Hiro Kasahara (UBC) Econ 325 27 / 28

Proof for (2) Var[ X] = p(1 p) Var[ X] = Var = [ 1 n ] n X i n : ( ) 2 1 Var[X 1 + X 2 +... + X n ] n = 1 n 2 {Var[X 1] + Var[X 2 ] +... + Var[X n ] +2Cov(X 1, X 2 ) + 2Cov(X 1, X 3 ) +... + 2Cov(X n 1, X n )} = 1 {p(1 p) + p(1 p) +... + p(1 p) + 0} n2 n p(1 p) = = n 2 p(1 p). n Hiro Kasahara (UBC) Econ 325 28 / 28