Chapter 4 continued. Chapter 4 sections

Similar documents
Chapter 4. Chapter 4 sections

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

More than one variable

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

6.041/6.431 Fall 2010 Quiz 2 Solutions

Joint Distribution of Two or More Random Variables

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Covariance and Correlation Class 7, Jeremy Orloff and Jonathan Bloom

ECON Fundamentals of Probability

Chapter 5 continued. Chapter 5 sections

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

Covariance and Correlation

Lecture 2: Review of Probability

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Jointly Distributed Random Variables

Lecture 1: August 28

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

1 Probability theory. 2 Random variables and probability theory.

MA 575 Linear Models: Cedric E. Ginestet, Boston University Revision: Probability and Linear Algebra Week 1, Lecture 2

Raquel Prado. Name: Department of Applied Mathematics and Statistics AMS-131. Spring 2010

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

ECE Lecture #9 Part 2 Overview

Lecture 2: Repetition of probability theory and statistics

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

18.440: Lecture 28 Lectures Review

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

ECE Lecture #10 Overview

STOR Lecture 16. Properties of Expectation - I

18.440: Lecture 28 Lectures Review

Chapter 4 Multiple Random Variables

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Appendix A : Introduction to Probability and stochastic processes

Preliminary Statistics. Lecture 3: Probability Models and Distributions

3. Probability and Statistics

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

1 Presessional Probability

LIST OF FORMULAS FOR STK1100 AND STK1110

BASICS OF PROBABILITY

STAT/MATH 395 PROBABILITY II

More on Distribution Function

Chapter 5. Chapter 5 sections

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Notes for Math 324, Part 19

Algorithms for Uncertainty Quantification

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Bivariate distributions

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

5 Operations on Multiple Random Variables

Continuous Random Variables

Homework 4 Solution, due July 23

Expectation of Random Variables

Chapter 3. Chapter 3 sections

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Solutions to Homework Set #6 (Prepared by Lele Wang)

18.440: Lecture 26 Conditional expectation

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

14.30 Introduction to Statistical Methods in Economics Spring 2009

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Two hours. Statistical Tables to be provided THE UNIVERSITY OF MANCHESTER. 14 January :45 11:45

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

Ch. 5 Joint Probability Distributions and Random Samples

ECE Homework Set 3

ENGG2430A-Homework 2

STAT 430/510: Lecture 16

Bivariate Distributions

1 Basic continuous random variable problems

(y 1, y 2 ) = 12 y3 1e y 1 y 2 /2, y 1 > 0, y 2 > 0 0, otherwise.

STAT 414: Introduction to Probability Theory

STA 256: Statistics and Probability I

This does not cover everything on the final. Look at the posted practice problems for other topics.

Random Variables and Expectations

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

11. Regression and Least Squares

Review of Probability Theory

ECSE B Solutions to Assignment 8 Fall 2008

Class 8 Review Problems 18.05, Spring 2014

We introduce methods that are useful in:

Final Exam # 3. Sta 230: Probability. December 16, 2012

Expectation and Variance

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

STAT Chapter 5 Continuous Distributions

3 Multiple Discrete Random Variables

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY

STA 4322 Exam I Name: Introduction to Statistics Theory

Transcription:

Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility STA 611 (Lecture 07) Expectation 1 / 17

4.5 The Mean and the Median Median measure of center Def: Median Let X be a random variable. Every number m that satisfies P(X m) 0.5 and P(X m) 0.5 is called a median of the distribution of X. STA 611 (Lecture 07) Expectation 2 / 17

4.5 The Mean and the Median Example: coin toss example in Lecture 4 Recall the pf of Y = the number of heads in 3 tosses y 0 1 2 3 1 3 3 1 f Y (y) 8 8 8 8 STA 611 (Lecture 07) Expectation 3 / 17

4.5 The Mean and the Median Example: coin toss example in Lecture 4 Recall the pf of Y = the number of heads in 3 tosses y 0 1 2 3 1 3 3 1 f Y (y) 8 8 8 8 Both 1 and 2 are medians since P(X 1) = 1/2 and P(X 1) = 7/8 0.5 and P(X 2) = 7/8 0.5 and P(X 2) = 1/2 STA 611 (Lecture 07) Expectation 3 / 17

4.5 The Mean and the Median Example: coin toss example in Lecture 4 Recall the pf of Y = the number of heads in 3 tosses y 0 1 2 3 1 3 3 1 f Y (y) 8 8 8 8 Both 1 and 2 are medians since P(X 1) = 1/2 and P(X 1) = 7/8 0.5 and P(X 2) = 7/8 0.5 and P(X 2) = 1/2 In fact all numbers in the interval [1, 2] are medians. STA 611 (Lecture 07) Expectation 3 / 17

4.5 The Mean and the Median Example: Median of the exponential distribution Let X Exp(λ). The pdf of X is Find the median of X f (x) = λe λx, x > 0 STA 611 (Lecture 07) Expectation 4 / 17

Mean vs. Median Chapter 4 continued 4.5 The Mean and the Median Both can be used as a measure of center For a symmetric distribution: median = mean (when exists) STA 611 (Lecture 07) Expectation 5 / 17

Mean vs. Median Chapter 4 continued 4.5 The Mean and the Median Both can be used as a measure of center For a symmetric distribution: median = mean (when exists) Median exists for every distribution but Mean may not exist STA 611 (Lecture 07) Expectation 5 / 17

Mean vs. Median Chapter 4 continued 4.5 The Mean and the Median Both can be used as a measure of center For a symmetric distribution: median = mean (when exists) Median exists for every distribution but Mean may not exist Mean is heavily influenced by the tail ( outliers ) but Median is not For a skewed distribution: mean < median (left skewed) or mean > median (right skewed) STA 611 (Lecture 07) Expectation 5 / 17

Mean vs. Median Chapter 4 continued 4.5 The Mean and the Median Both can be used as a measure of center For a symmetric distribution: median = mean (when exists) Median exists for every distribution but Mean may not exist Mean is heavily influenced by the tail ( outliers ) but Median is not For a skewed distribution: mean < median (left skewed) or mean > median (right skewed) STA 611 (Lecture 07) Expectation 5 / 17

Mean vs. Median Chapter 4 continued 4.5 The Mean and the Median Both can be used as a measure of center For a symmetric distribution: median = mean (when exists) Median exists for every distribution but Mean may not exist Mean is heavily influenced by the tail ( outliers ) but Median is not For a skewed distribution: mean < median (left skewed) or mean > median (right skewed) Estimate of a random variable: Mean minimizes the mean squared error (Theorem 4.5.2) and Median minimizes the mean absolute error (Theorem 4.5.3) STA 611 (Lecture 07) Expectation 5 / 17

4.6 Covariance and Correlation Covariance Def: Covariance Let X and Y be random variables with finite means µ X and µ Y. The covariance of X and Y is defined as if the expectation exists. Cov(X, Y ) = E {(X µ X )(Y µ Y )} STA 611 (Lecture 07) Expectation 6 / 17

4.6 Covariance and Correlation Covariance Def: Covariance Let X and Y be random variables with finite means µ X and µ Y. The covariance of X and Y is defined as if the expectation exists. Cov(X, Y ) = E {(X µ X )(Y µ Y )} Another way of calculating the covariance: Cov(X, Y ) = E(XY ) E(X)E(Y ) A measure of how much X and Y depend (linearly) on each other STA 611 (Lecture 07) Expectation 6 / 17

4.6 Covariance and Correlation Covariance Def: Covariance Let X and Y be random variables with finite means µ X and µ Y. The covariance of X and Y is defined as if the expectation exists. Cov(X, Y ) = E {(X µ X )(Y µ Y )} Another way of calculating the covariance: Cov(X, Y ) = E(XY ) E(X)E(Y ) A measure of how much X and Y depend (linearly) on each other Magnitude of Cov(X, Y ) depends on the magnitudes of X and Y STA 611 (Lecture 07) Expectation 6 / 17

Correlation Chapter 4 continued 4.6 Covariance and Correlation Def: Correlation Let X and Y be random variables with finite variances σ 2 X and σ2 Y. The correlation of X and Y is defined as Cor(X, Y ) = ρ(x, Y ) = Cov(X, Y ) σ X σ Y STA 611 (Lecture 07) Expectation 7 / 17

Correlation Chapter 4 continued 4.6 Covariance and Correlation Def: Correlation Let X and Y be random variables with finite variances σ 2 X and σ2 Y. The correlation of X and Y is defined as Cor(X, Y ) = ρ(x, Y ) = Cov(X, Y ) σ X σ Y 1 ρ(x, Y ) 1 This is shown by using the Schwarz Inequality: (E(UV )) 2 E(U 2 )E(V 2 ) Also a measure of how much X and Y depend (linearly) on each other STA 611 (Lecture 07) Expectation 7 / 17

4.6 Covariance and Correlation Correlation Def: Correlation Let X and Y be random variables with finite variances σ 2 X and σ2 Y. The correlation of X and Y is defined as Cor(X, Y ) = ρ(x, Y ) = Cov(X, Y ) σ X σ Y 1 ρ(x, Y ) 1 This is shown by using the Schwarz Inequality: (E(UV )) 2 E(U 2 )E(V 2 ) Also a measure of how much X and Y depend (linearly) on each other But the correlation is independent on the scale of X and Y STA 611 (Lecture 07) Expectation 7 / 17

4.6 Covariance and Correlation Correlation Def: Correlation Let X and Y be random variables with finite variances σ 2 X and σ2 Y. The correlation of X and Y is defined as Cor(X, Y ) = ρ(x, Y ) = Cov(X, Y ) σ X σ Y 1 ρ(x, Y ) 1 This is shown by using the Schwarz Inequality: (E(UV )) 2 E(U 2 )E(V 2 ) Also a measure of how much X and Y depend (linearly) on each other But the correlation is independent on the scale of X and Y STA 611 (Lecture 07) Expectation 7 / 17

Example 2: Bivariate Normal Chapter 4 continued 4.6 Covariance and Correlation Joint pdf of two correlated Gaussian random variables: Contours of the joint pdf of two correlated Gaussian random variables: STA 611 (Lecture 07) Expectation 8 / 17

4.6 Covariance and Correlation Example 3: Calculating Covariance and Correlation Recall from Lecture 4 the joint pdf for random variables X and Y : f (x, y) = 8xy for 0 < y < x < 1 We have already established that the marginal pdf s are f X (x) = 4x 3 for 0 < x < 1 f Y (y) = 4y 4y 3 for 0 < y < 1 Find Cov(X, Y ) and ρ(x, Y ). STA 611 (Lecture 07) Expectation 9 / 17

4.6 Covariance and Correlation Properties of covariance and correlation Theorem 4.6.4 If X and Y are independent random variables with finite variances then Cov(X, Y ) = ρ(x, Y ) = 0 STA 611 (Lecture 07) Expectation 10 / 17

4.6 Covariance and Correlation Properties of covariance and correlation Theorem 4.6.4 If X and Y are independent random variables with finite variances then Cov(X, Y ) = ρ(x, Y ) = 0 The opposite is not true, i.e. two random variables can be uncorrelated without being independent. STA 611 (Lecture 07) Expectation 10 / 17

4.6 Covariance and Correlation Properties of covariance and correlation Theorem 4.6.4 If X and Y are independent random variables with finite variances then Cov(X, Y ) = ρ(x, Y ) = 0 The opposite is not true, i.e. two random variables can be uncorrelated without being independent. Example : Zero covariance does not imply independence Let { 1 f (x, y) = π x 2 + y 2 1 0 otherwise. Then Y and X are clearly not independent but Cov(X, Y ) = 0. STA 611 (Lecture 07) Expectation 10 / 17

4.6 Covariance and Correlation Properties of covariance and correlation If Y is a linear function of X then X and Y are perfectly correlated, i.e. ρ(x, Y ) = ±1 Cov(X, X) = Var(X) Cov(aX + b, cy + d) = accov(x, Y ) Var(aX + by + c) = a 2 Var(X) + b 2 Var(Y ) + 2abCov(X, Y ) Note that the Cov part is zero if X and Y are independent. STA 611 (Lecture 07) Expectation 11 / 17

4.7 Conditional Expectation Conditional Expectation Def: Conditional Mean Let X and Y be random variables where Y has a finite mean. The conditional expectation of Y given X = x, denoted E(Y x) or E(Y X = x), is the mean of the conditional distribution of Y given X = x STA 611 (Lecture 07) Expectation 12 / 17

4.7 Conditional Expectation Conditional Expectation Def: Conditional Mean Let X and Y be random variables where Y has a finite mean. The conditional expectation of Y given X = x, denoted E(Y x) or E(Y X = x), is the mean of the conditional distribution of Y given X = x More explicitly: E(Y x) = yf (y x)dy if Y is continuous E(Y x) = All y yf (y x) if Y is discrete STA 611 (Lecture 07) Expectation 12 / 17

4.7 Conditional Expectation Conditional Expectation Def: Conditional Mean Let X and Y be random variables where Y has a finite mean. The conditional expectation of Y given X = x, denoted E(Y x) or E(Y X = x), is the mean of the conditional distribution of Y given X = x More explicitly: E(Y x) = yf (y x)dy if Y is continuous E(Y x) = All y yf (y x) if Y is discrete E(Y x) is a function of x E(Y X) is a function of X, thus is a random variable STA 611 (Lecture 07) Expectation 12 / 17

Conditional Variance Chapter 4 continued 4.7 Conditional Expectation Def: Conditional Variance Let X and Y be random variables. The conditional variance of Y given X = x, denoted Var(Y x) or Var(Y X = x), is the variance of the conditional distribution of Y given X = x: ) Var(Y x) = E ((Y E(Y x)) 2 x STA 611 (Lecture 07) Expectation 13 / 17

Conditional Variance Chapter 4 continued 4.7 Conditional Expectation Def: Conditional Variance Let X and Y be random variables. The conditional variance of Y given X = x, denoted Var(Y x) or Var(Y X = x), is the variance of the conditional distribution of Y given X = x: ) Var(Y x) = E ((Y E(Y x)) 2 x Var(Y x) is a function of x Var(Y X) is a random variable STA 611 (Lecture 07) Expectation 13 / 17

Example Chapter 4 continued 4.7 Conditional Expectation Consider again the joint pdf for random variables X and Y : f (x, y) = 8xy for 0 < y < x < 1 We have already established that the marginal pdf s are f X (x) = 4x 3 for 0 < x < 1 f Y (y) = 4y 4y 3 for 0 < y < 1 Find E(Y X = 0.5) and Var(Y X = 0.5) STA 611 (Lecture 07) Expectation 14 / 17

4.7 Conditional Expectation Law of total probability for E and Var Theorem 4.7.1: Law of total probability for Expectations Let X and Y be random variables such that Y has finite mean. Then E (E(Y X)) = E(Y ) STA 611 (Lecture 07) Expectation 15 / 17

4.7 Conditional Expectation Law of total probability for E and Var Theorem 4.7.1: Law of total probability for Expectations Let X and Y be random variables such that Y has finite mean. Then E (E(Y X)) = E(Y ) Theorem 4.7.4: Law of total probability for Variances Let X and Y be random variables. Then Var(Y ) = E (Var(Y X)) + Var (E(Y X)) STA 611 (Lecture 07) Expectation 15 / 17

Example: Hierarchical Model Using laws of total probability 4.7 Conditional Expectation Screening for defective Halloween candy among n pieces 1 John s daughter Emma screens all the n items, the probability that an item passes her screening is p 1. 2 John screens what passed Emma s screening and the probability that a candy passes his screening is p 2 How many candies can we expect pass this double screening? STA 611 (Lecture 07) Expectation 16 / 17

Example: Hierarchical Model Using laws of total probability 4.7 Conditional Expectation Screening for defective Halloween candy among n pieces 1 John s daughter Emma screens all the n items, the probability that an item passes her screening is p 1. 2 John screens what passed Emma s screening and the probability that a candy passes his screening is p 2 How many candies can we expect pass this double screening? X = number of candies that pass first screening. X Binomial(n, p 1 ) STA 611 (Lecture 07) Expectation 16 / 17

Example: Hierarchical Model Using laws of total probability 4.7 Conditional Expectation Screening for defective Halloween candy among n pieces 1 John s daughter Emma screens all the n items, the probability that an item passes her screening is p 1. 2 John screens what passed Emma s screening and the probability that a candy passes his screening is p 2 How many candies can we expect pass this double screening? X = number of candies that pass first screening. X Binomial(n, p 1 ) Y = number of candies that pass second screening. Y X = x Binomial(x, p 2 ) STA 611 (Lecture 07) Expectation 16 / 17

Example: Hierarchical Model Using laws of total probability 4.7 Conditional Expectation Screening for defective Halloween candy among n pieces 1 John s daughter Emma screens all the n items, the probability that an item passes her screening is p 1. 2 John screens what passed Emma s screening and the probability that a candy passes his screening is p 2 How many candies can we expect pass this double screening? X = number of candies that pass first screening. X Binomial(n, p 1 ) Y = number of candies that pass second screening. Y X = x Binomial(x, p 2 ) Find E(Y ) and Var(Y ) STA 611 (Lecture 07) Expectation 16 / 17

END OF CHAPTER 4 STA 611 (Lecture 07) Expectation 17 / 17