CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

Similar documents
Joint Distribution of Two or More Random Variables

Notes for Math 324, Part 19

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

MAS113 Introduction to Probability and Statistics. Proofs of theorems

ENGG2430A-Homework 2

Preliminary Statistics. Lecture 3: Probability Models and Distributions

STAT 430/510: Lecture 16

Functions of two random variables. Conditional pairs

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Jointly Distributed Random Variables

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

3. General Random Variables Part IV: Mul8ple Random Variables. ECE 302 Fall 2009 TR 3 4:15pm Purdue University, School of ECE Prof.

EXPECTED VALUE of a RV. corresponds to the average value one would get for the RV when repeating the experiment, =0.

ECE Lecture #9 Part 2 Overview

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 4. Chapter 4 sections

Bivariate distributions

Appendix A : Introduction to Probability and stochastic processes

Notes for Math 324, Part 20

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Topic 5: Discrete Random Variables & Expectations Reference Chapter 4

Probability and Statistics Notes

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Analysis of Engineering and Scientific Data. Semester

More on Distribution Function

STAT/MATH 395 PROBABILITY II

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Stat 5101 Notes: Algorithms (thru 2nd midterm)

Continuous Random Variables

ECON Fundamentals of Probability

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

Bivariate Distributions

MATH 38061/MATH48061/MATH68061: MULTIVARIATE STATISTICS Solutions to Problems on Random Vectors and Random Sampling. 1+ x2 +y 2 ) (n+2)/2

ECSE B Solutions to Assignment 8 Fall 2008

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

Gaussian random variables inr n

Stat 5101 Notes: Algorithms

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Bivariate Distributions

Chapter 4. Continuous Random Variables

Statistics 427: Sample Final Exam

Probability Review. Chao Lan

Bayesian statistics, simulation and software

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Chapter 4 continued. Chapter 4 sections

Variance reduction. Michel Bierlaire. Transport and Mobility Laboratory. Variance reduction p. 1/18

Lecture 4: Conditional expectation and independence

3 Multiple Discrete Random Variables

Review: mostly probability and some statistics

STAT Chapter 5 Continuous Distributions

n x u x v n x = (u+v) n. p x (1 p) n x x = ( e t p+(1 p) ) n., x = 0,1,2,... λe = e λ t ) x = e λ e λet = e λ(et 1).

BASICS OF PROBABILITY

Chp 4. Expectation and Variance

Chapter 5: Joint Probability Distributions

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

2 (Statistics) Random variables

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

CMPSCI 240: Reasoning Under Uncertainty

STA 256: Statistics and Probability I

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

SDS 321: Introduction to Probability and Statistics

5 Operations on Multiple Random Variables

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

18.440: Lecture 28 Lectures Review

Math438 Actuarial Probability

Random Variables and Their Distributions

3. Probability and Statistics

1 Basic continuous random variable problems

CMPSCI 240: Reasoning Under Uncertainty

The Multivariate Normal Distribution. In this case according to our theorem

STOR Lecture 16. Properties of Expectation - I

Chapter 2. Continuous random variables

III - MULTIVARIATE RANDOM VARIABLES

7 Continuous Variables

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Lecture 2: Review of Probability

ACM 116: Lectures 3 4

SOLUTION FOR HOMEWORK 11, ACTS 4306

Expectation of Random Variables

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

HW5 Solutions. (a) (8 pts.) Show that if two random variables X and Y are independent, then E[XY ] = E[X]E[Y ] xy p X,Y (x, y)

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Math 510 midterm 3 answers

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

3-1. all x all y. [Figure 3.1]

4. Distributions of Functions of Random Variables

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

STAT 515 MIDTERM 2 EXAM November 14, 2018

Transcription:

CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a very large number of observations. We often denote the expected value as µ X, or µ if there is no confusion. µ X E(X) is also referred to the mean of the random variable X, or the mean of the probability distribution of X. In the case of a finite population, the expected value is the population mean. Consider a university with 15000 students and let X be the number of courses for which a randomly selected student is registered. The probability distribution of X is as follows: x 1 2 3 4 5 No. of students 300 900 2850 4500 6450 f (x) 0.02 0.06 0.19 0.30 0.43 The average number of courses per student, or the average value of X in the population, results from computing the total number of courses taken by all students, and then dividing by the number of students in the population. Since The mean, or average value of the random variable X, is therefore µ and so on, an alternative expression for the mean is 1(300) + 2(900) + 3(2850) + 4(4500) + 5(6450) 15000 300 0.02 f (1) 15000 45900 0.06 f (2), 15000 µ 1 f (1) + 2 f (2) + 3 f (3) + 4 f (4) + 5 f (5) 1(0.02) + 2(0.06) + 3(0.19) + 4(0.30) + 5(0.43) 0.02 + 0.12 + 0.57 + 1.20 + 2.15 4.06 4.06

16 Chapter 4. Mathematical Expectation Mean, or Expected Value of a random variable X Let X be a random variable with probability distribution f (x). The mean, or expected value, of X is x f (x) if X is discrete x µ E(X) x f (x) dx if X is continuous EXAMPLE 4.1 (Discrete). Suppose that a random variable X has the following PMF: Find E(X), the mathematical expectation of X. x 1 0 1 2 f (x) 0.3 0.1 0.4 0.2 EXAMPLE 4.2 (Continuous). Consider a random variable X with PDF 3x 2 if 0 < x < 1 f (x). 0 otherwise Find E(X). EXAMPLE 4.3 (Interview). Six men and five women apply for an executive position in a small company. Two of the applicants are selected for interview. Let X denote the number of women in the interview pool. We have found the PMF of X in the previous chapter: x 0 1 2 f (x) 3/11 6/11 2/11 How many women do you expect in the interview pool? That is, what is the expected value of X? EXAMPLE 4.4 (Train Waiting). A commuter train arrives punctually at a station every half hour. Each morning, a commuter named John leaves his house and casually strolls to the train station. Let X denote the amount of time, in minutes, that John waits for the train from the time he reaches the train station. It is known that the PDF of X is 1, for 0 < x < 30 f (x) 30 0, otherwise. Obtain and interpret the expected value of the random variable X. EXAMPLE 4.5 (DVD Failure). The time to failure in thousands of hours of an important piece of electronic equipment used in a manufactured DVD player has the density function 2e 2x, x > 0 f (x) 0, otherwise. Find the expected life of this piece of equipment. Mean Value of g(x) Let X be a random variable with probability distribution f (x). The expected value of the random variable g(x) is g(x) f (x) if X is discrete x µ g(x) E(g(X)) g(x) f (x) dx if X is continuous STAT-3611 Lecture Notes 2015 Fall X. Li

Section 4.1. Mean of a Random Variable 17 EXAMPLE 4.6. Refer to Example 4.1 (Discrete). Find the expected value of the random variable ( X 2 + 1 ). EXAMPLE 4.7. Refer to Example 4.2 (Continuous). Calculate E ( X 2). EXAMPLE 4.8. Refer to Example 4.3 (Interview). How many men do you expect in the interview pool? That is, find E(2 X). EXAMPLE 4.9. Refer to Example 4.4 (Train Waiting). What is the average value of E(X/60)? Can you interpret it? EXAMPLE 4.10. Refer to Example 4.5 (DVD Failure). Find E ( e X). EXAMPLE 4.11 (Insurance Payout). A group health insurance policy of a small business pays 100% of employee medical bills up to a maximum of $1 million per policy year. The total annual medical bills, X, in millions of dollars, incurred by the employee has PDF given by x(4 x), for 0 < x < 3 f (x) 9. 0, otherwise Determine the expected annual payout by the insurance company, i.e., the expected value of g(x) minx,1}. Mean Value of g(x,y ) Let X and Y be two random variable with joint probability distribution f (x, y). The mean, or expected value of the random variable g(x,y ) is µ g(x,y ) E(g(X,Y )) g(x,y) f (x,y) x y g(x, y) f (x, y) dxdy discrete continuous EXAMPLE 4.12 (Joint). If X and Y are two random variables with the joint PMF: Find E(XY ) and E ( XY 2). f (x,y) 1 0 1 0 0 0.1 0.2 1 0.3 0 0.1 2 0.2 0.1 0 Calculating E(X) or E(Y ) based on the joint PMF/PDF x f (x,y) xg(x) x y x E(X) x f (x,y) dydx xg(x) dx y f (x,y) yh(y) y x y E(Y ) y f (x,y) dxdy yh(y) dy discrete continuous discrete continuous EXAMPLE 4.13. Refer to Example 4.12 (Joint). (a) Find the marginal PMF of X. Use it to calculate E(X). (b) Find the marginal PMF of Y. Use it to calculate E(Y ). (c) Calculate E(X) using the joint PMF, i.e., E(X) x f (x, y). (d) Calculate E(Y ) using the joint PMF, i.e., E(Y ) y f (x,y). X. Li 2015 Fall STAT-3611 Lecture Notes

18 Chapter 4. Mathematical Expectation 4.2 Variance and Covariance of Random Variables The variance of a random variable X, or the variance of the probability distribution of X, is defined as the expected squared deviation from the expected value. Variance & Standard Deviation Let X be a random variable with probability distribution f (x) and mean µ. The variance of X is σ 2 Var(X) E [(X µ) 2] E [(X E(X)) 2] (x µ) 2 f (x) if X is discrete x (x µ) 2 f (x) dx if X is continuous The positive square root of the variance, σ, is called the standard deviation of X. EXAMPLE 4.14. Refer to Example 4.1 (Discrete). Find σ 2 Var(X), the variance of X. Note that E [(X µ) 2] E ( X 2 2µX + µ 2) E ( X 2) 2µE(X) + µ 2 E ( X 2) µ 2 We often calculate the variance in the following way: Var(X) E ( X 2) [E(X)] 2 EXAMPLE 4.15. Refer to Example 4.1 (Discrete). Find σ 2 Var(X) using the above formula. EXAMPLE 4.16. Refer to Example 4.2 (Continuous). Find Var(X). EXAMPLE 4.17. Refer to Example 4.4 (Train Waiting). Calculate σ, the standard deviation of X. EXAMPLE 4.18. Refer to Example 4.5 (DVD Failure). Calculate the variance of X. Similar to the mathematical expectation, we can extend the concept of the variance of a random variable X to the variance of a function of X, say, g(x). Variance of g(x) Let X be a random variable with probability distribution f (x). The variance of the random variable g(x) is σ 2 g(x) E [g(x) µg(x) ] 2 } [ ] 2 g(x) µg(x) f (x) discrete x [ ] 2 g(x) µg(x) f (x) dx continuous It can also be calculated as follows: Var[g(X)] E [g(x)] 2} E[g(X)]} 2 STAT-3611 Lecture Notes 2015 Fall X. Li

Section 4.3. Means and Variances of Linear Combinations of Random Variables 19 EXAMPLE 4.19. Refer to Example 4.1 (Discrete). Find the variance of ( X 2 + 1 ). EXAMPLE 4.20. Refer to Example 4.2 (Continuous). Find Var ( X 2). Covariance of X and Y Let X and Y be random variables with joint probability distribution f (x,y). The covariance of X and Y is σ XY Cov(X,Y ) E[(X µ X )(Y µ Y )] (x µ X )(y µ Y ) f (x,y) discrete x y (x µ X )(y µ Y ) f (x,y) dx dy continuous. We often calculate Cov(X,Y ) in the following way: Cov(X,Y ) E(XY ) E(X)E(Y ) NOTE. The covariance is a measure of the association between the two random variables. The sign of the covariance indicates whether the relationship between two dependent random variables is positive or negative. If X and Y are statistically independent, then the covariance is zero. The converse, however, is not generally true. The association that the covariance measures between X and Y is the linear relationship. EXAMPLE 4.21. Refer to Example 4.12 (Joint). Calculate the covariance of X and Y. Correlation Coefficient of X and Y Let X and Y be random variables with covariance σ XY and standard deviations σ X and σ Y, respectively. The correlation coefficient of X and Y is ρ XY σ XY Cov(X,Y ) σ X σ Y Var(X) Var(Y ) NOTE. Unlike the variance, the correlation coefficient ρ XY is a scale-free measure. The magnitude of ρ XY does not depend on the units used to measure both X and Y. The correlation coefficient ρ XY indicates the strength of the relationship. 1 ρ XY 1. EXAMPLE 4.22. Refer to Example 4.12 (Joint). Calculate the correlation coefficient of X and Y. 4.3 Means and Variances of Linear Combinations of Random Variables Theorem. The expected value of the sum or difference of two or more functions of a random variable X is the sum or difference of the expected values of the functions. That is, E[g(X) ± h(x)] E[g(X)] ± E[h(X)] Proof. For continuous case, E[g(X) ± h(x)] [g(x) ± h(x)] f (x) dx g(x) f (x) dx ± h(x) f (x) dx E[g(X)] ± E[h(X)] X. Li 2015 Fall STAT-3611 Lecture Notes

20 Chapter 4. Mathematical Expectation Theorem. The expected value of the sum or difference of two or more functions of the random variables X and Y is the sum or difference of the expected values of the functions. That is, E[g(X,Y ) ± h(x,y )] E[g(X,Y )] ± E[h(X,Y )] Proof. For continuous case, E[g(X,Y ) ± h(x,y )] [g(x,y) ± h(x,y)] f (x,y) dxdy g(x,y) f (x,y) dxdy ± h(x,y) f (x,y) dxdy E[g(X,Y )] ± E[h(X,Y )] COROLLARY. E[g(X) ± h(y )] E[g(X)] ± E[h(Y )] COROLLARY. E[X ±Y ] E[X] ± E[Y ] Theorem. If a, b and c are constants, then E(aX + by + c) ae(x) + be(y ) + c and Var(aX + by + c) a 2 Var(X) + b 2 Var(Y ) + 2abCov(X,Y ) Proof. E(aX + by + c) ae(x) + be(y ) + c. Var(aX + by + c) E [(ax + by + c) E(aX + by + c)] 2} E [(ax + by + c) (ae(x) + be(y ) + c)] 2} E [a(x E(X)) + b(y E(Y ))] 2} a 2 E [X E(X)] 2} + b 2 E [Y E(Y )] 2} + 2abE[X E(X)][Y E(Y )]} a 2 Var(X) + b 2 Var(Y ) + 2abCov(XY ). COROLLARY. It can be easily verified that E(c) c Var(c) 0 E(X + c) E(X) + c Var(X + c) Var(X) E(aX) ae(x) Var(aX) a 2 Var(X) E(aX + c) ae(x) + c Var(aX + c) a 2 Var(X) EXAMPLE 4.23. Suppose that X and Y are random variables with E(X) 2 and E(Y ) 3, Var(X) 4, Var(Y ) 5, and correlation coefficient ρ 0.6. Let Z 2X + 4Y 3. Find (a) E(Z) (b) Cov(X,Y ) (c) Var(Z) EXAMPLE 4.24. Refer to Example 4.1 (Discrete). Find E[(X 2)(X + 1)]. EXAMPLE 4.25. Refer to Example 4.2 (Continuous). Find E ( 3X 2 + 5X 8 ). EXAMPLE 4.26. Refer to Example 4.12 (Joint). Find Var(X 2Y + 3). STAT-3611 Lecture Notes 2015 Fall X. Li

Section 4.4. Other properties 21 Theorem. Let X and Y be two independent random variables. Then E(XY ) E(X) E(Y ). Proof. For continuous case, E(XY ) xy f (x,y) dxdy xyg(x)h(y) dxdy [ ] xg(x) dx][ yh(y) dy E(X) E(Y ) COROLLARY. Let X and Y be two independent random variables. Then σ XY Cov(X,Y ) 0. EXAMPLE 4.27. If X and Y are random variables with the joint density function 6e (2x+3y) if x > 0,y > 0 f (x,y). 0 otherwise Find ρ XY. COROLLARY. If X and Y are independent random variables, then Var(aX ± by ) a 2 Var(X) + b 2 Var(Y ). COROLLARY. If X 1,X 2,...,X n are independent random variables, then Var ( n i1a i X i ) n i1 a 2 i Var(X i ) That is, Var(a 1 X 1 + a 2 X 2 + + a n X n ) a 2 1Var(X 1 ) + a 2 2Var(X 2 ) + + a 2 nvar(x n ). 4.4 Other properties In general, Cov ( n i1 E Var a i X i, ( n i1a i X i ) ( n i1a i X i ) m j1 b j Y j ) n i1 n n i1 j1 n i1 n i1 n m i1 j1 a i E(X i ) a i a j Cov(X i,x j ) a 2 i Var(X i ) + a i a j Cov(X i,x j ) i j a 2 i Var(X i ) + 2 1 i< j n a i b j Cov( X i,y j ) a i a j Cov(X i,x j ) X. Li 2015 Fall STAT-3611 Lecture Notes

22 Chapter 4. Mathematical Expectation As follows are more interesting properties of covariance. Cov(X,a) 0 Cov(X,X) Var(X) Cov(X,Y ) Cov(Y,X) Cov(aX,bY ) abcov(x,y ) Cov(X + a,y + b) Cov(X,Y ) Cov(aX + by,cz + dw) accov(x,z) + adcov(x,w) + bccov(y,z) + bdcov(y,z) EXAMPLE 4.28. Prove that Cov(aX,bY ) abcov(x,y ) where a and b are constants. EXAMPLE 4.29. Suppose that X and Y are random variables with Var(X) 4, Var(Y ) 5, and Cov(X,Y ) 3. Calculate (a) Cov(12X 2013, 2014) (b) Cov(5X, 6Y ) (c) Cov(X + 2013, Y + 2014) (d) Cov(X + 2Y, 3X + 4Y ) STAT-3611 Lecture Notes 2015 Fall X. Li