Chapter 4. Chapter 4 sections

Similar documents
Chapter 4 continued. Chapter 4 sections

More on Distribution Function

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Expectation of Random Variables

Joint Distribution of Two or More Random Variables

6.041/6.431 Fall 2010 Quiz 2 Solutions

Continuous Random Variables

SDS 321: Introduction to Probability and Statistics

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

STAT 430/510: Lecture 16

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

STAT Chapter 5 Continuous Distributions

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Lecture 1: August 28

Chapter 5. Chapter 5 sections

3 Multiple Discrete Random Variables

18.440: Lecture 28 Lectures Review

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Quick Tour of Basic Probability Theory and Linear Algebra

Review of Probability Theory

Analysis of Engineering and Scientific Data. Semester

1 Presessional Probability

BASICS OF PROBABILITY

Continuous Random Variables and Continuous Distributions

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

3. Probability and Statistics

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

18.440: Lecture 28 Lectures Review

1.1 Review of Probability Theory

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Probability and Distributions

1 Solution to Problem 2.1

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

CS145: Probability & Computing

SDS 321: Introduction to Probability and Statistics

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

1 Review of Probability and Distributions

Formulas for probability theory and linear models SF2941

Chapter 4 : Expectation and Moments

Fundamental Tools - Probability Theory II

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

Lecture 2: Repetition of probability theory and statistics

Lecture 5: Moment generating functions

Preliminary Statistics. Lecture 3: Probability Models and Distributions

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

MATH4210 Financial Mathematics ( ) Tutorial 7

1 Random Variable: Topics

p. 6-1 Continuous Random Variables p. 6-2

5.6 The Normal Distributions

Discrete Probability Refresher

1 Expectation of a continuously distributed random variable

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

The Multivariate Normal Distribution. In this case according to our theorem

Chapter 4. Continuous Random Variables 4.1 PDF

Mathematics 426 Robert Gross Homework 9 Answers

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Probability and Statistics Notes

Topic 3: The Expectation of a Random Variable

ACM 116: Lectures 3 4

Conditional distributions (discrete case)

Chapter 5 continued. Chapter 5 sections

Math 341: Probability Eighth Lecture (10/6/09)

LIST OF FORMULAS FOR STK1100 AND STK1110

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

18.440: Lecture 19 Normal random variables

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Lecture 2: Review of Probability

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

STT 441 Final Exam Fall 2013

Lecture 19: Properties of Expectation

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss?

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

2 Continuous Random Variables and their Distributions

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Limiting Distributions

Expectation and Variance

University of Regina. Lecture Notes. Michael Kozdron

Statistics Examples. Cathal Ormond

ESS011 Mathematical statistics and signal processing

Econ 508B: Lecture 5

MULTIVARIATE PROBABILITY DISTRIBUTIONS

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

5 Operations on Multiple Random Variables

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

SDS 321: Introduction to Probability and Statistics

Jointly Distributed Random Variables

Transcription:

Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation 1 / 18

Summarizing distributions 4.1 Expectation The distribution of X contains everything there is to know about the probabilistic properties of X. However, sometimes we want to summarize the distribution of X in one or a few numbers e.g. to more easily compare two or more distributions. Examples of descriptive quantities: Mean ( = Expectation ) Center of mass - weighted average Median, Moments Variance, Interquartile Range (IQR), Covariance, Correlation Expectation 2 / 18

4.1 Expectation Definition of Expectation µ = E(X) Def: Mean aka. Expected value Let X be a random variable with p(d)f f (x). The mean, or expected value of X, denoted E(X), is defined as follows X discrete: E(X) = xf (x) All x assuming the sum exists. X continuous: E(X) = assuming the integral exists. xf (x) dx If the sum or integral does not exists we say that the expected value does not exist. The mean is often denoted with µ. Expectation 3 / 18

Examples Chapter 4 4.1 Expectation Recall the distribution of Y = the number of heads in 3 tosses (coin toss example from Lecture 4) y 0 1 2 3 then f Y (y) 1 8 E(Y ) = 0 1 8 + 1 3 8 + 2 3 8 + 3 1 8 = 12 8 = 3 2 = 1.5 Find E(X) where X Binom(n, p). The pf of X is ( ) n f (x) = p x (1 p) n x for x = 0, 1,..., n x Find E(X) where X Uniform(a, b). The pdf of X is f (x) = 1 b a 3 8 3 8 1 8 for a x b Expectation 4 / 18

4.1 Expectation Expectation of g(x) Theorem 4.1.1 Let X be a random variable with p(d)f f (x) and g(x) be a real-valued function. Then X discrete: E (g(x)) = All x g(x)f (x) X continuous: E (g(x)) = g(x)f (x) dx Example: Find E(X 2 ) where X Uniform(a, b). Expectation 5 / 18

Expectation of g(x, Y ) Chapter 4 4.1 Expectation Theorem 4.1.2 Let X and Y be random variables with joint p(d)f f (x, y) and let g(x, y) be a real-valued function. Then X and Y discrete: X and Y continuous: E (g(x, Y )) = E (g(x, Y )) = All x,y g(x, y)f (x, y) g(x, y)f (x, y) dx dy Example: Find E ( ) X+Y 2 where X and Y are independent and X Uniform(a, b) and Y Uniform(c, d). Expectation 6 / 18

Properties of Expectation 4.2 Properties of Expectations Theorems 4.2.1, 4.2.4 and 4.2.6: E(aX + b) = ae(x) + b for constants a and b. Let X 1,..., X n be n random variables, all with finite expectations E(X i ), then ( n ) n E X i = E(X i ) Corollary: E (a 1 X 1 + + a n X n + b) = a 1 E(X 1 ) + + a n E(X n ) + b for constants b, a 1,..., a n. Let X 1,..., X n be n independent random variables, all with finite expectations E(X i ), then ( n ) n E X i = E(X i ) CAREFUL!!! In general E(g(X)) g(e(x)). For example: E(X 2 ) [E(X)] 2 Expectation 7 / 18

Examples Chapter 4 4.2 Properties of Expectations If X 1, X 2,..., X n are i.i.d. Bernoulli(p) random variables then Y = n X i Binomial(n, p). E(X i ) = 0 (1 p) + 1 p = p for i = 1,..., n ( n ) n n E(Y ) = E X i = E(X i ) = p = np Note: i.i.d. stands for independent and identically distributed Expectation 8 / 18

4.3 Variance Definition of Variance σ 2 = Var(X) Def: Variance Let X be a random variable (discrete or continuous) with a finite mean µ = E(X). The Variance of X is defined as Var(X) = E ((X µ) 2) The standard deviation of X is defined as Var(X) We often use σ 2 for variance and σ for standard deviation. Theorem 4.3.1 Another way of calculating variance For any random variable X Var(X) = E(X 2 ) [E(X)] 2 Expectation 9 / 18

4.3 Variance Examples - calculating the variance Recall the distribution of Y = the number of heads in 3 tosses (coin toss example from Lecture 4) y 0 1 2 3 1 3 3 1 f Y (y) 8 8 8 8 We already found that µ = E(Y ) = 1.5. Then Var(Y ) = (0 1.5) 2 1 8 + (1 1.5)2 3 8 = 0.75 + (2 1.5) 2 3 8 + (3 1.5)2 1 8 Find Var(X) where X Uniform(a, b) Expectation 10 / 18

Properties of the Variance 4.3 Variance Theorems 4.3.2, 4.3.3, 4.3.4 and 4.3.5 Var(X) 0 for any random variable X. Var(X) = 0 if and only if X is a constant, i.e. P(X = c) = 1 for some constant c. Var(aX + b) = a 2 Var(X) If X 1,..., X n are independent we have ( n ) n Var X i = Var(X i ) Expectation 11 / 18

Examples Chapter 4 4.3 Variance If X 1, X 2,..., X n are i.i.d. Bernoulli(p) random variables then Y = n X i Binomial(n, p). E(X i ) = p for i = 1,..., n E(Xi 2 ) = 0 2 (1 p) + 1 2 p = p for i = 1,..., n Var(X i ) = E(Xi 2 ) [E(X i )] 2 = p p 2 = p(1 p) Therefore, we have ( n ) n n Var(Y ) = Var X i = Var(X i ) = p(1 p) = np(1 p) Expectation 12 / 18

4.4 Moments Moments and Central moments Def: Moments Let X be a random variable and k be a positive integer. The expectation E(X k ) is called the k th moment of X Let E(X) = µ. The expectation E ( (X µ) k) is called the k th central moment of X The first moment is the mean: µ = E(X 1 ) The first central moment is zero: E(X µ) = E(X) E(X) = 0 The second central moment is the variance: σ 2 = E ( (X µ) 2) Expectation 13 / 18

4.4 Moments Moments and Central moments Symmetric distribution: If the p(d)f f (x) is symmetric with respect to a point x 0, i.e. f (x 0 + δ) = f (x 0 δ) for all δ It the mean of a symmetric distribution exists, then it is the point of symmetry. If the distribution of X is symmetric w.r.t. its mean µ then E ( (X µ) k) = 0 for k odd (if the central moment exists) Skewness: E ( (X µ) 3) /σ 3 Expectation 14 / 18

Moment generating function 4.4 Moments Def: Moment Generating Function Let X be a random variable. The function ( ) ψ(t) = E e tx t R is called the moment generating function (m.g.f.) of X Theorem 4.4.2 Let X be a random variables whose m.g.f. ψ(t) is finite for t in an open interval around zero. Then the nth moment of X is finite, for n = 1, 2,..., and E(X n ) = d n dt n ψ(t) t=0 Expectation 15 / 18

Example Chapter 4 4.4 Moments Let X Gamma(n, β). Then X has the pdf f (x) = 1 (n 1)!β n x n 1 e x/β for x > 0 Find the m.g.f. of X and use it to find the mean and the variance of X. Expectation 16 / 18

Properties of m.g.f. Chapter 4 4.4 Moments Theorems 4.4.3 and 4.4.4: ψ ax+b (t) = e bt ψ X (at) Let Y = n X i where X 1,..., X n are independent random variables with m.g.f. ψ i (t) for i = 1,..., n Then ψ Y (t) = n ψ i (t) Theorem 4.4.5: Uniqueness of the m.g.f. Let X and Y be two random variables with m.g.f. s ψ X (t) and ψ Y (t). If the m.g.f. s are finite and ψ X (t) = ψ Y (t) for all values of t in an open interval around zero, then X and Y have the same distribution. Expectation 17 / 18

4.4 Moments Example Let X N(µ, σ 2 ). X has the pdf f (x) = 1 ) ( σ 2π exp (x µ)2 2σ 2 and the m.g.f. for the normal distribution is Show that ψ(t) is the m.g.f. of X. ψ(t) = exp (µt + t2 σ 2 ) 2 Expectation 18 / 18