Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, Jeremy Orloff and Jonathan Bloom

Similar documents
MAS113 Introduction to Probability and Statistics. Proofs of theorems

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014

Chapter 4. Continuous Random Variables

Continuous Data with Continuous Priors Class 14, Jeremy Orloff and Jonathan Bloom

Class 8 Review Problems solutions, 18.05, Spring 2014

STATISTICS 1 REVISION NOTES

Central Limit Theorem and the Law of Large Numbers Class 6, Jeremy Orloff and Jonathan Bloom

Exam 1 Practice Questions I solutions, 18.05, Spring 2014

Chapter 4: Continuous Random Variables and Probability Distributions

This does not cover everything on the final. Look at the posted practice problems for other topics.

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Confidence Intervals for the Mean of Non-normal Data Class 23, Jeremy Orloff and Jonathan Bloom

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Class 8 Review Problems 18.05, Spring 2014

SDS 321: Introduction to Probability and Statistics

1 Solution to Problem 2.1

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

Brief Review of Probability

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Continuous Random Variables

Normal Distribution. Distribution function and Graphical Representation - pdf - identifying the mean and variance

Continuous Distributions

Massachusetts Institute of Technology

Joint Probability Distributions and Random Samples (Devore Chapter Five)

18.440: Lecture 26 Conditional expectation

Common ontinuous random variables

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Chapter 2. Continuous random variables

Bayesian Updating with Continuous Priors Class 13, Jeremy Orloff and Jonathan Bloom

ECON Fundamentals of Probability

2 Functions of random variables

Null Hypothesis Significance Testing p-values, significance level, power, t-tests

The normal distribution

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Compute f(x θ)f(θ) dθ

Expectation, variance and moments

Analysis of Engineering and Scientific Data. Semester

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Chapter 4. Continuous Random Variables 4.1 PDF

STA 111: Probability & Statistical Inference

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Exam 2 Practice Questions, 18.05, Spring 2014

Probability Density Functions

18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values.

MATH Solutions to Probability Exercises

Expectation is a positive linear operator

8 Laws of large numbers

Math Review Sheet, Fall 2008

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

3 Continuous Random Variables

Statistics 100A Homework 5 Solutions

Random Variables and Their Distributions

Brief reminder on statistics

14.30 Introduction to Statistical Methods in Economics Spring 2009

Chapter 4. Chapter 4 sections

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

Continuous r.v. s: cdf s, Expected Values

Chapter 4 Continuous Random Variables and Probability Distributions

18.440: Lecture 28 Lectures Review

Chapter 4 continued. Chapter 4 sections

CS 147: Computer Systems Performance Analysis

15 Discrete Distributions

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Lecture 18. Uniform random variables

Probability Distributions

Preliminary Statistics Lecture 3: Probability Models and Distributions (Outline) prelimsoas.webs.com

Post-exam 2 practice questions 18.05, Spring 2014

Stat 5101 Lecture Slides Deck 4. Charles J. Geyer School of Statistics University of Minnesota

Bell-shaped curves, variance

Class 26: review for final exam 18.05, Spring 2014

5.2 Continuous random variables

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

2 Continuous Random Variables and their Distributions

6.041/6.431 Fall 2010 Quiz 2 Solutions

S n = x + X 1 + X X n.

Topic 4: Continuous random variables

Final Solutions Fri, June 8

Continuous Random Variables and Continuous Distributions

Slides 8: Statistical Models in Simulation

14.30 Introduction to Statistical Methods in Economics Spring 2009

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

18.440: Lecture 28 Lectures Review

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Continuous random variables

1 Review of Probability

Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Let X be a continuous random variable, < X < f(x) is the so called probability density function (pdf) if

ECONOMICS 207 SPRING 2006 LABORATORY EXERCISE 5 KEY. 8 = 10(5x 2) = 9(3x + 8), x 50x 20 = 27x x = 92 x = 4. 8x 2 22x + 15 = 0 (2x 3)(4x 5) = 0

Topic 3: The Expectation of a Random Variable

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

STAT 430/510: Lecture 10

Conditional densities, mass functions, and expectations

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Variance of Discrete Random Variables Class 5, Jeremy Orloff and Jonathan Bloom

SDS 321: Introduction to Probability and Statistics

Final Exam # 3. Sta 230: Probability. December 16, 2012

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Transcription:

Expectation, Variance and Standard Deviation for Continuous Random Variables Class 6, 8.5 Jeremy Orloff and Jonathan Bloom Learning Goals. Be able to compute and interpret expectation, variance, and standard deviation for continuous random variables. 2. Be able to compute and interpret quantiles for discrete and continuous random variables. 2 Introduction So far we have looked at expected value, standard deviation, and variance for discrete random variables. These summary statistics have the same meaning for continuous random variables: The expected value µ = E(X) is a measure of location or central tendency. The standard deviation σ is a measure of the spread or scale. The variance σ 2 = Var(X) is the square of the standard deviation. To move from discrete to continuous, we will simply replace the sums in the formulas by integrals. We will do this carefully and go through many examples in the following sections. In the last section, we will introduce another type of summary statistic, quantiles. You may already be familiar with the.5 quantile of a distribution, otherwise known as the median or 5 th percentile. 3 Expected value of a continuous random variable Definition: Let X be a continuous random variable with range [a, b] and probability density function f(x). The expected value of X is defined by E(X) = b xf(x) dx. a Let s see how this compares with the formula for a discrete random variable: n E(X) = x i p(x i ). i= The discrete formula says to take a weighted sum of the values x i of X, where the weights are the probabilities p(x i ). Recall that f(x) is a probability density. Its units are prob/(unit of X).

8.5 class 6, Expectation and Variance for Continuous Random Variables 2 So f(x) dx represents the probability that X is in an infinitesimal range of width dx around x. Thus we can interpret the formula for E(X) as a weighted integral of the values x of X, where the weights are the probabilities f(x) dx. As before, the expected value is also called the mean or average. 3. Examples Let s go through several example computations. Where the solution requires an integration technique, we push the computation of the integral to the appendix. Example. Let X uniform(, ). Find E(X). answer: X has range [, ] and density f(x) =. Therefore, x 2 E(X) = x dx = 2 = 2. Not surprisingly the mean is at the midpoint of the range. Example 2. Let X have range [, 2] and density 3 8x 2. Find E(X). answer: 2 2 3 E(X) = xf(x) dx = 8 x3 dx = 3x4 2 32 = 3 2. Does it make sense that this X has mean is in the right half of its range? answer: Yes. Since the probability density increases as x increases over the range, the average value of x should be in the right half of the range. f(x) µ =.5 x µ is pulled to the right of the midpoint because there is more mass to the right. Example 3. Let X exp(λ). Find E(X). answer: The range of X is [, ) and its pdf is f(x) = λe λx. So (details in appendix) E(X) = λe λx dx = f(x) = λe λx λx λ λx e e λ = λ. µ = /λ x Mean of an exponential random variable

8.5 class 6, Expectation and Variance for Continuous Random Variables 3 Example 4. Let Z N(, ). Find E(Z). answer: The range of Z is (, ) and its pdf is φ(z) = 2 e z /2. appendix) E(Z) = ze z2 /2 dz = e z2 /2 =. So (details in φ(z) µ = z The standard normal distribution is symmetric and has mean. 3.2 Properties of E(X) The properties of E(X) for continuous random variables are the same as for discrete ones:. If X and Y are random variables on a sample space Ω then E(X + Y ) = E(X) + E(Y ). (linearity I) 2. If a and b are constants then E(aX + b) = ae(x) + b. (linearity II) Example 5. In this example we verify that for X N(µ, σ) we have E(X) = µ. answer: Example (4) showed that for standard normal Z, E(Z) =. We could mimic the calculation there to show that E(X) = µ. Instead we will use the linearity properties of E(X). In the class 5 notes on manipulating random variables we showed that if X N(µ, σ 2) is a normal random variable we can standardize it: X Z = µ N(, ). σ Inverting this formula we have X = σ Z + µ. The linearity of expected value now gives E(X) = E(σ Z + µ) = σ E(Z) + µ = µ 3.3 Expectation of Functions of X This works exactly the same as the discrete case. if h(x) is a function then Y = h(x) is a random variable and E(Y ) = E(h(X)) = h(x)f X (x) dx. Example 6. Let X exp(λ). Find E(X 2 ). answer: Using integration by parts we have E(X 2 ) = x 2 λe λx dx = [ x 2 e λx 2x e λx 2 ] λ λ 2 e λx = 2 λ 2.

8.5 class 6, Expectation and Variance for Continuous Random Variables 4 4 Variance Now that we ve defined expectation for continuous random variables, the definition of variance is identical to that of discrete random variables. Definition: Let X be a continuous random variable with mean µ. The variance of X is Var(X) = E((X µ) 2 ). 4. Properties of Variance These are exactly the same as in the discrete case.. If X and Y are independent then Var(X + Y ) = Var(X) + Var(Y ). 2. For constants a and b, Var(aX + b) = a 2 Var(X). 3. Theorem: Var(X) = E(X 2 ) E(X) 2 = E(X 2 ) µ 2. For Property, note carefully the requirement that X and Y are independent. Property 3 gives a formula for Var(X) that is often easier to use in hand calculations. The proofs of properties 2 and 3 are essentially identical to those in the discrete case. We will not give them here. Example 7. Let X uniform(, ). Find Var(X) and σ X. answer: In Example we found µ = /2. Next we compute Var(X) = E((X µ) 2 ) = (x /2) 2 dx = 2. Example 8. Let X exp(λ). Find Var(X) and σ X. answer: In Examples 3 and 6 we computed E(X) = So by Property 3, xλe λx dx = λ and E(X 2 ) = x 2 λe λx dx = 2. λ 2 Var(X) = E(X 2 ) E(X) 2 2 = λ 2 λ 2 = λ 2 and σ X =. λ We could have skipped Property 3 and computed this directly from Var(X) = (x /λ) 2 λe λx dx. Example 9. Let Z N(, ). Show Var(Z) =. Note: The notation for normal variables is X N(µ, σ 2 ). This is certainly suggestive, but as mathematicians we need to prove that E(X) = µ and Var(X) = σ 2. Above we showed E(X) = µ. This example shows that Var(Z) =, just as the notation suggests. In the next example we ll show Var(X) = σ 2. answer: Since E(Z) =, we have Var(Z) = E(Z 2 ) = z 2 e z2 /2 dz.

8.5 class 6, Expectation and Variance for Continuous Random Variables 5 2 2 (using integration by parts with u = z, v = ze z /2 u =, v = e z /2 ) ( = ze z2 /2 ) + e z2 /2 dz. 2 π The first term equals because the exponential goes to zero much faster than z grows at both ±. The second term equals because it is exactly the total probability integral of the pdf ϕ(z) for N(, ). So Var(X) =. Example. Let X N(µ, σ 2 ). Show Var(X) = σ 2. answer: This is an exercise in change of variables. Letting z = (x µ)/σ, we have Var(X) = E((X µ) 2 ) = (x µ) 2 e (x µ)2 /2σ 2 dx σ = σ2 2/ z 2 e z 2 dz = σ 2. The integral in the last line is the same one we computed for Var(Z). 5 Quantiles Definition: The median of X is the value x for which P (X x) =.5, i.e. the value of x such that P (X X) = P (X x). In other words, X has equal probability of being above or below the median, and each probability is therefore /2. In terms of the cdf F (x) = P (X x), we can equivalently define the median as the value x satisfying F (x) =.5. Think: What is the median of Z? answer: By symmetry, the median is. Example. Find the median of X exp(λ). answer: The cdf of X is F (x) = e λx. So the median is the value of x for which F (x) = e λx =.5.. Solving for x we find: x = (ln 2)/λ. Think: In this case the median does not equal the mean of µ = /λ. Based on the graph of the pdf of X can you argue why the median is to the left of the mean. Definition: The p th quantile of X is the value q p such that P (X q p ) = p. Notes.. In this notation the median is q.5. 2. We will usually write this in terms of the cdf: F (q p ) = p. With respect to the pdf f(x), the quantile q p is the value such that there is an area of p to the left of q p and an area of p to the right of q p. In the examples below, note how we can represent the quantile graphically using either area of the pdf or height of the cdf. Example 2. Find the.6 quantile for X U(, ).

8.5 class 6, Expectation and Variance for Continuous Random Variables 6 answer: The cdf for X is F (x) = x on the range [,]. So q.6 =.6. f(x) F (x) left tail area = prob =.6 F (q.6 ) =.6 q.6 =.6 x q.6 =.6 x q.6 : left tail area =.6 F (q.6 ) =.6 Example 3. Find the.6 quantile of the standard normal distribution. answer: We don t have a formula for the cdf, so we use the R quantile function qnorm. q.6 = qnorm(.6,, ) =.25335 φ(z) Φ(z) left tail area = prob. =.6 F (q.6 ) =.6 q.6 =.253 z q.6 =.253 z q.6 : left tail area =.6 F (q.6 ) =.6 Quantiles give a useful measure of location for a random variable. We will use them more in coming lectures. 5. Percentiles, deciles, quartiles For convenience, quantiles are often described in terms of percentiles, deciles or quartiles. The 6 th percentile is the same as the.6 quantile. For example you are in the 6 th percentile for height if you are taller than 6 percent of the population, i.e. the probability that you are taller than a randomly chosen person is 6 percent. Likewise, deciles represent steps of /. The third decile is the.3 quantile. Quartiles are in steps of /4. The third quartile is the.75 quantile and the 75 th percentile. 6 Appendix: Integral Computation Details Example 3: Let X exp(λ). Find E(X). The range of X is [, ) and its pdf is f(x) = λe λx. Therefore E(X) = xf(x) dx = λxe λx dx

8.5 class 6, Expectation and Variance for Continuous Random Variables 7 (using integration by parts with u = x, v = λe λx u =, v = e λx ) = xe λx + e λx dx e λx = λ =. λ We used the fact that xe λx and e λx go to as x. Example 4: Let Z N(, ). Find E(Z). 2 The range of Z is (, ) and its pdf is φ(z) = e z /2. By symmetry the mean must be. The only mathematically tricky part is to show that the integral converges, i.e. that the mean exists at all (some random variable do not have means, but we will not encounter this very often.) For completeness we include the argument, though this is not something we will ask you to do. We first compute the integral from to : zφ(z) dz = ze z2 /2 dz. The u-substitution u = z 2 /2 gives du = z dz. So the integral becomes ze z2 /2 dz. = e u du = e u = Similarly, zφ(z) dz =. Adding the two pieces together gives E(Z) =. Example 6: Let X exp(λ). Find E(X 2 ). ) = x 2 f(x) dx = E(X 2 λx 2 e λx dx (using integration by parts with u = x 2, v = λe λx u = 2x, v = e λx ) 2 λx = x e + 2xe λx dx (the first term is, for the second term use integration by parts: u = 2x, v = e λx u λx = 2, v = e λ ) = 2x e λx λ = 2 e λx λ 2 + 2 = λ 2. e λx dx λ

MIT OpenCourseWare https://ocw.mit.edu 8.5 Introduction to Probability and Statistics Spring 24 For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms.