Lecture 10. Variance and standard deviation

Similar documents
Lecture 9. Expectations of discrete random variables

Lecture 16. Lectures 1-15 Review

18.440: Lecture 26 Conditional expectation

Lecture 18. Uniform random variables

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

18.440: Lecture 19 Normal random variables

18.440: Lecture 25 Covariance and some conditional expectation exercises

Topic 3: The Expectation of a Random Variable

SDS 321: Introduction to Probability and Statistics

Lecture 12. Poisson random variables

Lecture 14. More discrete random variables

Class 8 Review Problems 18.05, Spring 2014

18.440: Lecture 28 Lectures Review

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

Lecture 7. Bayes formula and independence

Analysis of Engineering and Scientific Data. Semester

Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis

STAT 414: Introduction to Probability Theory

CSE 312: Foundations of Computing II Random Variables, Linearity of Expectation 4 Solutions

18.440: Lecture 28 Lectures Review

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Lecture 7. Sums of random variables

STAT 418: Probability and Stochastic Processes

18.600: Lecture 24 Covariance and some conditional expectation exercises

success and failure independent from one trial to the next?

STA 111: Probability & Statistical Inference

For a list of topics, look over the previous review sheets. Since the last quiz we have... Benford s Law. When does it appear? How do people use it?

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Discrete Mathematics for CS Spring 2007 Luca Trevisan Lecture 20

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Homework 4 Solution, due July 23

STAT Chapter 3: Probability

What is a random variable

6.041/6.431 Fall 2010 Quiz 2 Solutions

Lecture 8: Continuous random variables, expectation and variance

Discrete Random Variables. David Gerard Many slides borrowed from Linda Collins

18.175: Lecture 17 Poisson random variables

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

Class 8 Review Problems solutions, 18.05, Spring 2014

Chapter 7: Section 7-1 Probability Theory and Counting Principles

Topic 3 Random variables, expectation, and variance, II

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014

18.600: Lecture 3 What is probability?

Class 26: review for final exam 18.05, Spring 2014

X = X X n, + X 2

Part (A): Review of Probability [Statistics I revision]

Independence. P(A) = P(B) = 3 6 = 1 2, and P(C) = 4 6 = 2 3.

Random variables (discrete)

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Massachusetts Institute of Technology

Lecture 23. Random walks

A random variable is a quantity whose value is determined by the outcome of an experiment.

Deep Learning for Computer Vision

Review of Probability. CS1538: Introduction to Simulations

Computations - Show all your work. (30 pts)

More on Distribution Function

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

Random Variables and Expectations

Introductory Probability

Massachusetts Institute of Technology

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Great Theoretical Ideas in Computer Science

Notes for Math 324, Part 17

LECTURE 5: Discrete random variables: probability mass functions and expectations. - Discrete: take values in finite or countable set

Let us think of the situation as having a 50 sided fair die; any one number is equally likely to appear.

1 INFO Sep 05

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

Massachusetts Institute of Technology

18.600: Lecture 7 Bayes formula and independence

What does independence look like?

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

5. Conditional Distributions

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Dept. of Linguistics, Indiana University Fall 2015

Probability Review. Chao Lan

Conditional Probability

Probability and Statistics Notes

Probability Rules. MATH 130, Elements of Statistics I. J. Robert Buchanan. Fall Department of Mathematics

18.05 Exam 1. Table of normal probabilities: The last page of the exam contains a table of standard normal cdf values.

Bernoulli Trials, Binomial and Cumulative Distributions

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

Exam 1 Solutions. Problem Points Score Total 145

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

SDS 321: Introduction to Probability and Statistics

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Random processes. Lecture 17: Probability, Part 1. Probability. Law of large numbers

Joint Distribution of Two or More Random Variables

1 Basic continuous random variable problems

Examples of random experiment (a) Random experiment BASIC EXPERIMENT

Practice Exam 1: Long List 18.05, Spring 2014

Review of probability

A First Course in Probability Sheldon Ross Ninth Edition

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

Fundamental Tools - Probability Theory II

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

CSC Discrete Math I, Spring Discrete Probability

Vina Nguyen HSSP July 6, 2008

Chapter 4 : Discrete Random Variables

18.600: Lecture 7 Bayes formula and independence

Transcription:

18.440: Lecture 10 Variance and standard deviation Scott Sheffield MIT 1

Outline Defining variance Examples Properties Decomposition trick 2

Outline Defining variance Examples Properties Decomposition trick 3

Recall definitions for expectation Recall: a random variable X is a function from the state space to the real numbers. Can interpret X as a quantity whose value depends on the outcome of an experiment. Say X is a discrete random variable if (with probability one) it takes one of a countable set of values. For each a in this countable set, write p(a) := P{X = a}. Call p the probability mass function. The expectation of X, written E [X ], is defined by E [X ] = x:p(x)>0 xp(x). Also, E[g(X )] = x:p(x)>0 g(x)p(x). 4

Defining variance Let X be a random variable with mean µ. The variance of X, denoted Var(X ), is defined by Var(X ) = E [(X µ) 2 ]. Taking g(x) ) = (x µ) 2, and recalling that E [g(x )] = g(x)p(x), we find that x:p(x)>0 Var[X ] = x:p(x)>0 (x µ) 2 p(x). Variance is one way to measure the amount a random variable varies from its mean over successive trials. 5

Very important alternate formula Let X be a random variable with mean µ. We introduced above the formula Var(X ) = E [(X µ) 2 ]. This can be written Var[X ] = E [X 2 2X µ + µ 2 ]. By additivity of expectation, this is the same as 2 2 E [X 2 ] 2µE[X ] + µ = E[X 2 ] µ. This gives us our very important alternate formula: Var[X ] = E [X 2 ] (E [X ]) 2. Original formula gives intuitive idea of what variance is (expected square of difference from mean). But we will often use this alternate formula when we have to actually compute the variance. 6

Outline Defining variance Examples Properties Decomposition trick 7

Outline Defining variance Examples Properties Decomposition trick 8

Variance examples If X is number on a standard die roll, what is Var[X ]? Var[X ] = E [X 2 ] E [X ] 2 = 1 1 1 1 1 1 91 1 2 + 2 2 + 3 2 + 4 2 + 5 2 + 6 2 (7/2) 2 = = 49 35 6 6 6 6 6 6 6 4 12. Let Y be number of heads in two fair coin tosses. What is Var[Y ]? Recall P{Y = 0} = 1/4 and P{Y = 1} = 1/2 and P{Y = 2} = 1/4. 1 1 1 1 Then Var[Y ] = E[Y 2 ] E [Y ] 2 = 0 2 + 1 2 + 2 2 1 2 = 2. 4 2 4 9

More variance examples You buy a lottery ticket that gives you a one in a million chance to win a million dollars. Let X be the amount you win. What s the expectation of X? How about the variance? Variance is more sensitive than expectation to rare outlier events. At a particular party, there are four five-foot-tall people, five six-foot-tall people, and one seven-foot-tall person. You pick one of these people uniformly at random. What is the expected height of the person you pick? Variance? 10

Outline Defining variance Examples Properties Decomposition trick 11

Outline Defining variance Examples Properties Decomposition trick 12

Identity If Y = X + b, where b is constant, then does it follow that Var[Y ] = Var[X ]? Yes. We showed earlier that E [ax ] = ae [X ]. We claim that Var[aX ] = a 2 Var[X ]. Proof: Var[aX ] = E [a 2 X 2 ] E [ax ] 2 = a 2 E [X 2 ] a 2 E [X ] 2 = a 2 Var[X ]. 13

Standard deviation Write SD[X ] = Var[X ]. Satisfies identity SD[aX ] = asd[x ]. Uses the same units as X itself. If we switch from feet to inches in our height of randomly chosen person example, then X, E [X ], and SD[X ] each get multiplied by 12, but Var[X ] gets multiplied by 144. 14

Outline Defining variance Examples Properties Decomposition trick 15

Outline Defining variance Examples Properties Decomposition trick 16

Number of aces Choose five cards from a standard deck of 52 cards. Let A be the number of aces you see. Compute E[A] and Var[A]. How many five card hands total? 52 Answer: 5. How many such hands have k aces? 4 48 k 5 k Answer:. ( 4 )( 48 k 5 k ) So P{A = k} = ( 52. 5 ) ) So E[A] = 4 k=0 kp{a = k}, ) and Var[A] = 4 k 2 P{A = k} E [A] 2. k=0 17

Number of aces revisited Choose five cards from a standard deck of 52 cards. Let A be the number of aces you see. Choose five cards in order, and let A i be 1 if the ith card chosen is an ace and zero otherwise. ) 5 ) Then A = A 5 i=1 i. And E [A] = i=1 E [A i ] = 5/13. Now A 2 = (A 1 + A 2 +... + A 5 ) 2 can be expanded into 25 ) 5 ) terms: A 2 = 5 i=1 j=1 A i A j. ) 5 ) So E[A 2 ] = 5 i=1 j=1 E [A i A j ]. Five terms of form E [A i A j ] with i = j five with i = j. First five contribute 1/13 each. How about other twenty? E [A i A j ] = (1/13)(3/51) = (1/13)(1/17). So 5 20 105 E [A 2 ] = + = 13 13 17 13 17. Var[A] = E [A 2 ] E [A] 2 = 105 25 13 17 13 13. 18

Hat problem variance In the n-hat shuffle problem, let X be the number of people who get their own hat. What is Var[X ]? We showed earlier that E [X ] = 1. So Var[X ] = E [X 2 ] 1. But how do we compute E [X 2 ]? Decomposition trick: write variable as sum of simple variables. Let X i be one if ith person gets own hat and zero otherwise. ) n Then X = X 1 + X 2 +... + X n = i=1 X i. We want to compute E [(X 1 + X 2 +... + X n ) 2 ]. Expand this out and using linearity of expectation: n n n n 1 1 E [ X i X j ] = E [X i X j ] = n +n(n 1) = 2. n n(n 1) i=1 j=1 i=1 j=1 So Var[X ] = E [X 2 ] (E [X ]) 2 = 2 1 = 1. 19

MIT OpenCourseWare http://ocw.mit.edu 18.440 Probability and Random Variables Spring 2014 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.