STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011
Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula Independence
Random Variables In most problems, we are interested only in a particular aspect of the outcomes of experiments. Example: When we toss 10 coins, we are interested in the total number of heads, and not the outcome for each coin.
Definition For a given sample space S, a random variable (r.v.) is a real-valued function defined over the elements of S. It looks abstract. But it just means like we only care about one aspect of the outcomes (e.g. total number of heads). And we denote such aspect by a random variable.
Example Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, then Y is a random variable taking on one of the values 0,1,2 and 3.
Example Three balls are to be randomly selected without replacement from an urn containing 5 balls numbered 1 through 5, and we want to consider the biggest number we get. What is the random variable here? What are the values it may take?
Random Variables: Continued A random variable reflects the aspect of a random experiment that is of interest to us. There are two types of random variables A discrete random variable takes discrete values. It may take finite or infinite number of values. A continuous random variable may take all values in an interval of numbers.
Example of Discrete Random Variables X=the sum of two tosses of a fair die. Possible values: {2, 3, 4,, 12}
Example of Discrete Random Variables X=the sum of two tosses of a fair die. Possible values: {2, 3, 4,, 12} X=the total number of coin tosses required for seeing a head. Possible values: {1, 2, 3,, } X is a discrete random variable. The number of possible values is infinite.
Probability Mass Function If X is a discrete random variable, then p(x) = P(X = x) is called the probability mass function (pmf) of X. x are the values that could be taken by X.
Example Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, then Y is a random variable taking on one of the values 0,1,2 and 3 with respective probabilities p(0) = P{Y = 0} = P{(T, T, T )} = 1 8 p(1) = P{Y = 1} = P{(T, T, H), (T, H, T ), (H, T, T )} = 3 8 p(2) = P{Y = 2} = P{(T, H, H), (H, T, H), (H, H, T )} = 3 8 p(3) = P{Y = 3} = P{(H, H, H)} = 1 8
Cumulative Distribution Function The cumulative distribution function F(x) of a discrete random variable X with pmf p(x) is given by F(x) = P(X x) = y x p(y) For any x, F(x) is the probability that the observed value of X will be at most x.
Example : Continued F(0) = P{Y 0} = 1 8 F(1) = P{Y 1} = 1 2 F(2) = P{Y 2} = 7 8 F(3) = P{Y 3} = 1
Expected Value If X is a discrete random variable having a probability mass function p(x), then the expected value of X is defined by E(X) = xp(x)
Expected Value If X is a discrete random variable having a probability mass function p(x), then the expected value of X is defined by E(X) = xp(x) E(X) is a weighted average of the possible values that X can take on. Each value is weighted by the probability.
Expected Value If X is a discrete random variable having a probability mass function p(x), then the expected value of X is defined by E(X) = xp(x) E(X) is a weighted average of the possible values that X can take on. Each value is weighted by the probability. It can be interpreted as a kind of best guess you can make.
Example: Continue Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, What is E(Y )? E(Y ) = 0 p(0) + 1 p(1) + 2 p(2) + 3 p(3) = 3 2
Example Find E(X), where X is the outcome when we roll a fair die.
Example Find E(X), where X is the outcome when we roll a fair die. Solution: 3.5
Expectation of a Function of a Random Variable If X is a discrete random variable that takes value x i, i 1 with probability p(x i ), then for any function g, E[g(X)] = i g(x i )p(x i )
Example: Continue Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, What is E(Y 2 )? E(Y 2 ) = 0 2 p(0) + 1 2 p(1) + 2 2 p(2) + 3 2 p(3) = 3
Example Let X denote a random variable that takes on any of the values -1, 0 and 1 with respective probabilities P(X = 1) = 0.2, P(X = 0) = 0.5, P(X = 1) = 0.3. Compute E[X], E[X 1].
Example Let X denote a random variable that takes on any of the values -1, 0 and 1 with respective probabilities P(X = 1) = 0.2, P(X = 0) = 0.5, P(X = 1) = 0.3. Compute E[X], E[X 1]. Solution: E[X] = ( 1)P(X = 1) + 0 P(X = 0) + 1P(X = 1) = 0.1 Solution: E[X 1] = ( 1 1)P(X = 1) + (0 1) P(X = 0) + (1 1)P(X = 1) = 0.9
Example Let X denote a random variable that takes on any of the values -1, 0 and 1 with respective probabilities P(X = 1) = 0.2, P(X = 0) = 0.5, P(X = 1) = 0.3. Compute E[X], E[X 1]. Solution: E[X] = ( 1)P(X = 1) + 0 P(X = 0) + 1P(X = 1) = 0.1 Solution: E[X 1] = ( 1 1)P(X = 1) + (0 1) P(X = 0) + (1 1)P(X = 1) = 0.9 So E[X 1] = E[X] 1.
Corollary If a and b are constant, then E[aX + b] = ae[x] + b
Example Assume that E[X] = 1. Calculate E[7X + 2].
Example Assume that E[X] = 1. Calculate E[7X + 2]. Solution: E[7X + 2] = 7E[X] + 2 = 9
Corollary-continue If X and Y are random variables, then E[X + Y ] = E[X] + E[Y ] E[aX + by ] = ae[x] + be[y ] For random variables X 1, X 2,, X n, n E[ X i ] = i=1 n E[X i ] i=1
Example If E(X 2 ) = 1, E(Y 2 ) = 2, E(XY ) = 1, calculate E((X + Y )(X 2Y )) E((X + Y )(X 2Y )) = E(X 2 + XY 2XY 2Y 2 ) = E(X 2 ) E(XY ) 2E(Y 2 ) = 4
Example: Continue Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, What is E(Y )? Use the trick of summation.
Important trick: Using the summation Find the expected value of the sum obtained when n fair dice are rolled. Let X be the sum
Important trick: Using the summation Find the expected value of the sum obtained when n fair dice are rolled. Let X be the sum let Y i be the upturned value on die i
Important trick: Using the summation Find the expected value of the sum obtained when n fair dice are rolled. Let X be the sum let Y i be the upturned value on die i E[X] = E[ n i=1 Y i] = n i=1 E[Y i] = 3.5n
Important trick: Method of Indicators For any event A, the indicator of A, I A, equals to 1 if A occurs and 0 otherwise. The expected value of I A is the probability of A: E[I A ] = P(A) If X is the number of events that occur among some collection of events A 1,, A n, then E[X] = E(I A1 ) + + E(I An ) = P(A 1 ) + + P(A n )
Example Suppose that we want to match 10 couples into 10 pairs. What is the expected number of husbands that will be paired with their wives? Let X be the number of husbands that will be paired with their wives. Let I i be the indicator whether ith husband is matched with his wife.
Example Suppose that we want to match 10 couples into 10 pairs. What is the expected number of husbands that will be paired with their wives? Let X be the number of husbands that will be paired with their wives. Let I i be the indicator whether ith husband is matched with his wife. E[X] = E[ 10 i=1 I i] = 10 i=1 E[I i] = 10 i=1 1 10 = 1