STAT 430/510 Probability Lecture 7: Random Variable and Expectation

Similar documents
Random variables (discrete)

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

What is a random variable

Lecture 10. Variance and standard deviation

What does independence look like?

Topic 3: The Expectation of a Random Variable

CSE 312: Foundations of Computing II Random Variables, Linearity of Expectation 4 Solutions

3 Multiple Discrete Random Variables

Joint Distribution of Two or More Random Variables

Lecture 9. Expectations of discrete random variables

More on Distribution Function

Homework 4 Solution, due July 23

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Chapter 3 Discrete Random Variables

Notes 12 Autumn 2005

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

M378K In-Class Assignment #1

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Discrete Probability Refresher

SDS 321: Introduction to Probability and Statistics

Review of Probability. CS1538: Introduction to Simulations

Refresher on Discrete Probability

Probability and Statisitcs

SDS 321: Introduction to Probability and Statistics

Chapter 4. Chapter 4 sections

Week 2. Review of Probability, Random Variables and Univariate Distributions

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Lecture 10: Bayes' Theorem, Expected Value and Variance Lecturer: Lale Özkahya

random variables T T T T H T H H

Analysis of Engineering and Scientific Data. Semester

18.440: Lecture 19 Normal random variables

More than one variable

Random variables, Expectation, Mean and Variance. Slides are adapted from STAT414 course at PennState

Recitation 2: Probability

Fundamental Tools - Probability Theory II

Expectation of Random Variables

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Probability Review. Chao Lan

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

STAT 430/510: Lecture 16

Continuous Random Variables

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Dept. of Linguistics, Indiana University Fall 2015

18.440: Lecture 26 Conditional expectation

Random Variables Chris Piech CS109, Stanford University. Piech, CS106A, Stanford University

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

LECTURE 5: Discrete random variables: probability mass functions and expectations. - Discrete: take values in finite or countable set

A random variable is a quantity whose value is determined by the outcome of an experiment.

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

The random variable 1

STAT 414: Introduction to Probability Theory

Notes for Math 324, Part 19

Fourier and Stats / Astro Stats and Measurement : Stats Notes

a zoo of (discrete) random variables

Fundamentals. CS 281A: Statistical Learning Theory. Yangqing Jia. August, Based on tutorial slides by Lester Mackey and Ariel Kleiner

UCSD CSE 21, Spring 2014 [Section B00] Mathematics for Algorithm and System Analysis

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

CME 106: Review Probability theory

Introduction to Machine Learning

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Probabilities and Expectations

Probability and Statistics

MAT 271E Probability and Statistics

Edexcel past paper questions

Lecture 16. Lectures 1-15 Review

Bayesian statistics, simulation and software

SDS 321: Introduction to Probability and Statistics

CSE 312, 2011 Winter, W.L.Ruzzo. 6. random variables

Part (A): Review of Probability [Statistics I revision]

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

X = X X n, + X 2

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Relationship between probability set function and random variable - 2 -

MULTIVARIATE PROBABILITY DISTRIBUTIONS

STAT 418: Probability and Stochastic Processes

Lecture 3. Discrete Random Variables

CMPSCI 240: Reasoning Under Uncertainty

Conditional Probability

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Week 12-13: Discrete Probability

1 Random variables and distributions

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

Discrete Random Variables

2 (Statistics) Random variables

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

Stochastic Models of Manufacturing Systems

18.440: Lecture 28 Lectures Review

Topic 3 Random variables, expectation, and variance, II

Mathematics 426 Robert Gross Homework 9 Answers

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss?

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Expectation and Variance

Transcription:

STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011

Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula Independence

Random Variables In most problems, we are interested only in a particular aspect of the outcomes of experiments. Example: When we toss 10 coins, we are interested in the total number of heads, and not the outcome for each coin.

Definition For a given sample space S, a random variable (r.v.) is a real-valued function defined over the elements of S. It looks abstract. But it just means like we only care about one aspect of the outcomes (e.g. total number of heads). And we denote such aspect by a random variable.

Example Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, then Y is a random variable taking on one of the values 0,1,2 and 3.

Example Three balls are to be randomly selected without replacement from an urn containing 5 balls numbered 1 through 5, and we want to consider the biggest number we get. What is the random variable here? What are the values it may take?

Random Variables: Continued A random variable reflects the aspect of a random experiment that is of interest to us. There are two types of random variables A discrete random variable takes discrete values. It may take finite or infinite number of values. A continuous random variable may take all values in an interval of numbers.

Example of Discrete Random Variables X=the sum of two tosses of a fair die. Possible values: {2, 3, 4,, 12}

Example of Discrete Random Variables X=the sum of two tosses of a fair die. Possible values: {2, 3, 4,, 12} X=the total number of coin tosses required for seeing a head. Possible values: {1, 2, 3,, } X is a discrete random variable. The number of possible values is infinite.

Probability Mass Function If X is a discrete random variable, then p(x) = P(X = x) is called the probability mass function (pmf) of X. x are the values that could be taken by X.

Example Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, then Y is a random variable taking on one of the values 0,1,2 and 3 with respective probabilities p(0) = P{Y = 0} = P{(T, T, T )} = 1 8 p(1) = P{Y = 1} = P{(T, T, H), (T, H, T ), (H, T, T )} = 3 8 p(2) = P{Y = 2} = P{(T, H, H), (H, T, H), (H, H, T )} = 3 8 p(3) = P{Y = 3} = P{(H, H, H)} = 1 8

Cumulative Distribution Function The cumulative distribution function F(x) of a discrete random variable X with pmf p(x) is given by F(x) = P(X x) = y x p(y) For any x, F(x) is the probability that the observed value of X will be at most x.

Example : Continued F(0) = P{Y 0} = 1 8 F(1) = P{Y 1} = 1 2 F(2) = P{Y 2} = 7 8 F(3) = P{Y 3} = 1

Expected Value If X is a discrete random variable having a probability mass function p(x), then the expected value of X is defined by E(X) = xp(x)

Expected Value If X is a discrete random variable having a probability mass function p(x), then the expected value of X is defined by E(X) = xp(x) E(X) is a weighted average of the possible values that X can take on. Each value is weighted by the probability.

Expected Value If X is a discrete random variable having a probability mass function p(x), then the expected value of X is defined by E(X) = xp(x) E(X) is a weighted average of the possible values that X can take on. Each value is weighted by the probability. It can be interpreted as a kind of best guess you can make.

Example: Continue Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, What is E(Y )? E(Y ) = 0 p(0) + 1 p(1) + 2 p(2) + 3 p(3) = 3 2

Example Find E(X), where X is the outcome when we roll a fair die.

Example Find E(X), where X is the outcome when we roll a fair die. Solution: 3.5

Expectation of a Function of a Random Variable If X is a discrete random variable that takes value x i, i 1 with probability p(x i ), then for any function g, E[g(X)] = i g(x i )p(x i )

Example: Continue Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, What is E(Y 2 )? E(Y 2 ) = 0 2 p(0) + 1 2 p(1) + 2 2 p(2) + 3 2 p(3) = 3

Example Let X denote a random variable that takes on any of the values -1, 0 and 1 with respective probabilities P(X = 1) = 0.2, P(X = 0) = 0.5, P(X = 1) = 0.3. Compute E[X], E[X 1].

Example Let X denote a random variable that takes on any of the values -1, 0 and 1 with respective probabilities P(X = 1) = 0.2, P(X = 0) = 0.5, P(X = 1) = 0.3. Compute E[X], E[X 1]. Solution: E[X] = ( 1)P(X = 1) + 0 P(X = 0) + 1P(X = 1) = 0.1 Solution: E[X 1] = ( 1 1)P(X = 1) + (0 1) P(X = 0) + (1 1)P(X = 1) = 0.9

Example Let X denote a random variable that takes on any of the values -1, 0 and 1 with respective probabilities P(X = 1) = 0.2, P(X = 0) = 0.5, P(X = 1) = 0.3. Compute E[X], E[X 1]. Solution: E[X] = ( 1)P(X = 1) + 0 P(X = 0) + 1P(X = 1) = 0.1 Solution: E[X 1] = ( 1 1)P(X = 1) + (0 1) P(X = 0) + (1 1)P(X = 1) = 0.9 So E[X 1] = E[X] 1.

Corollary If a and b are constant, then E[aX + b] = ae[x] + b

Example Assume that E[X] = 1. Calculate E[7X + 2].

Example Assume that E[X] = 1. Calculate E[7X + 2]. Solution: E[7X + 2] = 7E[X] + 2 = 9

Corollary-continue If X and Y are random variables, then E[X + Y ] = E[X] + E[Y ] E[aX + by ] = ae[x] + be[y ] For random variables X 1, X 2,, X n, n E[ X i ] = i=1 n E[X i ] i=1

Example If E(X 2 ) = 1, E(Y 2 ) = 2, E(XY ) = 1, calculate E((X + Y )(X 2Y )) E((X + Y )(X 2Y )) = E(X 2 + XY 2XY 2Y 2 ) = E(X 2 ) E(XY ) 2E(Y 2 ) = 4

Example: Continue Suppose that our experiment consists of tossing 3 fair coins. If we let Y denote the number of heads that appear, What is E(Y )? Use the trick of summation.

Important trick: Using the summation Find the expected value of the sum obtained when n fair dice are rolled. Let X be the sum

Important trick: Using the summation Find the expected value of the sum obtained when n fair dice are rolled. Let X be the sum let Y i be the upturned value on die i

Important trick: Using the summation Find the expected value of the sum obtained when n fair dice are rolled. Let X be the sum let Y i be the upturned value on die i E[X] = E[ n i=1 Y i] = n i=1 E[Y i] = 3.5n

Important trick: Method of Indicators For any event A, the indicator of A, I A, equals to 1 if A occurs and 0 otherwise. The expected value of I A is the probability of A: E[I A ] = P(A) If X is the number of events that occur among some collection of events A 1,, A n, then E[X] = E(I A1 ) + + E(I An ) = P(A 1 ) + + P(A n )

Example Suppose that we want to match 10 couples into 10 pairs. What is the expected number of husbands that will be paired with their wives? Let X be the number of husbands that will be paired with their wives. Let I i be the indicator whether ith husband is matched with his wife.

Example Suppose that we want to match 10 couples into 10 pairs. What is the expected number of husbands that will be paired with their wives? Let X be the number of husbands that will be paired with their wives. Let I i be the indicator whether ith husband is matched with his wife. E[X] = E[ 10 i=1 I i] = 10 i=1 E[I i] = 10 i=1 1 10 = 1