Feb 28th, 2018 Lecture 10: Random variables
Countdown to midterm (March 21st): 28 days Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters 4, 5, 6, 7: Random variables Week 9 Chapters 8, 9: Bivariate and multivariate distributions Week 10 Chapter 10: Expectations and variances Week 11 Chapter 11: Limit theorems Week 12 Chapters 12, 13: Selected topics
Order Discrete random variables (Chap 4) Continuous random variables (Chap 6) Special discrete distributions (Chap 5) Special continuous distributions (Chap 7)
Chapter 4: Discrete random variables 4.1 Random variables 4.3 Discrete random variables 4.4 Expectations of discrete random variables 4.5 Variances and moments of discrete random variables 4.2 Distribution functions
Random variable Definition Let S be the sample space of an experiment. A real-valued function X : S R is called a random variable of the experiment.
Example: tossing a coin two times Experiment: Toss a fair coins 2 times Let X be the number of heads X is a random variable What are all possible values of X? What is the probability that X = 1?
A random variable is a function... X is a function that maps S = {HH, HT, TH, TT} to {0, 1, 2} such that HH 2 HT 1 TH 1 TT 0
... and its value is random X 0 1 2 probability 0.25 0.5 0.25 Remark: the probability on the sample space induces a new probability on R
Example: Brownian motion Experiment: let a pollen grain move in water, starting at a given position, for 1s Einstein s question: the trajectory of the pollen grain An outcome: a record of the locations and velocities of all the molecules in the system over time The sample space: the set of all possible outcomes
Einstein s approach Describes the increment of particle positions in unrestricted one dimensional domain as a random variable Computes the probability law of the probability law of the location of the particle after time t
( Enhanced ) definition of random variables Definition Let S be the sample space of an experiment [that is so complicated and obscure that we can not learn anything from it]. A random variable of the experiment is just a real-valued function X : S R [for which people try to forget about its domain of definition].
Notations When we write {X = 2}, we are (implicitly) referring to the set {s S : X(s) = 2} which is an event of the experiment (a subset of S). When we write {X (0, 1)}, we are (implicitly) referring to the event {s S : X(s) (0, 1)}.
Question X 0 1 2 probability 0.25 0.5 0.25 Question: If we don t know the probability on the sample space (S), how can we compute the probability induced on X Answer: We can t. But at least X is quantifiable and usually observable.
Bayesian statistics Bayesian learning: Start with some prior belief on how likely each outcome of X happen With new information comes in (events E 1, E 2,..., happen), we update the belief about the likelihood of each outcome
Discrete random variables
Discrete random variable Definition A random variables X is discrete if the set of all possible values of X is finite is countably infinite Note: A set A is countably infinite if its elements can be put in one-to-one correspondence with the set of natural numbers, i.e, we can index the element of A as a sequence A = {x 1, x 2,..., x n,...}
Discrete random variable A random variable X is described by its probability mass function
Discrete random variable: finite case x 0 1 2 p(x) 0.25 0.5 0.25
Exercise Problem Let p be a function defined as follows ( ) 1 2 x 2 3 if x = 1, 2, 3,..., p(x) = 0 elsewhere Prove that p is a probability mass function.
Read the pmf table
A game of chance Let s assume we have a biased coin that turns head 60% of the time. Consider two scenarios Scenario 1: I toss the coin; if the outcome is tail, you win $7; if it is head, you lose $5 Scenario 2: I toss the coin; if the outcome is tail, you win $6; if it is head, you lose $4 Which scenario is a fair game?
Expectation Example: Let X be the money you get by playing one game in Scenario 1 x 7-5 p(x) 0.4 0.6 then E(X) = 7 (0.4) + ( 5) (0.6) = 2.