Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Similar documents
What is a random variable

CSE 312: Foundations of Computing II Random Variables, Linearity of Expectation 4 Solutions

Conditional Probability

Expected Value 7/7/2006

Lecture 3: Random variables, distributions, and transformations

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Deep Learning for Computer Vision

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

p. 4-1 Random Variables

MODULE 2 RANDOM VARIABLE AND ITS DISTRIBUTION LECTURES DISTRIBUTION FUNCTION AND ITS PROPERTIES

Review of Probability. CS1538: Introduction to Simulations

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, < x <. ( ) X s. Real Line

More on Distribution Function

Homework 4 Solution, due July 23

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

Discrete Random Variables. Discrete Random Variables

Statistics and Econometrics I

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

DISCRETE RANDOM VARIABLES: PMF s & CDF s [DEVORE 3.2]

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

Lecture Notes 1 Basic Probability. Elements of Probability. Conditional probability. Sequential Calculation of Probability

Part (A): Review of Probability [Statistics I revision]

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Joint Distribution of Two or More Random Variables

Dept. of Linguistics, Indiana University Fall 2015

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Steve Smith Tuition: Maths Notes

the time it takes until a radioactive substance undergoes a decay

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

What does independence look like?

Discrete Probability Refresher


CSC Discrete Math I, Spring Discrete Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Experiments, Models, and Probabilities

Discrete Random Variable

1 Random variables and distributions

Conditional Probability

Probability. VCE Maths Methods - Unit 2 - Probability

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

6.2 Introduction to Probability. The Deal. Possible outcomes: STAT1010 Intro to probability. Definitions. Terms: What are the chances of?

Applied Statistics I

Notice how similar the answers are in i,ii,iii, and iv. Go back and modify your answers so that all the parts look almost identical.

Outline. 1. Define likelihood 2. Interpretations of likelihoods 3. Likelihood plots 4. Maximum likelihood 5. Likelihood ratio benchmarks

27 Binary Arithmetic: An Application to Programming

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Probability Distributions for Discrete RV

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

Chapter 1 Probability Theory

MAT 271E Probability and Statistics

Probability, Random Processes and Inference

(i) Given that a student is female, what is the probability of having a GPA of at least 3.0?

MATH Solutions to Probability Exercises

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

2. Conditional Probability

Statistics for Economists. Lectures 3 & 4

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Random Variables Example:

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Random variables. DS GA 1002 Probability and Statistics for Data Science.

What is Probability? Probability. Sample Spaces and Events. Simple Event

Why should you care?? Intellectual curiosity. Gambling. Mathematically the same as the ESP decision problem we discussed in Week 4.

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Notation: X = random variable; x = particular value; P(X = x) denotes probability that X equals the value x.

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term

Review of Probability Theory

Multivariate Distributions (Hogg Chapter Two)

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

PROBABILITY CHAPTER LEARNING OBJECTIVES UNIT OVERVIEW

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

MAT 271E Probability and Statistics

Recitation 2: Probability

Notes 6 Autumn Example (One die: part 1) One fair six-sided die is thrown. X is the number showing.

RVs and their probability distributions

7 Random samples and sampling distributions

Topic 3: The Expectation of a Random Variable

The random variable 1

Joint Probability Distributions and Random Samples (Devore Chapter Five)

First Digit Tally Marks Final Count

Fundamental Tools - Probability Theory II

Week 12-13: Discrete Probability

Chapter 3. Chapter 3 sections

3 Multiple Discrete Random Variables

Probability & Statistics - FALL 2008 FINAL EXAM

To find the median, find the 40 th quartile and the 70 th quartile (which are easily found at y=1 and y=2, respectively). Then we interpolate:

Probability Year 9. Terminology

Probability Theory Review

Week 2. Review of Probability, Random Variables and Univariate Distributions

MAT 271E Probability and Statistics

Probabilistic models

Lecture 2. Constructing Probability Spaces

Lectures on Elementary Probability. William G. Faris

General Info. Grading

Introduction to Probability 2017/18 Supplementary Problems

Single Maths B: Introduction to Probability

Expected Value. Lecture A Tiefenbruck MWF 9-9:50am Center 212 Lecture B Jones MWF 2-2:50pm Center 214 Lecture C Tiefenbruck MWF 11-11:50am Center 212

Transcription:

Random Variables Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin

Random Variables A Random Variable (RV) is a response of a random phenomenon which is numeric. Examples: 1. Roll a die twice and record the sum. (Discrete RV) 2. The coin flip example where the response X is the flip number of the first tail. (Discrete, countable RV) 3. Count α particles emitted from a low intensity radioactive source. Let T i = waiting time between the i 1 th and the i th emissions. (Continuous RV) Let Y T = # of emissions in time interval [0, T ]. (Discrete RV) Random Variables 1

4. Throw a dart at a square target and record the X and Y coordinates of where the dart hits. Z = (X, Y ) is a random vector. (Both are continuous RVs) Discrete: The values taken come from a discrete set. There may be a finite number of possible outcomes (e.g. 11 as for the sum of two dice), or countable (infinite, as for the flip that the first tail occurs). Continuous: The values taken come from an interval (possibly infinite). In the dart example X [ 1, 1] (finite), whereas in the radiation example T i [0, ) (infinite). Random Variables 2

Note that RVs are usually indicated by capital letters, usually from the end of the alphabet (X, Y, Z etc). Letters from the beginning of the alphabet are left to denote events. Instead of working with events, it is often easier to work with RVs. In fact for any event A Ω, we can define an indicator RV I A = From this definition, we get { 1 If A occurs 0 Otherwise I A c = 1 I A I A B = I A I B I A B = I A + I B I A I B Instead of using logic and set operations when working with events, we use algebra and arithmetic operations on RVs Random Variables 3

For something to be a random variable, arithmetic operations have to make sense. I would not classify the following as a RV X = 1 if Fred 2 if Ethel 3 if Ricky 4 if Lucy What does 1 + 4 (i.e. Fred + Lucy) mean here. A RV X is a function from Ω (domain) to the real numbers. The range of the function (the values it takes) will, obviously, depend on the problem Random Variables 4

Example: Flip a biased coin 3 times and let X be the number of heads. Assume the flips are independent with P [H] = p and P [T ] = 1 p. ω X(ω) P [ω] HHH 3 p 3 HHT 2 p 2 (1 p) HTH 2 p 2 (1 p) THH 2 p 2 (1 p) HTT 1 p(1 p) 2 HTT 1 p(1 p) 2 HTT 1 p(1 p) 2 TTT 0 (1 p) 3 For this example, the range of X = {0, 1, 2, 3}. The probability measure on the original sample space, Ω, gives the probability measure for X. Random Variables 5

For the above example, the probability measure is defined by P [X = i] = ( ) 3 p i (1 p) 3 i i This happens to be the probability mass function for X. Definition: Assume that the discrete RV X takes the values {x 1, x 2,..., x n } (n possibly ). The Probability Mass Function (PMF) of the discrete RV X is a function on the range of X that gives the probability for each possible value of X: p X (x i ) = P [X = x i ] = ω:x(ω)=x i P [ω] Random Variables 6

Any valid PMF p(x) must satisfy p X (x i ) 0 i p X(x i ) = 1 In the coin flipping example, assume that p = 0.5 (i.e. the coin is fair). Then the probability histogram of the PMF looks like PMF for number of heads in 3 flips P[X=x] 0.00 0.10 0.20 0.30 0 1 2 3 x (number of heads) Random Variables 7

Definition: The Cumulative Distribution Function (CDF) of a RV X is a function on the range of X that gives the probability of being less than or equal x F (x) = P [X x] The CDF is a nondecreasing function satisfying lim F (x) = 0 and lim F (x) = 1 x x The PMF and the CDF are closely related as (for discrete RVs) F (x) = p(x i ) x i :x i x Also, assuming that x 1 < x 2 <... p(x i ) = F (x i ) F (x i 1 ) Random Variables 8

The CDF for the coin flipping example looks like CDF for number of heads in 3 flips P[X <= x] 0.0 0.2 0.4 0.6 0.8 1.0 1 0 1 2 3 4 x (number of heads) For a discrete RV, the CDF is a step function, with jumps of p(x i ) at each x i. Also it is right continuous. For the coin flipping example lim F (x) = 0 and lim x 0 F (x) = p(0) x 0+ Random Variables 9

For discrete RVs, independence is easily defined. Let X and Y be two discrete RVs taking values x 1, x 2,... and y 1, y 2,... respectively. Then X and Y are said to be independent if for all i and j P [X = x i, Y = y j ] = P [X = x i ]P [Y = y j ] (You don t need to check all possible events involving X and Y.) For three or more RVs, the obvious extension is the case. Random Variables 10

Expected Values Definition: The Expected Value of a RV X (with PMF p(x)) is E[X] = x xp(x) assuming that i x i p(x i ) <. This is a technical point, which if ignored, can lead to paradoxes. There are well defined RV that don t have a finite expection. For example, see the St. Petersberg Paradox (Example D, page 113). Expected Values 11

E[X] can be thought of as an average Example: Flipping 3 coins x 0 1 2 3 p(x) 1 8 3 8 3 8 1 8 E[X] = 0 1 8 + 1 3 8 + 2 3 8 + 3 1 8 = 1.5 E[X] is the average value of the balls in this urn. E[X] serves as a central or typical value of the distribution. Expected Values 12

PMF for number of heads in 3 flips P[X=x] 0.00 0.10 0.20 0.30 0 1 2 3 x (number of heads) Expected Values 13

You bet an amount θ on each draw (with replacement) from the urn for X. The payoff in each draw is the value drawn (or X(ω) for the sample point drawn). What is the bet θ that makes this a fair game? [In the sense that in the long run, the winning or loss will become smaller and smaller relative to the total amount of your bets.] Answer: θ = E[X] Justification: To come (Law of Large Numbers) Theorem. If Y = g(x), then E[Y ] = x g(x)p X (x) Proof. Since Y is also a RV is also has its own PMF. p Y (y) = P [{x : g(x) = y}] = p X (x) {x:g(x)=y} Expected Values 14

Therefore E[Y ] = y = y = y = x yp Y (y) y p X (x) {x:g(x)=y} {x:g(x)=y} g(x)p X (x) g(x)p X (x) So instead of figuring out the PMF for the transformed RV, we can directly calculate the expected value. Expected Values 15

Example: Let Y = X 2 x -2 2 5 p X (x) 1 4 Example: Let X = I A for some event A. Then In general E[g(X)] g(e[x]) 1 2 1 4 y 4 25 p Y (y) 3 4 E[X] = 1P [A] + 0P [A c ] = P [A] E[X 2 ] = 1P [A] + 0P [A c ] = P [A] 1 4 Expected Values 16

Expected value of a linear function If a and b are constants, then E[a + bx] = a + be[x] Proof. Simple algebra Expected value of a sum of 2 RVs E[Y + Z] = E[Y ] + E[Z] Note that this only defined if Y and Z are defined on a common sample space. Expected Values 17

Proof. Both Y and Z are RVs on a common sample space Ω. Then E[Y + Z] = ω (Y + Z)(ω)P [ω] = ω (Y (ω) + Z(ω))P [ω] = ω Y (ω)p [ω] + ω Z(ω)P [ω] = E[Y ] + E[Z] For practical reasons, the results of this may not make sense, even if it is well defined. For example, let Y = number of coins in pocket and Z = time to get to Science Center from home, when students in the class are sampled. Expected Values 18