Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Similar documents
P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

BASICS OF PROBABILITY

STT 441 Final Exam Fall 2013

3. Probability and Statistics

Joint Distribution of Two or More Random Variables

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

Review: mostly probability and some statistics

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Lecture 11. Probability Theory: an Overveiw

Notes for Math 324, Part 19

Chapter 4. Chapter 4 sections

Review of Probability. CS1538: Introduction to Simulations

18.440: Lecture 28 Lectures Review

2 (Statistics) Random variables

Continuous Random Variables

18.440: Lecture 28 Lectures Review

Lecture 2: Repetition of probability theory and statistics

Random Variables and Their Distributions

Discrete Probability Refresher

Probability. Paul Schrimpf. January 23, UBC Economics 326. Probability. Paul Schrimpf. Definitions. Properties. Random variables.

ENGG2430A-Homework 2

STAT 430/510: Lecture 16

LIST OF FORMULAS FOR STK1100 AND STK1110

Formulas for probability theory and linear models SF2941

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

1 Random Variable: Topics

ECON Fundamentals of Probability

Lecture 2: Review of Probability

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

1 Probability theory. 2 Random variables and probability theory.

Lecture 1: Review on Probability and Statistics

Lecture 4 : Random variable and expectation

Algorithms for Uncertainty Quantification

STOR Lecture 16. Properties of Expectation - I

STAT/MATH 395 PROBABILITY II

6.041/6.431 Fall 2010 Quiz 2 Solutions

Lecture 1: August 28

Introduction to Machine Learning

Lecture 19: Properties of Expectation

Appendix A : Introduction to Probability and stochastic processes

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Chp 4. Expectation and Variance

Probability Theory and Statistics. Peter Jochumzen

1 Presessional Probability

Probability. Paul Schrimpf. January 23, Definitions 2. 2 Properties 3

Expectation of Random Variables

Jointly Distributed Random Variables

More on Distribution Function

Chapter 4 continued. Chapter 4 sections

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

4.1 Expected value of a discrete random variable. Suppose we shall win $i if the die lands on i. What is our ideal average winnings per toss?

Bivariate Distributions

conditional cdf, conditional pdf, total probability theorem?

Random Variables and Expectations

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

STAT Chapter 5 Continuous Distributions

Review of Probability Theory

Homework 10 (due December 2, 2009)

Stochastic Models of Manufacturing Systems

Analysis of Engineering and Scientific Data. Semester

ECE353: Probability and Random Processes. Lecture 7 -Continuous Random Variable

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

MULTIVARIATE PROBABILITY DISTRIBUTIONS

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

More than one variable

GSBA 603. Empirical Research Methods I. Fall Instructor: Dr. Gareth James. Course Notes

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Introduction to Machine Learning

A Course Material on. Probability and Random Processes

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Random Variables. Lecture 6: E(X ), Var(X ), & Cov(X, Y ) Random Variables - Vocabulary. Random Variables, cont.

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Recitation 2: Probability

Stochastic Simulation Introduction Bo Friis Nielsen

Quick Tour of Basic Probability Theory and Linear Algebra

Multiple Random Variables

Introduction to Statistical Inference Self-study

5 Operations on Multiple Random Variables

Topic 3: The Expectation of a Random Variable

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Preliminary Statistics. Lecture 3: Probability Models and Distributions

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

p. 6-1 Continuous Random Variables p. 6-2

Chapter 3: Random Variables 1

4. Distributions of Functions of Random Variables

1.1 Review of Probability Theory

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Lecture 4: Proofs for Expectation, Variance, and Covariance Formula

1 Review of Probability

Probability and Distributions

SDS 321: Introduction to Probability and Statistics

Transcription:

Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example 2. Cast two dice. Define the random variable as sum of the outcomes. P (X =2)=P {(1, 1)} =1/36 P (X =3)=P {(1, 2), (2, 1)} =2/36 P (X =4)=P {(1, 3), (2, 2), (3, 1)} =3/36 P (X =5)=P {(1, 4), (2, 3), (3, 2), (4, 1)} =4/36 P (X =6)=P {(1, 5), (2, 4), (3, 3), (4, 2), (5, 1)} =5/36 P (X =7)=P {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)} =6/36 P (X =8)=P {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} =5/36 P (X =9)=P {(3, 6), (4, 5), (5, 4), (6, 3)} =4/36 P (X =1)=P {(4, 6), (5, 5), (6, 4)} =3/36 P (X =11)=P {(5, 6), (6, 5)} =2/36 P (X =12)=P {(6, 6)} =3/36 Cumulative Distribution Function (CDF) For any real number x F (x) =P (X apple x) i.e., the probability that the random variable X takes on a value less than or equal to x.

Note: P (a <Xapple b) =P (X apple b) P (X apple a) = F (b) F (a) Types of RV Discrete X takes discrete values Probability Mass Function (pmf) p(a) =P (X = a) CDF: F (a) = P all xapplea p(x), F (1) = P 1 i=1 p(x i)=1, F ( 1) =. Example 3. Cast a die. X =outcome the probability mass function p X (i) = 1, i =1, 2,...,6 6 Figure 2.5: pmf and CDF of die cast experiment.

Example 4. The cumulative distribution function F is 8, b < 1/2, apple b<1 >< 3/5, 1 apple b<2 F (b) = 4/5, 2 apple< 3 9/1, 3 apple b<3.5 >: 1, b 3.5 Figure 2.6: CDF and pmf. the probability mass function of X is p() = 1/2, p(1) = 1/1, p(2) = 1/5, p(3) = 1/1, p(3.5) = 1/1. Check: F (1) = P all x p(x) =1 Continuous Random Variable possible values of X is an interval. Z P (X 2 B) = Note: CDF: F (a) =P {X 2 ( 1,a]} = R a1 f(x)dx, B f(x) dx {z} pdf f(x) iscalledprobabilitydensityfunction(pdf)ofx, F (1) = R f(x)dx = P [X 2 ( 1, 1)] = 1, 11 P (a apple X apple b) = R b f(x)dx but P (X = a) =R a f(x)dx =, a a d F (a) =f(a), da F ( 1) =.

Example 5. 1 kg load is equally likely to be placed anywhere along the 1 m span of the beam. Define X to be the position of the load. ( c <xapple f(x) = otherwise where c is a constant. To estimate c use F (1) =1 ) 1 f(x)dx =1 cdx =1 )c x 1 =1 )c =.1 Figure 2.7: pmf and CDF of die cast experiment. The cumulative distribution function 8R x >< cdx = cx =.1x, <xapple1 F (x) = 1, x > 1 >:, x < What is the probability that the load is placed between 2m and 5m? P (2 <Xapple 5) = Z 5 2 cdx =.3

Expectation E[X] = X i x i P (X = x i )= X i x i p(x i ) E[X] = 1 xf(x)dx Example 6. Cast a die and denote the outcome by a random variable X. E[X] = X i x i P (X = x i ) =1 P (X =1)+2 P (X =2)+3 P (X =3) +4 P (X =4)+5 P (X =5)+6 P (X =6) =21 1 6 =3.5 Example 2 contd. From Example 2, the expected sum of two dice E[X] =2 P (X =2)+3 P (X =3)+ +12 P (X =12) X12 = ip (X = i) i=2 =252/36 = 7 Example 5 contd. From Example 5, the expected position of the load E[X] = = 1 =.1 xf(x)dx cxdx =.1 x2 2 =5 xdx 1

Variance Var(X) =E[(X µ) 2 ] where µ = E[X] = E[X 2 2µX + µ 2 ] = E[X 2 ] 2µE[X]+µ 2 = E[X 2 ] µ 2 = E[X 2 ] (E[X]) 2 Var(aX + b) =a 2 Var(X) Var(aX) =a 2 Var(X) Var(b) = Var(X + X) =4Var(X) Standard deviation = p Var(X) Covariance of two random variables X, Y Cov(X, Y )=Cov(Y,X) Cov(X, X) =Var(X) Cov(X, Y )=E[(X µ X )(Y µ Y )] = E[XY ] E[X]E[Y ] Cov(aX, Y )=acov(x, Y ) Cov(X + Z, Y )=Cov(X, Y )+Cov(Z, Y ) Pn Cov i=1 X i, P m j=1 Y j = P n P m i=1 j=1 Cov(X i,y j ) Var ( P n i=1 X i)= P n i=1 Var(X i)+ P n i=1 P n j=1,j6=i Cov(X i,x j ) If X and Y are independent random variables, then Cov(X, Y )=! nx nx Var X i = Var(X i ) i=1 i=1 The correlation coe cient Corr(X, Y )= Cov(X, Y ) p Var(X)Var(Y )

Properties of the Expected Value Discrete RV: E[g(X)] = P x g(x)p(x) Continuous RV: E[g(X)] = R 11 g(x)f(x)dx E[aX + b] =ae[x]+b ( P E[X n x ]= xn p(x) Discrete RV R 1 1 xn f(x)dx Continuous RV Expected value of a function of two RVs ( P P x E[g(X, Y )] = E[X + Y ]=E[X]+E[Y ] Example 2 contd. y R 1 R 1 1 1 g(x, y)p(x, y) g(x, y)f(x, y)dxdy Discrete RV Continuous RV Let us denote the outcome of the first die by X 1 and the second die by X 2. X = X 1 +X 2. E[X] =E[X 1 + X 2 ] = E[X 1 ]+E[X 2 ] =3.5+3.5 =7 (seeexample6) Moment Generating Function, (t) (t) =E ( P e tx x = etx p(x) Discrete RV R 1 1 etx f(x)dx Continuous RV nth moment of the random variable is E[X n ]anditcanbecomputedfrom (t) using E[X n ]= dn dt n (t) t= i.e., E[X] = () and E[X 2 ]= ().

Example 7. The moment generating function of X with pmf p(i) = (t) =E[e tx ]= = e 1X i= e ti i e i! is i e i! 1X ( e t ) i i= = e e et = e (et 1) (t) = e t e (et 1), (t) =( e t ) 2 e (et 1) + e t e (et 1). This gives E[X] = () = and E[X 2 ]= () = 2 + Markov s Inequality i! For any value a>, P (X ) apple E[X] a Chebyshev s Inequality If the mean of X is µ and variance is 2,foranyk>, P X µ k apple The Weak Law of Large Numbers Let X 1,X 2,... be a sequence of i.i.d. (independent and identically distributed) random variables with E[X i ]=µ. For any > X1 + X 2 + + X n P n Jointly Distributed RVs Joint CDF of X and Y 2 k 2 µ >! as n!1 F (x, y) =P (X apple x, Y apple y) F X (x) =P (X apple x) =P (X apple x, Y apple1)=f (x, 1)

F Y (y) =P (Y apple y) =P (X apple1,y apple y) =F (1,y) Joint pmf p(x i,y i )=P(X = x i,y = y i ) Marginal pmf: p X (x i )=P(X = x i )=P [ j {X = x i,y = y j } = P j p(x i,y j ) p Y (y i )=P(Y = y i )=P [ i {X = x i,y = y j } = P i p(x i,y j ) Joint pdf f(a, b) = @2 F (a, b) @a@b Marginal densities: f X (x) = R f(x, y)dy 11 f Y (y) = R f(x, y)dx 11 If X and Y are independent Example 8. F (a, b) =F (a)f (b) Discrete RV: p(x, y) =p X (x)p Y (y) Continuous RV: f(x, y) =f X (x)f Y (y) Let Y = X + W. The joint probability density of X and Y is f(x, y) = Determine the marginal densities. When X = x, Y varies from x<y<1. 2 e y <x<y<1 f X (x) = = x x = e f(x, y)dy Similarly, when Y = y, W varies from y>w>andx varies from <x<y. f Y (y) = = Z y Z y = 2 ye x 2 e y dy f(x, y)dy 2 e y y dy