Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example 2. Cast two dice. Define the random variable as sum of the outcomes. P (X =2)=P {(1, 1)} =1/36 P (X =3)=P {(1, 2), (2, 1)} =2/36 P (X =4)=P {(1, 3), (2, 2), (3, 1)} =3/36 P (X =5)=P {(1, 4), (2, 3), (3, 2), (4, 1)} =4/36 P (X =6)=P {(1, 5), (2, 4), (3, 3), (4, 2), (5, 1)} =5/36 P (X =7)=P {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)} =6/36 P (X =8)=P {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} =5/36 P (X =9)=P {(3, 6), (4, 5), (5, 4), (6, 3)} =4/36 P (X =1)=P {(4, 6), (5, 5), (6, 4)} =3/36 P (X =11)=P {(5, 6), (6, 5)} =2/36 P (X =12)=P {(6, 6)} =3/36 Cumulative Distribution Function (CDF) For any real number x F (x) =P (X apple x) i.e., the probability that the random variable X takes on a value less than or equal to x.
Note: P (a <Xapple b) =P (X apple b) P (X apple a) = F (b) F (a) Types of RV Discrete X takes discrete values Probability Mass Function (pmf) p(a) =P (X = a) CDF: F (a) = P all xapplea p(x), F (1) = P 1 i=1 p(x i)=1, F ( 1) =. Example 3. Cast a die. X =outcome the probability mass function p X (i) = 1, i =1, 2,...,6 6 Figure 2.5: pmf and CDF of die cast experiment.
Example 4. The cumulative distribution function F is 8, b < 1/2, apple b<1 >< 3/5, 1 apple b<2 F (b) = 4/5, 2 apple< 3 9/1, 3 apple b<3.5 >: 1, b 3.5 Figure 2.6: CDF and pmf. the probability mass function of X is p() = 1/2, p(1) = 1/1, p(2) = 1/5, p(3) = 1/1, p(3.5) = 1/1. Check: F (1) = P all x p(x) =1 Continuous Random Variable possible values of X is an interval. Z P (X 2 B) = Note: CDF: F (a) =P {X 2 ( 1,a]} = R a1 f(x)dx, B f(x) dx {z} pdf f(x) iscalledprobabilitydensityfunction(pdf)ofx, F (1) = R f(x)dx = P [X 2 ( 1, 1)] = 1, 11 P (a apple X apple b) = R b f(x)dx but P (X = a) =R a f(x)dx =, a a d F (a) =f(a), da F ( 1) =.
Example 5. 1 kg load is equally likely to be placed anywhere along the 1 m span of the beam. Define X to be the position of the load. ( c <xapple f(x) = otherwise where c is a constant. To estimate c use F (1) =1 ) 1 f(x)dx =1 cdx =1 )c x 1 =1 )c =.1 Figure 2.7: pmf and CDF of die cast experiment. The cumulative distribution function 8R x >< cdx = cx =.1x, <xapple1 F (x) = 1, x > 1 >:, x < What is the probability that the load is placed between 2m and 5m? P (2 <Xapple 5) = Z 5 2 cdx =.3
Expectation E[X] = X i x i P (X = x i )= X i x i p(x i ) E[X] = 1 xf(x)dx Example 6. Cast a die and denote the outcome by a random variable X. E[X] = X i x i P (X = x i ) =1 P (X =1)+2 P (X =2)+3 P (X =3) +4 P (X =4)+5 P (X =5)+6 P (X =6) =21 1 6 =3.5 Example 2 contd. From Example 2, the expected sum of two dice E[X] =2 P (X =2)+3 P (X =3)+ +12 P (X =12) X12 = ip (X = i) i=2 =252/36 = 7 Example 5 contd. From Example 5, the expected position of the load E[X] = = 1 =.1 xf(x)dx cxdx =.1 x2 2 =5 xdx 1
Variance Var(X) =E[(X µ) 2 ] where µ = E[X] = E[X 2 2µX + µ 2 ] = E[X 2 ] 2µE[X]+µ 2 = E[X 2 ] µ 2 = E[X 2 ] (E[X]) 2 Var(aX + b) =a 2 Var(X) Var(aX) =a 2 Var(X) Var(b) = Var(X + X) =4Var(X) Standard deviation = p Var(X) Covariance of two random variables X, Y Cov(X, Y )=Cov(Y,X) Cov(X, X) =Var(X) Cov(X, Y )=E[(X µ X )(Y µ Y )] = E[XY ] E[X]E[Y ] Cov(aX, Y )=acov(x, Y ) Cov(X + Z, Y )=Cov(X, Y )+Cov(Z, Y ) Pn Cov i=1 X i, P m j=1 Y j = P n P m i=1 j=1 Cov(X i,y j ) Var ( P n i=1 X i)= P n i=1 Var(X i)+ P n i=1 P n j=1,j6=i Cov(X i,x j ) If X and Y are independent random variables, then Cov(X, Y )=! nx nx Var X i = Var(X i ) i=1 i=1 The correlation coe cient Corr(X, Y )= Cov(X, Y ) p Var(X)Var(Y )
Properties of the Expected Value Discrete RV: E[g(X)] = P x g(x)p(x) Continuous RV: E[g(X)] = R 11 g(x)f(x)dx E[aX + b] =ae[x]+b ( P E[X n x ]= xn p(x) Discrete RV R 1 1 xn f(x)dx Continuous RV Expected value of a function of two RVs ( P P x E[g(X, Y )] = E[X + Y ]=E[X]+E[Y ] Example 2 contd. y R 1 R 1 1 1 g(x, y)p(x, y) g(x, y)f(x, y)dxdy Discrete RV Continuous RV Let us denote the outcome of the first die by X 1 and the second die by X 2. X = X 1 +X 2. E[X] =E[X 1 + X 2 ] = E[X 1 ]+E[X 2 ] =3.5+3.5 =7 (seeexample6) Moment Generating Function, (t) (t) =E ( P e tx x = etx p(x) Discrete RV R 1 1 etx f(x)dx Continuous RV nth moment of the random variable is E[X n ]anditcanbecomputedfrom (t) using E[X n ]= dn dt n (t) t= i.e., E[X] = () and E[X 2 ]= ().
Example 7. The moment generating function of X with pmf p(i) = (t) =E[e tx ]= = e 1X i= e ti i e i! is i e i! 1X ( e t ) i i= = e e et = e (et 1) (t) = e t e (et 1), (t) =( e t ) 2 e (et 1) + e t e (et 1). This gives E[X] = () = and E[X 2 ]= () = 2 + Markov s Inequality i! For any value a>, P (X ) apple E[X] a Chebyshev s Inequality If the mean of X is µ and variance is 2,foranyk>, P X µ k apple The Weak Law of Large Numbers Let X 1,X 2,... be a sequence of i.i.d. (independent and identically distributed) random variables with E[X i ]=µ. For any > X1 + X 2 + + X n P n Jointly Distributed RVs Joint CDF of X and Y 2 k 2 µ >! as n!1 F (x, y) =P (X apple x, Y apple y) F X (x) =P (X apple x) =P (X apple x, Y apple1)=f (x, 1)
F Y (y) =P (Y apple y) =P (X apple1,y apple y) =F (1,y) Joint pmf p(x i,y i )=P(X = x i,y = y i ) Marginal pmf: p X (x i )=P(X = x i )=P [ j {X = x i,y = y j } = P j p(x i,y j ) p Y (y i )=P(Y = y i )=P [ i {X = x i,y = y j } = P i p(x i,y j ) Joint pdf f(a, b) = @2 F (a, b) @a@b Marginal densities: f X (x) = R f(x, y)dy 11 f Y (y) = R f(x, y)dx 11 If X and Y are independent Example 8. F (a, b) =F (a)f (b) Discrete RV: p(x, y) =p X (x)p Y (y) Continuous RV: f(x, y) =f X (x)f Y (y) Let Y = X + W. The joint probability density of X and Y is f(x, y) = Determine the marginal densities. When X = x, Y varies from x<y<1. 2 e y <x<y<1 f X (x) = = x x = e f(x, y)dy Similarly, when Y = y, W varies from y>w>andx varies from <x<y. f Y (y) = = Z y Z y = 2 ye x 2 e y dy f(x, y)dy 2 e y y dy