x i p X (x i ) if X is discrete provided that the relevant sum or integral is absolutely convergent, i.e., i

Size: px
Start display at page:

Download "x i p X (x i ) if X is discrete provided that the relevant sum or integral is absolutely convergent, i.e., i"

Transcription

1 Chapter 5 Expectation 5. Introduction Def The expectation (mean), E[X] or µ X, of a random variable X is defined by: x i p X (x i ) if X is discrete E[X] µ X i xf(x)dx if X is continuous provided that the relevant sum or integral is absolutely convergent, i.e., i x i p X (x i ) < and x f(x)dx <. Otherwise, E[X] does not exist. Example A lot contains 4 good components and 3 defective components. A sample of 3 is taken by a inspector. Find the expected value of the number of good components in this samples. Let X denote the number of good components in the samples. Then, That is, p X (x) ( 4 x )( 3 ) 3 x ( ) 7, x 0,, 2, 3. 3 x p X (x) /35 2/35 8/35 4/35

2 Hence, E[X] Example 2 Let X be the random variable that denotes the life in hours of a certain electronic device. The pdf is given by f(x) Find the expected life of this device. { 20,000 x 3 x > 00 0 otherwise E[X] , 000 x dx x 3 20, 000 dx x 2 Example 3 Let X be a discrete random variable with pmf Show that E[X] is undefined. p X (x) { x, 2,... x(x+) 0 otherwise pf) Hence, E[X] is undefined. x x x(x + ) x x Expectation of a Function of a Random Variable Let X be a random variable, and let Y g(x) be a real-valued function. Then g(x i )p X (x i ) if X is discrete E[Y ] E[g(X)] i g(x)f X (x)dx if X is continuous 2

3 Example 4 Suppose that the number of cars, X, that pass through a car wash between 4:00 P.M. and 5:00 P.M on any sunny Friday has the following probability distribution: x P(X x) 2 2 Let g(x) 2X represent the amount of money in dollars, paid to the attendant by the manager. Find the attendant s expected earnings for this particular time period E[g(X)] E[2X ] (2x )p X (x) x Example 5 Let X be a random variable with pdf f(x) Find the expected value of g(x) 4X + 3. { x 2 3 < x < 2 0 otherwise Moments Def 2 E[g(X)] E[4X + 3] (4x 3 + 3x 2 )dx (4x + 3)x 2 dx 3. E[X k ] is defined to be the kth moment of a random variable X. 2. E[(X E[X]) k ] (µ k ) is defined to be the kth central moment of a random variable X. Def 3 The variance of a random variable X is V ar[x] σx 2 E[(X E[X]) 2 ] E[(X µ X ) 2 ] (x i µ X ) 2 p X (x i ) if X is discrete i (x µ x ) 2 f X (x)dx if X is continuous 3

4 The square root, σ X is called the standard deviation of random variable X. Example 6 Let X B(n, p). Find E[X] and V ar[x]. np ( n x x ) n E[X] p x q n x x0 ( ) n n n p x q n x x x ) n x0 ( n x np(p + q) n np p x q (n ) x ) n V ar[x] np) x0(x 2( n p x q n x x ( ) n n (x 2 2npx + n 2 p 2 ) p x q n x x x0 ) ( ) n x0x 2( n n n n ( p x q n x 2np x p x q n x + n 2 p 2 n x x x x0 x0 (npq + n 2 p 2 ) 2n 2 p 2 + n 2 p 2 npq ) p x q n x Example 7 Let X Exp(λ). Find E[X] and V ar[x]. E[X] 0 x λe λx dx λ 4

5 V ar[x] λ ( 0 0 Γ(3) λ 2 x λ) 2 λe λx dx x 2 e λx dx 2 xe λx dx + e λx dx 0 λ 0 2Γ(2) λ 2 + Γ() λ 2 λ Expectation of Functions of Multiple Random Variables Let X,...,X n be n random variables defined on the same probability space, and let Y g(x,...,x n ) be a real-valued function. Then E[Y ] E[g(X,...,X n )] g(x,...,x n )p(x,...,x n ) x x n g(x,...,x n )f(x,...,x n )dx dx n discrete case continuous case Example 8 A box contains 3 balls numbered,2 and 3. Now, select two ball without replacement from the box. Let X, Y denote the number of the first and the second balls, respectively. Find E[X + Y ] and E[XY ]. The jpmf of X,Y is as follows: Y X E[X + Y ] [( + 2) + ( + 3) + (2 + ) + (2 + 3) + (3 + ) + (3 + 2)] 6 4 5

6 E[XY ] [( 2) + ( 3) + (2 ) + 6 (2 3) + (3 ) + (3 2)] 3 Example 9 Find E[Y/X] for the pdf x(+3y 2 ) 0 < x < 2, 0 < y < 4 f(x,y) 0 otherwise E[Y/X] y( + 3y 2 ) dxdy 0 4 y + y 3 dy Important Properties of Expectation Linearity E. E[k] k for any constant k. E2. E[a X + a 2 X 2 + a n X n ] a E[X ] + a 2 E[X 2 ] + + a n E[X n ] where a i s are any constants and X i s are any random variables. Example 0 Let X,Y be two continuous random variables. Show that E[X +Y ] E[X]+ E[Y ]. pf) E[X + Y ] x (x + y)f(x,y)dxdy xf X (x)dx + E[X] + E[Y ] f(x,y)dydx + yf Y (y)dy y f(x,y)dxdy 6

7 Independence n n E3. If random variables X,...,X n are independent, then E[ X i ] E[X i ]. i i Example Let X and Y be two independent continuous random variables. Show that E[XY ] E[X]E[Y ]. pf) E[XY ] xf X (x) E[X]E[Y ] xyf(x, y)dydx yf Y (y)dydx Remark E[XY ] E[X]E[Y ] does not imply that X and Y are independent. Example 2 Suppose that the possible outcomes of (X,Y ) are (, 0), (, 0), (0, ), and (0, ), which are assumed to occur with the same probability. Show that E[XY ] E[X]E[Y ], but X and Y are not independent. pf) E[XY ] ( ) 0 4 E[X] ( ) 0 4 E[Y ] ( ) 0 4 E[XY ] E[X]E[Y ] P(X 0,Y 0) 0 4 P(X 0)P(Y 0) Hence, X and Y are not independent. Covariance 7

8 We are now interested in finding V ar[x + Y ]. V ar[x + Y ] E[((X + Y ) E[X + Y ]) 2 ] E[((X + Y ) E[X] E[Y ]) 2 ] E[(X E[X]) 2 + (Y E[Y ]) 2 + 2(X E[X])(Y E[Y ])] E[(X E[X]) 2 ] + E[(Y E[Y ]) 2 ] + 2E[(X E[X])(Y E[Y ])] V ar[x] + V ar[y ] + 2E[(X E[X])(Y E[Y ])] Define We have Cov(X,Y ) E[(X E[X])(Y E[Y ])], V ar[x + Y ] V ar[x] + V ar[y ] + 2Cov(X,Y ) Example 3 Let X, Y be two independent random variables. Show that Cov(X,Y ) 0 pf) Cov(X,Y ) E[(X E[X])(Y E[Y ])] E[XY XE[Y ] Y E[X] + E[X]E[Y ]] E[XY ] E[X]E[Y ] E[Y ]E[X] + E[X]E[Y ] E[XY ] E[X]E[Y ] Since X, Y are independent, we have E[XY ] E[X]E[Y ]. Hence Cov(X, Y ) 0. Remark Cov(X,Y ) 0 does not imply that X and Y are independent. E4. Cov(X,X) V ar[x]. E5. V ar[x] E[X 2 ] (E[X]) 2. pf) E6. Cov(X,Y ) Cov(Y,X). V ar[x] Cov(X, X) E[XX] E[X]E[X] E[X 2 ] (E[X]) 2 E7. If X,Y are independent, V ar[x +Y ] V ar[x]+v ar[y ]. More generally, if X,...,X n are independent, V ar[x + + X n ] V ar[x ] + + V ar[x n ] 8

9 E8. m n m n Cov α i X i, β j Y j α i β j Cov(X i,y j ) i j i j pf) m n Cov α i X i, βy i i j m n [ m ] n E α i X i β j Y j E α i X i E β j Y j i j i j m n m n E α i β j X i Y j α i β j E[X i ]E[Y j ] i j m n i j m n i j m n i j α i β j E[X i Y j ] i j m n i j α i β j (E[X i Y j ] E[X i ]E[Y j ]) α i β j Cov(X i,y j ) α i β j E[X i ]E[Y j ] E9. [ n ] n V ar α i X i αiv 2 ar[x i ] + α i α j Cov(X i,x j ) i i i j Example 4 Let X Γ(α,λ). Find E[X], E[X m ], and V ar[x]. E[X m ] 0 λα Γ(α) x m Γ(α) λα x α e λx dx E[X] α λ E[X 2 α(α + ) ] λ 2 0 x m+α e λx dx λα Γ(m + α) λ m+α Γ(α) α(α + ) (α + m ) λ m V ar[x] E[X 2 ] (E[X]) 2 α λ 2 9

10 Example 5 Let U i U(0, ),i, 2,...,n, be independent random variables. Let X min(u,...,u n ) Y max(u,...,u n ) Find E[X m ], E[Y m ], E[X], E[Y ], V ar[x], V ar[y ], Cov(X,Y ). From Chapter 4, we know f(x,y) f X (x) f Y (y) { n(n )(y x) n 2 0 < x y < 0 otherwise { n( x) n 0 < x < 0 otherwise { ny n 0 < y < 0 otherwise E[X m ] n x m ( x) n dx 0 nγ(m + )Γ(n) Γ(m + n + ) m!n! E[X] E[X 2 ] (m + n)! n + 2 (n + )(n + 2) V ar[x] E[X 2 ] (E[X]) 2 E[Y m ] n n E[Y ] E[Y 2 ] 0 0 n m + n n n + n n + 2 y m y n dy y m+n dy V ar[y ] E[Y 2 ] (E[Y ]) 2 0 n (n + ) 2 (n + 2) n (n + ) 2 (n + 2)

11 E[XY ] n(n ) y n + 2 Cov(X,Y ) E[XY ] E[X]E[Y ] n + 2 n (n + ) 2 (n + ) 2 (n + 2) 0 0 xy(y x) n 2 dxdy Example 6 Suppose that E[X] 2, E[Y ], E[Z] 3, V ar[x] 4, V ar[y ], V ar[z] 5, Cov(X,Y ) 2, Cov(X,Z) 0, Cov(Y,Z) 2. Let U 3X 2Y + Z, V X + Y 2Z. Find. E[U]; 2. V ar[u]; 3. Cov(U,V ).. E[U] E[3X 2Y + Z] 3E[X] 2E[Y ] + E[Z] V ar[u] 3 2 V ar[x] V ar[y ] + V ar[z] 2Cov(X,Y ) 4Cov(Y,Z) + 6Cov(Z,X) ( 2) Cov(U,V ) Cov(3X 2Y + Z,X + Y 2Z) 3Cov(X,X) + 3Cov(X,Y ) 6Cov(X,Z) 2Cov(Y,X) 2Cov(Y,Y ) + 4Cov(Y,Z) +Cov(Z,X) + Cov(Z,Y ) 2Cov(Z,Z) Correlation Coefficient

12 Theorem Schwartz Inequality Let X, Y be two independent random variable. Then, pf) (E[XY ]) 2 E[X 2 ]E[Y 2 ] 0 E[(X λy ) 2 ] λ 2 E[Y 2 ] 2λE[XY ] + E[X 2 ] We now Find λ to minimize E[(X λy ) 2 ] by setting the corresponding derivative to zero. de[(x λy ) 2 ] dλ 2λE[Y 2 ] 2E[XY ] 0 λ E[XY ] E[Y 2 ] a Hence, Equivalently, 0 E[(X ay ) 2 ] E[X 2 ] (E[XY ]) 2 E[X 2 ]E[Y 2 ] (E[XY ])2 E[Y 2 ] E0. Cov(X,Y ) 2 V ar[x]v ar[y ]. Def 4 The correlation coefficient ρ X,Y of two random variables X and Y are defined by ρ X,Y Cov(X,Y ) V ar[x]v ar[y ] E. ρ X,Y. E2. Let a be a positive constant, and let b be any constant. Then { Y ax + b ρ X,Y Y ax + b pf) Consider Y ax + b. Cov(X,Y ) Cov(X, ax + b) acov(x,x) + Cov(X,b) av ar[x] V ar[y ] V ar[ ax + b] a 2 V ar[x] 2

13 ρ X,Y Cov(X,Y ) V ar[x]v ar[y ] av ar[x] V ar[x] a 2 V ar[x] Example 7 Find ρ X,Y of Example 5 n V ar[x] V ar[y ] (n + ) 2 (n + 2) Cov(X,Y ) (n + ) 2 (n + 2) Cov(X,Y ) ρ X,Y V ar[x]v ar[y ] n Example 8 Let X and Y be two independent random variables. Suppose that E[X], E[Y ] 2, V ar[x] 3, and V ar[y ] 4. Find. V ar[3x 2Y ]; 2. V ar[xy ]; 3. ρ X+Y,X Y.. V ar[3x 2Y ] 3 2 V ar[x] V ar[y ] V ar[xy ] E[(XY ) 2 ] (E[XY ]) 2 E[X 2 ]E[Y 2 ] (E[X]E[Y ]) 2 (V ar[x] + (E[X]) 2 )(V ar[y ] + (E[Y ]) 2 ) (E[X]E[Y ]) 2 (3 + 2 )( ) ( 2)

14 3. Cov(X + Y,X Y ) Cov(X,X) Cov(X,Y ) + Cov(Y,X) Cov(Y,Y ) V ar[x] V ar[y ] V ar[x + Y ] V ar[x] + V ar[y ] 7 V ar[x Y ] V ar[x] + V ar[y ] 7 ρ X+Y,X Y 7 Example 9 A box contains 3 red balls and 2 black balls. Now, select two balls without replacement from the box. Let X and Y denote the numbers of red balls and black balls, respectively. Find ρ X,Y. Method. The jpmf of X,Y is as follows: Y X E[X] E[Y ] E[XY ] Cov(X,Y ) E[XY ] E[X]E[Y ] 9 25 V ar[x] E[X 2 ] (E[X]) V ar[y ] E[Y 2 ] (E[Y ]) ρ X,Y Method 2. Since X + Y 2, or Y X + 2, we immediately have ρ X,Y. 4

15 5.5 Conditional Expectation Def 5 Let X and Y be two random variables. The conditional expectation of Y given X x, denoted by E[Y X x] or E[Y x] (µ Y x ), is defined by Remark y i p Y X (y i x) if Y is discrete E[Y x] i yf Y X (y x)dy if Y is continuous. E[Y x] is a function of x. Hence, E[E[Y x]] is a real. 2. The conditional variance of random variable Y given that X x is defined by V ar[y x] E[(Y µ Y x ) 2 x] E[Y 2 x] (E[Y x]) 2 Example 20 Let random variables X,Y have jpdf f(x,y) Find E[Y X 0.5] and V ar[y X 0.5]. { 2 (2x + y) 0 < x <, 0 < y < 3 0 otherwise E3. E[E[Y x]] E[Y ] 2 f X (x) 0 3 (2x + y)dy ( + 4x), 0 < x < 3 f Y X (y x) f(x,y) f X (x) 2(2x + y) + 4x, 0 < x <, 0 < y < f Y X (y 0.5) 2 (y + ), 0 < y < 3 E[Y 0.5] yf Y X (y 0.5)dy y(y + )dy 5 9 E[Y 2 0.5] 2 y 2 (y + )dy V ar[y 0.5] E[Y 2 0.5] (E[Y 0.5])

16 pf) E[E[Y x]] E[Y ] µ Y x f X (x)dx f X (x) y ( y f(x,y) f X (x) dydx yf(x, y)dydx yf Y (y)dy f(x,y)dxdy ) yf Y X (y x)dy f X (x)dx 5.6 Moment Generating Functions Def 6 The moment generating function M X (t) of a random variable X is defined by M X (t) E[e Xt ] The domain of M X (t) is all real numbers such that e Xt has finite expectation. Example 2 Let X B(n,p). Find M X (t). ) n M X (t) E[e Xt ] e xt( n p x q n x x x0 ( ) n n (pe t ) x q n x x x0 (pe t + q) n Example 22 Let X Γ(α,λ). Find M X (t). M X (t) E[e Xt ] e xt 0 Γ(α) λα x α e λx dx λα x α e (λ t)x dx Γ(α) 0 6

17 Γ(α) Γ(α) (λ t) α ( ) α λ λα λ t ( t λ ) α Facts. M X (t) E[e Xt ] [ E + Xt! E[] + E[X]! E[X k ] t k k! k0 + X2 t 2 2! t + E[X2 ] 2! + X3 t 3 3! + ] t 2 + E[X3 ] t 3 + 3! 2. M X(t) E[X] + E[X] M X(0) k2 X k (k )! tk 3. M X(t) E[X 2 ] + k3 X k (k 2)! tk 2 E[X 2 ] M X(0) V ar[x] M X(0) (M X(0)) 2 4. E[X k ] M (k) (0) Example 23 Use generating function to find the means of variances of the following random variables:. X B(n,p); 2. Y Γ(α,λ). 7

18 . M X (t) (pe t + q) n M X(t) n(pe t + q) n pe t M X(t) n(n )(pe t + q) n 2 pe t pe t + n(pe t + q) n pe t E[X] M X(0) np E[X 2 ] M X(0) n(n )p 2 + np V ar[x] E[X 2 ] (E[X]) 2 np( p) npq 2. M Y (t) ( t ) α λ ( ) α M Y (t) α λ M Y (t) t λ α(α + ) λ 2 E[Y ] M Y (0) α λ E[Y 2 ] M Y (0) ( t ) α 2 λ α(α + ) λ 2 V ar[y ] E[Y 2 ] (E[Y ]) 2 α λ 2 8

19 Summary of Important Moments of Random Variables Distribution Density Mean Variance MGF Bernoulli p x q x p pq pe t + q X B(,p) x 0,. Binomial X B(n,p) n x p x q n x np npq (pe t + q) n x 0,, 2,...,n Geometric pq x p X G(p) x, 2,... Neg. Bin x p r q x r r X NB(r,p) x r,r +,... Poisson α x e α x! X P(α) x 0,,... Uniform n X I(,n) x, 2,...,n r p p p 2 r( p) p 2 pe t qe t ( pe t qe t α α e α( et ) n + 2 Exponential λe λx λ X Exp(λ) x > 0 Gamma Γ(α) λα x α e λx α λ n 2 2 λ 2 α λ 2 ) r e t e (n+)t n e t ( t ) λ ( t ) α λ X Γ(α,λ) x > 0 Chi-Square x n/2 e x/2 Γ(n/2)2 n/2 n 2n ( 2t) n/2 X χ 2 n x > 0 Normal e (x µ)2 2σ 2 2πσ µ σ 2 e µt e σ2 t 2 /2 X N(µ,σ 2 ) Uniform X U(α,β) x R β α α < x < β α + β 2 (β α) 2 2 e βt e αt (β α)t 9

20 Theorem 2 (Correspondence Theorem or Uniqueness Theorem) Let X,X 2 be two random variables. Then, M X (t) M X2 (t) for all t if and only if F X (x) F X2 (x) for all x. Example 24 Find a) the corresponding distributions; b) means and variances. for the following moment generating functions of random variables.. M X (t) ( e t ) 5 ; 2. M Y (t) e 4(et ) ; 3. M Z (t) 0.3et 0.7et,t < ln 0.7; 4. M W (t) e 2t+2t2.. (a) X B(5, 0.7). (b) µ X 3.5,σ 2 X (a) Y P(4). (b) µ Y 4,σ 2 Y (a) Z G(0.3). (b) µ Z 0/3,σ 2 Z 70/9. 4. (a) W N(2, 2 2 ). Theorem 3 (b) µ W 2,σ 2 W 4.. [Linear Translation] where a,b are constants. Y ax + b M Y (t) e bt M X (at), 2. [Convolution] Let X,X 2,...,X n be n independent random variables, and let Y X + X X n. Then, M Y (t) M X (t)m X2 (t) M Xn (t) 20

21 pf). M Y (t) E[e Y t ] E[e (ax+b)t ] E[e bt e X(at) ] e bt E[e X(at) ] e bt M X (at) 2. M Y (t) E[e Y t ] E[e (X +X 2 + +X n)t ] E[e Xt e X 2t e Xnt ] E[e Xt ]E[e X2t ] E[e Xnt ] M X (t)m X2 (t) M Xn (t) Example 25 Let X i Exp ( ) iβ,i,...,n be n independent random variables. Let n ( ) Xi Y. ni i. Find E[Y ]. 2. Let Z 2nY/β. Find the distribution of Z.. 2. Hence, Z χ 2 n. E[X i ] iβ [ n E[Y ] E i n i n i M Xi (t) ( iβt) M Xi /(ni) ( ) ] Xi ni ni E[X i] β n β M Xi (t/(ni)) ( βt/n) M Y (t) ( βt/n) n M Z (t) M 2nY/β (t) M Y (2nt/β) ( 2t) n 2

22 Example 26 Let random variable X satisfy E[X 2k ] (2k)! 2 k k! 0,, 2,... Find the pdf of X and E[X 2k+ ] 0 for k Hence, X N(0, ). M X (t) k0 k0 k0 k0 k0 e t2 /2 E[X k ] t k k! E[X 2k ] t 2k + (2k)! k0 E[X 2k+ ] (2k + )! t2k+ E[X 2k ] t 2k (Since E[X 2k+ ] 0) (2k)! (2k)! 2 k k! (2k)! t2k ( ) t 2 k Theorem 4 Let X i,...,x n be independent random variables. k!. X i B(,p) X + + X n B(n,p). 2. X i B(m i,p) X + + X n B(m + + m n,p). 3. X i G(p) X + + X n NB(n,p). 4. X i NB(r i,p) X + + X n NB(r +,r n,p). 5. X i P(α i ) X + + X n P(α + + α n ). 6. X i Exp(λ) X + + X n Γ(n,λ). 7. X i Γ(α i,λ) X + + X n Γ(α + + α n,λ). 8. X i χ 2 X + + X n χ 2 n. 9. X i χ 2 m i X + + X n χ 2 m + +m n. 0. X i N(µ i,σ 2 i ) X + + X n N(µ + + µ n,σ σ 2 n). 2 22

23 pf) 0. X i N(µ i,σ 2 i ) M Xi (t) e µ it e σ2 i t2 /2. M X + +X n (t) e (µ + µ n)t e (σ2 + σ2 n )t2 /2. Hence, X N(µ + + µ n,σ 2 + σ 2 n) 5.7 Inequalities Theorem 5 Markov Inequality Let X be a nonnegative random variable with E[X] µ. Then, for any t > 0, pf) Define discrete r.v Y as follows: Clearly, the pmf of Y is P(X t) µ t Y p Y (y) { 0 X < t t X t { P(X < t) y 0 P(X t) y t E[Y ] 0 P(X < t) + t P(X t) tp(x t) E[X] P(X t) µ t Example 27 Consider a system with M T T F 00 hours.. Use Markov inequality to approximate the reliability R(t) P(X > t) with t 90, 00, 0 and Suppose that the lifetime of the system is exponentially distributed. Find the reliability R(t) with t 90, 00, 0 and What is your conclusion from the above answers. 23

24 . P(X > 90) 00/90. P(X > 00) 00/00 P(X > 0) 00/0 0.9 P(X > 200) 00/ X Exp(/00) P(X > 90) e P(X > 00) e P(X > 0) e P(X > 200) e The performance of Markov inequality is not so good. Theorem 6 Chebyshev s Inequality P( X µ X t) σ2 X t 2 pf) Substituting X with (X µ x ) 2 into Markov inequality, we have Facts P[(X µ X ) 2 t 2 ] E[(X µ X) 2 ] t 2 P( X µ X t) σ2 X t 2 P( X µ X t) σ2 X t 2 P( X µ X kσ X ) k 2 P( X µ X kσ X ) k 2 Example 28 Seventy new jobs are opening up at an automobile manufacturing plant, but,000 applicants show up for the 70 positions. To select the best 70 from among the applicants, the company gives a test to them. The mean and standard deviation of the scores turn out to be 60 and 6, respectively. Can a person who has an 84 score count on getting one of the jobs? 24

25 Let X denote the score. Then, µ X 60 and σ X µ X 24 4σ X. By Chebyshev s inequality, P(X µ X 4σ X ) P( X µ X 4σ X ) 4 2 This implies that at most 000/6 63 applicants whose scores exceed 84. Hence, this guy can get a job. 5.8 The Weak Law of Large Numbers and Central Limit Theorems Theorem 7 Weak Law of Large Numbers Let X,...,X n be independent, identically distributed random variables having finite mean µ. Define Then for any ǫ > 0 pf) X n (X + + X n ). lim P( X µ > ǫ) 0 n Let the variance of the population be σ 2. [ ] σ 2 V ar[x] V ar X n (X + + X n ) n 2(V ar[x ] + + V ar[x n ]) By Chebyshev s inequality, nσ2 n 2 σ2 n Clearly, P( X µ > ǫ) σ2 X ǫ 2 σ2 nǫ lim P( X µ > ǫ) 0 n Theorem 8 Central Limit Theorem Let X,...,X n be independent, identically distributed random variables having mean µ and finite nonzero variance σ 2. Define Then, S n X + + X n X n (X + + X n ). 25

26 pf). lim n S n N(nµ,nσ 2 ); or 2. lim n S n nµ σ n N(0, ); or 3. lim n X N(µ,σ 2 /n). 4. lim n X µ σ/ n N(0, ). We show the 4th part of the theorem. Define Clearly, Hence, Z n X µ σ/ n n X i µ n i σ n ( Y i let Y i X ) i µ n i σ E[Y i ] 0 E[Yi 2 ] V ar[y i ] + (E[Y i ]) 2 M Yi (t) + E[Y i]! t + E[Y i 2 ] t 2 + E[Y i 3 ] t 3 + 2! 3! + 2 t2 + E[Y i 3 ] t M Yi / n(t) M Yi (t/ n) i ] t 3 + 2n t2 + E[Y 3 6 n + 3/2 M Zn (t) M Y / n(t)m Y2 / n(t) M Yn/ n(t) [ + 2n t2 + E[Y i 3 ] t 3 ] n 6 n + 3/2 By L Hopital rule, one can show (exercise) lim M Z n n (t) e t2 /2 26

27 This concludes lim Z n N(0, ) n Normal Approximation By the central limit theorem, when a sample size is sufficiently large, we can use normal distribution to approximate certain probabilities regarding to the sample or the parameters of its corresponding population. Example 29 Suppose that the lifetime of a bulb is exponentially distributed with mean length of 0 days. As soon as one light bulb burns out, a similar one is installed in its place. Find the probability that more than 50 bulbs will be required during a one-year period. Let X i denote the lifetime of the ith bulb. Clearly, X i Exp(/0). Hence, µ Xi 0 and σ 2 X i 00. Define S 50 X + + X 50 to be the lifetime of 50 bulbs. Then the probability that we want to find is P(S 50 < 365). Since n 50 > 30, we can approximate the distribution of S 50 by N(50 0, 50 00) N(500, 5000). Hence, P(S 50 < 365) Φ ( ) Φ(.9) Example 30 Let X B(50, 0.3). Use normal approximation to compute P(X 20). Let X i B(, 0.3),i, 2,...,50 be independent random variables. Then, X X + + X 50. Clearly, µ Xi p 0.3 and σ 2 X i pq 0.2. According to the central limit theorem, we can approximate the distribution of X by N(5, 0.5). Hence, P(X 20) Φ ( ) Φ(.543)

28 5.9 Exercise. Find E[X] and V ar[x] for random variable X. (a) X B(n,p); (b) X G(p); (c) X NB(r,p); (d) X P(α); (e) X Exp(λ); (f) X Γ(α,λ); (g) X N(µ,σ 2 ); (h) X U(a,b). 2. Let X be a nonnegative integer-valued random variable having a finite expectation. Show that E[X] P(X x). x 3. Let X be a geometrically distributed random variable and let M > 0 be an integer. Set Y max(x,m), Z min(x,m). Find E[Y ] and E[Z]. (Hint: use the fact depicted in problem 2). 4. Let the pmf of random variable X be Find E[X] and V ar[x]. 5. Let X P(α). Find E[( X) ]. { 2x x, 2,...,N N(N+) p(x) 0 otherwise 6. Let X be a discrete random variable with pmf (a) Find E[X], V ar[x]. x p(x) /8 /8 /4 /2 (b) Let Y X 2. Find pmf p Y, E[Y ] and V ar[y ]. 7. Suppose that we have two decks of n cards, each numbered, 2,...,n. The two decks are shuffled and the cards are matched against each other. We say a match occurs at position i if the ith card from each deck has the same number. Let S n denote the number of matches. Compute E[S n ] and V ar[s n ]. 8. A box has 3 balls labelled,2 and 3. Two balls are selected without replacement from the box. Let X, Y denote the numbers of the st and 2nd balls, respectively. Compute Cov(X,Y ). 28

29 9. The length of time, in minutes, for an airplane to wait for clearance to take off at a certain airport is a random variable Y 3X 2, where X Exp(/4). Find the mean and the variance of Y. 0. The total time, measured in units of 00 hours, that a teenager runs her stereo set over a period of one year is a continuous random variable X that has pdf x 0 < x < f(x) 2 x x < 2 0 otherwise Let Y 60X 2 +39X denote the number of kilowatt hours expended annually. Compute E[Y ].. Let X,...,X n be independent random variables having a common density with mean µ and variance σ 2. Let X (X X n )/n. (a) Show that (b) Conclude from (a) that n n (X k X) 2 (X k µ) 2 n(x µ) 2. k k [ n ] E (X k X) 2 (n )σ 2 k 2. If a random variable X is defined such that E[(X ) 2 ] 0, E[(X 2) 2 ] 6, find µ and σ If the jpdf of random variables of X and Y is given by f(x,y) Compute E[(X/Y 3 ) + X 2 Y ]. { 2 (x + 2y) 0 < x <, < y < otherwise 4. Tossing 3 fair dice 0 times. Let X denote the number of times with the same three faces landing up, and let Y denote the number of times with exactly the same two faces landing up. Find E[XY ]. 5. Let X and Y be two independent random variables. Suppose that E[X 4 ] 2, E[Y 2 ], E[X 2 ], and E[Y ] 0. Find V ar[x 2 Y ]. 6. Let X χ 2 n. Compute E[ X]. 7. Let X N(0,σ 2 ). Compute E[ X ] and E[X 2 ]. 8. Let E[X n ] n!. Find the pdf of X. 29

30 9. Let the pdf of X is given by Find M X (t), E[X], and V ar[x]. 20. Let M X (t) e 66t+200t2. Find (a) µ X, σ 2 X; (b) P(70 < X < 200). f(x) 2 e x, x R. 2. Suppose that the moment generating function of random variable X is M X (t) 3 (2t + 3 t + 5 t ),t R. Find the cdf of X. 22. Let X Γ(α,λ) and Y Γ(α 2,λ) be independent random variables, and set Z X + Y. Find the conditional expectation of X given Z. 23. Use moment generating functions to prove Theorem Suppose that random variable X satisfies E[X n ] n!. Find the pdf of X. 25. Let the jpdf of X,Y be defined by { f(x,y) 6 e (x/2+y/3) x > 0,y > 0 0 otherwise Find (a) f X and f Y ; (b) Cov(X,Y ); (c) ρ X,Y ; (d) ρ 2X 5,3Y Let X,X 2,X 3 have the jpdf f(x,x 2,x 3 ) Compute Cov(X,X 3 ). { (x + x 2 )e x 3 0 < x,x 2 <,x 3 > 0 0 otherwise 27. A bolt manufacturer knows that 5% of his production is defective. He gives a guarantee on his shipment of 0,000 parts by promising to refund the money if more than a bolts are defective. How small can the manufacturer choose a and still be assured the he need not give a refund more than % of the time? 28. Suppose that we are given a random variable X with E[X] 20, E[X 2 ] 4, 464. Using Chebyshev s inequality to find the bounds for the probabilities (a) P(96 X 44) (b) P(08 X 32). 29. Let X P(λ). Use Chebyshev s inequality to verify 30

31 ( (a) P X λ ) 4 2 λ ; (b) P(X 2λ) λ. 30. A local company manufactures telephone wire. The average length of the wire is 52 inches with standard deviation of 6.5 inches. At most, what percentage of the telephone wire from this company exceeds 7.5 inches. 3. Let random variables X,Y have the jpdf: f(x,y) Find f Y X, E[Y x], and V ar[y x]. { 20 x y x + 2, 0 x 0 0 otherwise 32. Let random variables X,Y have the jpmf p(x,y) { x+y 2 x, 2,y, 2, 3 0 otherwise Find (a) f X (b) f Y X (c) E[Y x] (d) E[E[Y x]] (e) E[Y 2 x] (f) V ar[y x] (g) E[V ar[y x]]. 33. Let X and Y have the joint probability density function f(x,y) 2, 0 x y 2 Compute (a) f X (x); (b) E[Y x]; (c) V ar[y x]; (d) ρ X,Y. 34. Suppose that the random variables X and Y have jpdf f(x,y) Find (a) E[X] (b) E[X y] (c) V ar[e[x y]]. { 2 0 < x < y < 0 otherwise 35. Let random variables X, Y be independent. Show that E[Y x] E[Y ] 36. Show that E[g(X)Y x] g(x)e[y x], where g is a real-valued function. 37. Show that V ar[y ] E[V ar[y X]] + V ar[e[y X]]. 38. Let X G(p),Y G(p) be independent. Find E[Y X + Y z]. 39. Let X P(λ ),Y P(λ 2 ) be independent. Find E[Y X + Y z]. 40. Let X i U(0, ),i, 2,...,n be independent, and let (a) Are Y and Y n independent? Y min(x,...,x n ), Y n max(x,...,x n ) 3

32 (b) Find E[Y n Y y ]. 4. The amount of time that a bank teller spends on a customer is a random variable with a mean µ 3.2 minutes and a standard deviation σ.6 minutes. If a random sample of 64 customers is observed, find the probability that their mean time at the teller s counter is (a) at most 2.7 minute; (b) more than 3.5 minutes; (c) at least 3.2 minutes but less than 3.4 minutes. 42. A random sample of size 25 is taken from a normal population having a mean of 80 and a standard deviation of 5. A second random sample of size 36 is taken from a different normal population having a mean of 75 and a standard deviation of 3. Find the probability that the sample mean computed from the 25 measurements will exceed the sample mean computed from the 36 measurements by at least 3.4 but less than

LIST OF FORMULAS FOR STK1100 AND STK1110

LIST OF FORMULAS FOR STK1100 AND STK1110 LIST OF FORMULAS FOR STK1100 AND STK1110 (Version of 11. November 2015) 1. Probability Let A, B, A 1, A 2,..., B 1, B 2,... be events, that is, subsets of a sample space Ω. a) Axioms: A probability function

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

1 Review of Probability and Distributions

1 Review of Probability and Distributions Random variables. A numerically valued function X of an outcome ω from a sample space Ω X : Ω R : ω X(ω) is called a random variable (r.v.), and usually determined by an experiment. We conventionally denote

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Chp 4. Expectation and Variance

Chp 4. Expectation and Variance Chp 4. Expectation and Variance 1 Expectation In this chapter, we will introduce two objectives to directly reflect the properties of a random variable or vector, which are the Expectation and Variance.

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Lecture 5: Moment generating functions

Lecture 5: Moment generating functions Lecture 5: Moment generating functions Definition 2.3.6. The moment generating function (mgf) of a random variable X is { x e tx f M X (t) = E(e tx X (x) if X has a pmf ) = etx f X (x)dx if X has a pdf

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

Probability- the good parts version. I. Random variables and their distributions; continuous random variables. Probability- the good arts version I. Random variables and their distributions; continuous random variables. A random variable (r.v) X is continuous if its distribution is given by a robability density

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (RV)X (R.V.) X: A function on sample space X: S R Cumulative distribution

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Probability Theory and Statistics. Peter Jochumzen

Probability Theory and Statistics. Peter Jochumzen Probability Theory and Statistics Peter Jochumzen April 18, 2016 Contents 1 Probability Theory And Statistics 3 1.1 Experiment, Outcome and Event................................ 3 1.2 Probability............................................

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA6530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Two Classes of R.V. Discrete R.V. Bernoulli Binomial Geometric Poisson Continuous R.V. Uniform Exponential,

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Homework 5 Solutions

Homework 5 Solutions 126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx

More information

Week 12-13: Discrete Probability

Week 12-13: Discrete Probability Week 12-13: Discrete Probability November 21, 2018 1 Probability Space There are many problems about chances or possibilities, called probability in mathematics. When we roll two dice there are possible

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Lecture 19: Properties of Expectation

Lecture 19: Properties of Expectation Lecture 19: Properties of Expectation Dan Sloughter Furman University Mathematics 37 February 11, 4 19.1 The unconscious statistician, revisited The following is a generalization of the law of the unconscious

More information

Partial Solutions for h4/2014s: Sampling Distributions

Partial Solutions for h4/2014s: Sampling Distributions 27 Partial Solutions for h4/24s: Sampling Distributions ( Let X and X 2 be two independent random variables, each with the same probability distribution given as follows. f(x 2 e x/2, x (a Compute the

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

Exercises and Answers to Chapter 1

Exercises and Answers to Chapter 1 Exercises and Answers to Chapter The continuous type of random variable X has the following density function: a x, if < x < a, f (x), otherwise. Answer the following questions. () Find a. () Obtain mean

More information

MATH2715: Statistical Methods

MATH2715: Statistical Methods MATH2715: Statistical Methods Exercises V (based on lectures 9-10, work week 6, hand in lecture Mon 7 Nov) ALL questions count towards the continuous assessment for this module. Q1. If X gamma(α,λ), write

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2 APPM/MATH 4/5520 Solutions to Exam I Review Problems. (a) f X (x ) f X,X 2 (x,x 2 )dx 2 x 2e x x 2 dx 2 2e 2x x was below x 2, but when marginalizing out x 2, we ran it over all values from 0 to and so

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) = Chapter 7 Generating functions Definition 7.. Let X be a random variable. The moment generating function is given by M X (t) =E[e tx ], provided that the expectation exists for t in some neighborhood of

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

Sampling Distributions

Sampling Distributions Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of

More information

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities PCMI 207 - Introduction to Random Matrix Theory Handout #2 06.27.207 REVIEW OF PROBABILITY THEORY Chapter - Events and Their Probabilities.. Events as Sets Definition (σ-field). A collection F of subsets

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

General Random Variables

General Random Variables 1/65 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Probability A general random variable is discrete, continuous, or mixed. A discrete random variable

More information

STAT 515 MIDTERM 2 EXAM November 14, 2018

STAT 515 MIDTERM 2 EXAM November 14, 2018 STAT 55 MIDTERM 2 EXAM November 4, 28 NAME: Section Number: Instructor: In problems that require reasoning, algebraic calculation, or the use of your graphing calculator, it is not sufficient just to write

More information

A Course Material on. Probability and Random Processes

A Course Material on. Probability and Random Processes A Course Material on Probability and Random Processes By Mrs. V.Sumathi ASSISTANT PROFESSOR DEPARTMENT OF SCIENCE AND HUMANITIES SASURIE COLLEGE OF ENGINEERING VIJAYAMANGALAM 638 56 QUALITY CERTIFICATE

More information

STAT515, Review Worksheet for Midterm 2 Spring 2019

STAT515, Review Worksheet for Midterm 2 Spring 2019 STAT55, Review Worksheet for Midterm 2 Spring 29. During a week, the proportion of time X that a machine is down for maintenance or repair has the following probability density function: 2( x, x, f(x The

More information

Bivariate Distributions

Bivariate Distributions Bivariate Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1 What s It All About? Many random processes produce Examples.»

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables CDA5530: Performance Models of Computers and Networks Chapter 2: Review of Practical Random Variables Definition Random variable (R.V.) X: A function on sample space X: S R Cumulative distribution function

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

1.12 Multivariate Random Variables

1.12 Multivariate Random Variables 112 MULTIVARIATE RANDOM VARIABLES 59 112 Multivariate Random Variables We will be using matrix notation to denote multivariate rvs and their distributions Denote by X (X 1,,X n ) T an n-dimensional random

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

5 Operations on Multiple Random Variables

5 Operations on Multiple Random Variables EE360 Random Signal analysis Chapter 5: Operations on Multiple Random Variables 5 Operations on Multiple Random Variables Expected value of a function of r.v. s Two r.v. s: ḡ = E[g(X, Y )] = g(x, y)f X,Y

More information

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3) STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 07 Néhémy Lim Moment functions Moments of a random variable Definition.. Let X be a rrv on probability space (Ω, A, P). For a given r N, E[X r ], if it

More information

Continuous r.v practice problems

Continuous r.v practice problems Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation

More information

14.30 Introduction to Statistical Methods in Economics Spring 2009

14.30 Introduction to Statistical Methods in Economics Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

Chap 2.1 : Random Variables

Chap 2.1 : Random Variables Chap 2.1 : Random Variables Let Ω be sample space of a probability model, and X a function that maps every ξ Ω, toa unique point x R, the set of real numbers. Since the outcome ξ is not certain, so is

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise Name M36K Final. Let X be a random variable with probability density function { /x x < f(x = 0 otherwise Compute the following. You can leave your answers in integral form. (a ( points Find F X (t = P

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

Continuous Random Variables and Continuous Distributions

Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Continuous Random Variables and Continuous Distributions Expectation & Variance of Continuous Random Variables ( 5.2) The Uniform Random Variable

More information

Homework 9 (due November 24, 2009)

Homework 9 (due November 24, 2009) Homework 9 (due November 4, 9) Problem. The join probability density function of X and Y is given by: ( f(x, y) = c x + xy ) < x

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs Lecture #7 BMIR Lecture Series on Probability and Statistics Fall 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University 7.1 Function of Single Variable Theorem

More information

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B Statistics STAT:5 (22S:93), Fall 25 Sample Final Exam B Please write your answers in the exam books provided.. Let X, Y, and Y 2 be independent random variables with X N(µ X, σ 2 X ) and Y i N(µ Y, σ 2

More information

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Lecture 25: Review. Statistics 104. April 23, Colin Rundel Lecture 25: Review Statistics 104 Colin Rundel April 23, 2012 Joint CDF F (x, y) = P [X x, Y y] = P [(X, Y ) lies south-west of the point (x, y)] Y (x,y) X Statistics 104 (Colin Rundel) Lecture 25 April

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix: Joint Distributions Joint Distributions A bivariate normal distribution generalizes the concept of normal distribution to bivariate random variables It requires a matrix formulation of quadratic forms,

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

MATH Notebook 4 Fall 2018/2019

MATH Notebook 4 Fall 2018/2019 MATH442601 2 Notebook 4 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 4 MATH442601 2 Notebook 4 3 4.1 Expected Value of a Random Variable............................

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Practice Examination # 3

Practice Examination # 3 Practice Examination # 3 Sta 23: Probability December 13, 212 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use a single

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information