Section 9.1. Expected Values of Sums

Size: px
Start display at page:

Download "Section 9.1. Expected Values of Sums"

Transcription

1 Section 9.1 Expected Values of Sums

2 Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ].

3 Proof: Theorem 9.1 We prove this theorem by induction on n. In Theorem 5.11, we proved E[W 2 ] = E[X 1 ]+E[X 2 ]. Now we assume E[W n 1 ] = E[X 1 ]+ +E[X n 1 ]. Notice that W n = W n 1 + X n. Since W n is a sum of the two random variables W n 1 and X n, we know that E[W n ] = E[W n 1 ]+E[X n ] = E[X 1 ]+ + E[X n 1 ] + E[X n ].

4 Theorem 9.2 The variance of W n = X X n is Var[W n ] = n i=1 Var[X i ] + 2 n 1 i=1 n j=i+1 Cov [ X i, X j ].

5 Proof: Theorem 9.2 From the definition of the variance, we can write Var[W n ] = E[(W n E[W n ]) 2 ]. For convenience, let µ i denote E[X i ]. Since W n = n i=1 X n and E[W n ] = n i=1 µ i, we can write ( n ) 2 n n Var[W n ] = E (X i µ i ) = E (X i µ i ) (X j µ j ) i=1 = n i=1 i=1 j=1 j=1 n Cov [X i, X j ]. (9.2) In terms of the random vector X = [ ], X 1 X n we see that Var[Wn ] is the sum of all the elements of the covariance matrix C X. Recognizing that Cov[X i, X i ] = Var[X] and Cov[X i, X j ] = Cov[X j, X i ], we place the diagonal terms of C X in one sum and the off-diagonal terms (which occur in pairs) in another sum to arrive at the formula in the theorem.

6 Theorem 9.3 When X 1,..., X n are uncorrelated, Var[W n ] = Var[X 1 ] + + Var[X n ].

7 Example 9.1 Problem X 0, X 1, X 2,... is a sequence of random variables with expected values E[X i ] = 0 and covariances, Cov[X i, X j ] = 0.8 i j. Find the expected value and variance of a random variable Y i defined as the sum of three consecutive values of the random sequence Y i = X i + X i 1 + X i 2. (9.3)

8 Example 9.1 Solution Theorem 9.1 implies that E [Y i ] = E [X i ] + E [ X i 1 ] + E [ Xi 2 ] = 0. (9.4) Applying Theorem 9.2, we obtain for each i, Var[Y i ] = Var[X i ] + Var[X i 1 ] + Var[X i 2 ] + 2 Cov [ X i, X i 1 ] + 2 Cov [ Xi, X i 2 ] + 2 Cov [ Xi 1, X i 2 ]. (9.5) We next note that Var[X i ] = Cov[X i, X i ] = 0.8 i i = 1 and that Cov [ X i, X i 1 ] = Cov [ Xi 1, X i 2 ] = 0.8 1, Cov [ X i, X i 2 ] = (9.6) Therefore, Var[Y i ] = = (9.7)

9 Example 9.2 Problem At a party of n 2 people, each person throws a hat in a common box. The box is shaken and each person blindly draws a hat from the box without replacement. We say a match occurs if a person draws his own hat. What are the expected value and variance of V n, the number of matches?

10 Example 9.2 Solution Let X i denote an indicator random variable such that { 1 person i draws his hat, X i = 0 otherwise. (9.8) The number of matches is V n = X X n. Note that the X i are generally not independent. For example, with n = 2 people, if the first person draws his own hat, then the second person must also draw her own hat. Note that the ith person is equally likely to draw any of the n hats, thus P Xi (1) = 1/n and E[X i ] = P Xi (1) = 1/n. Since the expected value of the sum always equals the sum of the expected values, E [V n ] = E [X 1 ] + + E [X n ] = n(1/n) = 1. (9.9) To find the variance of V n, we will use Theorem 9.2. The variance of X i is Var[X i ] = E [ X 2 i To find Cov[X i, X j ], we observe that ] (E [Xi ]) 2 = 1 n 1 n2. (9.10) Cov [X i, X j ] = E [X i X j ] E [X i ] E [X j ]. (9.11) [Continued]

11 Example 9.2 Solution (Continued 2) Note that X i X j = 1 if and only if X i = 1 and X j = 1, and X i X j = 0 otherwise. Thus E [X i X j ] = P Xi,X j (1, 1) = P Xi X j (1 1) P Xj (1). (9.12) Given X j = 1, that is, the jth person drew his own hat, then X i = 1 if and only if the ith person draws his own hat from the n 1 other hats. Hence P Xi X j (1 1) = 1/(n 1) and E [X i X j ] = 1 n(n 1), Cov [X i, X j ] = Finally, we can use Theorem 9.2 to calculate 1 n(n 1) 1 n2. (9.13) Var[V n ] = n Var[X i ] + n(n 1) Cov [X i, X j ] = 1. (9.14) That is, both the expected value and variance of V n are 1, no matter how large n is!

12 Example 9.3 Problem Continuing Example 9.2, suppose each person immediately returns to the box the hat that he or she drew. What is the expected value and variance of V n, the number of matches?

13 Example 9.3 Solution In this case the indicator random variables X i are independent and identically distributed (iid) because each person draws from the same bin containing all n hats. The number of matches V n = X X n is the sum of n iid random variables. As before, the expected value of V n is E [V n ] = n E [X i ] = 1. (9.15) In this case, the variance of V n equals the sum of the variances, Var[V n ] = n Var[X i ] = n ( 1 n 1 n 2 ) = 1 1 n. (9.16)

14 Quiz 9.1 Let W n denote the sum of n independent throws of a fair four-sided die. Find the expected value and variance of W n.

15 Quiz 9.1 Solution Let K 1,..., K n denote a sequence of iid random variables each with PMF { 1/4 k = 1,..., 4, P K (k) = 0 otherwise. We can write W n = K K n. First, we note that the first two moments of K i are E [K i ] = = 2.5, (2) 4 E [ ] Ki = = 7.5. (3) 4 Thus the variance of K i is Var[K i ] = E [ ] Ki 2 (E [Ki ]) 2 = 7.5 (2.5) 2 = (4) Since E[K i ] = 2.5, the expected value of W n is (1) E [W n ] = E [K 1 ] + + E [K n ] = 2.5n. (5) Since the rolls are independent, the random variables K 1,..., K n are independent. Hence, by Theorem 9.3, the variance of the sum equals the sum of the variances. That is, Var[W n ] = Var[K 1 ] + + Var[K n ] = 1.25n. (6)

16 Section 9.2 Moment Generating Functions

17 Definition 9.1 Moment Generating Function (MGF) For a random variable X, the moment generating function (MGF) of X is φ X (s) = E [ e sx].

18 Theorem 9.4 A random variable X with MGF φ X (s) has nth moment E [X n ] = dn φ X (s) ds n. s=0

19 Proof: Theorem 9.4 The first derivative of φ X (s) is ( dφ X (s) = d ds ds esx f X (x) dx = xesx f X (x) dx. (9.19) Evaluating this derivative at s = 0 proves the theorem for n = 1. dφ X (s) ds = s=0 Similarly, the nth derivative of φ X (s) is ) xf X(x) dx = E [X]. (9.20) d n φ X (s) ds n = xn e sx f X (x) dx. (9.21) The integral evaluated at s = 0 is the formula in the theorem statement.

20 Example 9.4 Problem X is an exponential random variable with MGF φ X (s) = λ/(λ s). What are the first and second moments of X? Write a general expression for the nth moment.

21 Example 9.4 Solution The first moment is the expected value: E [X] = dφ X(s) ds = s=0 The second moment of X is the mean square value: E [ X 2] = d2 φ X (s) ds 2 = s=0 λ (λ s) 2 = 1 s=0 λ. (9.22) 2λ (λ s) 3 = 2 s=0 λ2. (9.23) Proceeding in this way, it should become apparent that the nth moment of X is E [X n ] = dn φ X (s) ds n = s=0 n!λ (λ s) n+1 = n! s=0 λn. (9.24)

22 Theorem 9.5 The MGF of Y = ax + b is φ Y (s) = e sb φ X (as).

23 Proof: Theorem 9.5 From the definition of the MGF, φ Y (s) = E [ e s(ax+b)] = e sb E [ e (as)x] = e sb φ X (as). (9.25)

24 Quiz 9.2 Random variable K has PMF P K (k) = 0.2 k = 0,..., 4, 0 otherwise. (9.26) Use φ K (s) to find the first, second, third, and fourth moments of K.

25 Quiz 9.2 Solution The MGF of K is φ K (s) = E [ e sk] = 4 k=0 1 5 esk = 1 + es + e 2s + e 3s + e 4s. (1) 5 We find the moments by taking derivatives. The first derivative of φ K (s) is dφ K (s) = es + 2e 2s + 3e 3s + 4e 4s. (2) ds 5 Evaluating the derivative at s = 0 yields E [K] = dφ K(s) ds = s=0 5 = 2. (3) [Continued]

26 Quiz 9.2 Solution (Continued 2) To find higher-order moments, we continue to take derivatives: E [ K 2] = d2 φ K (s) ds 2 s=0 = es + 4e 2s + 9e 3s + 16e 4s 5 E [ K 3] = d3 φ K (s) ds 3 s=0 = es + 8e 2s + 27e 3s + 64e 4s ) 5 E [ K 4] = d4 φ K (s) ds 4 s=0 s=0 = 6. (4) s=0 = 20. (5) = es + 16e 2s + 81e 3s + 256e 4s = (6) 5 s=0

27 Section 9.3 MGF of the Sum of Independent Random Variables

28 Theorem 9.6 For a set of independent random variables X 1,..., X n, the moment generating function of W = X X n is φ W (s) = φ X1 (s)φ X2 (s) φ Xn (s). When X 1,..., X n are iid, each with MGF φ Xi (s) = φ X (s), φ W (s) = [φ X (s)] n.

29 Proof: Theorem 9.6 From the definition of the MGF, φ W (s) = E [ e s(x 1+ +X n ) ] = E [ e sx 1e sx 2 e sx n ]. (9.28) Here, we have the expected value of a product of functions of independent random variables. Theorem 8.4 states that this expected value is the product of the individual expected values: E [g 1 (X 1 )g 2 (X 2 ) g n (X n )] = E [g 1 (X 1 )] E [g 2 (X 2 )] E [g n (X n )]. (9.29) By Equation (9.29) with g i (X i ) = e sx i, the expected value of the product is φ W (s) = E [ e sx 1] E [ e sx 2 ] E [ e sx n ] = φx1 (s)φ X2 (s) φ Xn (s). (9.30) When X 1,..., X n are iid, φ Xi (s) = φ X (s) and thus φ W (s) = (φ W (s)) n.

30 Example 9.5 Problem J and K are independent random variables with probability mass functions j P J (j) , k 1 1 P K (k) (9.31) Find the MGF of M = J + K. What are P M (m) and E[M 3 ]?

31 Example 9.5 Solution J and K have have moment generating functions φ J (s) = 0.2e s + 0.6e 2s + 0.2e 3s, φ K (s) = 0.5e s + 0.5e s. (9.32) Therefore, by Theorem 9.6, M = J + K has MGF φ M (s) = φ J (s)φ K (s) = e s + 0.2e 2s + 0.3e 3s + 0.1e 4s. (9.33) The value of P M (m) at any value of m is the coefficient of e ms in φ M (s): φ M (s) = E [ e sm] = 0.1 }{{} P M (0) }{{} P M (1) e s }{{} P M (2) e 2s }{{} P M (3) e 3s + }{{} 0.1 e 4s. P M (4) From the coefficients of φ M (s), we construct the table for the PMF of M: m P M (m) To find the third moment of M, we differentiate φ M (s) three times: E [ M 3] = d3 φ M (s) ds 3 s=0 = 0.3e s + 0.2(2 3 )e 2s + 0.3(3 3 )e 3s + 0.1(4 3 )e 4s s=0 = (9.34)

32 Theorem 9.7 If K 1,..., K n are independent Poisson random variables, W = K 1 + +K n is a Poisson random variable.

33 Proof: Theorem 9.7 We adopt the notation E[K i ] = α i and note in Table 9.1 that K i has MGF By Theorem 9.6, φ Ki (s) = e α i(e s 1). (9.35) φ W (s) = e α 1(e s 1) e α 2(e s 1) e α n(e s 1) = e (α 1+ +α n )(e s 1) = e (α T )(e s 1) (9.36) where α T = α α n. Examining Table 9.1, we observe that φ W (s) is the moment generating function of the Poisson (α T ) random variable. Therefore, P W (w) = α w T e α /w! w = 0, 1,..., 0 otherwise. (9.37)

34 Theorem 9.8 The sum of n independent Gaussian random variables W = X X n is a Gaussian random variable.

35 Proof: Theorem 9.8 For convenience, let µ i = E[X i ] and σi 2 = Var[X i ]. Since the X i are independent, we know that φ W (s) = φ X1 (s)φ X2 (s) φ Xn (s) = e sµ 1+σ 2 1 s2 /2 e sµ 2+σ 2 2 s2 /2 e sµ n+σ 2 n s2 /2 = e s(µ 1+ +µ n )+(σ σ2 n )s2 /2. (9.38) From Equation (9.38), we observe that φ W (s) is the moment generating function of a Gaussian random variable with expected value µ µ n and variance σ σ2 n.

36 Theorem 9.9 If X 1,..., X n are iid exponential (λ) random variables, then W = X X n has the Erlang PDF f W (w) = λ n w n 1 e λw (n 1)! w 0, 0 otherwise.

37 Proof: Theorem 9.9 In Table 9.1 we observe that each X i has MGF φ X (s) = λ/(λ s). Theorem 9.6, W has MGF By φ W (s) = ( λ λ s ) n. (9.39) Returning to Table 9.1, we see that W has the MGF of an Erlang (n, λ) random variable.

38 Quiz 9.3(A) Let K 1, K 2,..., K m be iid discrete uniform random variables with PMF P K (k) = Find the MGF of J = K K m. 1/n k = 1, 2,..., n, 0 otherwise. (9.40)

39 Quiz 9.3(A) Solution Each K i has MGF φ K (s) = E [ e sk ] e s + e 2s + + e ns i = n = es (1 e ns n(1 e s ). (1) Since the sequence of K i is independent, Theorem 9.6 says the MGF of J is φ J (s) = (φ K (s)) m = ems (1 e ns ) m n m (1 e s ) m. (2)

40 Quiz 9.3(B) Let X 1,..., X n be independent Gaussian random variables with E[X i = 0] and Var[X i ] = i. Find the PDF of W = αx 1 + α 2 X α n X n. (9.41)

41 Quiz 9.3(B) Solution Since the set of α j X j are independent Gaussian random variables, Theorem 9.8 says that W is a Gaussian random variable. Thus to find the PDF of W, we need only find the expected value and variance. Since the expectation of the sum equals the sum of the expectations: E [W ] = α E [X 1 ] + α 2 E [X 2 ] + + α n E [X n ] = 0. (1) Since the α j X j are independent, the variance of the sum equals the sum of the variances: Var[W ] = α 2 Var[X 1 ] + α 4 Var[X 2 ] + + α 2n Var[X n ] = α 2 + 2(α 2 ) n(α 2 ) n. (2) Defining q = α 2, we can use Math Fact B.6 to write With E[W ] = 0 and σ 2 W Var[W ] = α2 α 2n+2 [1 + n(1 α 2 )] (1 α 2 ) 2. (3) = Var[W ], we can write the PDF of W as f W (w) = 1 2πσ 2 W e w2 /2σ 2 W. (4)

42 Section 9.4 Central Limit Theorem

43 Figure P X (x) 0.2 P X (x) 0.2 P X (x) x x x n = 5 n = 10 n = 20 The PMF of the X, the number of heads in n coin flips for n = 5, 10, 20. As n increases, the PMF more closely resembles a bell-shaped curve.

44 Theorem 9.10 Central Limit Theorem Given X 1, X 2,..., a sequence of iid random variables with expected value µ X and variance σx 2, the CDF of Z n = ( n i=1 X i nµ X )/ nσx 2 has the property lim n F Z n (z) = Φ(z).

45 Figure x x (a) n = 1 (b) n = x x (c) n = 3 (d) n = 4 The PDF of W n, the sum of n uniform (0, 1) random variables, and the corresponding central limit theorem approximation for n = 1, 2, 3, 4. The solid line denotes the PDF f Wn (w), and the broken line denotes the Gaussian approximation.

46 Figure Binomial CLT Binomial CLT n = 2, p = 1/2 n = 4, p = 1/ Binomial CLT Binomial CLT n = 8, p = 1/2 n = 16, p = 1/2 The binomial (n, p) CDF and the corresponding central limit theorem approximation for n = 4, 8, 16, 32, and p = 1/2.

47 Definition 9.2 Central Limit Theorem Approximation Let W n = X X n be the sum of n iid random variables, each with E[X] = µ X and Var[X] = σx 2. The central limit theorem approximation to the CDF of W n is ( ) w nµx F Wn (w) Φ. nσ 2 X

48 Example 9.6 To gain some intuition into the central limit theorem, consider a sequence of iid continuous random variables X i, where each random variable is uniform (0,1). Let W n = X X n. (9.46) Recall that E[X] = 0.5 and Var[X] = 1/12. Therefore, W n has expected value E[W n ] = n/2 and variance n/12. The central limit theorem says that the CDF of W n should approach a Gaussian CDF with the same expected value and variance. Moreover, since W n is a continuous random variable, we would also expect that the PDF of W n would converge to a Gaussian PDF. In Figure 9.2, we compare the PDF of W n to the PDF of a Gaussian random variable with the same expected value and variance. First, W 1 is a uniform random variable with the rectangular PDF shown in Figure 9.2(a). This figure also shows the PDF of W 1, a Gaussian random variable with expected value µ = 0.5 and variance σ 2 = 1/12. Here the PDFs are very dissimilar. When we consider n = 2, we have the situation in Figure 9.2(b). The PDF of W 2 is a triangle with expected value 1 and variance 2/12. The figure shows the corresponding Gaussian PDF. The following figures show the PDFs of W 3,..., W 6. The convergence to a bell shape is apparent.

49 Example 9.7 Now suppose W n = X X n is a sum of independent Bernoulli (p) random variables. We know that W n has the binomial PMF P Wn (w) = ( n) p w (1 p) n w. (9.47) w No matter how large n becomes, W n is always a discrete random variable and would have a PDF consisting of impulses. However, the central limit theorem says that the CDF of W n converges to a Gaussian CDF. Figure 9.3 demonstrates the convergence of the sequence of binomial CDFs to a Gaussian CDF for p = 1/2 and four values of n, the number of Bernoulli random variables that are added to produce a binomial random variable. For n 32, Figure 9.3 suggests that approximations based on the Gaussian distribution are very accurate.

50 Example 9.8 Problem A compact disc (CD) contains digitized samples of an acoustic waveform. In a CD player with a one bit digital to analog converter, each digital sample is represented to an accuracy of ±0.5 mv. The CD player oversamples the waveform by making eight independent measurements corresponding to each sample. The CD player obtains a waveform sample by calculating the average (sample mean) of the eight measurements. What is the probability that the error in the waveform sample is greater than 0.1 mv?

51 Example 9.8 Solution The measurements X 1, X 2,..., X 8 all have a uniform distribution between v 0.5 mv and v mv, where v mv is the exact value of the waveform sample. The compact disk player produces the output U = W 8 /8, where W 8 = 8 X i. (9.48) i=1 To find P[ U v > 0.1] exactly, we would have to find an exact probability model for W 8, either by computing an eightfold convolution of the uniform PDF of X i or by using the moment generating function. Either way, the process is extremely complex. Alternatively, we can use the central limit theorem to model W 8 as a Gaussian random variable with E[W 8 ] = 8µ X = 8v mv and variance Var[W 8 ] = 8 Var[X] = 8/12. Therefore, U is approximately Gaussian with E[U] = E[W 8 ]/8 = v and variance Var[W 8 ]/64 = 1/96. Finally, the error, U v in the output waveform sample is approximately Gaussian with expected value 0 and variance 1/96. It follows that P [ U v > 0.1] = 2 [ 1 Φ ( 0.1/ 1/96 )] = (9.49)

52 Example 9.9 Problem Transmit one million bits. Let A denote the event that there are at least 499,000 ones but no more than 501,000 ones. What is P[A]?

53 Example 9.9 Solution Let X i be the value of bit i (either 0 or 1). The number of ones in one million bits is W = 10 6 i=1 X i. Because X i is a Bernoulli (0.5) random variable, E[X i ] = 0.5 and Var[X i ] = 0.25 for all i. Note that E[W ] = 10 6 E[X i ] = 500,000, and Var[W ] = 10 6 Var[X i ] = 250,000. Therefore, σ W = 500. By the central limit theorem approximation, P [A] = P [W 501,000] P [W < 499,000] ( ) ( ) 501, , , ,000 Φ Φ = Φ(2) Φ( 2) = (9.50)

54 Quiz 9.4 X milliseconds, the total access time (waiting time + read time) to get one block of information from a computer disk, is the continuous (0,12) random variable. Before performing a certain task, the computer must access 12 different blocks of information from the disk. (Access times for different blocks are independent of one another.) The total access time for all the information is a random variable A milliseconds. (a) Find the expected value and variance of the access time X. (b) Find the expected value and standard deviation of the total access time A. (c) Use the central limit theorem to estimate P[A > 75 ms]. (d) Use the central limit theorem to estimate P[A < 48 ms].

55 Quiz 9.4 Solution (a) The expected access time is E [X] = xf X(x) dx = 12 0 (b) The second moment of the access time is x 12 dx = 6 ms. (1) E [ X 2] = x2 f X (x) dx = 12 0 x 2 12 dx = 48. (2) The variance of the access time is Var[X] = E[X 2 ] (E[X]) 2 = 12. (c) Using X i to denote the access time of block i, we can write A = X 1 + X X 12 (3) Since the expectation of the sum equals the sum of the expectations, E [A] = E [X 1 ] + + E [X 12 ] = 12 E [X] = 72 ms. (4) [Continued]

56 Quiz 9.4 Solution (Continued 2) (d) Since the X i are independent, Var[A] = Var[X 1 ] + + Var[X 12 ] = 12 Var[X] = 144. (5) Thus A has standard deviation σ A = 12. (e) To use the central limit theorem, we use Table 4.1 to evaluate [ ] A E [A] 75 E [A] P [A 75] = P ( Φ = (6) 12 Then P[A > 75] = 1 P[A 75] = (f) Once again, we use the central limit theorem and Table 4.1 to estimate [ ] A E [A] 48 E [A] P [A < 48] = P < σ A σ A ( Φ 12 ) ) σ A σ A = (7)

57 Section 9.5 Matlab

58 Example 9.10 Problem X 1 and X 2 are independent discrete random variables with PMFs P X1 (x) = 0.04 x = 1,..., 25, 0 otherwise, What is the PMF of W = X 1 + X 2? P X2 (x) = x 550 x = 10, 20,..., 100, 0 otherwise.

59 Example 9.10 Solution %sumx1x2.m sx1=(1:25);px1=0.04*ones(1,25); sx2=10*(1:10);px2=sx2/550; [SX1,SX2]=ndgrid(sx1,sx2); [PX1,PX2]=ndgrid(px1,px2); SW=SX1+SX2;PW=PX1.*PX2; sw=unique(sw); pw=finitepmf(sw,pw,sw); pmfplot(sw,pw); As in Example 5.25, sumx1x2.m uses ndgrid to generate a grid for all possible pairs of X 1 and X 2. The matrix SW holds the sum x 1 + x 2 for each possible pair x 1, x 2. The probability P X1,X 2 (x 1, x 2 ) of each such pair is in the matrix PW. For each unique w generated by pairs x 1 +x 2, finitepmf finds the probability P W (w). The graph of P W (w) appears in Figure 9.4.

60 Figure P W (w) w The PMF P W (w) for Example 9.10.

61 Figure 9.5 >> uniform12(10000); ans = >> uniform12(10000); ans = Two sample runs of uniform12.m.

62 Example 9.11 Problem Write a Matlab program to generate m = 10,000 samples of the random variable X = 12 i=1 U i 6. Use the data to find the relative frequencies of the following events {X T } for T = 3, 2..., 3. Calculate the probabilities of these events when X is a Gaussian (0, 1) random variable.

63 Example 9.11 Solution function FX=uniform12(m); x=sum(rand(12,m))-6; T=(-3:3);FX=(count(x,T)/m) ; [T;phi(T);FX] In uniform12(m), x holds the m samples of X. The function n=count(x,t) returns n(i) as the number of elements of x less than or equal to T(i). The output is a three-row table: T on the first row, the true probabilities P[X T ] = Φ(T ) second, and the relative frequencies third. Two sample runs of uniform12 are shown in Figure 9.5. We see that the relative frequencies and the probabilities diverge as T moves farther from zero. In fact this program will never produce a value of X > 6, no matter how many times it runs. By contrast, Q(6) = This suggests that in a set of one billion independent samples of the Gaussian (0, 1) random variable, we can expect two samples with X > 6, one sample with X < 6, and one sample with X > 6.

64 Quiz 9.5 X is the binomial (100, 0.5) random variable and Y is the discrete uniform (0, 100) random variable. Calculate and graph the PMF of W = X + Y.

65 Quiz 9.5 Solution One solution to this problem is to follow the approach of Example 9.10: %unifbinom100.m sx=0:100;sy=0:100; px=binomialpmf(100,0.5,sx); py=duniformpmf(0,100,sy); [SX,SY]=ndgrid(sx,sy); [PX,PY]=ndgrid(px,py); SW=SX+SY; PW=PX.*PY; sw=unique(sw); pw=finitepmf(sw,pw,sw); pmfplot(sw,pw); Here is a graph of the PMF P W (w): With some thought, it should be apparent that the is implementing the convolution of the two PMFs. finitepmf function

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

Nonparametric hypothesis tests and permutation tests

Nonparametric hypothesis tests and permutation tests Nonparametric hypothesis tests and permutation tests 1.7 & 2.3. Probability Generating Functions 3.8.3. Wilcoxon Signed Rank Test 3.8.2. Mann-Whitney Test Prof. Tesler Math 283 Fall 2018 Prof. Tesler Wilcoxon

More information

Section 8.1. Vector Notation

Section 8.1. Vector Notation Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random

More information

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014 Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem 18.5 Spring 214.5.4.3.2.1-4 -3-2 -1 1 2 3 4 January 1, 217 1 / 31 Expected value Expected value: measure of

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

Limiting Distributions

Limiting Distributions Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the

More information

Twelfth Problem Assignment

Twelfth Problem Assignment EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Week 9 The Central Limit Theorem and Estimation Concepts

Week 9 The Central Limit Theorem and Estimation Concepts Week 9 and Estimation Concepts Week 9 and Estimation Concepts Week 9 Objectives 1 The Law of Large Numbers and the concept of consistency of averages are introduced. The condition of existence of the population

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

ECE302 Spring 2015 HW10 Solutions May 3,

ECE302 Spring 2015 HW10 Solutions May 3, ECE32 Spring 25 HW Solutions May 3, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

MATH Solutions to Probability Exercises

MATH Solutions to Probability Exercises MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

Quick Tour of Basic Probability Theory and Linear Algebra

Quick Tour of Basic Probability Theory and Linear Algebra Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra CS224w: Social and Information Network Analysis Fall 2011 Quick Tour of and Linear Algebra Quick Tour of and Linear Algebra Outline Definitions

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

8 Laws of large numbers

8 Laws of large numbers 8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable

More information

1 Review of Probability

1 Review of Probability 1 Review of Probability Random variables are denoted by X, Y, Z, etc. The cumulative distribution function (c.d.f.) of a random variable X is denoted by F (x) = P (X x), < x

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Mathematical Statistics

Mathematical Statistics Mathematical Statistics Chapter Three. Point Estimation 3.4 Uniformly Minimum Variance Unbiased Estimator(UMVUE) Criteria for Best Estimators MSE Criterion Let F = {p(x; θ) : θ Θ} be a parametric distribution

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2016 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Probability reminders

Probability reminders CS246 Winter 204 Mining Massive Data Sets Probability reminders Sammy El Ghazzal selghazz@stanfordedu Disclaimer These notes may contain typos, mistakes or confusing points Please contact the author so

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y

More information

Limiting Distributions

Limiting Distributions We introduce the mode of convergence for a sequence of random variables, and discuss the convergence in probability and in distribution. The concept of convergence leads us to the two fundamental results

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 8 Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 8 Fall 007 Issued: Thursday, October 5, 007 Due: Friday, November, 007 Reading: Bertsekas

More information

3. Review of Probability and Statistics

3. Review of Probability and Statistics 3. Review of Probability and Statistics ECE 830, Spring 2014 Probabilistic models will be used throughout the course to represent noise, errors, and uncertainty in signal processing problems. This lecture

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

7 Random samples and sampling distributions

7 Random samples and sampling distributions 7 Random samples and sampling distributions 7.1 Introduction - random samples We will use the term experiment in a very general way to refer to some process, procedure or natural phenomena that produces

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Chapter 4 : Expectation and Moments

Chapter 4 : Expectation and Moments ECE5: Analysis of Random Signals Fall 06 Chapter 4 : Expectation and Moments Dr. Salim El Rouayheb Scribe: Serge Kas Hanna, Lu Liu Expected Value of a Random Variable Definition. The expected or average

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014. EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014 Midterm Exam 1 Last name First name SID Rules. DO NOT open the exam until instructed

More information

4 Branching Processes

4 Branching Processes 4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring

More information

University of Illinois ECE 313: Final Exam Fall 2014

University of Illinois ECE 313: Final Exam Fall 2014 University of Illinois ECE 313: Final Exam Fall 2014 Monday, December 15, 2014, 7:00 p.m. 10:00 p.m. Sect. B, names A-O, 1013 ECE, names P-Z, 1015 ECE; Section C, names A-L, 1015 ECE; all others 112 Gregory

More information

Chapter 3: Random Variables 1

Chapter 3: Random Variables 1 Chapter 3: Random Variables 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Math Spring Practice for the final Exam.

Math Spring Practice for the final Exam. Math 4 - Spring 8 - Practice for the final Exam.. Let X, Y, Z be three independnet random variables uniformly distributed on [, ]. Let W := X + Y. Compute P(W t) for t. Honors: Compute the CDF function

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

ECE313 Summer Problem Set 5. Note: It is very important that you solve the problems first and check the solutions. 0, if X = 2k, = 3.

ECE313 Summer Problem Set 5. Note: It is very important that you solve the problems first and check the solutions. 0, if X = 2k, = 3. ECE Summer 20 Problem Set 5 Reading: Independence, Important Discrete RVs Quiz date: no quiz, this is for the midterm Note: It is very important that you solve the problems first and chec the solutions

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Chapter 2. Discrete Distributions

Chapter 2. Discrete Distributions Chapter. Discrete Distributions Objectives ˆ Basic Concepts & Epectations ˆ Binomial, Poisson, Geometric, Negative Binomial, and Hypergeometric Distributions ˆ Introduction to the Maimum Likelihood Estimation

More information

Multivariate Distributions (Hogg Chapter Two)

Multivariate Distributions (Hogg Chapter Two) Multivariate Distributions (Hogg Chapter Two) STAT 45-1: Mathematical Statistics I Fall Semester 15 Contents 1 Multivariate Distributions 1 11 Random Vectors 111 Two Discrete Random Variables 11 Two Continuous

More information

Lecture 11. Multivariate Normal theory

Lecture 11. Multivariate Normal theory 10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1) 11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances

More information

Continuous Distributions

Continuous Distributions Continuous Distributions 1.8-1.9: Continuous Random Variables 1.10.1: Uniform Distribution (Continuous) 1.10.4-5 Exponential and Gamma Distributions: Distance between crossovers Prof. Tesler Math 283 Fall

More information

Statistics, Data Analysis, and Simulation SS 2015

Statistics, Data Analysis, and Simulation SS 2015 Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

4. Distributions of Functions of Random Variables

4. Distributions of Functions of Random Variables 4. Distributions of Functions of Random Variables Setup: Consider as given the joint distribution of X 1,..., X n (i.e. consider as given f X1,...,X n and F X1,...,X n ) Consider k functions g 1 : R n

More information

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM YOUR NAME: KEY: Answers in blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they are

More information

Massachusetts Institute of Technology

Massachusetts Institute of Technology 6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Math 151. Rumbos Spring Solutions to Review Problems for Exam 3

Math 151. Rumbos Spring Solutions to Review Problems for Exam 3 Math 151. Rumbos Spring 2014 1 Solutions to Review Problems for Exam 3 1. Suppose that a book with n pages contains on average λ misprints per page. What is the probability that there will be at least

More information

STA 111: Probability & Statistical Inference

STA 111: Probability & Statistical Inference STA 111: Probability & Statistical Inference Lecture Four Expectation and Continuous Random Variables Instructor: Olanrewaju Michael Akande Department of Statistical Science, Duke University Instructor:

More information

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type

Chapter 2: Discrete Distributions. 2.1 Random Variables of the Discrete Type Chapter 2: Discrete Distributions 2.1 Random Variables of the Discrete Type 2.2 Mathematical Expectation 2.3 Special Mathematical Expectations 2.4 Binomial Distribution 2.5 Negative Binomial Distribution

More information

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as L30-1 EEL 5544 Noise in Linear Systems Lecture 30 OTHER TRANSFORMS For a continuous, nonnegative RV X, the Laplace transform of X is X (s) = E [ e sx] = 0 f X (x)e sx dx. For a nonnegative RV, the Laplace

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable Distributions of Functions of Random Variables 5.1 Functions of One Random Variable 5.2 Transformations of Two Random Variables 5.3 Several Random Variables 5.4 The Moment-Generating Function Technique

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

Lecture 3 Continuous Random Variable

Lecture 3 Continuous Random Variable Lecture 3 Continuous Random Variable 1 Cumulative Distribution Function Definition Theorem 3.1 For any random variable X, 2 Continuous Random Variable Definition 3 Example Suppose we have a wheel of circumference

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information