HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session)

Size: px
Start display at page:

Download "HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session)"

Transcription

1 ECS 315: Probability and Random Processes 2015/1 HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session) Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra questions at the end are optional. (c) Late submission will be heavily penalized. Problem 1. [F2013/1] For each of the following random variables, find P [1 < X 2]. (a) X Binomial(3, 1/3) (b) X Poisson(3) Solution: (a) Because X Binomial(3, 1/3), we know that X can only take the values 0, 1, 2, 3. Only the value 2 satisfies the condition given. Therefore, P [1 < X 2] = P [X = 2] = p X (2). Recall that the pmf for the binomial random variable is ( ) n p X (x) = p x (1 p) n x x for x = 0, 1, 2, 3,..., n. Here, it is given that n = 3 and p = 1/3. Therefore, ( )( ) 2 ( 3 1 p X (2) = 1 1 ) 3 2 = = 2 9. (b) Because X Poisson(3), we know that X can take the values 0, 1, 2, 3,.... As in the previous part, only the value 2 satisfies the condition given. Therefore, P [1 < X 2] = P [X = 2] = p X (2). Recall that the pmf for the Poisson random variable is α αx p X (x) = e x! 7-1

2 ECS 315 HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session) 2015/ x Figure 7.1: CDF of X for Problem 2 2 for x = 0, 1, 2, 3,.... Here, it is given that α = 3. Therefore, 3 32 p X (2) = e 2! = 9 2 e Problem 2. [M2011/1] The cdf of a random variable X is plotted in Figure 7.1. (a) Find the pmf p X (x). (b) Find the family to which X belongs. (Uniform, Bernoulli, Binomial, Geometric, Poisson, etc.) Solution: (a) For discrete random variable, P [X = x] is the jump size at x on the cdf plot. In this problem, there are four jumps at 0, 1, 2, 3. P [X = 0] = the jump size at 0 = = = (4/10)3 = (2/5) 3. P [X = 1] = the jump size at 1 = = P [X = 2] = the jump size at 2 = = P [X = 3] = the jump size at 3 = = = (6/10)

3 ECS 315 HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session) 2015/1 In conclusion, p X (x) = 0.064, x = 0, 0.288, x = 1, 0.432, x = 2, 0.216, x = 3, 0, otherwise. (b) Among all the pmf that we discussed in class, only one can have support = {0, 1, 2, 3} with unequal probabilities. This is the binomial pmf. To check that it really is Binomial, recall that the pmf for binomial X is given by p X (x) = ( ) n x p x (1 p) (n x) for x = 0, 1, 2,..., n. Here, n = 3. Furthermore, observe that p X (0) = (1 p) n. By comparing p X (0) with what we had in part (a), we have 1 p = 2/5 or p = 3/5. For x = 1, 2, 3, plugging in p = 3/5 and n = 3 in to p X (x) = ( n x) p x (1 p) (n x) gives the same values as what we had in part (a). So, X is a binomial RV. Problem 3. Arrivals of customers at the local supermarket are modeled by a Poisson process with a rate of λ = 2 customers per minute. Let M be the number of customers arriving between 9:00 and 9:05. What is the probability that M < 2? Solution: Here, we are given that M P(α) where α = λt = 2 5 = 10. Recall that, for M P(α), we have { e α α m, m {0, 1, 2, 3,...} P [M = m] = m! 0, otherwise Therefore, α α0 α1 P [M < 2] = P [M = 0] + P [M = 1] = e + e α 0! 1! = e α (1 + α) = e 10 (1 + 10) = 11e Problem 4. When n is large, binomial distribution Binomial(n, p) becomes difficult to compute directly because of the need to calculate factorial terms. In this question, we will consider an approximation when the value of p is close to 0. In such case, the binomial can be approximated 1 by the Poisson distribution with parameter α = np. For this approximation to work, we will see in this exercise that n does not have to be very large and p does not need to be very small. 1 More specifically, suppose X n has a binomial distribution with parameters n and p n. If p n 0 and np n α as n, then α αk P [X n = k] e k!. 7-3

4 ECS 315 HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session) 2015/1 (a) Let X Binomial(12, 1/36). (For example, roll two dice 12 times and let X be the number of times a double 6 appears.) Evaluate p X (x) for x = 0, 1, 2. (b) Compare your answers part (a) with its Poisson approximation. (c) Compare MATLAB plots of p X (x) in part (a) and the pmf of P(np). Solution: (a) For Binomial(n, p) random variable, { ( n ) p X (x) = x p x (1 p) n x, x {0, 1, 2,..., n}, 0, otherwise. Here, we are given that n = 12 and p = Plugging in x = 0, 1, 2, we get , , , respectively (b) A Poisson random variable with parameter α = np can approximate a Binomial(n, p) random variable when n is large and p is small. Here, with n = 12 and p = 1, we 36. The Poisson pmf at x = 0, 1, 2 is given by e α αx x! = e 1/3 (1/3) x x!. have α = 12 1 = Plugging in x = 0, 1, 2 gives , , , respectively. (c) See Figure 7.2. Note how close they are! Binomial pmf Poisson pmf x Figure 7.2: Poisson Approximation 7-4

5 ECS 315 HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session) 2015/1 Problem 5. You go to a party with 500 guests. What is the probability that exactly one other guest has the same birthday as you? Calculate this exactly and also approximately by using the Poisson pmf. (For simplicity, exclude birthdays on February 29.) [Bertsekas and Tsitsiklis, 2008, Q2.2.2] Solution: Let N be the number of guests that has the same birthday as you. We may think of the comparison of your birthday with each of the guests as a Bernoulli trial. Here, there are 500 guests and therefore we are considering n = 500 trials. For each trial, the (success) probability that you have the same birthday as the corresponding guest is p = Then, this N Binomial(n, p). (a) Binomial: P [N = 1] = np 1 (1 p) n (b) Poisson: P [N = 1] = e np (np) ! 7-5

6 ECS 315 HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session) 2015/1 Extra Questions Here are some optional questions for those who want more practice. Problem 6. A sample of a radioactive material emits particles at a rate of 0.7 per second. Assuming that these are emitted in accordance with a Poisson distribution, find the probability that in one second (a) exactly one is emitted, (b) more than three are emitted, (c) between one and four (inclusive) are emitted [Applebaum, 2008, Q5.27]. Solution: Let X be the number or particles emitted during the one second under consideration. Then X P(α) where α = λt = = 0.7. α α1 (a) P [X = 1] = e = αe α = 0.7e ! (b) P [X > 3] = 1 P [X 3] = 1 3 (c) P [1 X 4] = 4 k=1 k= k e k! k e k! Problem 7 (M2011/1). You are given an unfair coin with probability of obtaining a head equal to 1/3, 000, 000, 000. You toss this coin 6,000,000,000 times. Let A be the event that you get tails for all the tosses. Let B be the event that you get heads for all the tosses. (a) Approximate P (A). (b) Approximate P (A B). Solution: Let N be the number of heads among the n tosses. Then, N B(n, p). Here, we have small p = 1/ and large n = So, we can apply Poisson approximation. In other words, B(n, p) is well-approximated by P(α) where α = np = 2. (a) P (A) = P [N = 0] = e ! = 1 e (b) P (A B) = P [N = 0]+P [N = n] = e +e 2. The second term is extremely 0! ( )! small compared to the first one. Hence, P (A B) is approximately the same as P (A). 7-6

7 ECS 315: Probability and Random Processes 2015/1 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra questions at the end are optional. (c) Late submission will be heavily penalized. Problem 1. Consider a random variable X whose pmf is 1/2, x = 1, p X (x) = 1/4, x = 0, 1, 0, otherwise. Let Y = X 2. (a) Find EX. (b) Find E [X 2 ]. (c) Find Var X. (d) Find σ X. (e) Find p Y (y). (f) Find EY. (g) Find E [Y 2 ]. Solution: (a) EX = x xp X (x) = ( 1) (0) (1) 1 4 = =

8 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 (b) E [X 2 ] = x x 2 p X (x) = ( 1) (0) (1)2 1 4 = = 3 4. (c) Var X = E [X 2 ] (EX) 2 = 3 4 ( 1 4) 2 = = (d) σ X = Var X = (e) First, we build a table to see which values y of Y are possible from the values x of X: x p X (x) y -1 1/2 ( 1) 2 = 1 0 1/4 (0) 2 = 0 1 1/4 (1) 2 = 1 Therefore, the random variable Y can takes two values: 0 and 1. p Y (0) = p X (0) = 1/4. p Y (1) = p X ( 1) + p X (1) = 1/2 + 1/4 = 3/4. Therefore, 1/4, y = 0, p Y (y) = 3/4, y = 1, 0, otherwise. (f) EY = y yp Y (y) = (0) (1) 3 4 = 3 4. Alternatively, because Y = X2, we automatically have E [Y ] = E [X 2 ]. Therefore, we can simply use the answer from part (b). (g) E [Y 2 ] = y y 2 p Y (y) = (0) (1)2 3 4 = 3 4. Alternatively, E [ Y 2] = E [ X 4] = x x 4 p X (x) = ( 1) (0) (1)4 1 4 = = 3 4. Problem 2. For each of the following random variables, find EX and σ X. (a) X Binomial(3, 1/3) (b) X Poisson(3) Solution: 8-2

9 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 (a) From the lecture notes, we know that when X Binomial(n, p), we have EX = np and Var X = p(1 p). Here, n = 3 and p = 1/3. Therefore, EX = 3 1 = 1. Also, 3 ( ) because Var X = = 2, we have σ 9 X = 2 Var X = 3. (b) From the lecture notes, we know that when X Poisson(α), we have EX = α and Var X = α. Here, α = 3. Therefore, EX = 3. Also, because Var X = 3, we have σ X = 3. Problem 3. Suppose X is a uniform discrete random variable on { 3, 2, 1, 0, 1, 2, 3, 4}. Find (a) EX (b) E [X 2 ] (c) Var X (d) σ X Solution: All of the calculations in this question are simply plugging in numbers into appropriate formulas. (a) EX = 0.5 (b) E [X 2 ] = 5.5 (c) Var X = 5.25 (d) σ X = Alternatively, we can find a formula for the general case of uniform random variable X on the sets of integers from a to b. Note that there are n = b a + 1 values that the random variable can take. Hence, all of them has probability 1 n. (a) EX = b k=a k 1 = 1 b n n k=a k = 1 n(a+b) = a+b. n

10 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 (b) First, note that b b ( ) (k + 1) (k 2) k (k 1) = k (k 1) 3 k=a ( b ) = 1 b (k + 1) k (k 1) k (k 1) (k 2) 3 i=a k=a k=a = 1 ((b + 1) b (b 1) a (a 1) (a 2)) 3 where the last equality comes from the fact that there are many terms in the first sum that is repeated in the second sum and hence many cancellations. Now, Therefore, b k 2 = k=a b k=a = 1 3 b (k (k 1) + k) = k=a b k (k 1) + k=a b k k=a ((b + 1) b (b 1) a (a 1) (a 2)) + n (a + b) 2 k 2 1 n = 1 a + b ((b + 1) b (b 1) a (a 1) (a 2)) + 3n 2 = 1 3 a2 1 6 a ab b b2 (c) Var X = E [X 2 ] (EX) 2 = 1 12 (d) σ X = n Var X = (b a) (b a + 2) = 1 12 (n 1)(n + 1) = n Problem 4. (Expectation + pmf + Gambling + Effect of miscalculation of probability) In the eighteenth century, a famous French mathematician Jean Le Rond d Alembert, author of several works on probability, analyzed the toss of two coins. He reasoned that because this experiment has THREE outcomes, (the number of heads that turns up in those two tosses can be 0, 1, or 2), the chances of each must be 1 in 3. In other words, if we let N be the number of heads that shows up, Alembert would say that [Mlodinow, 2008, p 50 51] p N (n) = 1/3 for N = 0, 1, 2. (8.1) 8-4

11 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 We know that Alembert s conclusion was wrong. His three outcomes are not equally likely and hence classical probability formula can not be applied directly. The key is to realize that there are FOUR outcomes which are equally likely. We should not consider 0, 1, or 2 heads as the possible outcomes. There are in fact four equally likely outcomes: (heads, heads), (heads, tails), (tails, heads), and (tails, tails). These are the 4 possibilities that make up the sample space. The actual pmf for N is 1/4, n = 0, 2, p N (n) = 1/2 n = 1, 0 otherwise. Suppose you travel back in time and meet Alembert. You could make the following bet with Alembert to gain some easy money. The bet is that if the result of a toss of two coins contains exactly one head, then he would pay you $150. Otherwise, you would pay him $100. Let R be Alembert s profit from this bet and Y be the your profit from this bet. (a) Then, R = 150 if you win and R = +100 otherwise. Use Alembert s miscalculated probabilities from (8.1) to determine the pmf of R (from Alembert s belief). (b) Use Alembert s miscalculated probabilities from (8.1) (or the corresponding (miscalculated) pmf found in part (a)) to calculate ER, the expected profit for Alembert. Remark: You should find that ER > 0 and hence Alembert will be quite happy to accept your bet. (c) Use the actual probabilities, to determine the pmf of R. (d) Use the actual pmf, to determine ER. Remark: You should find that ER < 0 and hence Alembert should not accept your bet if he calculates the probabilities correctly. (e) Note that Y = +150 if you win and Y = 100 otherwise. Use the actual probabilities to determine the pmf of Y. (f) Use the actual probabilities, to determine EY. Remark: You should find that EY > 0. This is the amount of money that you expect to gain each time that you play with Alembert. Of course, Alembert, who still believes that his calculation is correct, will ask you to play this bet again and again believing that he will make profit in the long run. By miscalculating probabilities, one can make wrong decisions (and lose a lot of money)! Solution: 8-5

12 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 (a) P [R = 150] = P [N = 1] and P [R = +100] = P [N 1] = P [N = 0] + P [N = 2]. So, p N (1), r = 150, p R (r) = p N (0) + p N (2), r = +100, 0, otherwise. Using Alembert s miscalculated pmf, 1/3, r = 150, p R (r) = 2/3, r = +100, 0, otherwise (b) From p R (r) in part (a), we have ER = r p R(r) = 1 ( 150) = (c) Again, Using the actual pmf, 1, r = 150, 2 1 p R (r) = + 1, r = +100, 4 4 0, otherwise p N (1), r = 150, p R (r) = p N (0) + p N (2), r = +100, 0, otherwise = { 1, 2 r = 150 or + 100, 0, otherwise. (d) From p R (r) in part (c), we have ER = r p R(r) = 1 ( 150) = (e) Observe that Y = R. Hence, using the answer from part (d), we have p R (r) = { 1, 2 r = +150 or 100, 0, otherwise. (f) Observe that Y = R. Hence, EY = ER. Using the actual probabilities, ER = 25 from part (d). Hence, EY = +25. Extra Questions Here are some optional questions for those who want more practice. 8-6

13 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 Problem 5. A random variables X has support containing only two numbers. Its expected value is EX = 5. Its variance is Var X = 3. Give an example of the pmf of such a random variable. Solution: We first find σ X = Var X = 3. Recall that this is the average deviation from the mean. Because X takes only two values, we can make them at exactly ± 3 from the mean; that is x 1 = 5 3 and x 2 = In which case, we automatically have EX = 5 and Var X = 3. Hence, one example of such pmf is { 1 p X (x) =, x = 5 ± 3 2 0, otherwise We can also try to find a general formula for x 1 and x 2. If we let p = P [X = x 2 ], then q = 1 p = P [X = x 1 ]. Given p, the values of x 1 and x 2 must satisfy two conditions: EX = m and Var X = σ 2. (In our case, m = 5 and σ 2 = 3.) From EX = m, we must have that is x 1 q + x 2 p = m; (8.2) x 1 = m q x p 2 q. From Var X = σ 2, we have E [X 2 ] = Var X + EX 2 = σ 2 + m 2 and hence we must have Substituting x 1 from (8.2) into (8.3), we have x 2 1q + x 2 2p = σ 2 + m 2. (8.3) x 2 2p 2x 2 mp + ( pm 2 qσ 2) = 0 whose solutions are Using (8.2), we have x 2 = 2mp ± 4m 2 p 2 4p (pm 2 qσ 2 ) 2p = 2mp ± 2σ pq 2p x 1 = m ) q p p (m q ± σ p q = m σ q. Therefore, for any given p, there are two pmfs: 1 p, x = m σ p X (x) = p, x = m + σ 0, otherwise, p 1 p 1 p p q = m ± σ p. 8-7

14 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 or 1 p, x = m + σ p p X (x) = p, x = m σ p 0, otherwise. 1 p 1 p Problem 6. For each of the following families of random variable X, find the value(s) of x which maximize p X (x). (This can be interpreted as the mode of X.) (a) P(α) (b) Binomial(n, p) (c) G 0 (β) (d) G 1 (β) Remark [Y&G, p. 66]: For statisticians, the mode is the most common number in the collection of observations. There are as many or more numbers with that value than any other value. If there are two or more numbers with this property, the collection of observations is called multimodal. In probability theory, a mode of random variable X is a number x mode satisfying p X (x mode ) p X (x) for all x. For statisticians, the median is a number in the middle of the set of numbers, in the sense that an equal number of members of the set are below the median and above the median. In probability theory, a median, X median, of random variable X is a number that satisfies P [X < X median ] = P [X > X median ]. Neither the mode nor the median of a random variable X need be unique. A random variable can have several modes or medians. Solution: We first note that when α > 0, p (0, 1), n N, and β (0, 1), the above pmf s will be strictly positive for some values of x. Hence, we can discard those x at which p X (x) = 0. The remaining points are all integers. To compare them, we will evaluate p X(i+1) p X (i). (a) For Poisson pmf, we have p X (i + 1) p X (i) = e α α i+1 (i+1)! e α α i i! = α i + 1. Notice that 8-8

15 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 p X(i+1) p X (i) > 1 if and only if i < α 1. p X(i+1) p X (i) = 1 if and only if i = α 1. p X(i+1) p X (i) < 1 if and only if i > α 1. Let τ = α 1. This implies that τ is the place where things change. Moving from i to i + 1, the probability strictly increases if i < τ. When i > τ, the next probability value (at i + 1) will decrease. (i) Suppose α (0, 1), then α 1 < 0 and hence i > α 1 for all i. (Note that i are are nonnegative integers.) This implies that the pmf is a strictly decreasing function and hence the maximum occurs at the first i which is i = 0. (ii) Suppose α N. Then, the pmf will be strictly increasing until we reaches i = α 1. At which point, the next probability value is the same. Then, as we further increase i, the pmf is strictly decreasing. Therefore, the maximum occurs at α 1 and α. (iii) Suppose α / N and α 1. Then we will have have any i = α 1. The pmf will be strictly increasing where the last increase is from i = α 1 to i + 1 = α = α. After this, the pmf is strictly decreasing. Hence, the maximum occurs at α. To summarize, arg max p X (x) = x 0, α (0, 1), α 1 and α, α is an integer, α, α > 1 is not an integer. (b) For binomial pmf, we have p X (i + 1) p X (i) = n! (i+1)!(n i 1)! pi+1 (1 p) n i 1 n! i!(n i)! pi (1 p) n i = (n i) p (i + 1) (1 p). Notice that p X(i+1) p X (i) > 1 if and only if i < np 1 + p = (n + 1)p 1. p X(i+1) p X (i) = 1 if and only if i = (n + 1)p 1. p X(i+1) p X (i) < 1 if and only if i > (n + 1)p

16 ECS 315 HW Solution 8 Due: Nov 4, 9:19 AM (in tutorial session) 2015/1 Let τ = (n+1)p 1. This implies that τ is the place where things change. Moving from i to i + 1, the probability strictly increases if i < τ. When i > τ, the next probability value (at i + 1) will decrease. (i) Suppose (n+1)p is an integer. The pmf will strictly increase as a function of i, and then stays at the same value at i = τ = (n + 1)p 1 and i + 1 = (n + 1)p = (n + 1)p. Then, it will strictly decrease. So, the maximum occurs at (n + 1)p 1 and (n + 1)p. (ii) Suppose (n + 1)p is not an integer. Then, there will not be any i that is = τ. Therefore, we only have the pmf strictly increases where the last increase occurs when we goes from i = τ to i + 1 = τ + 1. After this, the probability is strictly decreasing. Hence, the maximum is unique and occur at τ + 1 = (n + 1)p = (n + 1)p. To summarize, arg max p X (x) = x { (n + 1)p 1 and (n + 1)p, (n + 1)p is an integer, (n + 1)p, (n + 1)p is not an integer. (c) p X(i+1) p X (i) = β < 1. Hence, p X (i) is strictly decreasing. The maximum occurs at the smallest value of i which is 0. (d) p X(i+1) p X (i) = β < 1. Hence, p X (i) is strictly decreasing. The maximum occurs at the smallest value of i which is 1. Problem 7. An article in Information Security Technical Report [ Malicious Software Past, Present and Future (2004, Vol. 9, pp. 618)] provided the data (shown in Figure 8.1) on the top ten malicious software instances for The clear leader in the number of registered incidences for the year 2002 was the Internet worm Klez. This virus was first detected on 26 October 2001, and it has held the top spot among malicious software for the longest period in the history of virology. Suppose that 20 malicious software instances are reported. Assume that the malicious sources can be assumed to be independent. (a) What is the probability that at least one instance is Klez? (b) What is the probability that three or more instances are Klez? (c) What are the expected value and standard deviation of the number of Klez instances among the 20 reported? 8-10

17 (b) What is the probability of no hits? [ Malicious Software Past, Present and Future (2004, Vol. 9, pp. 6 18)] provided the following data on the top ten malicious (c) What are the mean and variance of software instances for The clear leader in the num A statistical process control ch ber of registered incidences for the year 2002 was the Internet worm Klez, and it is still one of the most widespread threats. This virus was first detected on 26 October 2001, and it has of 20 parts from a metal punching pro hour. Typically, 1% of the parts require the number of parts in the sample of 20 ECS 315 HW Solution held the 8 top Due: spot among Novmalicious 4, 9:19software AM (in for the tutorial longest session) process problem 2015/1 is suspected if X exc period in the history of virology. Place Name % Instances 1 I-Worm.Klez 61.22% 2 I-Worm.Lentin 20.52% 3 I-Worm.Tanatos 2.09% 4 I-Worm.BadtransII 1.31% 5 Macro.Word97.Thus 1.19% 6 I-Worm.Hybris 0.60% 7 I-Worm.Bridex 0.32% 8 I-Worm.Magistr 0.30% 9 Win95.CIH 0.27% 10 I-Worm.Sircam 0.24% The 10 most widespread malicious programs for 2002 (Source Kaspersky Labs). than three standard deviations. (a) If the percentage of parts that req 1%, what is the probability that X more than three standard deviation (b) If the rework percentage increas probability that X exceeds 1? (c) If the rework percentage increase probability that X exceeds 1 in at le hours of samples? Because not all airline passen reserved seat, an airline sells 125 ticket only 120 passengers. The probability th show up is 0.10, and the passengers be (a) What is the probability that every up can take the flight? (b) What is the probability that the flig seats? Figure 8.1: The 10 most widespread malicious programs for 2002 (Source Kaspersky Labs) This exercise illustrates that p Suppose that 20 malicious software instances are reported. schedules and costs. A manufacturing pr Solution: Let N Assume be thethat number the malicious of instances sources can (among be assumed theto 20) be independent. that are orders Klez. to fill. Then, Each order requires one N binomial(n, p) where n = 20 and p = purchased from a supplier. However, ty (a) What is the probability that at least one instance is Klez? ponents are identified as defective, and (a) P [N 1] = 1 P (b)[n What < 1] is the = 1 P probability [N = that 0] = three 1 p or more instances are assumed to be independent. N (0) = 1 ( ) 20 Klez? (a) If the manufacturer stocks 100 co (c) What are the mean and standard deviation of the number probability that the 100 orders of Klez instances among the 20 reported? reordering components? (b) Heart failure is due to either natural occurrences (b) If the manufacturer stocks 102 co P [N 3] = 1 P [N < 3] = 1 (P [N = 0] + P [N = 1] + P [N (87%) or outside factors (13%). Outside factors are related to probability = 2]) that the 100 orders induced substances 2 ( ) 20 or foreign objects. Natural occurrences are reordering components? caused = 1 by arterial blockage, (0.6122) disease, k (0.3878) and infection. 20 k Suppose (c) If the manufacturer stocks 105 co k that 20 patients k=0 will visit an emergency room with heart failure. probability that the 100 orders Assume that causes of heart failure between individuals are reordering components? (c) EN = np = 20 independent = Consider the lengths of stay at σ N = Var N = (a) What np(1is the p) probability = 20 that three individuals have conditions caused by outside factors? pendently arrive for service. department in Exercise Assume (b) What is the probability that three or more individuals have (a) What is the probability that the le conditions caused by outside factors? one person is less than or equal to 4 (c) What are the mean and standard deviation of the number (b) What is the probability that exactly of individuals with conditions caused by outside factors? than 4 hours? 8-11

18 ECS 315: Probability and Random Processes 2015/1 HW 9 Due: Nov 11, 9:19 AM (in tutorial session) Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra questions at the end are optional. (c) Late submission will be rejected. Problem 1 (Modified from Yates and Goodman, 2005, Q3.1.3). The CDF of a random variable W is 0, w < 5, (w + 5)/8, 5 w < 3, F W (w) = 1/4, 3 w < 3, 1/4 + 3 (w 3) /8, 3 w < 5, 1, w 5. (a) Is W a continuous random variable? (b) What is P [W 4]? (c) What is P [ 2 < W 2]? (d) What is P [W > 0]? (e) What is the value of a such that P [W a] = 1/2? Problem 2 (Yates and Goodman, 2005, Q3.2.1). The random variable X has probability density function { cx 0 x 2, f X (x) = 0, otherwise. Use the pdf to find (a) the constant c, 9-1

19 ECS 315 HW 9 Due: Nov 11, 9:19 AM (in tutorial session) 2015/1 (b) P [0 X 1], (c) P [ 1/2 X 1/2], (d) the cdf F X (x). Problem 3 (Yates and Goodman, 2005, Q3.2.3). The CDF of random variable W is 0, w < 5, (w + 5)/8, 5 w < 3, F W (w) = 1/4, 3 w < 3, 1/4 + 3 (w 3) /8, 3 w < 5, 1, w 5. Find its pdf f W (w). Problem 4 (Yates and Goodman, 2005, Q3.3.4). The pdf of random variable Y is { y/2 0 y < 2, f Y (y) = 0, otherwise. What are E [Y ] and Var Y? Problem 5 (Yates and Goodman, 2005, Q3.3.6). The cdf of random variable V is 0 v < 5, F V (v) = (v + 5) 2 /144, 5 v < 7, 1 v 7. (a) What is E [V ]? (b) What is Var[V ]? (c) What is E [V 3 ]? 9-2

20 Y&G Q3.1.3 Wednesday, October 03, :14 PM ECS HW9 Page 1

21 Y&G Q3.2.1 Wednesday, October 03, :50 PM ECS HW9 Page 2

22 ECS HW9 Page 3 Y&G Q3.2.3 Wednesday, October 03, :18 PM

23 Y&G Q3.3.4 Tuesday, September 07, :08 PM ECS HW9 Page 4

24 Y&G Q3.3.6 Wednesday, October 03, :37 PM ECS HW9 Page 5

25 ECS 315: Probability and Random Processes 2015/1 HW Solution 10 Due: Nov 18, 8:59 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1 (Randomly Phased Sinusoid). Suppose Θ is a uniform random variable on the interval (0, 2π). (a) Consider another random variable X defined by where t is some constant. Find E [X]. X = 5 cos(7t + Θ) (b) Consider another random variable Y defined by Y = 5 cos(7t 1 + Θ) 5 cos(7t 2 + Θ) where t 1 and t 2 are some constants. Find E [Y ]. Solution: First, because Θ is a uniform random variable on the interval (0, 2π), we know that f Θ (θ) = 1 1 2π (0,2π)(t). Therefore, for any function g, we have E [g(θ)] = g(θ)f Θ (θ)dθ. (a) X is a function of Θ. E [X] = 5E [cos(7t + Θ)] = 5 2π 1 cos(7t + θ)dθ. Now, we know 0 2π that integration over a cycle of a sinusoid gives 0. So, E [X] = 0. (b) Y is another function of Θ. E [Y ] = E [5 cos(7t 1 + Θ) 5 cos(7t 2 + Θ)] = 2π 0 = 25 2π 2π 0 cos(7t 1 + θ) cos(7t 2 + θ)dθ. 1 2π 5 cos(7t 1 + θ) 5 cos(7t 2 + θ)dθ 10-1

26 ECS 315 HW Solution 10 Due: Nov 18, 8:59 AM 2015/1 Recall 1 the cosine identity Therefore, cos(a) cos(b) = 1 (cos (a + b) + cos (a b)). 2 EY = 25 4π 2π = 25 4π 0 cos (14t + 2θ) + cos (7 (t 1 t 2 )) dθ ( 2π 0 cos (14t + 2θ) dθ + 2π 0 cos (7 (t 1 t 2 )) dθ ). The first integral gives 0 because it is an integration over two period of a sinusoid. The integrand in the second integral is a constant. So, EY = 25 4π cos (7 (t 1 t 2 )) 2π 0 dθ = 25 4π cos (7 (t 1 t 2 )) 2π = 25 2 cos (7 (t 1 t 2 )). Problem 2 (Yates and Goodman, 2005, Q3.4.5). X is a continuous uniform RV on the interval ( 5, 5). (a) What is its pdf f X (x)? (b) What is its cdf F X (x)? (c) What is E [X]? (d) What is E [X 5 ]? (e) What is E [ e X]? Solution: For a uniform random variable X on the interval (a, b), we know that { 0, x < a or x > b, f X (x) = a x b 1, b a 1 This identity could be derived easily via the Euler s identity: cos(a) cos(b) = eja + e ja ejb + e jb = 1 ( e ja e jb + e ja e jb + e ja e jb + e ja e jb) = 1 ( e ja e jb + e ja e jb + e ja e jb + e ja e jb ) = 1 (cos (a + b) + cos (a b))

27 ECS 315 HW Solution 10 Due: Nov 18, 8:59 AM 2015/1 and In this problem, we have a = 5 and b = 5. { 0, x < a or x > b, F X (x) = x a, a x b. b a (a) f X (x) = { 0, x < 5 or x > 5, x 5 (b) F X (x) = (c) EX = In general, EX = { 0, x < a or x > b, x+5 10 a x b. xf X (x) dx = b a 5 5 x 1 10 dx = 1 10 x x 1 b a dx = 1 b xdx = 1 b a b a a With a = 5 and b = 5, we have EX = 0. (d) E [X 5 ] = In general, x 5 f X (x) dx = E [ X 5] = b a 5 5 x dx = 1 10 x 6 6 ( = ( 5) 2) = x 2 2 x 5 1 b a dx = 1 b x 5 dx = 1 b a b a a With a = 5 and b = 5, we have E [X 5 ] = 0. (e) In general, E [ e X] = b a b a = 1 b 2 a 2 b a 2 ( = ( 5) 6) = 0. x 6 6 b a = 1 b a = a + b 2. b 6 a 6. 2 e x 1 b a dx = 1 b e x dx = 1 b a b a ex b a = eb e a b a. a With a = 5 and b = 5, we have E [ e X] = e5 e

28 ECS 315 HW Solution 10 Due: Nov 18, 8:59 AM 2015/1 Problem 3. A random variable X is a Gaussian random variable if its pdf is given by f X (x) = 1 2πσ e 1 2( x m σ ) 2, for some constant m and positive number σ. Furthermore, when a Gaussian random variable has m = 0 and σ = 1, we say that it is a standard Gaussian random variable. There is no closed-form expression for the cdf of the standard Gaussian random variable. The cdf itself is denoted by Φ and its values (or its complementary values Q( ) = 1 Φ( )) are traditionally provided by a table. Suppose Z is a standard Gaussian random variable. (a) Use the Φ table to find the following probabilities: (i) P [Z < 1.52] (ii) P [Z < 1.52] (iii) P [Z > 1.52] (iv) P [Z > 1.52] (v) P [ 1.36 < Z < 1.52] (b) Use the Φ table to find the value of c that satisfies each of the following relation. (i) P [Z > c] = 0.14 (ii) P [ c < Z < c] = 0.95 (a) Solution: (i) P [Z < 1.52] = Φ(1.52) = (ii) P [Z < 1.52] = Φ( 1.52) = 1 Φ(1.52) = = (iii) P [Z > 1.52] = 1 P [Z < 1.52] = 1 Φ(1.52) = = (iv) It is straightforward to see that the area of P [Z > 1.52] is the same as P [Z < 1.52] = Φ(1.52). So, P [Z > 1.52] = Alternatively, P [Z > 1.52] = 1 P [Z 1.52] = 1 Φ( 1.52) = 1 (1 Φ(1.52)) = Φ(1.52). (v) P [ 1.36 < Z < 1.52] = Φ(1.52) Φ( 1.36) = Φ(1.52) (1 Φ(1.36)) = Φ(1.52)+ Φ(1.36) 1 = =

29 ECS 315 HW Solution 10 Due: Nov 18, 8:59 AM 2015/1 (b) (i) P [Z > c] = 1 P [Z c] = 1 Φ(c). So, we need 1 Φ(c) = 0.14 or Φ(c) = = In the Φ table, we do not have exactly 0.86, but we have and Because 0.86 is closer to , we answer the value of c whose φ(c) = Therefore, c (ii) P [ c < Z < c] = Φ(c) Φ( c) = Φ(c) (1 Φ(c)) = 2Φ(c) 1. So, we need 2Φ(c) 1 = 0.95 or Φ(c) = From the Φ table, we have c Problem 4. The peak temperature T, as measured in degrees Fahrenheit, on a July day in New Jersey is a N (85, 100) random variable. Remark: Do not forget that, for our class, the second parameter in N (, ) is the variance (not the standard deviation). (a) Express the cdf of T in terms of the Φ function. (b) Express each of the following probabilities in terms of the Φ function(s). Make sure that the arguments of the Φ functions are positive. (Positivity is required so that we can directly use the Φ/Q tables to evaluate the probabilities.) (i) P [T > 100] (ii) P [T < 60] (iii) P [70 T 100] (c) Express each of the probabilities in part (b) in terms of the Q function(s). Again, make sure that the arguments of the Q functions are positive. (d) Evaluate each of the probabilities in part (b) using the Φ/Q tables. (e) Observe that the Φ table ( Table 4 from the lecture) stops at z = 2.99 and the Q table ( Table 5 from the lecture) starts at z = Why is it better to give a table for Q(z) instead of Φ(z) when z is large? Solution: (a) Recall that when X N (m, σ 2 ), F X (x) = Φ ( ) x m σ. Here, T N (85, 10 2 ). Therefore, ( ) t 85 F T (t) = Φ

30 ECS 315 HW Solution 10 Due: Nov 18, 8:59 AM 2015/1 (b) (i) P [T > 100] = 1 P [T 100] = 1 F T (100) = 1 Φ ( ) = 1 Φ(1.5) (ii) P [T < 60] = P [T 60] because T is a continuous random variable and hence (iii) P [T = 60] = 0. Now, P [T 60] = F T (60) = Φ ( ) = Φ ( 2.5) = 1 Φ (2.5). Note that, for the last equality, we use the fact that Φ( z) = 1 Φ(z). ( ) ( ) P [70 T 100] = F T (100) F T (70) = Φ Φ = Φ (1.5) Φ ( 1.5) = Φ (1.5) (1 Φ (1.5)) = 2Φ (1.5) 1. (c) In this question, we use the fact that Q(x) = 1 Φ(x). (d) (i) 1 Φ(1.5) = Q(1.5). (ii) 1 Φ (2.5) = Q(2.5). (iii) 2Φ (1.5) 1 = 2(1 Q(1.5)) 1 = 2 2Q(1.5) 1 = 1 2Q(1.5). (i) 1 Φ(1.5) = = (ii) 1 Φ (2.5) = = (iii) 2Φ (1.5) 1 = 2 (0.9332) 1 = (e) When z is large, Φ(z) will start with The first few significant digits will all be the same and hence not quite useful to be there. Extra Questions Here are some optional questions for those who want more practice. Problem 5. Let X be a uniform random variable on the interval [0, 1]. Set [ A = 0, 1 ) [, B = 0, 1 ) [ , 3 ) [, and C = 0, 1 ) [ , 3 ) [ 1 8 2, 5 ) 8 Are the events [X A], [X B], and [X C] independent? [ 3 4, 7 )

31 ECS 315 HW Solution 10 Due: Nov 18, 8:59 AM 2015/1 Solution: Note that P [X A] = P [X B] = P [X C] = dx = 1 2, dx + dx dx = 1 2, and dx dx dx = Now, for pairs of events, we have P ([X A] [X B]) = 1 4 dx = 1 4 = P [X A] P [X B], (10.1) P ([X A] [X C]) = P ([X B] [X C]) = dx + dx dx = 1 4 dx = 1 4 = P [X A] P [X C], and (10.2) = P [X B] P [X C]. (10.3) Finally, P ([X A] [X B] [X C]) = 1 8 dx = 1 8 = P [X A] P [X B] P [X C]. (10.4) 0 From (10.1), (10.2), (10.3) and (10.4), we can conclude that the events [X A], [X B], and [X C] are independent. Problem 6 (Q3.5.6). Solve this question using the Φ/Q table. A professor pays 25 cents for each blackboard error made in lecture to the student who points out the error. In a career of n years filled with blackboard errors, the total amount in dollars paid can be approximated by a Gaussian random variable Y n with expected value 40n and variance 100n. 10-7

32 ECS 315 HW Solution 10 Due: Nov 18, 8:59 AM 2015/1 (a) What is the probability that Y 20 exceeds 1000? (b) How many years n must the professor teach in order that P [Y n > 1000] > 0.99? Solution: We are given 2 that Y n N (40n, 100n). Recall that when X N (m, σ 2 ), ( ) x m F X (x) = Φ. (10.5) σ (a) Here n = 20. So, we have Y n N (40 20, ) = N (800, 2000). For this random variable m = 800 and σ = We want to find P [Y 20 > 1000] which is the same as 1 P [Y ]. Expressing this quantity using cdf, we have Apply (10.5) to get P [Y 20 > 1000] = 1 Φ P [Y 20 > 1000] = 1 F Y20 (1000). ( ) = 1 Φ(4.472) Q(4.47) (b) Here, the value of n is what we want. So, we will need to keep the formula in the general form. Again, from (10.5), for Y n N (40n, 100n), we have ( ) ( ) n 100 4n P [Y n > 1000] = 1 F Yn (1000) = 1 Φ 10 = 1 Φ. n n To find the value of n such that P [Y n > 1000] > 0.99, we will first find the value of z which make 1 Φ (z) > (10.6) At this point, we may try to solve for the value of Z by noting that (10.6) is the same as Φ (z) < (10.7) Unfortunately, the tables that we have start with Φ(0) = 0.5 and increase to something close to 1 when the argument of the Φ function is large. This means we can t directly find 0.01 in the table. Of course, 0.99 is in there and therefore we will need to solve (10.6) via another approach. 2 Note that the expected value and the variance in this question are proportional to n. This naturally occurs when we consider the sum of i.i.d. random variables. The approximation by Gaussian random variable is a result of the central limit theorem (CLT). 10-8

33 ECS 315 HW Solution 10 Due: Nov 18, 8:59 AM 2015/1 To do this, we use another property of the Φ function. Recall that 1 Φ(z) = Φ( z). Therefore, (10.6) is the same as Φ ( z) > (10.8) From our table, we can then conclude that (10.7) (which is the same as (10.8)) will happen when z > (If you have MATLAB, then you can get a more accurate answer of ) Now, plugging in z = 100 4n n, we have 4n 100 n > To solve for n, we first let x = n. In which case, we have 4x2 100 > 2.33 or, equivalently, 4x x 100 > 0. The two x roots are x = and x > 5.3. So, We need x < or x > 5.3. Note that x = n and therefore can not be negative. So, we only have one case; that is, we need x > 5.3. Because n = x 2, we then conclude that we need n > 28.1 years. 10-9

34 ECS 315: Probability and Random Processes 2015/1 HW Solution 11 Due: Nov 25, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Suppose that the time to failure (in hours) of fans in a personal computer can be modeled by an exponential distribution with λ = (a) What proportion of the fans will last at least 10,000 hours? (b) What proportion of the fans will last at most 7000 hours? [Montgomery and Runger, 2010, Q4-97] Solution: See handwritten solution Problem 2. Consider each random variable X defined below. Let Y = 1 + 2X. (i) Find and sketch the pdf of Y and (ii) Does Y belong to any of the (popular) families discussed in class? If so, state the name of the family and find the corresponding parameters. (a) X U(0, 1) (b) X E(1) (c) X N (0, 1) Solution: See handwritten solution Problem 3. Consider each random variable X defined below. Let Y = 1 2X. (i) Find and sketch the pdf of Y and (ii) Does Y belong to any of the (popular) families discussed in class? If so, state the name of the family and find the corresponding parameters. (a) X U(0, 1) (b) X E(1) (c) X N (0, 1) Solution: See handwritten solution 11-1

35 HW11 Q1: Exponential RV Friday, November 21, :02 AM ECS HW11 Page 1

36 HW11 Q2 Affine Transformation Tuesday, November 11, :15 PM ECS HW11 Page 2

37 ECS HW11 Page 3

38 HW11 Q3 Affine Transformation Tuesday, November 11, :15 PM ECS HW11 Page 4

39 ECS HW11 Page 5

40 ECS 315 HW Solution 11 Due: Nov 25, 9:19 AM 2015/1 Problem 4. Let X E(5) and Y = 2/X. (a) Check that Y is still a continuous random variable. (b) Find F Y (y). (c) Find f Y (y). (d) (optional) Find EY. Hint: Because d 10 e y = 10 e 10 dy y 2 y > 0 for y 0. We know that e 10 y is an increasing function on our range of integration. In particular, consider y > 10/ ln(2). Then, e 10 y > 1. Hence, y e y dy > 10/ ln y e y dy > 10/ ln y 2 dy = 10/ ln 2 5 y dy. Remark: To be technically correct, we should be a little more careful when writing Y = 2 because it is undefined when X = 0. Of course, this happens with 0 probability; X so it won t create any serious problem. However, to avoid the problem, we may define Y by { 2/X, X 0, Y = 0, X = 0. Solution: In this question, we have Y = g(x) where the function g is defined by g(x) = 2 x. (a) First, we evaluate P [Y = y] = P [g(x) = y]. For each value of y 0, there is only one x value that satisfies y = g(x). (That x value is x = 2 y.) When y = 0, there is no x value that satisfies y = g(x). In both cases, for each value of y, the number of solutions for y = g(x) is countable. Therefore, we can write P [Y = y] = P [X = x]. x:g(x)=y Here, X E(5). Therefore, X is a continuous random variable and P [X = x] = 0 for any x. Hence, P [Y = y] = 0 = 0. x:g(x)=y Because P [Y = y] = 0 for all y, we conclude that Y is a continuous random variable. 11-2

41 ECS 315 HW Solution 11 Due: Nov 25, 9:19 AM 2015/1 (b) We consider two cases: y 0 and y > 0. (i) Because X > 0, we know that Y = 2 must be > 0 and hence, F X Y (y) = 0 for y 0. Note that Y can not be = 0. We need X = or to make Y = 0. However, ± are not real numbers therefore they are not possible X values. (ii) For y > 0, F Y (y) = P [Y y] = P [ ] [ 2 X y = P X 2 ]. y Note that, for the last equality, we can freely move X and y without worrying about flipping the inequality or division by zero because both X and y considered here are strictly positive. Now, for X E(λ) and x > 0, we have P [X x] = λe λt dt = e λt x = e λx x Therefore, F Y (y) = e 5( 2 y) = e 10 y. Combining the two cases above we have F Y (y) = { e 10 y, y > 0 0, y 0 (c) Because we have already derived the cdf in the previous part, we can find the pdf via the cdf by f Y (y) = d F dy Y (y). This gives f Y at all points except at y = 0 which we will set f Y to be 0 there. (This arbitrary assignment works for continuous RV. This is why we need to check first that the random variable is actually continuous.) Hence, f Y (y) = { 10e y 2 y, y > 0 0, y 0. (d) We can find EY from f Y (y) found in the previous part or we can even use f X (x) Method 1: EY = yf Y (y) = 0 y y 2 e y dy = y e y dy 11-3

42 ECS 315 HW Solution 11 Due: Nov 25, 9:19 AM 2015/1 From the hint, we have EY > 10/ ln y e y dy > = 5 ln y 10/ ln 2 =. 10/ ln y 2 dy = 10/ ln 2 5 y dy Therefore, EY =. Method 2: [ ] 1 EY = E X > 1 0 = 1 x λe λ dx = λe λ 1 x f 1 X (x) dx = x λe λx dx > x dx = λe λ ln x 1 0 =, 0 1 x λe λx dx where the second inequality above comes from the fact that for x (0, 1), e λx > e λ. Problem 5. In wireless communications systems, fading is sometimes modeled by lognormal random variables. We say that a positive random variable Y is lognormal if ln Y is a normal random variable (say, with expected value m and variance σ 2 ). (a) Check that Y is still a continuous random variable. (b) Find the pdf of Y. Hint: First, recall that the ln is the natural log function (log base e). Let X = ln Y. Then, because Y is lognormal, we know that X N (m, σ 2 ). Next, write Y as a function of X. Solution: Because X = ln(y ), we have Y = e X. So, here, we consider Y = g(x) where the function g is defined by g(x) = e x. (a) First, we evaluate P [Y = y] = P [g(x) = y]. Note that for each value of y, there is only one x value that satisfies y = g(x). (That x value is x = ln(y).) So, the number of solutions for y = g(x) is countable. Therefore, we can write P [Y = y] = P [X = x] x:g(x)=y

43 ECS 315 HW Solution 11 Due: Nov 25, 9:19 AM 2015/1 Here, X N (m, σ 2 ). Therefore, X is a continuous random variable and P [X = x] = 0 for any x. Hence, P [Y = y] = 0 = 0. x:g(x)=y Because P [Y = y] = 0 for all y, we conclude that Y is a continuous random variable. (b) Start with Y = e X. We know that exponential function gives strictly positive number. So, Y is always strictly positive. In particular, F Y (y) = 0 for y 0. Next, for y > 0, by definition, F Y (y) = P [Y y]. Plugging in Y = e X, we have F Y (y) = P [ e X y ]. Because the exponential function is strictly increasing, the event [e X y] is the same as the event [X ln y]. Therefore, F Y (y) = P [X ln y] = F X (ln y). Combining the two cases above, we have { FX (ln y), y > 0, F Y (y) = 0, y 0. Finally, we apply f Y (y) = d dy F Y (y). For y < 0, we have f Y (y) = d 0 = 0. For y > 0, dy Therefore, f Y (y) = d dy F Y (y) = d dy F X (ln y) = f X (ln y) d dy ln y = 1 y f X (ln y). (11.1) f Y (y) = { 1 f y X (ln y), y > 0, 0, y < 0. At y = 0, because Y is a continuous random variable, we can assign any value, e.g. 0, to f Y (0). Then { 1 f Y (y) = f y X (ln y), y > 0, 0, otherwise. Here, X N (m, σ 2 ). Therefore, f X (x) = 1 e 1 2( x m σ ) 2 2πσ 11-5

44 ECS 315 HW Solution 11 Due: Nov 25, 9:19 AM 2015/1 and f Y (y) = { 1 2πσy e 1 2( ln(y) m σ ) 2, y > 0, 0, otherwise. Extra Questions Problem 6. Cholesterol is a fatty substance that is an important part of the outer lining (membrane) of cells in the body of animals. Its normal range for an adult is mg/dl. The Food and Nutrition Institute of the Philippines found that the total cholesterol level for Filipino adults has a mean of mg/dl and 84.1% of adults have a cholesterol level below 200 mg/dl. Suppose that the cholesterol level in the population is normally distributed. (a) Determine the standard deviation of this distribution. (b) What is the value of the cholesterol level that exceeds 90% of the population? (c) An adult is at moderate risk if cholesterol level is more than one but less than two standard deviations above the mean. What percentage of the population is at moderate risk according to this criterion? (d) An adult is thought to be at high risk if his cholesterol level is more than two standard deviations above the mean. What percentage of the population is at high risk? Solution: See handwritten solution Problem 7. Consider a random variable X whose pdf is given by { cx f X (x) = 2, x (1, 2), 0, otherwise. Let Y = 4 X 1.5. (a) Find EY. (b) Find f Y (y). Solution: See handwritten solution 11-6

45 HW11 Q6: Gaussian RV - Cholesterol Thursday, November 13, :49 PM ECS HW11 Page 6

46 HW11 Q7: SISO Saturday, November 22, :54 PM ECS HW11 Page 7

47 ECS HW11 Page 8

48 ECS HW11 Page 9

49 ECS 315: Probability and Random Processes 2015/1 HW Solution 12 Due: Dec 2, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Let X E(3). (a) For each of the following function g(x). Indicate whether the random variable Y = g(x) is a continuous random variable. (i) g(x) = x 2. { 1, x 0, (ii) g(x) = 0, x < 0.. { 4e (iii) g(x) = 4x, x 0, 0, x < 0. { x, x 5, (iv) g(x) = 5, x > 5. (b) Repeat part (a), but now check whether the random variable Y = g(x) is a discrete random variable. Hint: As shown in class, to check whether Y is a continuous random variable, we may check whether P [Y = y] = 0 for any real number y. Because Y = g(x), the probability under consideration is P [Y = y] = P [g (X) = y] = P [X {x : g (x) = y}]. At a particular y value, if there are at most countably many x values that satisfy g(x) = y, then P [Y = y] = P [X = x]. x:g(x)=y When X is a continuous random variable, P [X = x] = 0 for any x. P [Y = y] = 0. Therefore, Conclusion: Suppose X is a continuous random variable and there are at most countably many x values that satisfy g(x) = y for any y. Then, Y is a continuous random variable. 12-1

50 ECS 315 HW Solution 12 Due: Dec 2, 9:19 AM 2015/1 Suppose at a particular y value, we have uncountably many x values that satisfy g(x) = y. Then, we can t rewrite P [X {x : g (x) = y}] as P [X = x] because x:g(x)=y axiom P3 of probability is only applicable to countably many disjoint sets (cases). When X is a continuous random variable, we can find probability by integrating its pdf: P [Y = y] = P [X {x : g (x) = y}] = f X (x)dx. {x:g(x)=y} If we can find a y value whose integration above is > 0, we can conclude right away that Y is not a continuous random variable. (a) Solution: (i) YES. When y < 0, there is no x that satisfies g(x) = y. When y = 0, there is exactly one x (x = 0) that satisfies g(x) = y. When y > 0, there is exactly two x (x = ± y) that satisfies g(x) = y. Therefore, for any y, there are at most countably many x values that satisfy g(x) = y. Because X is a continuous random variable, we conclude that Y is also a continuous random variable. (ii) NO. An easy way to see this is that there can be only two values out of the function g( ): 0 or 1. So, Y = g(x) is a discrete random variable. Alternatively, consider y = 1. We see that any x 0, can make g(x) = 1. Therefore, P [Y = 1] = P [X 0]. For X E(3), P [X 0] = 1 > 0. Because we found a y with P [Y = y] > 0. Y can not be a continuous random variable. (iii) YES. The plot of the function g(x) may help you see the following facts: When y > 4 or y < 0, there is no x that gives y = g(x). When 0 < y < 4, there is exactly one x that satisfies y = g(x). Because X is a continuous random variable, we can conclude that P [Y = y] is 0 for y 0. When y = 0, any x < 0 would satisfy g(x) = y. So, P [Y = 0] = P [X < 0]. However, because X E(3) is always positive. P [X < 0] =

51 ECS 315 HW Solution 12 Due: Dec 2, 9:19 AM 2015/1 (iv) NO. Consider y = 5. We see that any x 5, can make g(x) = 5. Therefore, P [Y = 5] = P [X 5]. For X E(3), P [X 5] = 3e 3x dx = e 15 > 0. 5 Because P [Y = 5] > 0, we conclude that Y can t be a continuous random variable. (b) To check whether a random variable is discrete, we simply check whether it has a countable support. Also, if we have already checked that a random variable is continuous, then it can t also be discrete. (i) (ii) (iii) (iv) NO. We checked before that it is a continuous random variable. YES as discussed in part (a). NO. We checked before that it is a continuous random variable. NO. Because X is positive, Y = g(x) can be any positive number in the interval (0, 5]. The interval is uncountable. Therefore, Y is not discrete. We have shown previously that Y is not a continuous random variable. Here. knowing that it is not discrete means that it is of the last type: mixed random variable. Problem 2. The input X and output Y of a system subject to random perturbations are described probabilistically by the following joint pmf matrix: x 1 3 y Evaluate the following quantities: (a) The marginal pmf p X (x) (b) The marginal pmf p Y (y) (c) EX 12-3

HW Solution 5 Due: Sep 13

HW Solution 5 Due: Sep 13 ECS 315: Probability and Random Processes 01/1 HW Solution 5 Due: Sep 13 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which

More information

HW Solution 7 Due: Oct 24

HW Solution 7 Due: Oct 24 ECS 315: Probability and Random Processes 2014/1 HW Solution 7 Due: Oct 24 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which

More information

HW Solution 4 Not Due

HW Solution 4 Not Due IES 32: Engineering Statistics 211/2 HW Solution 4 Not Due Lecturer: Prapun Suksompong, Ph.D. Problem 1. The random variable V has pmf { cv p V (v) = 2, v = 1, 2, 3, 4,, otherwise. (a) Find the value of

More information

HW 13 Due: Dec 6, 5 PM

HW 13 Due: Dec 6, 5 PM ECS 315: Probability and Random Processes 2016/1 HW 13 Due: Dec 6, 5 PM Lecturer: Prapun Suksompong, Ph.D. Instructions (a) This assignment has 8 pages. (b) (1 pt) Write your first name and the last three

More information

HW Solution 12 Due: Dec 2, 9:19 AM

HW Solution 12 Due: Dec 2, 9:19 AM ECS 315: Probability and Random Processes 2015/1 HW Solution 12 Due: Dec 2, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Let X E(3). (a) For each of the following function g(x). Indicate whether

More information

HW Solution 1 Due: February 1

HW Solution 1 Due: February 1 .qxd 1/7/10 9:45 AM Page 28 APTER 2 PROBABILITY IES 302: Engineering Statistics 2011/2 HW Solution 1 Due: February 1 Lecturer: Prapun Suksompong, Ph.D. S FOR SECTION 2-1 2-18. In a magnetic storage device,

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Probability Distributions Columns (a) through (d)

Probability Distributions Columns (a) through (d) Discrete Probability Distributions Columns (a) through (d) Probability Mass Distribution Description Notes Notation or Density Function --------------------(PMF or PDF)-------------------- (a) (b) (c)

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006 Review problems UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 5 Spring 006 Problem 5. On any given day your golf score is any integer

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 71E Probability and Statistics Spring 013 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 1.30, Wednesday EEB 5303 10.00 1.00, Wednesday

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

HW Solution 3 Due: Sep 11

HW Solution 3 Due: Sep 11 ECS 35: Probability and Random Processes 204/ HW Solution 3 Due: Sep Lecturer: Prapun Suksompong, PhD Instructions (a) ONE part of a question will be graded (5 pt) Of course, you do not know which part

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab University of Illinois Spring 1 ECE 313: Conflict Final Exam Tuesday, May 13, 1, 7: p.m. 1: p.m. Room 1 Everitt Lab 1. [18 points] Consider an experiment in which a fair coin is repeatedly tossed every

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) 3 Probability Distributions (Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) Probability Distribution Functions Probability distribution function (pdf): Function for mapping random variables to real numbers. Discrete

More information

STAT 516 Midterm Exam 2 Friday, March 7, 2008

STAT 516 Midterm Exam 2 Friday, March 7, 2008 STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Guidelines for Solving Probability Problems

Guidelines for Solving Probability Problems Guidelines for Solving Probability Problems CS 1538: Introduction to Simulation 1 Steps for Problem Solving Suggested steps for approaching a problem: 1. Identify the distribution What distribution does

More information

Exam 3, Math Fall 2016 October 19, 2016

Exam 3, Math Fall 2016 October 19, 2016 Exam 3, Math 500- Fall 06 October 9, 06 This is a 50-minute exam. You may use your textbook, as well as a calculator, but your work must be completely yours. The exam is made of 5 questions in 5 pages,

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

Statistics and Econometrics I

Statistics and Econometrics I Statistics and Econometrics I Random Variables Shiu-Sheng Chen Department of Economics National Taiwan University October 5, 2016 Shiu-Sheng Chen (NTU Econ) Statistics and Econometrics I October 5, 2016

More information

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution

More information

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) 3 Probability Distributions (Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3) Probability Distribution Functions Probability distribution function (pdf): Function for mapping random variables to real numbers. Discrete

More information

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in: STAT/MA 46 Midterm Exam 2 Thursday, October 8, 27 Name Purdue student ID ( digits) Circle the section you are enrolled in: STAT/MA 46-- STAT/MA 46-2- 9: AM :5 AM 3: PM 4:5 PM REC 4 UNIV 23. The testing

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

DS-GA 1002 Lecture notes 2 Fall Random variables

DS-GA 1002 Lecture notes 2 Fall Random variables DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

ECS /1 Part III.2 Dr.Prapun

ECS /1 Part III.2 Dr.Prapun Sirindhorn International Institute of Technology Thammasat University School of Information, Computer and Communication Technology ECS315 2011/1 Part III.2 Dr.Prapun 8.2 Families of Discrete Random Variables

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

18.05 Practice Final Exam

18.05 Practice Final Exam No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For

More information

Probability. Lecture Notes. Adolfo J. Rumbos

Probability. Lecture Notes. Adolfo J. Rumbos Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002 EE/CpE 345 Modeling and Simulation Class 5 September 30, 2002 Statistical Models in Simulation Real World phenomena of interest Sample phenomena select distribution Probabilistic, not deterministic Model

More information

Probability Theory and Random Variables

Probability Theory and Random Variables Probability Theory and Random Variables One of the most noticeable aspects of many computer science related phenomena is the lack of certainty. When a job is submitted to a batch oriented computer system,

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

Part 3: Parametric Models

Part 3: Parametric Models Part 3: Parametric Models Matthew Sperrin and Juhyun Park August 19, 2008 1 Introduction There are three main objectives to this section: 1. To introduce the concepts of probability and random variables.

More information

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12 Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution

Random Variable. Discrete Random Variable. Continuous Random Variable. Discrete Random Variable. Discrete Probability Distribution Random Variable Theoretical Probability Distribution Random Variable Discrete Probability Distributions A variable that assumes a numerical description for the outcome of a random eperiment (by chance).

More information

ST 371 (IX): Theories of Sampling Distributions

ST 371 (IX): Theories of Sampling Distributions ST 371 (IX): Theories of Sampling Distributions 1 Sample, Population, Parameter and Statistic The major use of inferential statistics is to use information from a sample to infer characteristics about

More information

Continuous Random Variables

Continuous Random Variables Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability. Often, there is interest in random variables

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1 IEOR 3106: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability

More information

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution Probability distributions Probability Distribution Functions G. Jogesh Babu Department of Statistics Penn State University September 27, 2011 http://en.wikipedia.org/wiki/probability_distribution We discuss

More information

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory

Chapter 1 Statistical Reasoning Why statistics? Section 1.1 Basics of Probability Theory Chapter 1 Statistical Reasoning Why statistics? Uncertainty of nature (weather, earth movement, etc. ) Uncertainty in observation/sampling/measurement Variability of human operation/error imperfection

More information

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) = 1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7 ECS 332: Principles of Communications 2012/1 HW 4 Due: Sep 7 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

Brief Review of Probability

Brief Review of Probability Brief Review of Probability Nuno Vasconcelos (Ken Kreutz-Delgado) ECE Department, UCSD Probability Probability theory is a mathematical language to deal with processes or experiments that are non-deterministic

More information

ECE 302: Probabilistic Methods in Electrical Engineering

ECE 302: Probabilistic Methods in Electrical Engineering ECE 302: Probabilistic Methods in Electrical Engineering Test I : Chapters 1 3 3/22/04, 7:30 PM Print Name: Read every question carefully and solve each problem in a legible and ordered manner. Make sure

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution Lecture #5 BMIR Lecture Series on Probability and Statistics Fall, 2015 Department of Biomedical Engineering and Environmental Sciences National Tsing Hua University s 5.1 Definition ( ) A continuous random

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

MATH Notebook 5 Fall 2018/2019

MATH Notebook 5 Fall 2018/2019 MATH442601 2 Notebook 5 Fall 2018/2019 prepared by Professor Jenny Baglivo c Copyright 2004-2019 by Jenny A. Baglivo. All Rights Reserved. 5 MATH442601 2 Notebook 5 3 5.1 Sequences of IID Random Variables.............................

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition:

X 1 ((, a]) = {ω Ω : X(ω) a} F, which leads us to the following definition: nna Janicka Probability Calculus 08/09 Lecture 4. Real-valued Random Variables We already know how to describe the results of a random experiment in terms of a formal mathematical construction, i.e. the

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999 Name: 2:15-3:30 pm Thursday, 21 October 1999 You may use a calculator and your own notes but may not consult your books or neighbors. Please show your work for partial credit, and circle your answers.

More information

Lecture 2. Binomial and Poisson Probability Distributions

Lecture 2. Binomial and Poisson Probability Distributions Durkin, Lecture 2, Page of 6 Lecture 2 Binomial and Poisson Probability Distributions ) Bernoulli Distribution or Binomial Distribution: Consider a situation where there are only two possible outcomes

More information

Numerical Methods I Monte Carlo Methods

Numerical Methods I Monte Carlo Methods Numerical Methods I Monte Carlo Methods Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course G63.2010.001 / G22.2420-001, Fall 2010 Dec. 9th, 2010 A. Donev (Courant Institute) Lecture

More information

Continuous random variables

Continuous random variables Continuous random variables Continuous r.v. s take an uncountably infinite number of possible values. Examples: Heights of people Weights of apples Diameters of bolts Life lengths of light-bulbs We cannot

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information