HW Solution 7 Due: Oct 24

Size: px
Start display at page:

Download "HW Solution 7 Due: Oct 24"

Transcription

1 ECS 315: Probability and Random Processes 2014/1 HW Solution 7 Due: Oct 24 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra question at the end is optional. (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1 (M2010). The random variable V has pmf { 1 + c, v { 2, 2, 3} p V (v) = v 2 0, otherwise. (a) Find the value of the constant c. (b) Let W = V 2 V + 1. Find the pmf of W. (c) Find EV (d) Find E [V 2 ] (e) Find Var V (f) Find σ V (g) Find EW Solution: 7-1

2 ECS 315 HW Solution 7 Due: Oct /1 (a) The pmf must sum to 1. Hence, The value of c must be Note that this gives 1 ( 2) 2 + c + 1 (2) 2 + c + 1 (3) 2 + c = 1. c = 1 3 ( ) = p V ( 2) = p V (2) = and p V (3) = (b) W = V 2 V + 1. So, when V = 2, 2, 3, we have W = 7, 3, 7, respectively. Hence, W takes only two values, 7 and 3. the corresponding probabilities are P [W = 7] = p V ( 2) + p V (3) = and P [W = 3] = p V (2) = Hence, the pmf of W is given by 41, w = 3, p W (w) =, w = 7, 108 0, otherwise. 0.38, w = 3, 0.62, w = 7, 0, otherwise. (c) We apply the formula EV = v vp V (v). Here, EV = ( 2) p V ( 2) + (2) p V (2) + (3) p V (3) = ( 2) (2) + (3) = (d) We apply the formula E [V 2 ] = v v 2 p V (v). Here, E [ V 2] = ( 2) 2 p V ( 2) 2 + (2) p V (2) + (3) 2 p V (3) = =

3 ECS 315 HW Solution 7 Due: Oct /1 (e) Var V = EV 2 (EV ) 2 = (f) σ V = Var V (g) We show two techniques to calculate EW. Method 1: EW = w wp W (w) = = = Method 2: EW = E [V 2 V + 1] = E [V 2 ] E [V ]+1 = = Problem 2. Suppose X is a uniform discrete random variable on { 3, 2, 1, 0, 1, 2, 3, 4}. Find (a) EX (b) E [X 2 ] (c) Var X (d) σ X Solution: All of the calculations in this question are simply plugging in numbers into appropriate formulas. (a) EX = 0.5 (b) E [X 2 ] = 5.5 (c) Var X = 5.25 (d) σ X = Alternatively, we can find a formula for the general case of uniform random variable X on the sets of integers from a to b. Note that there are n = b a + 1 values that the random variable can take. Hence, all of them has probability 1 n. (a) EX = b i=a i 1 = 1 b n n i=a i = 1 n(a+b) = a+b. n

4 ECS 315 HW Solution 7 Due: Oct /1 (b) First, note that b b ( ) (i + 1) (i 2) i (i 1) = i (i 1) 3 i=a ( b ) = 1 b (i + 1) i (i 1) i (i 1) (i 2) 3 i=a i=a i=a = 1 ((b + 1) b (b 1) a (a 1) (a 2)) 3 where the last equality comes from the fact that there are many terms in the first sum that is repeated in the second sum and hence many cancellations. Now, Therefore, b i 2 = i=a b i=a = 1 3 b (i (i 1) + i) = i=a b i (i 1) + i=a b i i=a ((b + 1) b (b 1) a (a 1) (a 2)) + n (a + b) 2 i 2 1 n = 1 a + b ((b + 1) b (b 1) a (a 1) (a 2)) + 3n 2 = 1 3 a2 1 6 a ab b b2 (c) Var X = E [X 2 ] (EX) 2 = 1 12 (d) σ X = n Var X = (b a) (b a + 2) = 1 12 (n 1)(n + 1) = n Problem 3. (Expectation + pmf + Gambling + Effect of miscalculation of probability) In the eighteenth century, a famous French mathematician Jean Le Rond d Alembert, author of several works on probability, analyzed the toss of two coins. He reasoned that because this experiment has THREE outcomes, (the number of heads that turns up in those two tosses can be 0, 1, or 2), the chances of each must be 1 in 3. In other words, if we let N be the number of heads that shows up, Alembert would say that [Mlodinow, 2008, p 50 51] p N (n) = 1/3 for N = 0, 1, 2. (7.1) 7-4

5 ECS 315 HW Solution 7 Due: Oct /1 We know that Alembert s conclusion was wrong. His three outcomes are not equally likely and hence classical probability formula can not be applied directly. The key is to realize that there are FOUR outcomes which are equally likely. We should not consider 0, 1, or 2 heads as the possible outcomes. There are in fact four equally likely outcomes: (heads, heads), (heads, tails), (tails, heads), and (tails, tails). These are the 4 possibilities that make up the sample space. The actual pmf for N is 1/4, n = 0, 2, p N (n) = 1/2 n = 1, 0 otherwise. Suppose you travel back in time and meet Alembert. You could make the following bet with Alembert to gain some easy money. The bet is that if the result of a toss of two coins contains exactly one head, then he would pay you $150. Otherwise, you would pay him $100. Let R be Alembert s profit from this bet and Y be the your profit from this bet. (a) Then, R = 150 if you win and R = +100 otherwise. Use Alembert s miscalculated probabilities from (7.1) to determine the pmf of R (from Alembert s belief). (b) Use Alembert s miscalculated probabilities from (7.1) (or the corresponding (miscalculated) pmf found in part (a)) to calculate ER, the expected profit for Alembert. Remark: You should find that ER > 0 and hence Alembert will be quite happy to accept your bet. (c) Use the actual probabilities, to determine the pmf of R. (d) Use the actual pmf, to determine ER. Remark: You should find that ER < 0 and hence Alembert should not accept your bet if he calculates the probabilities correctly. (e) Note that Y = +150 if you win and Y = 100 otherwise. Use the actual probabilities to determine the pmf of Y. (f) Use the actual probabilities, to determine EY. Remark: You should find that EY > 0. This is the amount of money that you expect to gain each time that you play with Alembert. Of course, Alembert, who still believes that his calculation is correct, will ask you to play this bet again and again believing that he will make profit in the long run. By miscalculating probabilities, one can make wrong decisions (and lose a lot of money)! Solution: 7-5

6 ECS 315 HW Solution 7 Due: Oct /1 (a) P [R = 150] = P [N = 1] and P [R = +100] = P [N 1] = P [N = 0] + P [N = 2]. So, p N (1), r = 150, p R (r) = p N (0) + p N (2), r = +100, 0, otherwise. Using Alembert s miscalculated pmf, 1/3, r = 150, p R (r) = 2/3, r = +100, 0, otherwise (b) From p R (r) in part (a), we have ER = r p R(r) = 1 ( 150) = (c) Again, Using the actual pmf, 1, r = 150, 2 1 p R (r) = + 1, r = +100, 4 4 0, otherwise p N (1), r = 150, p R (r) = p N (0) + p N (2), r = +100, 0, otherwise = { 1, 2 r = 150 or + 100, 0, otherwise. (d) From p R (r) in part (c), we have ER = r p R(r) = 1 ( 150) = (e) Observe that Y = R. Hence, using the answer from part (d), we have p R (r) = { 1, 2 r = +150 or 100, 0, otherwise. (f) Observe that Y = R. Hence, EY = ER. Using the actual probabilities, ER = 25 from part (d). Hence, EY = +25. Extra Questions Here are some questions for those who want extra practice. 7-6

7 ECS 315 HW Solution 7 Due: Oct /1 Problem 4. A random variables X has support containing only two numbers. Its expected value is EX = 5. Its variance is Var X = 3. Give an example of the pmf of such a random variable. Solution: We first find σ X = Var X = 3. Recall that this is the average deviation from the mean. Because X takes only two values, we can make them at exactly ± 3 from the mean; that is x 1 = 5 3 and x 2 = In which case, we automatically have EX = 5 and Var X = 3. Hence, one example of such pmf is { 1 p X (x) =, x = 5 ± 3 2 0, otherwise We can also try to find a general formula for x 1 and x 2. If we let p = P [X = x 2 ], then q = 1 p = P [X = x 1 ]. Given p, the values of x 1 and x 2 must satisfy two conditions: EX = m and Var X = σ 2. (In our case, m = 5 and σ 2 = 3.) From EX = m, we must have that is x 1 q + x 2 p = m; (7.2) x 1 = m q x p 2 q. From Var X = σ 2, we have E [X 2 ] = Var X + EX 2 = σ 2 + m 2 and hence we must have Substituting x 1 from (7.2) into (7.3), we have x 2 1q + x 2 2p = σ 2 + m 2. (7.3) x 2 2p 2x 2 mp + ( pm 2 qσ 2) = 0 whose solutions are Using (7.2), we have x 2 = 2mp ± 4m 2 p 2 4p (pm 2 qσ 2 ) 2p = 2mp ± 2σ pq 2p x 1 = m ) q p p (m q ± σ p q = m σ q. Therefore, for any given p, there are two pmfs: 1 p, x = m σ p X (x) = p, x = m + σ 0, otherwise, p 1 p 1 p p q = m ± σ p. 7-7

8 ECS 315 HW Solution 7 Due: Oct /1 or 1 p, x = m + σ p p X (x) = p, x = m σ p 0, otherwise. 1 p 1 p Problem 5. For each of the following families of random variable X, find the value(s) of x which maximize p X (x). (This can be interpreted as the mode of X.) (a) P(α) (b) Binomial(n, p) (c) G 0 (β) (d) G 1 (β) Remark [Y&G, p. 66]: For statisticians, the mode is the most common number in the collection of observations. There are as many or more numbers with that value than any other value. If there are two or more numbers with this property, the collection of observations is called multimodal. In probability theory, a mode of random variable X is a number x mode satisfying p X (x mode ) p X (x) for all x. For statisticians, the median is a number in the middle of the set of numbers, in the sense that an equal number of members of the set are below the median and above the median. In probability theory, a median, X median, of random variable X is a number that satisfies P [X < X median ] = P [X > X median ]. Neither the mode nor the median of a random variable X need be unique. A random variable can have several modes or medians. Solution: We first note that when α > 0, p (0, 1), n N, and β (0, 1), the above pmf s will be strictly positive for some values of x. Hence, we can discard those x at which p X (x) = 0. The remaining points are all integers. To compare them, we will evaluate p X(i+1) p X (i). (a) For Poisson pmf, we have p X (i + 1) p X (i) = e α α i+1 (i+1)! e α α i i! = α i + 1. Notice that 7-8

9 ECS 315 HW Solution 7 Due: Oct /1 p X(i+1) p X (i) > 1 if and only if i < α 1. p X(i+1) p X (i) = 1 if and only if i = α 1. p X(i+1) p X (i) < 1 if and only if i > α 1. Let τ = α 1. This implies that τ is the place where things change. Moving from i to i + 1, the probability strictly increases if i < τ. When i > τ, the next probability value (at i + 1) will decrease. (i) Suppose α (0, 1), then α 1 < 0 and hence i > α 1 for all i. (Note that i are are nonnegative integers.) This implies that the pmf is a strictly decreasing function and hence the maximum occurs at the first i which is i = 0. (ii) Suppose α N. Then, the pmf will be strictly increasing until we reaches i = α 1. At which point, the next probability value is the same. Then, as we further increase i, the pmf is strictly decreasing. Therefore, the maximum occurs at α 1 and α. (iii) Suppose α / N and α 1. Then we will have have any i = α 1. The pmf will be strictly increasing where the last increase is from i = α 1 to i + 1 = α = α. After this, the pmf is strictly decreasing. Hence, the maximum occurs at α. To summarize, arg max p X (x) = x 0, α (0, 1), α 1 and α, α is an integer, α, α > 1 is not an integer. (b) For binomial pmf, we have p X (i + 1) p X (i) = n! (i+1)!(n i 1)! pi+1 (1 p) n i 1 n! i!(n i)! pi (1 p) n i = (n i) p (i + 1) (1 p). Notice that p X(i+1) p X (i) > 1 if and only if i < np 1 + p = (n + 1)p 1. p X(i+1) p X (i) = 1 if and only if i = (n + 1)p 1. p X(i+1) p X (i) < 1 if and only if i > (n + 1)p

10 ECS 315 HW Solution 7 Due: Oct /1 Let τ = (n+1)p 1. This implies that τ is the place where things change. Moving from i to i + 1, the probability strictly increases if i < τ. When i > τ, the next probability value (at i + 1) will decrease. (i) Suppose (n+1)p is an integer. The pmf will strictly increase as a function of i, and then stays at the same value at i = τ = (n + 1)p 1 and i + 1 = (n + 1)p = (n + 1)p. Then, it will strictly decrease. So, the maximum occurs at (n + 1)p 1 and (n + 1)p. (ii) Suppose (n + 1)p is not an integer. Then, there will not be any i that is = τ. Therefore, we only have the pmf strictly increases where the last increase occurs when we goes from i = τ to i + 1 = τ + 1. After this, the probability is strictly decreasing. Hence, the maximum is unique and occur at τ + 1 = (n + 1)p = (n + 1)p. To summarize, arg max p X (x) = x { (n + 1)p 1 and (n + 1)p, (n + 1)p is an integer, (n + 1)p, (n + 1)p is not an integer. (c) p X(i+1) p X (i) = β < 1. Hence, p X (i) is strictly decreasing. The maximum occurs at the smallest value of i which is 0. (d) p X(i+1) p X (i) = β < 1. Hence, p X (i) is strictly decreasing. The maximum occurs at the smallest value of i which is 1. Problem 6. An article in Information Security Technical Report [ Malicious Software Past, Present and Future (2004, Vol. 9, pp. 618)] provided the data (shown in Figure 7.1) on the top ten malicious software instances for The clear leader in the number of registered incidences for the year 2002 was the Internet worm Klez. This virus was first detected on 26 October 2001, and it has held the top spot among malicious software for the longest period in the history of virology. Suppose that 20 malicious software instances are reported. Assume that the malicious sources can be assumed to be independent. (a) What is the probability that at least one instance is Klez? (b) What is the probability that three or more instances are Klez? (c) What are the expected value and standard deviation of the number of Klez instances among the 20 reported? 7-10

11 (b) What is the probability of no hits? [ Malicious Software Past, Present and Future (2004, Vol. 9, pp. 6 18)] provided the following data on the top ten malicious (c) What are the mean and variance of software instances for The clear leader in the num A statistical process control ch ber of registered incidences for the year 2002 was the Internet worm Klez, and it is still one of the most widespread threats. This virus was first detected on 26 October 2001, and it has of 20 parts from a metal punching pro hour. Typically, 1% of the parts require the number of parts in the sample of 20 ECS 315 held the HW top spot Solution among malicious 7 Due: software Oct for 24 the longest process problem 2014/1 is suspected if X exc period in the history of virology. Place Name % Instances 1 I-Worm.Klez 61.22% 2 I-Worm.Lentin 20.52% 3 I-Worm.Tanatos 2.09% 4 I-Worm.BadtransII 1.31% 5 Macro.Word97.Thus 1.19% 6 I-Worm.Hybris 0.60% 7 I-Worm.Bridex 0.32% 8 I-Worm.Magistr 0.30% 9 Win95.CIH 0.27% 10 I-Worm.Sircam 0.24% The 10 most widespread malicious programs for 2002 (Source Kaspersky Labs). than three standard deviations. (a) If the percentage of parts that req 1%, what is the probability that X more than three standard deviation (b) If the rework percentage increas probability that X exceeds 1? (c) If the rework percentage increase probability that X exceeds 1 in at le hours of samples? Because not all airline passen reserved seat, an airline sells 125 ticket only 120 passengers. The probability th show up is 0.10, and the passengers be (a) What is the probability that every up can take the flight? (b) What is the probability that the flig seats? Figure 7.1: The 10 most widespread malicious programs for 2002 (Source Kaspersky Labs) This exercise illustrates that p Suppose that 20 malicious software instances are reported. schedules and costs. A manufacturing pr Solution: Let N Assume be thethat number the malicious of instances sources can (among be assumed theto 20) be independent. that are orders Klez. to fill. Then, Each order requires one N binomial(n, p) where n = 20 and p = purchased from a supplier. However, ty (a) What is the probability that at least one instance is Klez? ponents are identified as defective, and (a) P [N 1] = 1 P (b)[n What < 1] is the = 1 P probability [N = that 0] = three 1 p or more instances are assumed to be independent. N (0) = 1 ( ) 20 Klez? (a) If the manufacturer stocks 100 co (c) What are the mean and standard deviation of the number probability that the 100 orders of Klez instances among the 20 reported? reordering components? (b) Heart failure is due to either natural occurrences (b) If the manufacturer stocks 102 co P [N 3] = 1 P [N < 3] = 1 (P [N = 0] + P [N = 1] + P [N (87%) or outside factors (13%). Outside factors are related to probability = 2]) that the 100 orders induced substances 2 ( ) 20 or foreign objects. Natural occurrences are reordering components? caused = 1 by arterial blockage, (0.6122) disease, k (0.3878) and infection. 20 k Suppose (c) If the manufacturer stocks 105 co k that 20 patients k=0 will visit an emergency room with heart failure. probability that the 100 orders Assume that causes of heart failure between individuals are reordering components? (c) EN = np = 20 independent = Consider the lengths of stay at σ N = Var N = (a) What np(1is the p) probability = 20 that three individuals have conditions caused by outside factors? pendently arrive for service. department in Exercise Assume (b) What is the probability that three or more individuals have (a) What is the probability that the le conditions caused by outside factors? one person is less than or equal to 4 (c) What are the mean and standard deviation of the number (b) What is the probability that exactly of individuals with conditions caused by outside factors? than 4 hours? 7-11

12 ECS 315: Probability and Random Processes 2014/1 HW 8 Due: Nov 6 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra question at the end is optional. (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1 (Yates and Goodman, 2005, Q3.1.3). The CDF of a random variable W is 0, w < 5, (w + 5)/8, 5 w < 3, F W (w) = 1/4, 3 w < 3, 1/4 + 3 (w 3) /8, 3 w < 5, 1, w 5. (a) What is P [W 4]? (b) What is P [ 2 < W 2]? (c) What is P [W > 0]? (d) What is the value of a such that P [W a] = 1/2? Problem 2 (Yates and Goodman, 2005, Q3.2.1). The random variable X has probability density function { cx 0 x 2, f X (x) = 0, otherwise. Use the pdf to find 8-1

13 ECS 315 HW 8 Due: Nov /1 (a) the constant c, (b) P [0 X 1], (c) P [ 1/2 X 1/2], (d) the cdf F X (x). Problem 3 (Yates and Goodman, 2005, Q3.2.3). The CDF of random variable W is 0, w < 5, (w + 5)/8, 5 w < 3, F W (w) = 1/4, 3 w < 3, 1/4 + 3 (w 3) /8, 3 w < 5, 1, w 5. Find its pdf f W (w). Problem 4 (Yates and Goodman, 2005, Q3.3.4). The pdf of random variable Y is { y/2 0 y < 2, f Y (y) = 0, otherwise. What are E [Y ] and Var Y? Problem 5 (Yates and Goodman, 2005, Q3.3.6). The cdf of random variable V is 0 v < 5, F V (v) = (v + 5) 2 /144, 5 v < 7, 1 v 7. (a) What is E [V ]? (b) What is Var[V ]? (c) What is E [V 3 ]? 8-2

14 ECS HW8 Page 1 Y&G Q3.1.3 Wednesday, October 03, :14 PM

15 ECS HW8 Page 2

16 ECS HW8 Page 3 Y&G Q3.2.1 Wednesday, October 03, :50 PM

17 ECS HW8 Page 4

18 ECS HW8 Page 5 Y&G Q3.2.3 Wednesday, October 03, :18 PM

19 ECS HW8 Page 6 Y&G Q3.3.4 Tuesday, September 07, :08 PM

20 ECS HW8 Page 7 Y&G Q3.3.6 Wednesday, October 03, :37 PM

21 ECS HW8 Page 8

22 ECS 315: Probability and Random Processes 2014/1 HW Solution 9 Due: Nov 14 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra question at the end is optional. (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1 (Randomly Phased Sinusoid). Suppose Θ is a uniform random variable on the interval (0, 2π). (a) Consider another random variable X defined by where t is some constant. Find E [X]. X = 5 cos(7t + Θ) (b) Consider another random variable Y defined by Y = 5 cos(7t 1 + Θ) 5 cos(7t 2 + Θ) where t 1 and t 2 are some constants. Find E [Y ]. Solution: First, because Θ is a uniform random variable on the interval (0, 2π), we know that f Θ (θ) = 1 1 2π (0,2π)(t). Therefore, for any function g, we have E [g(θ)] = g(θ)f Θ (θ)dθ. 9-1

23 ECS 315 HW Solution 9 Due: Nov /1 (a) X is a function of Θ. E [X] = 5E [cos(7t + Θ)] = 5 2π 1 cos(7t + θ)dθ. Now, we know 0 2π that integration over a cycle of a sinusoid gives 0. So, E [X] = 0. (b) Y is another function of Θ. E [Y ] = E [5 cos(7t 1 + Θ) 5 cos(7t 2 + Θ)] = 2π 0 = 25 2π 2π 0 cos(7t 1 + θ) cos(7t 2 + θ)dθ. Recall 1 the cosine identity Therefore, cos(a) cos(b) = 1 (cos (a + b) + cos (a b)). 2 EY = 25 4π 2π = 25 4π 1 2π 5 cos(7t 1 + θ) 5 cos(7t 2 + θ)dθ 0 cos (14t + 2θ) + cos (7 (t 1 t 2 )) dθ ( 2π 0 cos (14t + 2θ) dθ + 2π 0 cos (7 (t 1 t 2 )) dθ ). The first integral gives 0 because it is an integration over two period of a sinusoid. The second integrand in the second integral is a constant. So, EY = 25 4π cos (7 (t 1 t 2 )) 2π 0 dθ = 25 4π cos (7 (t 1 t 2 )) 2π = 25 2 cos (7 (t 1 t 2 )). Problem 2 (Yates and Goodman, 2005, Q3.4.5). X is a continuous uniform RV on the interval ( 5, 5). (a) What is its pdf f X (x)? (b) What is its cdf F X (x)? 1 This identity could be derived easily via the Euler s identity: cos(a) cos(b) = eja + e ja ejb + e jb = 1 ( e ja e jb + e ja e jb + e ja e jb + e ja e jb) = 1 ( e ja e jb + e ja e jb + e ja e jb + e ja e jb ) = 1 (cos (a + b) + cos (a b))

24 ECS 315 HW Solution 9 Due: Nov /1 (c) What is E [X]? (d) What is E [X 5 ]? (e) What is E [ e X]? Solution: For a uniform random variable X on the interval (a, b), we know that { 0, x < a or x > b, f X (x) = a x b 1, b a and { 0, x < a or x > b, F X (x) = x a, a x b. b a In this problem, we have a = 5 and b = 5. (a) f X (x) = { 0, x < 5 or x > 5, x 5 (b) F X (x) = (c) EX = In general, EX = { 0, x < a or x > b, x+5 10 a x b. xf X (x) dx = b a 5 5 x 1 10 dx = 1 10 x x 1 b a dx = 1 b xdx = 1 b a b a a With a = 5 and b = 5, we have EX = 0. (d) E [X 5 ] = In general, x 5 f X (x) dx = E [ X 5] = b a 5 5 x dx = 1 10 x 6 6 ( = ( 5) 2) = x 2 2 x 5 1 b a dx = 1 b x 5 dx = 1 b a b a a With a = 5 and b = 5, we have E [X 5 ] = b a = 1 b 2 a 2 b a 2 ( = ( 5) 6) = 0. x 6 6 b a = 1 b a = a + b 2. b 6 a 6. 2

25 ECS 315 HW Solution 9 Due: Nov /1 (e) In general, E [ e X] = b a e x 1 b a dx = 1 b e x dx = 1 b a b a ex b a = eb e a b a. a With a = 5 and b = 5, we have E [ e X] = e5 e Problem 3. A random variable X is a Gaussian random variable if its pdf is given by f X (x) = 1 2πσ e 1 2( x m σ ) 2, for some constants m and σ. Furthermore, when a Gaussian random variable has m = 0 and σ = 1, we say that it is a standard Gaussian random variable. There is no closed-form expression for the cdf of the standard Gaussian random variable. The cdf itself is denoted by Φ and its values (or its complementary values Q( ) = 1 Φ( )) are traditionally provided by a table. We refer to this kind of table as the Φ table. Examples of such tables are Table 3.1 and Table 3.2 in [Y&G]. Suppose Z is a standard Gaussian random variable. (a) Use the Φ table to find the following probabilities: (i) P [Z < 1.52] (ii) P [Z < 1.52] (iii) P [Z > 1.52] (iv) P [Z > 1.52] (v) P [ 1.36 < Z < 1.52] (b) Use the Φ table to find the value of c that satisfies each of the following relation. (i) P [Z > c] = 0.14 (ii) P [ c < Z < c] = 0.95 Solution: (a) (i) P [Z < 1.52] = Φ(1.52) = (ii) P [Z < 1.52] = Φ( 1.52) = 1 Φ(1.52) = =

26 ECS 315 HW Solution 9 Due: Nov /1 (b) (iii) P [Z > 1.52] = 1 P [Z < 1.52] = = (iv) It is straightforward to see that the area of P [Z > 1.52] is the same as P [Z < 1.52] = Φ(1.52). So, P [Z > 1.52] = Alternatively, P [Z > 1.52] = 1 P [Z 1.52] = 1 Φ( 1.52) = 1 (1 Φ(1.52)) = Φ(1.52). (v) P [ 1.36 < Z < 1.52] = Φ(1.52) Φ( 1.36) = Φ(1.52) (1 Φ(1.36)) = Φ(1.52)+ Φ(1.36) 1 = = (i) P [Z > c] = 1 P [Z c] = 1 Φ(c). So, we need 1 Φ(c) = 0.14 or Φ(c) = = In the Φ table, we do not have exactly 0.86, but we have and Because 0.86 is closer to , we answer the value of c whose φ(c) = Therefore, c (ii) P [ c < Z < c] = Φ(c) Φ( c) = Φ(c) (1 Φ(c)) = 2Φ(c) 1. So, we need 2Φ(c) 1 = 0.95 or Φ(c) = From the Φ table, we have c Problem 4. The peak temperature T, as measured in degrees Fahrenheit, on a July day in New Jersey is a N (85, 100) random variable. Remark: Do not forget that, for our class, the second parameter in N (, ) is the variance (not the standard deviation). (a) Express the cdf of T in terms of the Φ function Hint: Recall that the cdf of a random variable T is given by F T (t) = P [T t]. For T N (m, σ 2 ), F T (t) = t 1 e 1 2( x m σ ) 2 dx. 2πσ The Φ function, which is the cdf of the standard Gaussian random variable, is given by Φ (z) = z 1 2π e 1 2 x2 dx. (b) Express each of the following probabilities in terms of the Φ function(s). Make sure that the arguments of the Φ functions are positive. (Positivity is required so that we can directly use the Φ/Q tables to evaluate the probabilities.) (i) P [T > 100] 9-5

27 ECS 315 HW Solution 9 Due: Nov /1 (ii) P [T < 60] (iii) P [70 T 100] (c) Express each of the probabilities in part (b) in terms of the Q function(s). Again, make sure that the arguments of the Q functions are positive. (d) Evaluate each of the probabilities in part (b) using the Φ/Q tables. (e) Observe that Table 3.1 stops at z = 2.99 and Table 3.2 starts at z = Why is it better to give a table for Q(z) instead of Φ(z) when z is large? Solution: (a) Continue from the hint. We perform a change of variable using y = x m σ to get F T (t) = x m σ 1 2π e 1 2 y2 dy = Φ ( x m ( ) t 85 Here, T N (85, 10 2 ). Therefore, F T (t) = Φ. 10 σ ). (b) (i) P [T > 100] = 1 P [T 100] = 1 F T (100) = 1 Φ ( ) = 1 Φ(1.5) (ii) P [T < 60] = P [T 60] because T is a continuous random variable and hence (iii) P [T = 60] = 0. Now, P [T 60] = F T (60) = Φ ( ) = Φ ( 2.5) = 1 Φ (2.5). Note that, for the last equality, we use the fact that Φ( x) = 1 Φ(x). ( ) ( ) P [70 T 100] = F T (100) F T (70) = Φ Φ = Φ (1.5) Φ ( 1.5) = Φ (1.5) (1 Φ (1.5)) = 2Φ (1.5) 1. (c) In this question, we use the fact that Q(x) = 1 Φ(x). (i) 1 Φ(1.5) = Q(1.5). (ii) 1 Φ (2.5) = Q(2.5). 9-6

28 ECS 315 HW Solution 9 Due: Nov /1 (d) (iii) 2Φ (1.5) 1 = 2(1 Q(1.5)) 1 = 2 2Q(1.5) 1 = 1 2Q(1.5). (i) 1 Φ(1.5) = = (ii) 1 Φ (2.5) = = (iii) 2Φ (1.5) 1 = 2 (0.9332) 1 = (e) When z is large, Φ(z) will start with The first few significant digits will all be the same and hence not quite useful to be there. Extra Questions Here are optional questions for those who want more practice. Problem 5. Let X be a uniform random variable on the interval [0, 1]. Set [ A = 0, 1 ) [, B = 0, 1 ) [ , 3 ) [, and C = 0, 1 ) [ , 3 ) [ 1 8 2, 5 ) 8 Are the events [X A], [X B], and [X C] independent? Solution: Note that P [X A] = P [X B] = P [X C] = dx = 1 2, dx + dx dx = 1 2, and dx dx dx = 1 2. [ 3 4, 7 )

29 ECS 315 HW Solution 9 Due: Nov /1 Now, for pairs of events, we have P ([X A] [X B]) = 1 4 dx = 1 4 = P [X A] P [X B], (9.1) P ([X A] [X C]) = P ([X B] [X C]) = dx + dx dx = 1 4 dx = 1 4 = P [X A] P [X C], and (9.2) = P [X B] P [X C]. (9.3) Finally, P ([X A] [X B] [X C]) = 1 8 dx = 1 8 = P [X A] P [X B] P [X C]. (9.4) 0 From (9.1), (9.2), (9.3) and (9.4), we can conclude that the events [X A], [X B], and [X C] are independent. Problem 6 (Q3.5.6). Solve this question using Table 3.1 and/or Table 3.2 from [Yates & Goodman, 2005]: A professor pays 25 cents for each blackboard error made in lecture to the student who points out the error. In a career of n years filled with blackboard errors, the total amount in dollars paid can be approximated by a Gaussian random variable Y n with expected value 40n and variance 100n. (a) What is the probability that Y 20 exceeds 1000? (b) How many years n must the professor teach in order that P [Y n > 1000] > 0.99? Solution: We are given 2 that Y n N (40n, 100n). Recall that when X N (m, σ 2 ), ( ) x m F X (x) = Φ. (9.5) σ 2 Note that the expected value and the variance in this question are proportional to n. This naturally occurs when we consider the sum of i.i.d. random variables. The approximation by Gaussian random variable is a result of the central limit theorem (CLT). 9-8

30 ECS 315 HW Solution 9 Due: Nov /1 (a) Here n = 20. So, we have Y n N (40 20, ) = N (800, 2000). For this random variable m = 800 and σ = We want to find P [Y 20 > 1000] which is the same as 1 P [Y ]. Expressing this quantity using cdf, we have Apply (9.5) to get P [Y 20 > 1000] = 1 Φ P [Y 20 > 1000] = 1 F Y20 (1000). ( ) = 1 Φ(4.472) Q(4.47) (b) Here, the value of n is what we want. So, we will need to keep the formula in the general form. Again, from (9.5), for Y n N (40n, 100n), we have ( ) ( ) n 100 4n P [Y n > 1000] = 1 F Yn (1000) = 1 Φ 10 = 1 Φ. n n To find the value of n such that P [Y n > 1000] > 0.99, we will first find the value of z which make 1 Φ (z) > (9.6) At this point, we may try to solve for the value of Z by noting that (9.6) is the same as Φ (z) < (9.7) Unfortunately, the tables that we have start with Φ(0) = 0.5 and increase to something close to 1 when the argument of the Φ function is large. This means we can t directly find 0.01 in the table. Of course, 0.99 is in there and therefore we will need to solve (9.6) via another approach. To do this, we use another property of the Φ function. Recall that 1 Φ(z) = Φ( z). Therefore, (9.6) is the same as Φ ( z) > (9.8) From our table, we can then conclude that (9.7) (which is the same as (9.8)) will happen when z > (If you have MATLAB, then you can get a more accurate answer of ) Now, plugging in z = 100 4n n, we have 4n 100 n > To solve for n, we first let x = n. In which case, we have 4x2 100 > 2.33 or, equivalently, 4x x 100 > 0. The two x roots are x = and x > 5.3. So, We need x < or x > 5.3. Note that x = n and therefore can not be negative. So, we only have one case; that is, we need x > 5.3. Because n = x 2, we then conclude that we need n > 28.1 years. 9-9

31 ECS 315: Probability and Random Processes 2014/1 HW Solution 10 Due: Nov 20 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra question at the end is optional. (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1. Consider each random variable X defined below. Let Y = 1 + 2X. (i) Find and sketch the pdf of Y and (ii) Does Y belong to any of the (popular) families discussed in class? If so, state the name of the family and find the corresponding parameters. (a) X U(0, 1) (b) X E(1) (c) X N (0, 1) Solution: See handwritten solution Problem 2. Consider each random variable X defined below. Let Y = 1 2X. (i) Find and sketch the pdf of Y and (ii) Does Y belong to any of the (popular) families discussed in class? If so, state the name of the family and find the corresponding parameters. (a) X U(0, 1) (b) X E(1) (c) X N (0, 1) Solution: See handwritten solution 10-1

32 ECS 315 HW Solution 10 Due: Nov /1 Problem 3. Let X E(5) and Y = 2/X. Find (a) F Y (y). (b) f Y (y). (c) EY Hint: Because d 10 e y = 10 e 10 dy y 2 y > 0 for y 0. We know that e 10 y is an increasing function on our range of integration. In particular, consider y > 10/ ln(2). Then, e 10 y > 1. Hence, y e y dy > 10/ ln y e y dy > 10/ ln y 2 dy = 10/ ln 2 5 y dy. Remark: To be technically correct, we should be a little more careful when writing Y = 2 because it is undefined when X = 0. Of course, this happens with 0 probability; X so it won t create any serious problem. However, to avoid the problem, we may define Y by { 2/X, X 0, Y = 0, X = 0. Solution: (a) We consider two cases: y 0 and y > 0. (i) Because X > 0, we know that Y = 2 must be > 0 and hence, F X Y (y) = 0 for y 0. Note that Y can not be = 0. We need X = or to make Y = 0. However, ± are not real numbers therefore they are not possible X values. (ii) For y > 0, F Y (y) = P [Y y] = P [ ] [ 2 X y = P X 2 ]. y Note that, for the last equality, we can freely move X and y without worrying about flipping the inequality or division by zero because both X and y considered here are strictly positive. Now, for X E(λ) and x > 0, we have P [X x] = λe λt dt = e λt x = e λx Therefore, x F Y (y) = e 5( 2 y) = e 10 y. 10-2

33 ECS 315 HW Solution 10 Due: Nov /1 Combining the two cases above we have F Y (y) = { e 10 y, y > 0 0, y 0 (b) We show two methods that can be applied here to find f Y (y). Method 1: Because we have already derive the cdf in the previous part, we can find the pdf via the cdf by f Y (y) = d F dy Y (y). This gives f Y at all points except at y = 0 which we will set f Y to be 0 there. Hence, f Y (y) = { 10e y 2 y, y > 0 0, y 0. Method 2: See handwritten solution. (c) We can find EY from f Y (y) found in the previous part or we can even use f X (x) Method 1: EY = From the hint, we have yf Y (y) = 0 y y 2 e y dy = y e y dy EY > 10/ ln y e y dy > = 5 ln y 10/ ln 2 =. 10/ ln y 2 dy = 10/ ln 2 5 y dy Therefore, EY =. Method 2: [ ] 1 EY = E X > 1 0 = 1 x λe λ dx = λe λ 1 x f 1 X (x) dx = x λe λx dx > x dx = λe λ ln x 1 0 =, 0 1 x λe λx dx where the second inequality above comes from the fact that for x (0, 1), e λx > e λ. 10-3

34 ECS 315 HW Solution 10 Due: Nov /1 Problem 4. In wireless communications systems, fading is sometimes modeled by lognormal random variables. We say that a positive random variable Y is lognormal if ln Y is a normal random variable (say, with expected value m and variance σ 2 ). Find the pdf of Y. Hint: First, recall that the ln is the natural log function (log base e). Let X = ln Y. Then, because Y is lognormal, we know that X N (m, σ 2 ). Next, write Y as a function of X. Solution: We now show two methods that can be applied here to find f Y (y): Method 1 : Finding f Y (y) from F Y (y). Because X = ln(y ), we have Y = e X. We know that exponential function gives strictly positive number. So, Y is always strictly positive. In particular, F Y (y) = 0 for y 0. Next, for y > 0, by definition, F Y (y) = P [Y y]. Plugging in Y = e X, we have F Y (y) = P [ e X y ]. Because the exponential function is strictly increasing, the event [e X y] is the same as the event [X ln y]. Therefore, F Y (y) = P [X ln y] = F X (ln y). Combining the two cases above, we have { FX (ln y), y > 0, F Y (y) = 0, y 0. Finally, we apply f Y (y) = d dy F Y (y). For y < 0, we have f Y (y) = d 0 = 0. For y > 0, dy Therefore, f Y (y) = d dy F Y (y) = d dy F X (ln y) = f X (ln y) d dy ln y = 1 y f X (ln y). (10.1) f Y (y) = { 1 f y X (ln y), y > 0, 0, y < 0. At y = 0, if we know that Y is a continuous random variable, we can assign any value, e.g. 0, to f Y (0). Suppose this is the case. Then f Y (y) = { 1 f y X (ln y), y > 0, 0, otherwise. 10-4

35 ECS 315 HW Solution 10 Due: Nov /1 Method 2 : Finding f Y (y) directly from f X (x). Here we have Y = g(x) where g(x) = e x. Note that exponential function always gives positive number. So, for y 0, there is no (real-valued) x that satisfies g(x) = y. Therefore, f Y (y) = 0 for y 0. For y > 0, there is exactly one x that satisfies g(x) = y, namely x = ln y. At x = ln y, the slope g (x) is e x x=ln(y) = e ln y = y. So, f Y (y) = f X (x) g (x) = f X (ln (y)) y Combining the two cases above, we have = 1 y f X (ln (y)). f Y (y) = { 1 f y X (ln y), y > 0, 0, otherwise. Here, X N (m, σ 2 ). Therefore, f X (x) = 1 e 1 2( x m σ ) 2 2πσ and f Y (y) = { 1 2πσy e 1 2( ln(y) m σ ) 2, y > 0, 0, otherwise. Extra Question Here is an optional question for those who want extra practice. Problem 5. Consider a random variable X whose pdf is given by { cx f X (x) = 2, x (1, 2), 0, otherwise. Let Y = 4 X 1.5. (a) Find EY. (b) Find f Y (y). Solution: See handwritten solution 10-5

36 Q1 Tuesday, November 11, :15 PM ECS HW10 Page 1

37 ECS HW10 Page 2

38 Q2 Tuesday, November 11, :15 PM ECS HW10 Page 3

39 ECS HW10 Page 4

40 Q3b SISO (Method 2) Friday, November 21, :52 AM ECS HW10 Page 5

41 ECS HW10 Page 6

42 Q5 Saturday, November 22, :54 PM ECS HW10 Page 7

43 ECS HW10 Page 8

44 ECS HW10 Page 9

45 ECS 315: Probability and Random Processes 2014/1 HW 11 Due: Nov 14 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra question at the end is optional. (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1. Consider a random variable X whose pdf is given by { cx, 0 < x < 1, f X (x) = 0, otherwise. (a) Find c. (b) Find f X (0.2) and f X (0.8). (c) Find P [0.19 < X < 0.21] and P [0.79 < X < 0.81]. (d) In class, we have seen that we may use f X (x) x to approximate the probability that the random variable X will be inside some small interval of length x near or around x. Use this approximation to calculate P [0.19 < X < 0.21] and P [0.79 < X < 0.81]. Compare your answers here with the exact answers in the previous part. Problem 2. Consider a random variable X whose pdf is given by { cx f X (x) = 2, 0 < x < 1, 0, otherwise. 11-1

46 ECS 315 HW 11 Due: Nov /1 (a) Find c. (b) Find f X (0.2) and f X (0.8). (c) Find P [0.19 < X < 0.21] and P [0.79 < X < 0.81]. (d) In class, we have seen that we may use f X (x) x to approximate the probability that the random variable X will be inside some small interval of length x near or around x. Use this approximation to calculate P [0.19 < X < 0.21] and P [0.79 < X < 0.81]. Compare your answers here with the exact answers in the previous part. Problem 3. Consider a random variable X whose pdf f X (x) is unknown. From an experiment, suppose we found that P [0.99 < X < 1.03] 0.1 and P [1.98 < X < 2.01] Use this information to approximate the ratio f X(2) f X (1). Problem 4. Suppose X U( 1, 4). Define a new random variable Y by Y = g(x) where the function g(x) is plotted in Figure g x x Figure 11.1: Function g(x) for Problem 4. (a) Find P [Y > 2]. (b) Find P [2 < Y < 3]. (c) Find P [1.9 < Y < 2.1]. (d) Find f Y (y) at y near y = 2. (e) Find f Y (y)

47 ECS 315 HW 11 Due: Nov /1 Problem 5. Let a continuous random variable X denote the current measured in a thin copper wire in milliamperes. Assume that the probability density function of X is { 5, 4.9 x 5.1, f X (x) = 0, otherwise. (a) Find the probability that a current measurement is less than 5 milliamperes. (b) Find and plot the cumulative distribution function of the random variable X. (c) Find the expected value of X. (d) Find the variance and the standard deviation of X. (e) Find the expected value of power when the resistance is 100 ohms? Extra Question Here are optional questions for those who want extra practice. Problem 6. The time until a chemical reaction is complete (in milliseconds) is approximated by the cumulative distribution function { 1 e F X (x) = 0.01x, x 0, 0, otherwise. (a) Determine the probability density function of X. (b) What proportion of reactions is complete within 200 milliseconds? Problem 7. Cholesterol is a fatty substance that is an important part of the outer lining (membrane) of cells in the body of animals. Its normal range for an adult is mg/dl. The Food and Nutrition Institute of the Philippines found that the total cholesterol level for Filipino adults has a mean of mg/dl and 84.1% of adults have a cholesterol level below 200 mg/dl. Suppose that the cholesterol level in the population is normally distributed. (a) Determine the standard deviation of this distribution. (b) What is the value of the cholesterol level that exceeds 90% of the population? (c) An adult is at moderate risk if cholesterol level is more than one but less than two standard deviations above the mean. What percentage of the population is at moderate risk according to this criterion? 11-3

48 ECS 315 HW 11 Due: Nov /1 (d) An adult is thought to be at high risk if his cholesterol level is more than two standard deviations above the mean. What percentage of the population is at high risk? Problem 8. Suppose that the time to failure (in hours) of fans in a personal computer can be modeled by an exponential distribution with λ = (a) What proportion of the fans will last at least 10,000 hours? (b) What proportion of the fans will last at most 7000 hours? [Montgomery and Runger, 2010, Q4-97] 11-4

49 Q1 Meaning of pdf Wednesday, November 12, :21 PM ECS HW11 Page 1

50 Q2 Meaning of pdf Wednesday, November 12, :21 PM ECS HW11 Page 2

51 Q3 Meaning of pdf Thursday, November 20, :27 PM ECS HW11 Page 3

52 Q4 SISO Thursday, November 20, :34 PM ECS HW11 Page 4

53 ECS HW11 Page 5

54 ECS HW11 Page 6

55 ECS HW11 Page 7

56 ECS HW11 Page 8

57 Q5: pdf, cdf, expected value, variance - current and power Thursday, November 13, :01 AM ECS HW11 Page 9

58 ECS HW11 Page 10

59 Q6: pdf and cdf - chemical reaction Thursday, November 13, :07 AM ECS HW11 Page 11

60 Q7: Gaussian RV - Cholesterol Thursday, November 13, :49 PM ECS HW11 Page 12

61 Q8: Exponential RV Friday, November 21, :02 AM ECS HW11 Page 13

62 ECS 315: Probability and Random Processes 2014/1 HW Solution 12 Due: Dec 4 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will be selected; so you should work on all of them. (b) It is important that you try to solve all problems. (5 pt) The extra questions at the end are optional. (c) Late submission will be heavily penalized. (d) Write down all the steps that you have done to obtain your answers. You may not get full credit even when your answer is correct without showing how you get your answer. Problem 1. The input X and output Y of a system subject to random perturbations are described probabilistically by the following joint pmf matrix: x 1 3 y Evaluate the following quantities: (a) The marginal pmf p X (x) (b) The marginal pmf p Y (y) (c) EX (d) Var X (e) EY (f) Var Y 12-1

63 ECS 315 HW Solution 12 Due: Dec /1 (g) P [XY < 6] (h) P [X = Y ] Solution: The MATLAB codes are provided in the file P XY marginal.m. (a) The marginal pmf p X (x) is founded by the sums along the rows of the pmf matrix: p X (x) = 0.2, x = 1 0.8, x = 3 0, otherwise. (b) The marginal pmf p Y (y) is founded by the sums along the columns of the pmf matrix: p Y (y) = 0.1, y = , y = , y = 5 0, otherwise. (c) EX = x xp X (x) = = = 2.6. (d) E [X 2 ] = x x 2 p X (x) = = = 7.4. So, Var X = E [X 2 ] (EX) 2 = 7.4 (2.6) 2 = = (e) EY = y yp Y (y) = = = (f) E [Y 2 ] = y y 2 p Y (y) = = So, Var Y = E [Y 2 ] (EY ) 2 = = (g) Among the 6 possible pairs of (x, y) shown in the joint pmf matrix, only the pairs (1, 2), (1, 4), (1, 5) satisfy xy < 6. Therefore, [XY < 6] = [X = 1] which implies P [XY < 6] = P [X = 1] = 0.2. (h) Among the 6 possible pairs of (x, y) shown in the joint pmf matrix, there is no pair which has x = y. Therefore, P [X = Y ] =

64 ECS 315 HW Solution 12 Due: Dec /1 Problem 2. The input X and output Y of a system subject to random perturbations are described probabilistically by the following joint pmf matrix: x 1 3 y (a) Evaluate the following quantities: (i) E [XY ] (ii) E [(X 3)(Y 2)] (iii) E [X(Y 3 11Y Y )] (iv) Cov [X, Y ] (v) ρ X,Y Hint: Write down the formulas then use MATLAB or Excel to compute them. (b) Find ρ X,X (c) Calculate the following quantities using the values of Var X, Cov [X, Y ], and ρ X,Y that you got earlier. (i) Cov [3X + 4, 6Y 7] (ii) ρ 3X+4,6Y 7 (iii) Cov [X, 6X 7] (iv) ρ X,6X 7 Solution: The MATLAB codes are provided in the file P XY EVarCov.m. (a) (i) From MATLAB, E [XY ] = (ii) From MATLAB, E [(X 3)(Y 2)] = (iii) From MATLAB, E [X(Y 3 11Y Y )] = 104. (iv) From MATLAB, Cov [X, Y ] =

65 ECS 315 HW Solution 12 Due: Dec /1 (v) From MATLAB, ρ X,Y = (b) ρ X,X = Cov[X,X] σ X σ X (c) = Var[X] σ 2 X = 1. (i) Cov [3X + 4, 6Y 7] = 3 6 Cov [X, Y ] (ii) Note that ρ ax+b,cy +d = = Cov [ax + b, cy + d] σ ax+b σ cy +d accov [X, Y ] = ac a σ X c σ Y ac ρ X,Y = sign(ac) ρ X,Y. Hence, ρ 3X+4,6Y 7 = sign(3 4)ρ X,Y = ρ X,Y = (iii) Cov [X, 6X 7] = 1 6 Cov [X, X] = 6 Var[X] (iv) ρ X,6X 7 = sign(1 6) ρ X,X = 1. Problem 3. Suppose X binomial(5, 1/3), Y binomial(7, 4/5), and X the following quantities. (a) E [(X 3)(Y 2)] (b) Cov [X, Y ] (c) ρ X,Y = Y. Evaluate Solution: (a) First, because X and Y are independent, we have E [(X 3)(Y 2)] = E [X 3] E [Y 2]. Recall that E [ax + b] = ae [X]+b. Therefore, E [X 3] E [Y 2] = (E [X] 3) (E [Y ] 2) Now, for Binomial(n, p), the expected value is np. So, (E [X] 3) (E [Y ] 2) = (5 13 ) ( ) 2 = = 24 5 = 4.8. (b) Cov [X, Y ] = 0 because X = Y. (c) ρ X,Y = 0 because Cov [X, Y ] =

66 ECS 315 HW Solution 12 Due: Dec /1 Problem 4. Suppose we know that σ X = 21 10, σ Y = 4 6 5, ρ X,Y = (a) Find Var[X + Y ]. (b) Find E [(Y 3X + 5) 2 ]. Assume E [Y 3X + 5] = 1. Solution: (a) First, we know that Var X = σx 2 = 21, Var Y = 100 σ2 Y = 96, and Cov [X, Y ] = ρ 25 X,Y σ X σ Y = 2. Now, 25 Var [X + Y ] = E [ ((X + Y ) E [X + Y ]) 2] = E [ ((X EX) + (Y EY )) 2] = E [ (X EX) 2] + 2E [(X EX) (Y EY )] + E [ (Y EY ) 2] = Var X + 2Cov [X, Y ] + Var Y = = Remark: It is useful to remember that Var [X + Y ] = Var X + 2Cov [X, Y ] + Var Y. Note that when X and Y are uncorrelated, Var [X + Y ] = Var X +Var Y. This simpler formula also holds when X and Y are independence because independence is a stronger condition. (b) First, we write Y ax b = (Y EY ) a (X EX) (aex + b EY ). }{{} c Now, using the expansion we have (u + v + t) 2 = u 2 + v 2 + t 2 + 2uv + 2ut + 2vt, (Y ax b) 2 = (Y EY ) 2 + a 2 (X EX) 2 + c 2 2a (X EX) (Y EY ) 2c (Y EY ) + 2a (X EX) c. Recall that E [X EX] = E [Y EY ] = 0. Therefore, E [ (Y ax b) 2] = Var Y + a 2 Var X + c 2 2aCov [X, Y ] 12-5

67 ECS 315 HW Solution 12 Due: Dec /1 Plugging back the value of c, we have E [ (Y ax b) 2] = Var Y + a 2 Var X + (E [(Y ax b)]) 2 2aCov [X, Y ]. Here, a = 3 and b = 5. Plugging these values along with the given quantities into the formula gives E [ (Y ax b) 2] = = Extra Question Here are optional questions for those who want extra practice. Problem 5. The input X and output Y of a system subject to random perturbations are described probabilistically by the joint pmf p X,Y (x, y), where x = 1, 2, 3 and y = 1, 2, 3, 4, 5. Let P denote the joint pmf matrix whose i,j entry is p X,Y (i, j), and suppose that P = 1 71 (a) Find the marginal pmfs p X (x) and p Y (y). (b) Find EX (c) Find EY (d) Find Var X (e) Find Var Y Solution: All of the calculations in this question are simply plugging numbers into appropriate formula. The MATLAB codes are provided in the file P XY marginal 2.m. (a) The marginal pmf p X (x) is founded by the sums along the rows of the pmf matrix: 26/71, x = , x = 1 25/71, x = , x = 2 p X (x) = 20/71, x = , x = 3 0, otherwise 0, otherwise. 12-6

68 ECS 315 HW Solution 12 Due: Dec /1 The marginal pmf p Y (y) is founded by the sums along the columns of the pmf matrix: 13/71, y = , y = 1 8/71, y = , y = 2 21/71, y = , y = 3 p Y (y) = 15/71, y = , y = 4 14/71, y = , y = 5 0, otherwise 0, otherwise. (b) EX = (c) EY = (d) Var X = (e) Var Y = Problem 6. A webpage server can handle r requests per day. Find the probability that the server gets more than r requests at least once in n days. Assume that the number of requests on day i is X i P(α) and that X 1,..., X n are independent. Solution: [Gubner, 2006, Ex 2.10] [ n ] [ n ] P [X i > r] = 1 P [X i r] = 1 i=1 = 1 i=1 n P [X i r] i=1 ( n r ) ( r ) n α k e α α k e α = 1. k! k! i=1 k=0 k=0 Problem 7. Suppose X binomial(5, 1/3), Y binomial(7, 4/5), and X (a) A vector describing the pmf of X can be created by the MATLAB expression: x = 0:5; px = binopdf(x,5,1/3). What is the expression that would give py, a corresponding vector describing the pmf of Y? (b) Use px and py from part (a), how can you create the joint pmf matrix in MATLAB? Do not use for-loop, while-loop, if statement. Hint: Multiply them in an appropriate orientation = Y.

HW Solution 5 Due: Sep 13

HW Solution 5 Due: Sep 13 ECS 315: Probability and Random Processes 01/1 HW Solution 5 Due: Sep 13 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which

More information

HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session)

HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session) ECS 315: Probability and Random Processes 2015/1 HW Solution 7 Due: Oct 28, 9:19 AM (in tutorial session) Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt).

More information

HW Solution 4 Not Due

HW Solution 4 Not Due IES 32: Engineering Statistics 211/2 HW Solution 4 Not Due Lecturer: Prapun Suksompong, Ph.D. Problem 1. The random variable V has pmf { cv p V (v) = 2, v = 1, 2, 3, 4,, otherwise. (a) Find the value of

More information

HW 13 Due: Dec 6, 5 PM

HW 13 Due: Dec 6, 5 PM ECS 315: Probability and Random Processes 2016/1 HW 13 Due: Dec 6, 5 PM Lecturer: Prapun Suksompong, Ph.D. Instructions (a) This assignment has 8 pages. (b) (1 pt) Write your first name and the last three

More information

HW Solution 12 Due: Dec 2, 9:19 AM

HW Solution 12 Due: Dec 2, 9:19 AM ECS 315: Probability and Random Processes 2015/1 HW Solution 12 Due: Dec 2, 9:19 AM Lecturer: Prapun Suksompong, Ph.D. Problem 1. Let X E(3). (a) For each of the following function g(x). Indicate whether

More information

HW Solution 1 Due: February 1

HW Solution 1 Due: February 1 .qxd 1/7/10 9:45 AM Page 28 APTER 2 PROBABILITY IES 302: Engineering Statistics 2011/2 HW Solution 1 Due: February 1 Lecturer: Prapun Suksompong, Ph.D. S FOR SECTION 2-1 2-18. In a magnetic storage device,

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab

ECE 313: Conflict Final Exam Tuesday, May 13, 2014, 7:00 p.m. 10:00 p.m. Room 241 Everitt Lab University of Illinois Spring 1 ECE 313: Conflict Final Exam Tuesday, May 13, 1, 7: p.m. 1: p.m. Room 1 Everitt Lab 1. [18 points] Consider an experiment in which a fair coin is repeatedly tossed every

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

STAT 516 Midterm Exam 2 Friday, March 7, 2008

STAT 516 Midterm Exam 2 Friday, March 7, 2008 STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

S n = x + X 1 + X X n.

S n = x + X 1 + X X n. 0 Lecture 0 0. Gambler Ruin Problem Let X be a payoff if a coin toss game such that P(X = ) = P(X = ) = /2. Suppose you start with x dollars and play the game n times. Let X,X 2,...,X n be payoffs in each

More information

FINAL EXAM: Monday 8-10am

FINAL EXAM: Monday 8-10am ECE 30: Probabilistic Methods in Electrical and Computer Engineering Fall 016 Instructor: Prof. A. R. Reibman FINAL EXAM: Monday 8-10am Fall 016, TTh 3-4:15pm (December 1, 016) This is a closed book exam.

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Exam 3, Math Fall 2016 October 19, 2016

Exam 3, Math Fall 2016 October 19, 2016 Exam 3, Math 500- Fall 06 October 9, 06 This is a 50-minute exam. You may use your textbook, as well as a calculator, but your work must be completely yours. The exam is made of 5 questions in 5 pages,

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

FINAL EXAM: 3:30-5:30pm

FINAL EXAM: 3:30-5:30pm ECE 30: Probabilistic Methods in Electrical and Computer Engineering Spring 016 Instructor: Prof. A. R. Reibman FINAL EXAM: 3:30-5:30pm Spring 016, MWF 1:30-1:0pm (May 6, 016) This is a closed book exam.

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

2 Functions of random variables

2 Functions of random variables 2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as

More information

Fundamental Tools - Probability Theory II

Fundamental Tools - Probability Theory II Fundamental Tools - Probability Theory II MSc Financial Mathematics The University of Warwick September 29, 2015 MSc Financial Mathematics Fundamental Tools - Probability Theory II 1 / 22 Measurable random

More information

This does not cover everything on the final. Look at the posted practice problems for other topics.

This does not cover everything on the final. Look at the posted practice problems for other topics. Class 7: Review Problems for Final Exam 8.5 Spring 7 This does not cover everything on the final. Look at the posted practice problems for other topics. To save time in class: set up, but do not carry

More information

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions Math/Stats 45, Sec., Fall 4: Introduction to Probability Final Exam: Solutions. In a game, a contestant is shown two identical envelopes containing money. The contestant does not know how much money is

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Chapter 2. Continuous random variables

Chapter 2. Continuous random variables Chapter 2 Continuous random variables Outline Review of probability: events and probability Random variable Probability and Cumulative distribution function Review of discrete random variable Introduction

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in:

STAT/MA 416 Midterm Exam 2 Thursday, October 18, Circle the section you are enrolled in: STAT/MA 46 Midterm Exam 2 Thursday, October 8, 27 Name Purdue student ID ( digits) Circle the section you are enrolled in: STAT/MA 46-- STAT/MA 46-2- 9: AM :5 AM 3: PM 4:5 PM REC 4 UNIV 23. The testing

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

HW Solution 3 Due: Sep 11

HW Solution 3 Due: Sep 11 ECS 35: Probability and Random Processes 204/ HW Solution 3 Due: Sep Lecturer: Prapun Suksompong, PhD Instructions (a) ONE part of a question will be graded (5 pt) Of course, you do not know which part

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Expectations and Variance

Expectations and Variance 4. Model parameters and their estimates 4.1 Expected Value and Conditional Expected Value 4. The Variance 4.3 Population vs Sample Quantities 4.4 Mean and Variance of a Linear Combination 4.5 The Covariance

More information

Brief Review of Probability

Brief Review of Probability Maura Department of Economics and Finance Università Tor Vergata Outline 1 Distribution Functions Quantiles and Modes of a Distribution 2 Example 3 Example 4 Distributions Outline Distribution Functions

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014

Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem Spring 2014 Continuous Expectation and Variance, the Law of Large Numbers, and the Central Limit Theorem 18.5 Spring 214.5.4.3.2.1-4 -3-2 -1 1 2 3 4 January 1, 217 1 / 31 Expected value Expected value: measure of

More information

STAT 516 Midterm Exam 3 Friday, April 18, 2008

STAT 516 Midterm Exam 3 Friday, April 18, 2008 STAT 56 Midterm Exam 3 Friday, April 8, 2008 Name Purdue student ID (0 digits). The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20

Problem Set 1. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 20 Problem Set MAS 6J/.6J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 0 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain a

More information

What is a random variable

What is a random variable OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE MATH 256 Probability and Random Processes 04 Random Variables Fall 20 Yrd. Doç. Dr. Didem Kivanc Tureli didemk@ieee.org didem.kivanc@okan.edu.tr

More information

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X.

F X (x) = P [X x] = x f X (t)dt. 42 Lebesgue-a.e, to be exact 43 More specifically, if g = f Lebesgue-a.e., then g is also a pdf for X. 10.2 Properties of PDF and CDF for Continuous Random Variables 10.18. The pdf f X is determined only almost everywhere 42. That is, given a pdf f for a random variable X, if we construct a function g by

More information

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS

EXAM. Exam #1. Math 3342 Summer II, July 21, 2000 ANSWERS EXAM Exam # Math 3342 Summer II, 2 July 2, 2 ANSWERS i pts. Problem. Consider the following data: 7, 8, 9, 2,, 7, 2, 3. Find the first quartile, the median, and the third quartile. Make a box and whisker

More information

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm)

Math 407: Probability Theory 5/10/ Final exam (11am - 1pm) Math 407: Probability Theory 5/10/2013 - Final exam (11am - 1pm) Name: USC ID: Signature: 1. Write your name and ID number in the spaces above. 2. Show all your work and circle your final answer. Simplify

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7

ECS 332: Principles of Communications 2012/1. HW 4 Due: Sep 7 ECS 332: Principles of Communications 2012/1 HW 4 Due: Sep 7 Lecturer: Prapun Suksompong, Ph.D. Instructions (a) ONE part of a question will be graded (5 pt). Of course, you do not know which part will

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Chapter 5 continued. Chapter 5 sections

Chapter 5 continued. Chapter 5 sections Chapter 5 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

STAT509: Discrete Random Variable

STAT509: Discrete Random Variable University of South Carolina September 16, 2014 Motivation So far, we have already known how to calculate probabilities of events. Suppose we toss a fair coin three times, we know that the probability

More information

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999

Probability Midterm Exam 2:15-3:30 pm Thursday, 21 October 1999 Name: 2:15-3:30 pm Thursday, 21 October 1999 You may use a calculator and your own notes but may not consult your books or neighbors. Please show your work for partial credit, and circle your answers.

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

Class 8 Review Problems 18.05, Spring 2014

Class 8 Review Problems 18.05, Spring 2014 1 Counting and Probability Class 8 Review Problems 18.05, Spring 2014 1. (a) How many ways can you arrange the letters in the word STATISTICS? (e.g. SSSTTTIIAC counts as one arrangement.) (b) If all arrangements

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

3 Conditional Expectation

3 Conditional Expectation 3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.

More information

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions

Outline PMF, CDF and PDF Mean, Variance and Percentiles Some Common Distributions. Week 5 Random Variables and Their Distributions Week 5 Random Variables and Their Distributions Week 5 Objectives This week we give more general definitions of mean value, variance and percentiles, and introduce the first probability models for discrete

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006 Review problems UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 5 Spring 006 Problem 5. On any given day your golf score is any integer

More information

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 17: Continuous random variables: conditional PDF Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 14: Continuous random variables Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES Contents 1. Continuous random variables 2. Examples 3. Expected values 4. Joint distributions

More information

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes. Closed book and notes. 60 minutes. A summary table of some univariate continuous distributions is provided. Four Pages. In this version of the Key, I try to be more complete than necessary to receive full

More information

p. 6-1 Continuous Random Variables p. 6-2

p. 6-1 Continuous Random Variables p. 6-2 Continuous Random Variables Recall: For discrete random variables, only a finite or countably infinite number of possible values with positive probability (>). Often, there is interest in random variables

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12 Lecture 5 Continuous Random Variables BMIR Lecture Series in Probability and Statistics Ching-Han Hsu, BMES, National Tsing Hua University c 215 by Ching-Han Hsu, Ph.D., BMIR Lab 5.1 1 Uniform Distribution

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2 IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information