MAT 271E Probability and Statistics

Size: px
Start display at page:

Download "MAT 271E Probability and Statistics"

Transcription

1 MAT 7E Probability and Statistics Spring 6 Instructor : Class Meets : Office Hours : Textbook : İlker Bayram EEB 3 ibayram@itu.edu.tr , Wednesday EEB 6.., Monday D. B. Bertsekas, J. N. Tsitsiklis, Introduction to Probability, nd Edition, Athena-Scientific. Grading : Homeworks (%), Midterms (5% each), Final (4%). Webpage : There s a ninova page. Please log in and check. Tentative Course Outline Probability Space Probability models, conditioning, Bayes rule, independence. Discrete Random Variables Probability mass function, functions of random variables, expectation, joint PMFs, conditioning, independence. General Random Variables Probability distribution function, cumulative distribution function, continuous Bayes rule, correlation, conditional expectation. Limit Theorems Law of large numbers, central limit theorem. Introduction to Statistics Parameter estimation, linear regression, hypothesis testing.

2 MAT 7E Homework Due You have a regular, unbiased coin and a regular, unbiased die. Consider the following two step experiment. (i) You toss the coin. (ii) If the toss is a Head, you toss the coin again. If the toss is a Tail, you roll the die. Propose a sample space for this experiment. Below is one reasonable sample space for this experiment. Here, H represents a Head and T a Tail. Ω = (H, H), (H, T ), (T, ), (T, ), (T, 3), (T, 4), (T, 5), (T, 6)}.. You have a regular, unbiased coin. You start tossing the coin. You stop if you observe two Heads before the fourth toss, or, you tossed the coin four times. (a) Propose a sample space for this experiment. (b) For the experiment described above, let us define the events B k = a Head occurs at the k th toss}, for k =,, 3, 4, A = you toss the coin three times and stop}. Express the event A in terms of the events B k, for k =,, 3, 4. (You can use as many B k s as you like). (a) A minimal sample space for this experiment is Ω = (H, H), (T, H, H), (H, T, H), (H, T, T, H), (H, T, T, T ), (T, H, T, H), (T, H, T, T ), (T, T, H, H), (T, T, H, T ), (T, T, T, H), (T, T, T, T )} (b) Notice that A occurs if and only if (T, H, H), (H, T, H)} occurs. Therefore, A = (B c B B 3 ) (B B c B 3 ). 3. You have four coins in your pocket. One of the coins has Heads on both faces. The other three are regular, unbiased coins. You randomly pick one of the coins, toss it and observe a Head. Answer the following questions by taking into account this information. (a) What is the probability that the other face is also a Head? (b) Without looking at the other face, you randomly pick one of the remaining coins. probability that this second coin has Heads on both faces? What is the (a) Let us define the events B = the randomly picked coin has heads on both faces}, H = we observe a head after the toss}. We are asked to compute P (B H). We compute this as, P (H B) P (B) P (B H) = P (H) P (H B) P (B) = P (H B) P (B) + P H B c P (B c ) (/4) = (/4) + (/) (3/4) = 5.

3 (b) Let us define the event S, in addition to H and B as, S = the second randomly picked coin has heads on both faces}. Notice that we need to compute P (S H). Note also that P (H) = 5/8 from part (a). We now have, P (S H) = = P (H S) P (S) P (H) / P (S). 5/8 We can compute P (S) by conditioning on B. That is, P (S) = P (S B) P (B) + P (S B c ) P (B c ) Thus, P (S H) = /5. = (/4) + (/3) (3/4) = /4. 4. Two specialists claim that they can tell whether a painting is original or a reproduction by looking at the painting. However it is known from past experience that each specialist makes mistakes with probability p. (Note that there are two types of mistakes (i) the painting is original but the specialist decides it is a reproduction, (ii) the painting is a reproduction but the specialist decides it is original. The probability of both types of mistakes are the same and equal to p). Suppose it is known that there are 3 reproductions and 7 originals in a collection of paintings but this information is not provided to the specialists. A painting is selected randomly from the collection and is presented to the specialists. Assume that the specialists decide on their own (they do not know what the other decides). (a) Find the probability that the first specialist decides that the given painting is original. (b) Given that the first specialist decides the painting is original, find the conditional probability that the selected painting is actually a reproduction. (c) Given that the first specialist decides the painting is original, find the conditional probability that the second painter decides the painting is original. (a) We are asked to compute the probability of the event D = the first specialist decides that the given painting is an original}. We will compute this probability by conditioning on the following event We find, O = the selected painting is an original}. P (D ) = P (D O) P (O) + P (D O c ) P (O c ) = ( p) (7/) + p (3/) = 7 p 4. (b) We are asked to compute P (O c D ). Observe that P (O c D ) = P (D O c ) P (O c ) P (D ) (c) Let us define the event = p (3/) (7 4p)/ = 3p 7 4p. D = the second specialist decides that the given painting is an original}. We are asked to compute P (D D ). Note that P (D D ) = P (D D ). P (D ) We already computed P (D ). To compute P (D D ), let us condition on O. P (D D ) = P (D D O) P (O) + P (D D O c ) P (O c ) = ( p) (7/) + p (3/). Thus we have, P (D D ) = 7 4p + p 7 4p

4 5. An urn initially contains 5 white balls and black ball. Your friend removes balls from the urn randomly so that there are now a total of 4 balls. (a) What is the probability that the black ball is still in the urn? (b) Suppose you pick randomly one of the remaining balls in the urn. What is the probability that the ball you pick is black? (Hint : Think in terms of events. You do not need sophisticated counting tricks.) (a) We need to compute the probability of the event B = the black ball is still in the urn after the removal of two balls}. Let us define the events W = the first removal is white}, W = the second removal is white}. Notice that B = W W. Therefore, P (B) = P (W W ) = P (W W ) P (W ) = (4/5) (5/6) = /3. (b) We are asked to compute the probability of the event C = the ball you pick is black}. This is easier to compute if we condition on B. P (C) = P (C B) P (B) + P (C B c ) P (B c ) = (/4) (/3) + (/3) = /6. 3

5 MAT 7E Homework Due.3.6. Consider a network with four nodes (A, B, C, D) and four links (l, l, l 3, l 4 ) as shown below. A l C l 4 D l B l 3 Assume that at a particular time, a particular link is operational with probability p. Assume also that the conditions of the links (i.e. whether they are operational or not) are independent of each other. We say that two nodes can communicate if there exists at least one path with operational links between them. (a) Compute the probability that A can communicate with D. (b) Compute the probability that B can communicate with C. (c) Compute the probability that A and B can both communicate with C. (d) Compute the probability that A and B can both communicate with D. Let us define the events E k = l k is operational}, for k =,,..., 5, to be used in the sequel. We are given that E k are independent and P (E k ) = p. (a) Define the event, E AD = A can communicate with D}. We are asked to find P (E AD ). Observe that E AD = E E 4. By the independence of E k s, we have P (E AD ) = P (E ) P (E 4 ) = p. (b) Define the event, E BC = B can communicate with C}. Observe that E BC = E E 3. Note that P (E E 3 ) = P (E ) + P (E 3 ) P (E E 3 ) = p p. (c) Define the event E AC = A can communicate with C}. We are asked to compute P (E AC E BC ). Notice that E AC E BC = E (E cape 3 ). Again, by the independence of E from E, E 3, we find that P (E AC E BC ) = p (p p ). (d) Define the events E CD = C can communicate with D}, E AD = A can communicate with D}, E BD = B can communicate with D}. Observe that E AD E BD = E AC E BC E CD. Observe that E CD = E 4. Thanks to independence, we find P (E AD E BD ) = P (E AC E BC ) P (E CD ) = p (p p ).. Suppose that a coin is tossed five times. The coin is biased and P (Head) = p. Assume that the tosses are independent. (a) Consider the event A = all tosses are Tails}. Compute P (A), the probability of A. (b) Consider the event B = at least one Head occurs}. Compute P (B). (Hint : Note that at least one means one or more than one. Think of how A and B are related.) (c) Consider the event C = at least one Tail occurs}. Given the event B in (b), compute P (C B), the conditional probability of C given B. Let us define the events E k = k th toss is a Head}, for k =,,..., 5. By assumption, E k s are independent.

6 (a) Notice that A = E c E c E c 5. Thanks to independence, we find P (E) = 5 k= P (E k) = ( p) 5. (b) Notice that B = A c. Therefore, P (B) = ( p) 5. (c) Notice that B C = there is at least one Tail and at least one Head}. The complement of this event is, (B C) c = HHHHH, T T T T T }. Therefore, P (B C) = P ( (B C) c) = p 5 ( p) 5. So, P (C B) = P (B C) P (B) = p5 ( p) 5 ( p) You toss an unbiased coin until you observe two Heads (in total) once you observe two Heads, you stop tossing. Assume that each toss is independent of the previous tosses. (a) What is the probability that you stop after two tosses? (b) What is the probability that you stop after three tosses? (c) Let X be the number of tosses. Write down the probability mass function (PMF) of X. Let us define the events, E k = you stop after k tosses}, for k. (a) Notice that E = H H}. Since the coin is unbiased, P (H) = /. Therefore, by independence, P (E ) = (/). (b) Notice that E 3 = HT H, T HH}. Therefore, P (E 3 ) = (/) 3. (c) Observe that the event X = k} is equivalent to E k. Notice that E k contains sequences of H, T that end with H and that contain a single H in the remaining k entries. There are k such sequences and the probability of each sequence is (/) k. Therefore, P (E k ) = (k ) (/) k. Thus the PMF of X is given as, (k ) (/) k, if k is an integer greater than, P X (k) = P (X = k) =, otherwise. 4. Suppose X is a random variable whose probability mass function (PMF) is given as c k, if k is an integer in [, ], P X (k) =, otherwise, where c is a constant. Suppose also that Y = X. (a) Determine c. (b) Compute the probability of the event X } (give your answer in terms of c if you have no answer for part (a)). (c) Compute the probability of the event Y }. (d) Find P Y, the PMF of Y. (a) Recall that P X sums to one. Therefore, we must have, k= c k = c 3 4 = = c = 4 3. (b) Notice that X } = X = } X = } X = } X = }. Thus, P (X ) = k= c k = 3 3. (c) Notice that Y } = X = } X = } X = }. Thus, P (Y ) = k= c k = 4 3. (d) Observe that Y = } = X = } X = }, Y = } = X = } X = }, Y = } = X = }. Therefore, P X () = (4/3), if k =, P X ( ) + P X () = (/3), if k =, P (Y = k) = P X ( ) + P X () = (7/3), if k =,, otherwise.

7 MAT 7E Homework 3 Due There are balls in an urn, numbered from to. You draw a ball and then without putting it back, draw another ball. Let X be the number on the first ball. Let Y be the number on the second ball. (a) Find P X (k), the probability mass function (PMF) of X. (b) Find P Y (k), the PMF of Y. (c) Let us define a new random variable Z = X + Y. Find the expected value of Z. (Hint : Recall that k n = n= k (k + ).) (a) Notice that any number is equally likely. Therefore the PMF of X is, if k,,..., }, P X (k) =, otherwise. (b) By symmetry we can say that P (Y = k) = P (Y = j) where k and j are two integers in the range [, ]. Therefore P Y (k) = P X (k). For a more formal argument, we can condition on X. That is, if k is an integer in the range [, ], we have P (Y = k) = P (Y = k X = k) P (X = k) + P (Y = k X k) P (X k) = + 9 Thus, P Y (k) = P X (k). (c) Note that E(Z) = E(X) + E(Y ). First, E(X) = n= n = =. Since P X (k) = P Y (k), we have E(X) = E(Y ). Therefore, E(Z) =.. Suppose that the PMF of a random variable X is given as ( k P X (k) = 3) 3, if k is a positive integer,, otherwise. Also, let Y be another random variable defined as Y = X. (a) Find the PMF of Y. (b) Find E(Y ), the expected value of Y. (Note : Recall that n= pn = p/( p) if p <.) (a) Notice that P (X = k) = P (Y = k ). Therefore, ( log (n) P Y (n) = 3) 3, if log (n) is a positive integer,, otherwise. (b) We compute E(Y ) as, E(Y ) = ( ) k k 3 3 = 4 3 k= k= ( ) k = A fair die is rolled once. Based on this experiment, two random variables are defined as,, if the outcome is even, X =, if the outcome is odd,, if the outcome 3, Y =, if the outcome > 3. Also, a random variable, Z, is defined as Z = X Y. 9 =.

8 (a) Write down the joint probability mass function (PMF) of X and Y (in the form of a table, if you like). (b) Write down the PMF of Z. (c) Find E(Z), the expected value of Z. (a) Notice that for this experiment, the sample space is Ω =,, 3, 4, 5, 6}. The event (X =, Y = ) is equivalent to }. Therefore P X,Y (, ) = /6. The event (X =, Y = ) is equivalent to 4, 6}, so P X,Y (, ) = /6. The event (X =, Y = ) is equivalent to, 3}, so P X,Y (, ) = /6. Finally, the event (X =, Y = ) is equivalent to 5}, sop X,Y (, ) = /6. P X,Y (x, y) = for any other (x, y) pair. (b) Z can take at most four different values with non-zero probability. Those are, (when X =, Y = ); (when X =, Y = ), (when X =, Y = ); (when X =, Y = ). Therefore, P X,Y (, ) = /6, if k =, P X,Y (, ) = /6, if k =, P Z (k) = P X,Y (, ) = /6, if k =, P X,Y (, ) = /6, if k =,, otherwise. (c) We can compute E(Z) using P Z as, E(Z) = = Consider a square whose corners are labeled as c i (see below). c 3 c 4 c c A particle is placed on one of the corners and starts moving from one corner to another connected by an edge at each step. Notice that each corner is connected to only two corners. If the particle reaches c 4, it is trapped and stops moving. Assume that the steps taken by the particle are independent and the particle chooses its next stop randomly (i.e., all possible choices are equally likely). Suppose that the particle is initially placed at c. Also, let X denote the total number of steps taken by the particle to reach c 4. (a) Find the probability that X =. (b) Find the probability that X =. (c) Find the probability that X = 4. (d) Write down P X, the probability mass function (PMF) of X. (e) Compute E(X), the expected value of X. A reasonable sample space for this experiment consists of sequences of c i that end with c 4, where the only c 4 occurs at the end. In the following, let A k denote the event X = k. (a) The particle needs at least two steps, therefore A = and P (X = ) =. (b) Notice that A = c c 4, c 3 c 4 }. The probability of any element of A is equal to /4. Therefore P (A ) = /. (c) Notice that A 4 = c c c c 4, c c c 3 c 4, c 3 c c c 4, c 3 c c 3 c 4 }. Each sequence in A 4 can be formed by adding in front c c or c 3 c. Thus the number of sequences is doubled compared to A but the probability of each sequence is now / 4. Therefore, P (A 4 ) = 4/ 4 =. 4/ (d) Notice that the pattern observed in part (c) holds throughout all k. That is, if k is even, then P (A k ) = k/ k =. If k is odd, then P (A k) =. Therefore, k/ k/, if k is a positive even integer, P X (k) =, otherwise. (e) We compute E(X) using the definition as, E(X) = k>, k even k k/ = n n = n n= n= ( ) n = ( /) = 4. Note: To evaluate the series, consider differentiating both sides of k pk =, for p <. p

9 MAT 7E Homework 4 Due Consider a random variable X whose probability density function (pdf) is as shown below. 3 f X (t) t Let us define the events A and B as A = X > }, B = X > }. (a) Compute the probability of A. (b) Compute the probability of B. (c) Compute the conditional probability P (A B). (d) Compute E(X). (a) Notice that (b) Notice that (c) We have P (X > ) = f X (t) dt = 3 dt = 3. P ( X > ) = P (X < ) + P (X > ) = f X (t) dt + f X (t) dt = + 3 = 5. P (A B) = P (A B) P (B) = P (X > ) P (B) = 3 5 = 4 5. (d) Using the definition of expected value, we compute E(X) = t f X (t) dt = ( t 6 t + ) dt + 3 t 3 dt = Let X be a random variable. Suppose we are given the following information regarding its distribution : P ( X t } ) = e t, P ( X t} ) = e t e t. (a) Compute the probability of the event A = X }. (b) Determine F X (t), the cumulative distribution function of X. (c) Determine f X (t), the probability density function of X. (a) Notice that we can express A as, A = ( X ) ( < X ). }}}} B C P (B) can be computed easily as, P (B) = e e. For P (C), observe that C (X < ) = (X ). }} D Since C and D are disjoint, we obtain P (C) = P (X ) P (X < ) = e ( ) e = ( e e ).

10 (b) Observe that if t >, P (X t) = e t. If t <, notice that (X < t) (t X t) = (X t). Therefore, since the two events on the left hand side are disjoint, we can write P (X t) = et ( et ) et = et. Thus the cdf of X is, F X (t) = et, if t <, e t, if t. (c) Differentiating the cdf, we find the pdf as, e t, if t <, f X (t) = e t, if t. 3. Let X be a random variable whose probability density function (pdf) is given by, t/, for t [, ], f X (t) =, for t / [, ]. Also, let A be the event A = X }. (a) Determine F X (t) = P ( X t} ), the cumulative distribution function (cdf) of X. (b) Compute E(X), the expected value of X. (a) Integrating the pdf, we find the cdf as,, if t <, t t F X (t) = f X (s) ds =, if t, 4, if < t. (b) We compute the expected value as, E(X) = t f X (t) dt = t t dt = Let X be a continuous random variable. Suppose we are given that, t t P t X t } = (t + )(t + ), if t t,, if t t <. (a) Find the probability of the event X }. (b) Find the probability of the event X }. (c) Find f X (t), the probability density function (pdf) of X. (a) Notice that t P (X ) = lim P ( X t) = lim t t (t + ) =. (b) Observe that P ( X ) = P (X < ) + P ( X ) = = 3 4. (c) Observe that P (X ) =. Therefore if t >, we have P (X t) = P ( X t) = t + t +. From this, we obtain the cdf of X as,, if t <, F X (t) = t +, if t. t + Differentiating the cdf, we obtain the pdf as, if t <, f X (t) =, if t. (t + )

11 MAT 7E Homework 5 Due Consider the triangle below. Suppose we choose a point randomly inside the triangle (i.e. any point inside the triangle is equally likely to be chosen). Let H denote the vertical distance of the point to the base of the triangle. H (a) Specify f H (t), the probability distribution function (pdf) of H. (b) Compute E(H), the expected value of H. (a) Consider the cdf F H (t) = P (H t). Observe that, H always. Therefore,, if t <, F H (t) =, if < t. Finally for t, since the point is chosen randomly from the triangle, we have, P (H t) = ( t). Differentiating the cdf, we find the pdf as,, if t <, f H (t) = ( t), if t,, if < t. (b) Using the definition of expected value, we can now compute, E(H) = t f H (t) dt = t t dt = 3.. Suppose we break a unit length stick into two pieces randomly (i.e., we choose the break point randomly on the stick). Let X be the length of the shorter piece and Y be the length of the longer piece. (a) Compute E(X). (Hint : If you cannot compute this directly, do part (b) first.) (b) Find the pdf of X. (c) Find the pdf and compute the expected value of Y. (a) Suppose Z is uniformly distributed on [, ]. Then, X = min(z, Z). We can compute the expected value of X as, E(X) = min(t, t) f Z (t) dt = where we used the observation, t, if t /, min(t, t) = t, if / < t. min(t, t) dt = / t dt + ( t) dt = / = 4, (b) We will make use of the cdf. Notice that for Z defined as in (a), the event (X t) can be written equivalently as, (min(z, Z) t) = (Z t) ( Z t) }}}} E E. Notice that E = ( t Z). Recall that P (E E ) = P (E ) + P (E ) P (E E ). Observe,, if t <, P (E ) = t, if t,, if < t.

12 Check that P (E ) = P (E ). Finally, observe that, if t < /, P (E E ) = P ( t Z t) = t, if / t,, if < t. Thus, we obtain the cdf of X as,, if t <, F X (t) = P (E ) + P (E ) P (E E ) = t, if t /,, if / < t. Differentiating, we obtain the pdf as, if t <, f X (t) =, if t /,, if / < t. Note : An alternative is to observe that E E = Ω (E c E c ) and work with an intersection instead of a union like I did above. This in fact leads to a shorter derivation. Try it! (c) Notice that Y = X. Therefore, E(Y ) = E( X) = E(X) = 3/4. Also, F Y (t) = P (Y t) = P ( X t) = P ( t X) = P (X t) = F X ( t). Differentiating, we find the pdf of Y as,, if t <,, if < t, f Y (t) = f X ( t) =, if t /, =, if / t,, if / < t, if t < /. 3. Consider a disk with unknown radius R, centered around the origin O. Suppose your friend picks a point p on this disk randomly (i.e., any point is equally likely). Also, let the distance of p to the origin be denoted as X (see the figure). p X (a) Find the probability of the event X t} (in terms of R) for t R. O R (b) Find f X (t), the pdf of X. (a) Notice that the ratio of the points inside the disk that are closer to the origin than t to the whole disk is given by t /R. Therefore, P (X t) = t /R. (b) It follows from the observation in (a) that, if t <, F X (t) = t /R, if t R,, if R < t. Differentiating, we obtain the pdf as,, if t <, f X (t) = t/r, if t R,, if R < t.

13 MAT 7E Homework 6 Due Let X and Y be independent random variables. Assume that both X and Y are uniformly distributed on [, ]. (a) Compute the probability of the event A = X Y }. (b) Given the event A = X Y }, compute the probability that X /. (c) Given the event A = X Y }, determine F X A (t), the conditional cumulative distribution function (cdf) of X. (d) Given the event A = X Y }, find the conditional expectation of X. (a) Recall that P (A) = P (A Y = t) f Y (t) dt = P (A Y = t) dt. But for t, thanks to independence of X and Y, we have Therefore, P (A Y = t) = P (X Y Y = t) = P (X t Y = t) = P (X t) = t. P (A) = t dt =. (b) Let B be the event that X /. Recall that P (B A) = P (B A)/P (A). Let C = B A. Then, P (C) = P (C Y = t) f Y (t) dt = P (C Y = t) dt. But thanks to independence of X and Y, for t, we have P (C Y = t) = P ((X t) (X /) Y = t) = P ((X t) (X /)) = Therefore, P (C) = P (C Y = t) dt = / It follows that P (B A) = P (C)/P (A) = 3/4. t dt + / dt = 3 8. For s, let us define the event C = (X s) A. Observe that, Therefore, Thus, P (C Y = t) = P ((X t) (X s) Y = t) = P ((X t) (X s)) = P (C) = P (C Y = t) dt = F X A (s) = P (X s A) = s t dt + s dt = s s + ( s) s = s s. P ((X s) A) P (A) Notice that, if t <, f X A (t) = F X A (t) = t, if t,, if < t. Therefore, E(X A) = f f X A (t) dt = t ( t) dt = 3., if s <, = s s, if s,, if < s. t, if t /, /, if / < t. t, if t s, s, if s < t.

14 . Let X be a random variable, uniformly distributed on [, ]. Also, let Y = max ( X, X ). (a) Compute the probability of the event A = Y /}. (b) Find the cumulative distribution function (cdf) of Y. (c) Find E(Y ), the expected value of Y. (a) Notice that P (Y /) = P (max(x, X ) /) = P ((X /) (X /)) = P ((X /) ( / X / )) = P ( / X /) = (b) By a similar reasoning, we find that for s, ( + ) + = 4. P (Y s) = P ((X s) (X s)) = P ( s X s) = (s + s). It follows that, if s <, F Y (s) = P (Y s) = (s + s), if s,, if < s. (c) Since Y is a function of X, we can use f X. We have, E(Y ) = max(t, t ) f X (t) dt = max(t, t ) dt = ( t dt + where we used max(t, t ) = t, if t <, t, if t. ) t dt = 5, 3. Let Y be a random variable, uniformly distributed on [, X]. Let Z be a random variable, uniformly distributed on [, X ]. Suppose that, given X, the random variables Y and Z are independent. (a) For X = /, compute the probability of the event Y Z}. (b) Assume that X is a random variable, uniformly distributed on [, ]. Compute the probability of the event Y Z}. (a) For X = /, Y is uniformly distributed on [, /] and Z is uniformly distributed on [, /4]. By the independence of Y and Z, we have, /4 /4 P (Y Z) = P (Y Z Z = t) f Z (t)dt = P (Y t) 4 dt = (t) 4 dt = 4. (b) Given X = s with s, Y is uniformly distributed on [, s] and Z is uniformly distributed on [, s ]. Noting that s s, we then have, P (Y Z X = s) = P (Y Z Z = t) f Z (t)dt = Therefore, P (Y Z) = P (Y Z X = s) f X (s) ds = 4. Let X be a discrete random variable with PMF, /, if k,,..., }, P X (k) =, otherwise. Also, let Y be a random variable with PMF, /X, if k,..., X}, P Y (k) =, otherwise. s s ds = 4. P (Y t) s s dt = t s s dt = s.

15 (a) Find the probability of the event Y 9}. (b) Find the probability of the event Y 8}. (a) Notice that if (X 9) occurs, then (Y 9) occurs with probability one. Observe also that the events (X 9) and (X = ) partition the sample space. Using these, we compute, P (Y 9) = P (Y 9 X 9) P (X 9) + P (Y 9 X = ) P (X = ) = (b) Reasoning similarly, P (Y 8) = P (Y 8 X 8) P (X 8) + P (Y 8 X = 9) P (X = 9) + P (Y 8 X = ) P (X = ) = = Suppose we roll a fair die until we observe a 6. Assume that the tosses are independent. Let X be the total sum of the rolls. (a) Find the probability that we roll n times, where n. (b) Let A be the event that we roll twice. Compute E(X A ). (c) Compute E(X). (a) Observe that we roll n times if we do not observe a 6 for n rolls, followed by a 6. Thanks to independence of the rolls, the probability of this event (denoted by A n ) is given by P (A n ) = ( ) n (b) Given that a certain roll is not a 6, the roll is uniformly distributed over the integers in [, 5]. Therefore, its expected value is 5 k= k = 3. Using this observation, 5 E(X A ) = E(first roll A ) + E(second roll A ) = = 9. (c) Since the event A, A,... partition the sample space, we have, E(X) = Noting that E(X A n ) P (A n ). n= E(X A n ) = 3n + 5, we thus compute E(X) = (3n + 5) n= ( ) n = 3.. 3

16 MAT 7E Homework 7 Due.5.6. Let X be a random variable, uniformly distributed on [, ].Also, let Z be another random variable defined as Z = X α, where α is an unknown constant. (a) Compute f Z (t), the pdf of Z, in terms of α. (You can assume that α.) (b) Suppose we are given two independent observations of Z as z, z. Find the maximum likelihood (ML) estimate of α, in terms of z, z. (c) Evaluate the ML estimate you found in part (b) for z = e 3, z = e 4. (a) Note that P (Z < ) = and P (Z > ) =. For t, we have P (Z t) = P (X α t) = P (X t α ) = t α. Differentiating, we find the pdf of Z as,, if t / [, ], f Z (t) = α t/α, if t [, ]. (b) The likelihood function for α is L(α) = α (z z ) /α. Note that z z = e c (for c = ln(z z )). Setting the derivative of L(α) to zero, we find that ML estimate should satisfy, L (α) = α 3 ec/α c c α 4 ec/α c =. Solving this we find the ML estimate as, α ML = c = ( ln z z (c) Plugging in the values for z and z, we find the ML estimate as α ML = 7/. ).. Consider a disk with unknown radius R, centered around the origin O. Suppose your friend picks a point p on this disk randomly (i.e., any point is equally likely). Also, let the distance of p to the origin be denoted as X (see the figure). p X O R (a) Find the probability of the event X t} (in terms of R) for t R. (b) Find f X (t), the pdf of X. (c) Consider ˆR = 3X/ as an estimator for R. Is ˆR biased or unbiased? If it is biased, propose an unbiased estimator for R. If it is unbiased, explain why. (a) For t R, we find P (X t) by computing the ratio of areas of a disk with radius t and R as P (X t) = πt πr. (b) Differentiating the cdf, we find,, if t <, t f X (t) =, if t R, R, if R < t. (c) We compute E( ˆR) = 3 R t f X(t) dt = 3 t t dt = R. R The estimator is unbiased. (Something to think about : Consider the same question for a sphere instead of a disk)

17 3. Suppose X is a random variable, uniformly distributed on [, ]. That is, the probability density function (pdf) of X is given by,, if t [, ], f X (t) =, if t / [, ]. Let Y = θ X, where θ is a constant such that θ >. (a) Find f Y (t), the pdf of Y (possibly in terms of θ). (b) Compute the probability of the event Y θ Y + } (possibly in terms of θ). (a) Observe that if Y can take values in the range [, θ]. Now if t θ, then we have P (Y t) = P (X t/θ) = t/θ. Differentiating this, we find, f Y (t) = θ, if t [, θ],, if t / [, θ]. (b) We rewrite the event as, (Y θ Y +) = (Y θ) (θ Y +) = (Y θ +) (θ Y ) = (θ Y θ +). Therefore, P (Y θ Y + ) = P (θ Y θ + ) = θ+ θ f Y (t) dt = θ θ θ dt = θ. 4. Suppose X is a continuous random variable uniformly distributed on [, ]. Note that the pdf of X is given by /, if t, f X (t) =, otherwise. Also, let Y = θ X, where θ is an unknown non-negative constant (so that θ = θ ). (a) Compute E(Y ), and E( Y ), possibly in terms of the unknown θ. (b) Find some c (possibly in terms of θ) such that P Y c θ Y + c} = /3. (c) Find an unbiased estimator for θ in terms of Y. (a) We compute these expected values using the pdf of X. E(Y ) = θ E(X ) = θ E( Y ) = θ E( X ) = θ (b) We rewrite the event as, t dt = θ 3. t dt = θ t dt + θ t dt = θ. ( Y c θ Y +c) = ( Y c θ) (θ Y +c) = ( Y θ+c) (θ c Y ) = (θ c Y θ+c) This can be expressed in terms of X as, Therefore, θ c Y θ + c} = c/θ X + c/θ}. P ( Y c θ Y + c) = P ( c/θ X + c/θ) = c θ For this probability to be /3, we must have c = θ/3. (c) By part (a), we know E( Y ) = θ/. Therefore an unbiased estimator for θ is ˆθ = Y.

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 71E Probability and Statistics Spring 013 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 1.30, Wednesday EEB 5303 10.00 1.00, Wednesday

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 271E Probability and Statistics Spring 2011 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 16.30, Wednesday EEB? 10.00 12.00, Wednesday

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Statistics 100A Homework 1 Solutions

Statistics 100A Homework 1 Solutions Problem Statistics 00A Homework Solutions Ryan Rosario Suppose we flip a fair coin 4 times independently. () What is the sample space? By definition, the sample space, denoted as Ω, is the set of all possible

More information

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM

MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM MATH 3510: PROBABILITY AND STATS July 1, 2011 FINAL EXAM YOUR NAME: KEY: Answers in blue Show all your work. Answers out of the blue and without any supporting work may receive no credit even if they are

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations

Discrete Mathematics and Probability Theory Fall 2014 Anant Sahai Note 15. Random Variables: Distributions, Independence, and Expectations EECS 70 Discrete Mathematics and Probability Theory Fall 204 Anant Sahai Note 5 Random Variables: Distributions, Independence, and Expectations In the last note, we saw how useful it is to have a way of

More information

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events

LECTURE 1. 1 Introduction. 1.1 Sample spaces and events LECTURE 1 1 Introduction The first part of our adventure is a highly selective review of probability theory, focusing especially on things that are most useful in statistics. 1.1 Sample spaces and events

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

The probability of an event is viewed as a numerical measure of the chance that the event will occur.

The probability of an event is viewed as a numerical measure of the chance that the event will occur. Chapter 5 This chapter introduces probability to quantify randomness. Section 5.1: How Can Probability Quantify Randomness? The probability of an event is viewed as a numerical measure of the chance that

More information

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems.

More information

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B).

P (A B) P ((B C) A) P (B A) = P (B A) + P (C A) P (A) = P (B A) + P (C A) = Q(A) + Q(B). Lectures 7-8 jacques@ucsdedu 41 Conditional Probability Let (Ω, F, P ) be a probability space Suppose that we have prior information which leads us to conclude that an event A F occurs Based on this information,

More information

1 Probability Theory. 1.1 Introduction

1 Probability Theory. 1.1 Introduction 1 Probability Theory Probability theory is used as a tool in statistics. It helps to evaluate the reliability of our conclusions about the population when we have only information about a sample. Probability

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 16. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Spring 206 Rao and Walrand Note 6 Random Variables: Distribution and Expectation Example: Coin Flips Recall our setup of a probabilistic experiment as

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2013 Vazirani Note 12. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 203 Vazirani Note 2 Random Variables: Distribution and Expectation We will now return once again to the question of how many heads in a typical sequence

More information

the time it takes until a radioactive substance undergoes a decay

the time it takes until a radioactive substance undergoes a decay 1 Probabilities 1.1 Experiments with randomness Wewillusethetermexperimentinaverygeneralwaytorefertosomeprocess that produces a random outcome. Examples: (Ask class for some first) Here are some discrete

More information

Introduction to Stochastic Processes

Introduction to Stochastic Processes Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Solutions 5 Spring 2006 Review problems UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Solutions 5 Spring 006 Problem 5. On any given day your golf score is any integer

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad

MAT2377. Ali Karimnezhad. Version September 9, Ali Karimnezhad MAT2377 Ali Karimnezhad Version September 9, 2015 Ali Karimnezhad Comments These slides cover material from Chapter 1. In class, I may use a blackboard. I recommend reading these slides before you come

More information

SDS 321: Introduction to Probability and Statistics

SDS 321: Introduction to Probability and Statistics SDS 321: Introduction to Probability and Statistics Lecture 2: Conditional probability Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation

Probabilistic Systems Analysis Spring 2018 Lecture 6. Random Variables: Probability Mass Function and Expectation EE 178 Probabilistic Systems Analysis Spring 2018 Lecture 6 Random Variables: Probability Mass Function and Expectation Probability Mass Function When we introduce the basic probability model in Note 1,

More information

HW Solution 2 Due: July 10:39AM

HW Solution 2 Due: July 10:39AM ECS 35: Probability and Random Processes 200/ HW Solution 2 Due: July 9 @ 0:39AM Lecturer: Prapun Suksompong, Ph.D. Instructions (a) A part of ONE question will be graded. Of course, you do not know which

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14

Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 14 Introduction One of the key properties of coin flips is independence: if you flip a fair coin ten times and get ten

More information

1 What is the area model for multiplication?

1 What is the area model for multiplication? for multiplication represents a lovely way to view the distribution property the real number exhibit. This property is the link between addition and multiplication. 1 1 What is the area model for multiplication?

More information

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS 6.0/6.3 Spring 009 Quiz Wednesday, March, 7:30-9:30 PM. SOLUTIONS Name: Recitation Instructor: Question Part Score Out of 0 all 0 a 5 b c 5 d 5 e 5 f 5 3 a b c d 5 e 5 f 5 g 5 h 5 Total 00 Write your solutions

More information

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2012 Vazirani Note 14. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 202 Vazirani Note 4 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected in, randomly

More information

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom 1 Learning Goals Probability: Terminology and Examples Class 2, 18.05 Jeremy Orloff and Jonathan Bloom 1. Know the definitions of sample space, event and probability function. 2. Be able to organize a

More information

MATH 3C: MIDTERM 1 REVIEW. 1. Counting

MATH 3C: MIDTERM 1 REVIEW. 1. Counting MATH 3C: MIDTERM REVIEW JOE HUGHES. Counting. Imagine that a sports betting pool is run in the following way: there are 20 teams, 2 weeks, and each week you pick a team to win. However, you can t pick

More information

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th

HW2 Solutions, for MATH441, STAT461, STAT561, due September 9th HW2 Solutions, for MATH44, STAT46, STAT56, due September 9th. You flip a coin until you get tails. Describe the sample space. How many points are in the sample space? The sample space consists of sequences

More information

CS 246 Review of Proof Techniques and Probability 01/14/19

CS 246 Review of Proof Techniques and Probability 01/14/19 Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we

More information

Probability & Statistics - FALL 2008 FINAL EXAM

Probability & Statistics - FALL 2008 FINAL EXAM 550.3 Probability & Statistics - FALL 008 FINAL EXAM NAME. An urn contains white marbles and 8 red marbles. A marble is drawn at random from the urn 00 times with replacement. Which of the following is

More information

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STAT 430/510 Probability Lecture 7: Random Variable and Expectation STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula

More information

Recursive Estimation

Recursive Estimation Recursive Estimation Raffaello D Andrea Spring 08 Problem Set : Bayes Theorem and Bayesian Tracking Last updated: March, 08 Notes: Notation: Unless otherwise noted, x, y, and z denote random variables,

More information

4. Probability of an event A for equally likely outcomes:

4. Probability of an event A for equally likely outcomes: University of California, Los Angeles Department of Statistics Statistics 110A Instructor: Nicolas Christou Probability Probability: A measure of the chance that something will occur. 1. Random experiment:

More information

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin

Random Variables. Statistics 110. Summer Copyright c 2006 by Mark E. Irwin Random Variables Statistics 110 Summer 2006 Copyright c 2006 by Mark E. Irwin Random Variables A Random Variable (RV) is a response of a random phenomenon which is numeric. Examples: 1. Roll a die twice

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 3 9/10/2008 CONDITIONING AND INDEPENDENCE Most of the material in this lecture is covered in [Bertsekas & Tsitsiklis] Sections 1.3-1.5

More information

Name: Firas Rassoul-Agha

Name: Firas Rassoul-Agha Midterm 1 - Math 5010 - Spring 016 Name: Firas Rassoul-Agha Solve the following 4 problems. You have to clearly explain your solution. The answer carries no points. Only the work does. CALCULATORS ARE

More information

Machine Learning: Homework Assignment 2 Solutions

Machine Learning: Homework Assignment 2 Solutions 10-601 Machine Learning: Homework Assignment 2 Solutions Professor Tom Mitchell Carnegie Mellon University January 21, 2009 The assignment is due at 1:30pm (beginning of class) on Monday, February 2, 2009.

More information

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom Conditional Probability, Independence and Bayes Theorem Class 3, 18.05 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of conditional probability and independence of events. 2.

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

18.05 Practice Final Exam

18.05 Practice Final Exam No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For

More information

Deep Learning for Computer Vision

Deep Learning for Computer Vision Deep Learning for Computer Vision Lecture 3: Probability, Bayes Theorem, and Bayes Classification Peter Belhumeur Computer Science Columbia University Probability Should you play this game? Game: A fair

More information

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events

Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Lecture 8: Conditional probability I: definition, independence, the tree method, sampling, chain rule for independent events Discrete Structures II (Summer 2018) Rutgers University Instructor: Abhishek

More information

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018

Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 Mathematical Foundations of Computer Science Lecture Outline October 18, 2018 The Total Probability Theorem. Consider events E and F. Consider a sample point ω E. Observe that ω belongs to either F or

More information

STA111 - Lecture 1 Welcome to STA111! 1 What is the difference between Probability and Statistics?

STA111 - Lecture 1 Welcome to STA111! 1 What is the difference between Probability and Statistics? STA111 - Lecture 1 Welcome to STA111! Some basic information: Instructor: Víctor Peña (email: vp58@duke.edu) Course Website: http://stat.duke.edu/~vp58/sta111. 1 What is the difference between Probability

More information

MATH Solutions to Probability Exercises

MATH Solutions to Probability Exercises MATH 5 9 MATH 5 9 Problem. Suppose we flip a fair coin once and observe either T for tails or H for heads. Let X denote the random variable that equals when we observe tails and equals when we observe

More information

Independence Solutions STAT-UB.0103 Statistics for Business Control and Regression Models

Independence Solutions STAT-UB.0103 Statistics for Business Control and Regression Models Independence Solutions STAT-UB.003 Statistics for Business Control and Regression Models The Birthday Problem. A class has 70 students. What is the probability that at least two students have the same

More information

Probability, Random Processes and Inference

Probability, Random Processes and Inference INSTITUTO POLITÉCNICO NACIONAL CENTRO DE INVESTIGACION EN COMPUTACION Laboratorio de Ciberseguridad Probability, Random Processes and Inference Dr. Ponciano Jorge Escamilla Ambrosio pescamilla@cic.ipn.mx

More information

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution

More information

Elementary Discrete Probability

Elementary Discrete Probability Elementary Discrete Probability MATH 472 Financial Mathematics J Robert Buchanan 2018 Objectives In this lesson we will learn: the terminology of elementary probability, elementary rules of probability,

More information

Dynamic Programming Lecture #4

Dynamic Programming Lecture #4 Dynamic Programming Lecture #4 Outline: Probability Review Probability space Conditional probability Total probability Bayes rule Independent events Conditional independence Mutual independence Probability

More information

Properties of Probability

Properties of Probability Econ 325 Notes on Probability 1 By Hiro Kasahara Properties of Probability In statistics, we consider random experiments, experiments for which the outcome is random, i.e., cannot be predicted with certainty.

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

Lecture 9: Conditional Probability and Independence

Lecture 9: Conditional Probability and Independence EE5110: Probability Foundations July-November 2015 Lecture 9: Conditional Probability and Independence Lecturer: Dr. Krishna Jagannathan Scribe: Vishakh Hegde 9.1 Conditional Probability Definition 9.1

More information

Review. December 4 th, Review

Review. December 4 th, Review December 4 th, 2017 Att. Final exam: Course evaluation Friday, 12/14/2018, 10:30am 12:30pm Gore Hall 115 Overview Week 2 Week 4 Week 7 Week 10 Week 12 Chapter 6: Statistics and Sampling Distributions Chapter

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Homework 2. Spring 2019 (Due Thursday February 7)

Homework 2. Spring 2019 (Due Thursday February 7) ECE 302: Probabilistic Methods in Electrical and Computer Engineering Spring 2019 Instructor: Prof. A. R. Reibman Homework 2 Spring 2019 (Due Thursday February 7) Homework is due on Thursday February 7

More information

Chapter 1 Probability Theory

Chapter 1 Probability Theory Review for the previous lecture Eample: how to calculate probabilities of events (especially for sampling with replacement) and the conditional probability Definition: conditional probability, statistically

More information

PRACTICE PROBLEMS FOR EXAM 1

PRACTICE PROBLEMS FOR EXAM 1 PRACTICE PROBLEMS FOR EXAM 1 Math 3160Q Spring 01 Professor Hohn Below is a list of practice questions for Exam 1. Any quiz, homework, or example problem has a chance of being on the exam. For more practice,

More information

Statistics and Econometrics I

Statistics and Econometrics I Statistics and Econometrics I Random Variables Shiu-Sheng Chen Department of Economics National Taiwan University October 5, 2016 Shiu-Sheng Chen (NTU Econ) Statistics and Econometrics I October 5, 2016

More information

MAS108 Probability I

MAS108 Probability I 1 BSc Examination 2008 By Course Units 2:30 pm, Thursday 14 August, 2008 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators.

More information

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch

Monty Hall Puzzle. Draw a tree diagram of possible choices (a possibility tree ) One for each strategy switch or no-switch Monty Hall Puzzle Example: You are asked to select one of the three doors to open. There is a large prize behind one of the doors and if you select that door, you win the prize. After you select a door,

More information

Mutually Exclusive Events

Mutually Exclusive Events 172 CHAPTER 3 PROBABILITY TOPICS c. QS, 7D, 6D, KS Mutually Exclusive Events A and B are mutually exclusive events if they cannot occur at the same time. This means that A and B do not share any outcomes

More information

Lecture 6 - Random Variables and Parameterized Sample Spaces

Lecture 6 - Random Variables and Parameterized Sample Spaces Lecture 6 - Random Variables and Parameterized Sample Spaces 6.042 - February 25, 2003 We ve used probablity to model a variety of experiments, games, and tests. Throughout, we have tried to compute probabilities

More information

P (E) = P (A 1 )P (A 2 )... P (A n ).

P (E) = P (A 1 )P (A 2 )... P (A n ). Lecture 9: Conditional probability II: breaking complex events into smaller events, methods to solve probability problems, Bayes rule, law of total probability, Bayes theorem Discrete Structures II (Summer

More information

Introduction to Probability 2017/18 Supplementary Problems

Introduction to Probability 2017/18 Supplementary Problems Introduction to Probability 2017/18 Supplementary Problems Problem 1: Let A and B denote two events with P(A B) 0. Show that P(A) 0 and P(B) 0. A A B implies P(A) P(A B) 0, hence P(A) 0. Similarly B A

More information

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com

Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com 1 School of Oriental and African Studies September 2015 Department of Economics Preliminary Statistics Lecture 2: Probability Theory (Outline) prelimsoas.webs.com Gujarati D. Basic Econometrics, Appendix

More information

Discrete Random Variables

Discrete Random Variables Chapter 5 Discrete Random Variables Suppose that an experiment and a sample space are given. A random variable is a real-valued function of the outcome of the experiment. In other words, the random variable

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

Machine Learning, Midterm Exam: Spring 2008 SOLUTIONS. Q Topic Max. Score Score. 1 Short answer questions 20.

Machine Learning, Midterm Exam: Spring 2008 SOLUTIONS. Q Topic Max. Score Score. 1 Short answer questions 20. 10-601 Machine Learning, Midterm Exam: Spring 2008 Please put your name on this cover sheet If you need more room to work out your answer to a question, use the back of the page and clearly mark on the

More information

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability

ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 1 Probability Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Applications Elementary Set Theory Random

More information

Notice how similar the answers are in i,ii,iii, and iv. Go back and modify your answers so that all the parts look almost identical.

Notice how similar the answers are in i,ii,iii, and iv. Go back and modify your answers so that all the parts look almost identical. RANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013 0) a) Suppose you flip a fair coin 3 times. i) What is the probability you get 0 heads? ii) 1 head? iii) 2 heads? iv) 3 heads? b) Suppose you are dealt

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Basic Probability. Introduction

Basic Probability. Introduction Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with

More information

COS 341: Discrete Mathematics

COS 341: Discrete Mathematics COS 341: Discrete Mathematics Final Exam Fall 2006 Print your name General directions: This exam is due on Monday, January 22 at 4:30pm. Late exams will not be accepted. Exams must be submitted in hard

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

STAT:5100 (22S:193) Statistical Inference I

STAT:5100 (22S:193) Statistical Inference I STAT:5100 (22S:193) Statistical Inference I Week 3 Luke Tierney University of Iowa Fall 2015 Luke Tierney (U Iowa) STAT:5100 (22S:193) Statistical Inference I Fall 2015 1 Recap Matching problem Generalized

More information

Conditional Probability

Conditional Probability Conditional Probability Idea have performed a chance experiment but don t know the outcome (ω), but have some partial information (event A) about ω. Question: given this partial information what s the

More information

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( )

Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr( ) Theorem 1.7 [Bayes' Law]: Assume that,,, are mutually disjoint events in the sample space s.t.. Then Pr Pr = Pr Pr Pr() Pr Pr. We are given three coins and are told that two of the coins are fair and the

More information

Chapter 8: An Introduction to Probability and Statistics

Chapter 8: An Introduction to Probability and Statistics Course S3, 200 07 Chapter 8: An Introduction to Probability and Statistics This material is covered in the book: Erwin Kreyszig, Advanced Engineering Mathematics (9th edition) Chapter 24 (not including

More information

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ).

1. When applied to an affected person, the test comes up positive in 90% of cases, and negative in 10% (these are called false negatives ). CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 8 Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According to clinical trials,

More information

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X.

4. Suppose that we roll two die and let X be equal to the maximum of the two rolls. Find P (X {1, 3, 5}) and draw the PMF for X. Math 10B with Professor Stankova Worksheet, Midterm #2; Wednesday, 3/21/2018 GSI name: Roy Zhao 1 Problems 1.1 Bayes Theorem 1. Suppose a test is 99% accurate and 1% of people have a disease. What is the

More information

Statistics 100A Homework 5 Solutions

Statistics 100A Homework 5 Solutions Chapter 5 Statistics 1A Homework 5 Solutions Ryan Rosario 1. Let X be a random variable with probability density function a What is the value of c? fx { c1 x 1 < x < 1 otherwise We know that for fx to

More information

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22

Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22 CS 70 Discrete Mathematics for CS Spring 2006 Vazirani Lecture 22 Random Variables and Expectation Question: The homeworks of 20 students are collected in, randomly shuffled and returned to the students.

More information

Introduction to Probability. Ariel Yadin. Lecture 1. We begin with an example [this is known as Bertrand s paradox]. *** Nov.

Introduction to Probability. Ariel Yadin. Lecture 1. We begin with an example [this is known as Bertrand s paradox]. *** Nov. Introduction to Probability Ariel Yadin Lecture 1 1. Example: Bertrand s Paradox We begin with an example [this is known as Bertrand s paradox]. *** Nov. 1 *** Question 1.1. Consider a circle of radius

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Chapter 4 Multiple Random Variables Chapter 41 Joint and Marginal Distributions Definition 411: An n -dimensional random vector is a function from a sample space S into Euclidean space n R, n -dimensional

More information

Midterm Exam 1 (Solutions)

Midterm Exam 1 (Solutions) EECS 6 Probability and Random Processes University of California, Berkeley: Spring 07 Kannan Ramchandran February 3, 07 Midterm Exam (Solutions) Last name First name SID Name of student on your left: Name

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)

More information

Single Maths B: Introduction to Probability

Single Maths B: Introduction to Probability Single Maths B: Introduction to Probability Overview Lecturer Email Office Homework Webpage Dr Jonathan Cumming j.a.cumming@durham.ac.uk CM233 None! http://maths.dur.ac.uk/stats/people/jac/singleb/ 1 Introduction

More information

Randomized Algorithms

Randomized Algorithms Randomized Algorithms Prof. Tapio Elomaa tapio.elomaa@tut.fi Course Basics A new 4 credit unit course Part of Theoretical Computer Science courses at the Department of Mathematics There will be 4 hours

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

Chapter 2: Probability Part 1

Chapter 2: Probability Part 1 Engineering Probability & Statistics (AGE 1150) Chapter 2: Probability Part 1 Dr. O. Phillips Agboola Sample Space (S) Experiment: is some procedure (or process) that we do and it results in an outcome.

More information

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes

RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes RYERSON UNIVERSITY DEPARTMENT OF MATHEMATICS MTH 514 Stochastic Processes Midterm 2 Assignment Last Name (Print):. First Name:. Student Number: Signature:. Date: March, 2010 Due: March 18, in class. Instructions:

More information