Problem Solutions Chapter 4

Size: px
Start display at page:

Download "Problem Solutions Chapter 4"

Transcription

1 Problem Solutions Chapter 4 Problem 4.. Solution (a) The probability P [, 3] can be found be evaluating the joint CDF F, (x, y) at x andy 3. This yields P [, 3] F, (, 3) ( e )( e 3 ) () (b) To find the marginal CDF of, F (x), we simply evaluate the joint CDF at y. F (x) F, (x, ) { e x x () (c) Likewise for the marginal CDF of, we evaluate the joint CDF at. F (y) F, (,y) { e y y (3) Problem 4.. Solution (a) Because the probability that any random variable is less than is zero, we have F, (x, ) P [ x, ] P [ ] () (b) The probability that any random variable is less than infinity is always one. F, (x, ) P [ x, ]P [ x] F (x) () (c) Although P [ ],P [ ]. Therefore the following is true. F, (, ) P [, ] P [ ] (3) (d) Part (d) follows the same logic as that of part (a). F, (,y)p [, y] P [ ] (4) (e) Analogous to Part (b), we find that F, (,y)p [, y] P [ y] F (y) (5) 4

2 Problem 4..3 Solution We wish to find P [x x ]orp[y y ]. B {y y } so that We define events A {y y } and P [A B] P [A]+P [B] P [AB] () Keep in mind that the intersection of events A and B are all the outcomes such that both A and B occur, specifically, AB {x x,y y }. It follows that By Theorem 4.5, P [A B] P [x x ]+P [y y ] P [x x,y y ]. () P [x x,y y ] F, (x,y ) F, (x,y ) F, (x,y )+F, (x,y ). (3) Expressed in terms of the marginal and joint CDFs, P [A B] F (x ) F (x )+F (y ) F (y ) (4) F, (x,y )+F, (x,y )+F, (x,y ) F, (x,y ) (5) Problem 4..4 Solution Its easy to show that the properties of Theorem 4. are satisfied. However, those properties are necessary but not sufficient to show F (x, y) is a CDF. To convince ourselves that F (x, y) is a valid CDF, we show that for all x x and y y, In this case, for x x and y y, Theorem 4.5 yields P [x < x,y < y ] () P [x < x,y < y ]F(x,y ) F (x,y ) F (x,y )+F(x,y ) () F (x ) F (y ) F (x ) F (y ) (3) F (x ) F (y )+F (x ) F (y ) (4) [F (x ) F (x )][F (y ) F (y )] (5) (6) Hence, F (x)f (y) is a valid joint CDF. Problem 4..5 Solution In this problem, we prove Theorem 4.5 which states P [x < x,y < y ]F, (x,y ) F, (x,y ) () F, (x,y )+F, (x,y ) () (a) The events A, B, andc are y y y y y y (3) x x x x x A B C 5

3 (b) In terms of the joint CDF F, (x, y), we can write P [A] F, (x,y ) F, (x,y ) (4) P [B] F, (x,y ) F, (x,y ) (5) P [A B C] F, (x,y ) F, (x,y ) (6) (c) Since A, B, andc are mutually exclusive, P [A B C] P [A]+P [B]+P [C] (7) However, since we want to express P [C] P [x < x,y < y ] (8) in terms of the joint CDF F, (x, y), we write P [C] P [A B C] P [A] P [B] (9) F, (x,y ) F, (x,y ) F, (x,y )+F, (x,y ) () which completes the proof of the theorem. Problem 4..6 Solution The given function is F, (x, y) { e (x+y) x, y First, we find the CDF F (x) andf (y). { x F (x) F, (x, ) { y F (y) F, (,y) () () (3) Hence, for any x ory, For x andy, this implies However, P [ >x] P [ > y] (4) P [{ >x} { > y}] P [ >x]+p [ > y] (5) P [{ >x} { > y}] P [ x, y] ( e (x+y) )e (x+y) (6) Thus, we have the contradiction that e (x+y) for all x, y. We can conclude that the given function is not a valid CDF. 6

4 Problem 4.. Solution In this problem, it is helpful to label points with nonzero probability on the, plane: y P, (x, y) 4 3 3c 6c c c c 4c x 3 4 (a) We must choose c so the PMF sums to one: P, (x, y) c x y () x,,4 y,3 x,,4 y,3 c [(+3)+(+3)+4(+3)]8c () Thus c /8. (b) The event { < } has probability P [ < ] P, (x, y) x,,4 y<x () + () + 4( + 3) (3) (c) The event { > } has probability P [ > ] P, (x, y) x,,4 y>x (3) + (3) + 4() (4) (d) There are two ways to solve this part. The direct way is to calculate P [ ] x,,4 yx P, (x, y) () + () 8 8 (5) The indirect way is to use the previous results and the observation that P [ ] P [ < ] P [ > ]( 8/8 9/8) /8 (6) (e) P [ 3] P, (x, 3) x,,4 ()(3) + ()(3) + (4)(3) (7) Problem 4.. Solution On the, plane, the joint PMF is 7

5 P, (x, y) c c 3c y c c 3c c c x (a) To find c, we sum the PMF over all possible values of and. We choose c so the sum equals one. P, (x, y) c x + y 6c +c +6c 4c () x y x,, y,, Thus c /4. (b) P [ <]P, (, ) + P, (, ) + P, (, ) + P, (, ) () c + c +c +3c 7c / (3) (c) P [ > ]P, (, ) + P, (, ) + P, (, ) + P, (, ) (4) 3c +c + c + c 7c / (5) (d) From the sketch of P, (x, y) given above, P [ ]. (e) P [ <] P, (, ) + P, (, ) + P, (, ) + P, (, ) + P, (, ) (6) 8c 8/4. (7) Problem 4..3 Solution Let r (reject) and a (accept) denote the result of each test. rr, ra, ar, aa. The sample tree is There are four possible outcomes: p r p a p r p a p r p a rr p ra p( p) ar p( p) aa ( p) 8

6 Now we construct a table that maps the sample outcomes to values of and. outcome P [ ] rr p ra p( p) ar p( p) aa ( p) This table is esentially the joint PMF P, (x, y). P, (x, y) p x,y p( p) x,y p( p) x,y ( p) x,y () () Problem 4..4 Solution The sample space is the set S {hh, ht, th, tt} and each sample point has probability /4. Each sample outcome specifies the values of and as given in the following table outcome hh ht th tt () The joint PMF can represented by the table P, (x, y) y y x /4 x /4 /4 x /4 () Problem 4..5 Solution As the problem statement says, reasonable arguments can be made for the labels being and or x and y. As we see in the arguments below, the lowercase choice of the text is somewhat arbitrary. Lowercase axis labels: For the lowercase labels, we observe that we are depicting the masses associated with the joint PMF P, (x, y) whose arguments are x and y. Since the PMF function is defined in terms of x and y, the axis labels should be x and y. Uppercase axis labels: On the other hand, we are depicting the possible outcomes (labeled with their respective probabilities) of the pair of random variables and. The corresponding axis labels should be and just as in Figure 4.. The fact that we have labeled the possible outcomes by their probabilities is irrelevant. Further, since the expression for the PMF P, (x, y) given in the figure could just as well have been written P, (, ), it is clear that the lowercase x and y are not what matter. 9

7 Problem 4..6 Solution As the problem statement indicates, y<nif and only if A: the first y tests are acceptable, and B: testy + is a rejection. Thus P [ y] P [AB]. Note that since the number of acceptable tests before the first failure cannot exceed the number of acceptable circuits. Moreover, given the occurrence of AB, the event x<noccurs if and only if there are x y acceptable circuits in the remaining n y tests. Since events A, B and C depend on disjoint sets of tests, they are independent events. Thus, for y x<n, P, (x, y) P [ x, y] P [ABC] () P [A] P [B] P [C] () ( ) n y p y ( p) p x y ( p) n y (x y) (3) }{{}}{{} x y P [A] P [B] }{{} P [C] ( ) n y p x ( p) n x (4) x y The case y x n occurs when all n tests are acceptable and thus P, (n, n) p n. Problem 4..7 Solution The joint PMF of and K is P K, (k, x) P [K k, x], which is the probability that K k and x. This means that both events must be satisfied. The approach we use is similar to that used in finding the Pascal PMF in Example.5. Since can take on only the two values and, let s consider each in turn. When that means that a rejection occurred on the last test and that the other k rejections must have occurred in the previous n tests.thus, ( ) n P K, (k, ) ( p) k p n (k ) ( p) k,...,n () k When the last test was acceptable and therefore we know that the K k n tails must have occurred in the previous n tests. In this case, ( ) n P K, (k, ) ( p) k p n k p k,...,n () k We can combine these cases into a single complete expression for the joint PMF. ( n k ) ( p) k p n k x,k,,...,n ( P K, (k, x) n ) k ( p) k p n k x,k,,...,n (3) Problem 4..8 Solution Each circuit test produces an acceptable circuit with probability p. Let K denote the number of rejected circuits that occur in n tests and is the number of acceptable circuits before the first reject. The joint PMF, P K, (k, x) P [K k, x] can be found by realizing that {K k, x} occurs if and only if the following events occur: 3

8 A The first x tests must be acceptable. B Test x+ must be a rejection since otherwise we would have x+ acceptable at the beginnning. C The remaining n x tests must contain k rejections. Since the events A, B and C are independent, the joint PMF for x + k r, x andk is ( ) n x P K, (k, x) p x ( p) ( p) k p n x (k ) () }{{}}{{} k P [A] P [B] }{{} P [C] After simplifying, a complete expression for the joint PMF is { ( n x ) P K, (k, x) k p n k ( p) k x + k n, x,k () Problem 4.3. Solution On the, plane, the joint PMF P, (x, y) is 4 3 y 3c c P, (x, y) 6c c c 4c 3 4 x By choosing c /8, the PMF sums to one. (a) The marginal PMFs of and are P (x) 4/8 x 8/8 x P, (x, y) 6/8 x 4 y,3 P (y) x,,4 P, (x, y) 7/8 y /8 y 3 () () (b) The expected values of and are E [] xp (x) (4/8) + (8/8) + 4(6/8) 3 (3) x,,4 E [ ] yp (y) 7/8 + 3(/8) 5/ (4) y,3 3

9 (c) The second moments are E [ ] xp (x) (4/8) + (8/8) + 4 (6/8) 73/7 (5) x,,4 E [ ] yp (y) (7/8) + 3 (/8) 7 (6) y,3 The variances are Var[] E [ ] (E []) /7 Var[ ]E [ ] (E [ ]) 3/4 (7) The standard deviations are σ /7 andσ 3/4. Problem 4.3. Solution On the, plane, the joint PMF is P, (x, y) c c 3c y c 3c c c c x The PMF sums to one when c /4. (a) The marginal PMFs of and are P (x) P (y) y,, x,, P, (x, y) P, (x, y) 6/4 x, /4 x 5/4 y, 4/4 y () () (b) The expected values of and are E [] xp (x) (6/4) + (6/4) (3) E [ ] x,, y,, yp (y) (5/4) + (5/4) (4) (c) Since and both have zero mean, the variances are Var[] E [ ] x P (x) ( ) (6/4) + (6/4) 4/7 (5) Var[ ]E [ ] x,, y,, The standard deviations are σ 4/7 andσ 5/7. y P (y) ( ) (5/4) + (5/4) 5/7 (6) 3

10 Problem Solution We recognize that the given joint PMF is written as the product of two marginal PMFs P N (n) and P K (k) where P N (n) P N,K (n, k) P K (k) k P N,K (n, k) n { n e n! n,,... ) p k ( p) k k,,..., {( k () () Problem Solution The joint PMF of N,K is ( p) n p/n k,,...,n P N,K (n, k) n,... o otherwise () For n, the marginal PMF of N is P N (n) n P N,K (n, k) k n ( p) n p/n ( p) n p () k The marginal PMF of K is found by summing P N,K (n, k) over all possible N. NotethatifK k, then N k. Thus, P K (k) n ( p)n p (3) Unfortunately, this sum cannot be simplified. Problem Solution For n,,..., the marginal PMF of N is nk P N (n) k P N,K (n, k) n k n e (n +)! n e n! () For k,,..., the marginal PMF of K is P K (k) nk n e (n +)! nk n+ e (n +)! () P N (n +) nk (3) P [N >k] / (4) Problem 4.4. Solution 33

11 (a) The joint PDF of and is + { c x+ y,x,y f, (x, y) () To find the constant c we integrate over the region shown. This gives x cdydx cx cx c () Therefore c. (b) To find the P [ ] we look to integrate over the area indicated by the graph / x P [ ] dy dx (3) / x ( 4x) dx (4) / (5) (c) The probability P [ + /] can be seen in the figure. Here we can set up the following integrals + / / x P [ + /] dy dx (6) +½ / ( x) dx (7) / /4 /4 (8) Problem 4.4. Solution Given the joint PDF f, (x, y) { cxy x, y () (a) To find the constant c integrate f, (x, y) over the all possible values of and to get Therefore c 6. cxy dx dy c/6 () 34

12 (b) The probability P [ ] is the integral of the joint PDF f, (x, y) over the indicated shaded region. P [ ] x 6xy dy dx (3) x 4 dx (4) /5 (5) Similarly, to find P [ ] we can integrate over the region shown in the figure. P [ ] x 6xy dy dx (6) /4 (7) (c) Here we can choose to either integrate f, (x, y) over the lighter shaded region, which would require the evaluation of two integrals, or we can perform one integral over the darker region by recognizing min(,) <½ min(,) >½ P [min(, ) /] P [min(, ) > /] (8) / / / 9y 4 6xy dx dy (9) dy 3 (d) The probability P [max(, ) 3/4] can be found be integrating over the shaded region shown below. max(,) <¾ P [max(, ) 3/4] P [ 3/4, 3/4] () Problem Solution The joint PDF of and is () 6xy dx dy () ( x )( 3/4 y 3 ) 3/4 (3) (3/4) 5.37 (4) f, (x, y) { 6e (x+3y) x,y,. () 35

13 (a) The probability that is: x P [ ] 6e (x+3y) dy dx () ( e x e 3y ) yx dx (3) y [e x e 5x ] dx 3/5 (4) The P [ + ] is found by integrating over the region where + + P [ + ] x 6e (x+3y) dy dx (5) e x [ e 3y y x y ] dx (6) e x [ e 3( x)] dx (7) e x e x 3 (8) +e 3 3e (9) (b) The event min(, ) is the same as the event {, }. Thus, P [min(, ) ] 6e (x+3y) dy dx e (+3) () (c) The event max(, ) is the same as the event {, } so that P [max(, ) ] 6e (x+3y) dy dx ( e )( e 3 ) () Problem Solution The only difference between this problem and Example 4.5 is that in this problem we must integrate the joint PDF over the regions to find the probabilities. Just as in Example 4.5, there are five cases. We will use variable u and v as dummy variables for x and y. x<ory< y x In this case, the region of integration doesn t overlap the region of nonzero probability and F, (x, y) y x f, (u, v) du dv () 36

14 <y x In this case, the region where the integral has a nonzero contribution is y x F, (x, y) y x y x y v f, (u, v) dy dx () 8uv du dv (3) 4(x v )vdv (4) x v v 4 vy v x y y 4 (5) <x y and x y x F, (x, y) y x x u x f, (u, v) dv du (6) 8uv dv du (7) 4u 3 du x 4 (8) <y andx y x F, (x, y) y x y y v f, (u, v) dv du (9) 8uv du dv () 4v( v ) dv () y y 4 () x andy 37

15 y In this case, the region of integration completely covers the region of nonzero probability and x F, (x, y) y x f, (u, v) du dv (3) (4) The complete answer for the joint CDF is x<ory< x y y 4 <y x F, (x, y) x 4 x y, x y y 4 y,x x,y (5) Problem 4.5. Solution (a) The joint PDF (and the corresponding region of nonzero probability) are f, (x, y) { / x y () - (b) P [ >] x x dy dx dx /4 () This result can be deduced by geometry. The shaded triangle of the, plane corresponding to the event >is/4 of the total shaded area. (c) For x>orx<, f (x). For x, f (x) f, (x, y) dy The complete expression for the marginal PDF is x dy ( x)/. (3) f (x) { ( x)/ x,. (4) 38

16 (d) From the marginal PDF f (x), the expected value of is E [] xf (x) dx x( x) dx (5) x 4 x (6) Problem 4.5. Solution { x + y,x,y f, (x, y) () + Using the figure to the left we can find the marginal PDFs by integrating over the appropriate regions. x { ( x) x f (x) dy () Likewise for f (y): f (y) y dx { ( y) y (3) Problem Solution Random variables and have joint PDF { /(πr f, (x, y) ) x + y r (a) The marginal PDF of is { r x f (x) r x πr dy (b) Similarly, for random variable, r y f (y) r y πr dx { r x πr r x r,. r y πr r y r,. () () (3) Problem Solution The joint PDF of and and the region of nonzero probability are { 5x f, (x, y) / x, y x - We can find the appropriate marginal PDFs by integrating the joint PDF. () 39

17 (a) The marginal PDF of is f (x) x 5x dy { 5x 4 / x () (b) Note that f (y) fory>ory<. For y, f (y) f, (x, y) dx (3) y y 5x dx + 5x dx (4) y - - y y 5( y 3/ )/3 (5) The complete expression for the marginal CDF of is f (y) { 5( y 3/ )/3 y (6) Problem Solution In this problem, the joint PDF is { xy /r 4 x f, (x, y) + y r () (a) Since xy x y, for r x r, we can write f (x) f, (x, y) dy x r 4 r x Since y is symmetric about the origin, we can simplify the integral to f (x) 4 x r 4 r x ydy x r 4 y r x r x y dy () x (r x ) r 4 (3) Note that for x >r, f (x). Hence the complete expression for the PDF of is f (x) { x (r x ) r 4 r x r (4) (b) Note that the joint PDF is symmetric in x and y so that f (y) f (y). Problem Solution 4

18 (a) The joint PDF of and and the region of nonzero probability are f, (x, y) { cy y x () (b) To find the value of the constant, c, we integrate the joint PDF over all x and y. Thus c 6. f, (x, y) dx dy x cy dy dx cx dx cx3 6 c 6. () (c) We can find the CDF F (x) P [ x] by integrating the joint PDF over the event x. For x<, F (x). Forx>, F (x). For x, ( F (x) f, x,y ) dy dx (3) x x x x 6y dy dx (4) x x 3(x ) dx x 3. (5) The complete expression for the joint CDF is x< F (x) x 3 x (6) x (d) Similarly, we find the CDF of by integrating f, (x, y) over the event y. For y<, F (y) and for y>, F (y). For y, ( F (y) f, x,y ) dy dx (7) y y y 6y y dx dy (8) y y 6y ( y ) dy (9) 3(y ) (y ) 3 y 3y y 3. () The complete expression for the CDF of is y< F (y) 3y y 3 y () y> 4

19 (e) To find P [ /], we integrate the joint PDF f, (x, y) over the region y x/. ½ P [ /] x/ 6ydydx () 3y x/ dx (3) 3x dx /4 (4) 4 Problem 4.6. Solution In this problem, it is helpful to label possible points, along with the corresponding values of W. From the statement of Problem 4.6., y 4 P, (x, y) 3 W 3/8 W 6/8 W /8 W /8 W /8 W 3 4/8 3 4 x (a) To find the PMF of W, we simply add the probabilities associated with each possible value of W : P W ( ) P, (, 3) 3/8 P W ( ) P, (, 3) 6/8 () P W () P, (, ) /8 P W () P, (, ) + P, (4, 3) () P W (3) P, (4, ) 4/8 4/8 (3) Forallothervaluesofw, P W (w). (b) The expected value of W is E [W ] w wp W (w) (4) (c) P [W >] P W () + P W (3) 8/8. (3/8) + (6/8) + (/8) + (4/8) + 3(4/8) / (5) 4

20 Problem 4.6. Solution c W c W 3c W 4 y P, (x, y) c 3c W W 4 c W x c c W W In Problem 4.., the joint PMF P, (x, y) is given in terms of the parameter c. For this problem, we first need to find c. Before doing so, it is convenient to label each possible, point with the corresponding value of W +. To find c, we sum the PMF over all possible values of and. We choose c so the sum equals one. P, (x, y) c x + y () x y x,, y,, 6c +c +6c 4c () Thus c /4. Now we can solve the actual problem. (a) From the above graph, we can calculate the probability of each possible value of w. P W ( 4) P, (, ) 3c (3) P W ( ) P, (, ) + P, (, ) 3c (4) P W () P, (, ) + P, (, ) c (5) P W () P, (, ) + P, (, ) 3c (6) P W (4) P, (, ) 3c (7) With c /4, we can summarize the PMF as 3/4 w 4,,, 4 P W (w) /4 w (8) (b) The expected value is now straightforward: (c) Lastly, P [W >] P W () + P W (4) 3/7. E [W ] 3 4 ( )+. (9) 4 Problem Solution We observe that when x, wemusthave w x in order for W w. Thatis, P W (w) x P [ x, w x] x P, (x, w x) () 43

21 Problem Solution w W>w w To find the PMF of W, we observe that for w,...,, The x, y pairs with nonzero probability are shown in the figure. For w,,...,, we observe that P [W >w]p [min(, ) >w] () P [ >w, >w] ().( w) (3) P W (w) P [W >w ] P [W >w] (4).[( w ) ( w) ].( w) (5) The complete expression for the PMF of W is P W (w) {.( w) w,,..., (6) Problem Solution v V<v v To find the PMF of V, we observe that for v,...,, The complete expression for the PMF of V is The x, y pairs with nonzero probability are shown in the figure. For v,...,, we observe that P [V <v]p [max(, ) <v] () P [ <v, <v] ().(v ) (3) P V (v) P [V <v+] P [V <v] (4).[v (v ) ] (5).(v ) (6) P V (v) {.(v ) v,,..., (7) 44

22 Problem Solution (a) The minimum value of W is W, which occurs when and. The maximum value of W is W, which occurs when or. The range of W is S W {w w }. (b) For w, the CDF of W is F w W<w W (w) P [max(, ) w] () P [ w, w] () w w f, (x, y) dy dx (3) w Substituting f, (x, y) x + y yields w w F W (w) (x + y) dy dx (4) ( w ) xy + y yw w dx (wx + w /) dx w 3 (5) y The complete expression for the CDF is w< F W (w) w 3 w otherwise (6) The PDF of W is found by differentiating the CDF. f W (w) df W (w) dw { 3w w (7) Problem Solution (a) Since the joint PDF f, (x, y) is nonzero only for y x, we observe that W since. In addition, the most negative value of W occurs when and andw. Hence the range of W is S W {w w }. (b) For w<, F W (w). Forw>, F W (w). For w, the CDF of W is ½ +w -w F W (w) P [ w] () w w x+w 6ydydx () 3(x + w) dx (3) (x + w) 3 w (+w)3 (4) 45

23 Therefore, the complete CDF of W is w< F W (w) ( + w) 3 w w> (5) By taking the derivative of f W (w) withrespecttow, we obtain the PDF f W (w) { 3(w +) w (6) Problem Solution Random variables and have joint PDF f, (x, y) { y x () (a) Since and are both nonnegative, W /. Since, W /. Note that W can occur if. Thus the range of W is S W {w w }. (b) For w, the CDF of W is F W (w) P [/ w] P [ w] w () The complete expression for the CDF is w P[<w] w< F W (w) w w< w By taking the derivative of the CDF, we find that the PDF of W is { w< f W (w) (3) (4) We see that W has a uniform PDF over [, ]. Thus E[W ]/. Problem Solution Random variables and have joint PDF f, (x, y) { y x () 46

24 (a) Since f, (x, y) fory>x, we can conclude that and that W /. Since can be arbitrarily small but positive, W can be arbitrarily large. Hence the range of W is S W {w w }. (b) For w, the CDF of W is P[</w] /w F W (w) P [/ w] () P [/ > w] (3) P [ < /w] (4) /w (5) Note that we have used the fact that P [ < /w]equals/ times the area of the corresponding triangle. The complete CDF is { w< F W (w) /w w (6) The PDF of W is found by differentiating the CDF. f W (w) df { W (w) /w w dw (7) To find the expected value E[W ], we write dw E [W ] wf W (w) dw w. (8) However, the integral diverges and E[W ] is undefined. Problem 4.6. Solution The position of the mobile phone is equally likely to be anywhere in the area of a circle with radius 6 km. Let and denote the position of the mobile. Since we are given that the cell has a radius of 4 km, we will measure and in kilometers. Assuming the base station is at the origin of the, plane, the joint PDF of and is { f, (x, y) 6π x + y 6 () Since the mobile s radial distance from the base station is R +,thecdfofr is F R (r) P [R r] P [ + r ] () By changing to polar coordinates, we see that for r 4km, π r r F R (r) 6π dr dθ r /6 (3) So r< F R (r) r /6 r<4 (4) r 4 Then by taking the derivative with respect to r we arrive at the PDF { r/8 r 4 f R (r) (5) 47

25 Problem 4.6. Solution Following the hint, we observe that either or, or, equivalently, (/) or (/ ). Hence, W. To find the CDF F W (w), we know that F W (w) forw<. For w, we solve F W (w) P [max[(/ ), (/)] w] P [(/ ) w, (/) w] P [ /w, w] P [/w w] w a/w a /w a/w a This implies We note that in the middle of the above steps, nonnegativity of and was essential. We can depict the given set {/w w} as the dark region on the, plane. Because the PDF is uniform over the square, it is easier to use geometry to calculate the probability. In particular, each of the lighter triangles that are not part of the region of interest has area a /w. The final expression for the CDF of W is P [/w w] a /w + a /w a F W (w) By taking the derivative, we obtain the PDF f W (w) { w< /w w { w< /w w w () () (3) Problem 4.7. Solution 4 3 y P, (x, y) 3/8 6/8 /8 /8 / /8 x In Problem 4.., we found the joint PMF P, (x, y) as shown. Also the expected values and variances were E [] 3 Var[] /7 () E [ ]5/ Var[ ]3/4 () We use these results now to solve this problem. (a) Random variable W / has expected value E [/] y x P, (x, y) (3) x,,4 y, /4 (4)

26 (b) The correlation of and is r, xyp, (x, y) (5) x,,4 y, (6) /8 5/ (7) Recognizing that P, (x, y) xy/8 yields the faster calculation r, E [ ] x,,4 y,3 8 x,,4 (xy) 8 x y,3 (8) y (9) 8 ( + +4 )( +3 ) 8 5 () (c) The covariance of and is Cov [, ]E [ ] E [] E [ ] 5 35 () (d) Since and have zero covariance, the correlation coefficient is ρ, Cov [, ] Var[]Var[ ]. () (e) Since and are uncorrelated, the variance of + is Var[ + ]Var[]+Var[ ] 6 8. (3) Problem 4.7. Solution P, (x, y) /4 y /4 3/4 In Problem 4.., we found the joint PMF P, (x, y) shown here. The expected values and variances were found to be /4 3/4 /4 /4 /4 x E [] Var[] 4/7 () E [ ] Var[ ]5/7 () We need these results to solve this problem. 49

27 (a) Random variable W has expected value E [ ] xy P, (x, y) (3) x,, y,, ( ) () 4 + () 4 +( ) 4 +() 4 (4) + ( ) 4 +() 4 +() 4 (5) 6/8 (6) (b) The correlation of and is r, x,, y,, (c) The covariance of and is (d) The correlation coefficient is xyp, (x, y) (7) ( )(3) + ()() + ()() + ( )() + ()() + ()(3) (8) 4/7 (9) Cov [, ]E [ ] E [] E [ ]4/7 () ρ, Cov [, ] () Var[]Var[ ] 3 (e) By Theorem 4.6, Var [ + ]Var[]+Var[ ]+Cov[, ] () (3) Problem Solution In the solution to Quiz 4.3, the joint PMF and the marginal PMFs are P H,B (h, b) b b b 4 P H (h) h.4..6 h... h... P B (b)..5.3 () From the joint PMF, the correlation coefficient is r H,B E [HB] hbp H,B (h, b) () h b,,4 ()(.4) + ()(.) + (4)(.) + (4)() (3).4 (4) 5

28 since only terms in which both h and b are nonzero make a contribution. PMFs, the expected values of and are Using the marginal E [H] hp H (h) (.6) + (.) + (.). (5) E [B] h b,,4 bp B (b) (.) + (.5) + 4(.3). (6) The covariance is Cov [H, B] E [HB] E [H] E [B].4 (.)(.).96 (7) Problem Solution From the joint PMF, P (x), found in Example 4.3, we can find the marginal PMF for or by summing over the columns or rows of the joint PMF. P (y) (a) The expected values are 5/48 y 3/48 y 7/48 y 3 3/48 y 4 E [ ] E [] 4 y P (x) { /4 x,, 3, 4 yp (y) /4 () 48 4 xp (x) (++3+4)5/ (3) 4 x () (b) To find the variances, we first find the second moments. E [ ] E [ ] Now the variances are 4 y y P (y) / (4) 48 4 x P (x) ( ) 5/ (5) 4 x Var[ ]E [ ] (E [ ]) 47/ (7/4) 4/48 (6) Var[] E [ ] (E []) 5/ (5/) 5/4 (7) 5

29 (c) To find the correlation, we evaluate the product over all values of and. Specifically, r, E [ ] (d) The covariance of and is (e) The correlation coefficient is 4 x y x xyp, (x, y) (8) (9) 5 () Cov [, ]E [ ] E [] E [ ]5 (7/4)(/4) /6 () ρ, Cov [W, V ] Var[W ]Var[V ] /6 (4/48)(5/4).65 () Problem Solution For integers x 5, the marginal PMF of is P (x) y P, (x, y) x (/) x + y () Similarly, for integers y 5, the marginal PMF of is P (y) x P, (x, y) 5 (/) 6 y xy () The complete expressions for the marginal PMFs are { (x +)/ x,...,5 P (x) { (6 y)/ y,...,5 P (y) (3) (4) The expected values are E [] 5 x x x E [ ] 5 y y 6 y (5) To find the covariance, we first find the correlation E [ ] 5 x x y The covariance of and is xy 5 x x x y y 4 5 x x (x +) Cov [, ]E [ ] E [] E [ ] (6) (7) 5

30 Problem Solution 4 3 y P, (x, y) /8 W V / W 3 V 3 / W V 3 /6 W 4 V 4 /6 W 3 V 4 /6 W V 4 To solve this problem, we identify the values of W min(, )and V max(, ) for each possible pair x, y. Here we observe that W and V. This is a result of the underlying experiment in that given x, each {,,...,x} is equally likely. Hence. This implies min(, ) and max(, ). /4 W V /8 W V / W V 3 /6 W V x Using the results from Problem 4.7.4, we have the following answers. (a) The expected values are E [W ]E [ ]7/4 E [V ]E [] 5/ () (b) The variances are (c) The correlation is Var[W ]Var[ ]4/48 Var[V ]Var[] 5/4 () r W,V E [WV]E [ ]r, 5 (3) (d) The covariance of W and V is (e) The correlation coefficient is Cov [W, V ]Cov[, ]/6 (4) ρ W,V ρ, /6 (4/48)(5/4).65 (5) Problem Solution First, we observe that has mean μ aμ + b and variance Var[ ]a Var[]. The covariance of and is The correlation coefficient is ρ, Cov [, ]E[( μ )(a + b aμ b)] () ae [ ( μ ) ] () a Var[] (3) Cov [, ] Var[] Var[ ] When a>, ρ,. Whena<, ρ,. 53 a Var[] Var[] a Var[] a a (4)

31 Problem Solution The joint PDF of and is { (x + y)/3 x, y f, (x, y) Before calculating moments, we first find the marginal PDFs of and.for x, x + y f (x) f, (x, y) dy dy xy y y x + 6 y 3 For y, ( x f (y) f, (x, y) dx 3 + y dx 3) x 6 + xy x y + 3 x 6 Complete expressions for the marginal PDFs are { x+ { f (x) 3 x y+ f (y) 6 y (a) The expected value of is x + E [] xf (x) dx x dx x x The second moment of is E [ ] x f (x) dx x + x dx x x3 9 Thevarianceof is Var[] E[ ] (E[]) 7/8 (5/9) 3/6. (b) The expected value of is E [ ] yf (y) dy y + y dy y 6 + y3 9 The second moment of is E [ ] y y + f (y) dy y dy y y4 Thevarianceof is Var[ ]E[ ] (E[ ]) 3/ () () (3) (4) (5) (6) (7) (8) (c) The correlation of and is E [ ] xyf, (x, y) dx dy (9) ( ) x + y xy dy dx () 3 ( x y ) + xy3 y 6 9 dx () y ( x 3 + 8x ) dx x x 9 () 3 ThecovarianceisCov[, ] E[ ] E[]E[ ] /8. 54

32 (d) The expected value of and is (e) By Theorem 4.5, E [ + ]E []+E [ ]5/9+/9 6/9 (3) Var[ + ]Var[]+Var[ ]+Cov[, ] (4) Problem Solution (a) The first moment of is E [] The second moment of is E [ ] 4x ydydx 4x 3 ydydx x dx 3 x 3 dx () () Thevarianceof is Var[] E[ ] (E[]) / (/3) /8. (b) The mean of is The second moment of is E [ ] E [ ] 4xy dy dx 4xy 3 dy dx 4x 3 dx 3 xdx (3) (4) Thevarianceof is Var[ ]E[ ] (E[ ]) / (/3) /8. (c) To find the covariance, we first find the correlation E [ ] Thecovarianceisthus 4x y dy dx 4x 3 dx 4 9 (5) Cov [, ]E [ ] E [] E [ ] 4 9 ( 3) (6) (d) E[ + ]E[]+E[ ]/3+/3 4/3. (e) By Theorem 4.5, the variance of + is Var[]+Var[ ]+Cov[, ]/8 + /8 + /9 (7) 55

33 Problem 4.7. Solution The joint PDF of and and the region of nonzero probability are - { 5x f, (x, y) / x, y x () (a) The first moment of is E [] x x 5x dy dx 5x 5 dx 5x6 Since E[], the variance of and the second moment are both Var[] E [ ] x x 5x 5x7 dy dx 4 () 4 (3) (b) The first and second moments of are E [ ] E [ ] x Therefore, Var[ ]5/6 (5/4).576. y 5x x y 5x dy dx 5 4 dy dx 5 6 (4) (5) (c) Since E[], Cov[, ] E[ ] E[]E[ ] E[ ]. Thus, Cov [, ]E [ ] x xy 5x dy dx 5x 7 dx (6) 4 (d) The expected value of the sum + is E [ + ]E []+E [ ] 5 4 (7) (e) By Theorem 4.5, the variance of + is Var[ + ]Var[]+Var[ ]+Cov[, ]5/ (8) Problem 4.7. Solution Random variables and have joint PDF f, (x, y) { y x () 56

34 Before finding moments, it is helpful to first find the marginal PDFs. For x, f (x) f, (x, y) dy Note that f (x) forx<orx>. For y, f (y) f, (x, y) dx y x dy x () dx ( y) (3) Also, for y<ory>, f (y). Complete expressions for the marginal PDFs are f (x) { { x x ( y) y f (y) (4) (a) The first two moments of are E [] E [ ] xf (x) dx x f (x) dx Thevarianceof is Var[] E[ ] (E[]) / 4/9 /8. x dx /3 (5) x 3 dx / (6) (b) The expected value and second moment of are E [ ] yf (y) dy y( y) dy y y3 3 /3 (7) E [ ] y f (y) dy y ( y) dy y3 3 y4 /6 (8) Thevarianceof is Var[ ]E[ ] (E[ ]) /6 /9 /8. (c) Before finding the covariance, we find the correlation Thecovarianceis E [ ] x xy dy dx x 3 dx /4 (9) Cov [, ]E [ ] E [] E [ ]/36. () (d) E[ + ]E[]+E[ ]/3+/3 (e) By Theorem 4.5, Var[ + ]Var[]+Var[ ]+Cov[, ]/6. () 57

35 Problem 4.7. Solution - Random variables and have joint PDF { / x y f, (x, y) The region of possible pairs (x, y) is shown with the joint PDF. The rest of this problem is just calculus. xy E [ ] x dy dx x( x ) dx x 4 8 x4 6 () E [ e + ] ex e y dy dx (3) x e x (e e x ) dx (4) e+x 4 ex e 4 + e 4 () (5) Problem Solution For this problem, calculating the marginal PMF of K is not easy. However, the marginal PMF of N is easy to find. For n,,..., P N (n) n k ( p) n p n ( p) n p () That is, N has a geometric PMF. From Appendix A, we note that E [N] p Var[N] p p () We can use these facts to find the second moment of N. E [ N ] Var[N]+(E [N]) p p (3) Now we can calculate the moments of K. E [K] Since n k k n(n +)/, E [K] n n k n k ( p)n p n n n + ( p) n p E ( p) n p n [ ] N + n k (4) k p + (5) We now can calculate the sum of the moments. E [N + K] E [N]+E [K] 3 p + (6) 58

36 The second moment of K is E [ K ] n n k k ( p)n p n n ( p) n p n n k (7) k Using the identity n k k n(n + )(n +)/6, we obtain E [ K ] n (n + )(n +) ( p) n p E 6 [ ] (N + )(N +) 6 (8) Applying the values of E[N] ande[n ] found above, we find that E [ K ] E [ N ] 3 + E [N] + 6 3p + 6p + 6 (9) Thus, we can calculate the variance of K. Var[K] E [ K ] (E [K]) 5 p 3p + 5 () To find the correlation of N and K, E [NK] n n k nk ( p)n p n ( p) n p n k n k () Since n k k n(n +)/, Finally, the covariance is E [NK] n n(n +) ( p) n p E [ ] N(N +) p () Cov [N,K] E [NK] E [N] E [K] p p (3) Problem 4.8. Solution The event A occurs iff >5and > 5 and has probability From Theorem 4.9, P [A] P [ >5, >5] x6 y6..5 () { P, (x,y) P [A] (x, y) A P, A (x, y) {.4 x 6,...,; y 6,..., () (3) 59

37 Problem 4.8. Solution The event B occurs iff 5and 5 and has probability From Theorem 4.9, P [B] P [ 5, 5] 5 x y 5..5 () { P, (x,y) P [B] (x, y) A P, B (x, y) {.4 x,...,5; y,...,5 () (3) Problem Solution Given the event A { + }, we wish to find f, A (x, y). First we find So then P [A] x { f, A (x, y) 6e (x+3y) dy dx 3e +e 3 () 6e (x+3y) 3e +e 3 x + y,x,y () Problem Solution First we observe that for n,,..., the marginal PMF of N satisfies P N (n) n P N,K (n, k) ( p) n p k Thus, the event B has probability P [B] From Theorem 4.9, n k n ( p)n p () P N (n) ( p) 9 p[ + ( p)+( p) + ]( p) 9 () n { PN,K (n,k) P [B] n, k B P N,K B (n, k) { ( p) n p/n n,,...; k,...,n (3) (4) The conditional PMF P N B (n b) could be found directly from P N (n) using Theorem.7. However, we can also find it just by summing the conditional joint PMF. P N B (n) n P N,K B (n, k) k { ( p) n p n,,... (5) 6

38 From the conditional PMF P N B (n), we can calculate directly the conditional moments of N given B. Instead, however, we observe that given B, N N 9 has a geometric PMF with mean /p. That is, for n,,..., P N B (n) P [N n +9 B] P N B (n +9)( p) n p (6) Hence, given B, N N + 9 and we can calculate the conditional expectations E [N B] E [ N +9 B ] E [ N B ] +9/p +9 (7) Var[N B] Var[N +9 B] Var[N B] ( p)/p (8) Note that further along in the problem we will need E[N B] which we now calculate. E [ N B ] Var[N B]+(E [N B]) (9) p + 7 p + 8 () For the conditional moments of K, we work directly with the conditional PMF P N,K B (n, k). E [K B] Since n k k n(n +)/, E [K B] n n k n k ( p)n p n n We now can calculate the conditional expectation of the sum. ( p) n p n n k () k n + ( p) n p E [N + B] p + 5 () E [N + K B] E [N B]+E[K B] /p +9+/(p) (3) p The conditional second moment of K is E [ K B ] n n k k ( p)n p n n ( p) n p n n k (4) k Using the identity n k k n(n + )(n +)/6, we obtain E [ K B ] n (n + )(n +) ( p) n p E [(N + )(N +) B] (5) 6 6 Applying the values of E[N B] ande[n B] found above, we find that E [ K B ] E [ N B ] + E [N B] p p +3 3 (6) Thus, we can calculate the conditional variance of K. Var[K B] E [ K B ] (E [K B]) 5 p 7 6p +6 3 (7) 6

39 To find the conditional correlation of N and K, n E [NK B] nk ( p)n p n Since n k k n(n +)/, E [NK B] n n k ( p) n p n n k (8) n(n +) ( p) n p E [N(N +) B] p (9) p Problem Solution The joint PDF of and is { (x + y)/3 x, y f, (x, y) () (a) The probability that is P [A] P [ ] f, (x, y) dx dy () y x + y 3 k dy dx (3) ) 3 + y y 6 dx (4) y dx x 6 + x 6 (5) 3 ( xy x + 6 (b) By Definition 4., the conditional joint PDF of and given A is f, A (x, y) { f, (x,y) P [A] (x, y) A { x + y x, y (6) From f, A (x, y), we find the conditional marginal PDF f A (x). For x, f A (x) f, A (x, y) dy (x + y) dy xy + y y x + (7) y The complete expression is { x +/ x f A (x) (8) For y, the conditional marginal PDF of is f A (y) f, A (x, y) dx (x + y) dx x x + xy y +/ (9) x The complete expression is { y +/ y f A (y) () 6

40 Problem Solution Random variables and have joint PDF { (4x +y)/3 x, y f, (x, y) () (a) The probability of event A { /} is P [A] f, (x, y) dy dx y / / 4x +y 3 dy dx. () With some calculus, P [A] 4xy + y 3 y/ y dx x +/4 3 dx x 3 + x 5. (3) (b) The conditional joint PDF of and given A is { f, (x,y) P [A] (x, y) A f, A (x, y) { 8(x + y)/5 x, y / (4) (5) For x, the PDF of given A is f A (x) The complete expression is f, A (x, y) dy / (x + y) dy (6) (xy + y ) y/ y 8x + 5 (7) f A (x) { (8x +)/5 x (8) For y /, the conditional marginal PDF of given A is f A (y) The complete expression is f, A (x, y) dx 8 5 8x +8xy 5 (x + y) dx (9) x x 8y +8 5 () f A (y) { (8y +8)/5 y / () 63

41 Problem Solution (a) The event A { /4} has probability ¼ - -½ This implies </4 ½ P [A] / x + / /4 / { f, (x, y) /P [A] (x, y) A 5x dy dx () 5x dy dx 5x 4 5x dx + dx / 4 () x 5 / +5x 3 / / (3) 9/48 (4) f, A (x, y) { x /9 x, y x,y /4 (5) (6) (b) f A (y) x { 8 f, A (x, y) dx dx 9 ( y3/ ) y /4 y 9 (c) The conditional expectation of given A is ( ) /4 E [ A] y 8 9 ( y3/ ) dy 8 y 9 y7/ 7 / (7) (8) (d) To find f A (x), we can write f A (x) f, A(x, y) dy. However, when we substitute f, A (x, y), the limits will depend on the value of x. When x /, f A (x) When x / or/ x, x /4 x 9 dy x4 9 x f A (x) dy 3x 9 9 The complete expression for the conditional PDF of given A is 3x /9 x / x f A (x) 4 /9 / x / 3x /9 / x (9) () () 64

42 (e) The conditional mean of given A is E [ A] / 3x 3 / 9 dx + x 5 3x 3 dx + dx () / 9 / 9 Problem 4.9. Solution The main part of this problem is just interpreting the problem statement. necessary. Since a trip is equally likely to last, 3 or 4 days, { /3 d, 3, 4 P D (d) No calculations are () Given a trip lasts d days, the weight change is equally likely to be any value between d and d pounds. Thus, { /(d +) w d, d +,...,d P W D (w d) () The joint PMF is simply P D,W (d, w) P W D (w d) P D (d) (3) { /(6d +3) d, 3, 4; w d,...,d (4) Problem 4.9. Solution We can make a table of the possible outcomes and the corresponding values of W and outcome P [ ] W hh p ht p( p) th p( p) tt ( p) () In the following table, we write the joint PMF P W, (w, y) along with the marginal PMFs P (y) and P W (w). P W, (w, y) w w w P (y) y ( p) ( p) y p( p) p( p) p( p) () y p p P W (w) p( p) p +p p( p) Using the definition P W (w y) P W, (w, y)/p (y), we can find the conditional PMFs of W given. { w P W (w ) { w P W (w ) P W (w ) { / w, (3) (4) 65

43 Similarly, the conditional PMFs of given W are P W (y ) P W (y ) { y { y ( p) y p+p P W (y ) p y p+p (5) (6) Problem Solution f, (x, y) { (x + y) x, y () (a) The conditional PDF f (x y) is defined for all y such that y. For y, f (x) f, (x, y) f (x) { (x + y) (x+y) x+/ x (x + y) dy () (b) The conditional PDF f (y x) is defined for all values of x in the interval [, ]. For x, f (y) f, (x, y) f (y) { (x + y) (x+y) y+/ y (x + y) dx (3) Problem Solution Random variables and have joint PDF f, (x, y) { y x () For y, f (y) f, (x, y) dx y dx ( y) () Also, for y<ory>, f (y). The complete expression for the marginal PDF is f (y) { ( y) y (3) By Theorem 4.4, the conditional PDF of given is f (x y) f, (x, y) f (y) { y y x (4) 66

44 That is, since, is uniform over [y, ] when y. The conditional expectation of given y can be calculated as E [ y] y xf (x y) dx (5) ( y) x y dx x y +y In fact, since we know that the conditional PDF of is uniform over [y, ] when y, itwasn t really necessary to perform the calculation. (6) Problem Solution Random variables and have joint PDF f, (x, y) { y x () For x, the marginal PDF for satisfies x f (x) f, (x, y) dy dy x () Note that f (x) forx<orx>. Hence the complete expression for the marginal PDF of is { x x f (x) (3) The conditional PDF of given x is f (y x) f, (x, y) f (x) { /x y x (4) Given x, has a uniform PDF over [,x] and thus has conditional expected value E[ x] x/. Another way to obtain this result is to calculate yf (y x) dy. Problem Solution We are told in the problem statement that if we know r, the number of feet a student sits from the blackboard, then we also know that that student s grade is a Gaussian random variable with mean 8 r and standard deviation r. Thisisexactly f R (x r) πr e (x [8 r]) /r () Problem Solution 67

45 (a) First we observe that A takes on the values S A {, } while B takes on values from S B {, }. To construct a table describing P A,B (a, b) we build a table for all possible values of pairs (A, B). The general form of the entries is P A,B (a, b) b b a P B A ( ) P A ( ) P B A ( ) P A ( ) a P B A ( ) P A () P B A ( ) P A () () Now we fill in the entries using the conditional PMFs P B A (b a) and the marginal PMF P A (a). This yields P A,B (a, b) b b a (/3)(/3) (/3)(/3) a (/)(/3) (/)(/3) which simplifies to P A,B (a, b) b b a /9 /9 a /3 /3 () (b) Since P A () P A,B (, ) + P A,B (, ) /3, P B A (b ) P A,B (,b) P A () If A, the conditional expectation of B is { / b,,. (3) E [B A ] bp B A (b ) P B A ( ) /. (4) b (c) Before finding the conditional PMF P A B (a ), we first sum the columns of the joint PMF table to find { 4/9 b P B (b) (5) 5/9 b The conditional PMF of A given B is P A B (a ) P A,B (a, ) P B () { /5 a 3/5 a (6) (d) Now that we have the conditional PMF P A B (a ), calculating conditional expectations is easy. E [A B ] ap A B (a ) (/5) + (3/5) /5 (7) E [ A B ] a, a, a P A B (a ) /5+3/5 (8) The conditional variance is then Var[A B ]E [ A B ] (E [A B ]) (/5) 4/5 (9) 68

46 (e) To calculate the covariance, we need E [A] ap A (a) (/3) + (/3) /3 () E [B] E [AB] a, bp B (b) (4/9) + (5/9) 5/9 () b a, b abp A,B (a, b) () ()(/9) + ()(/9) + ()(/3) + ()(/3) /9 (3) Thecovarianceisjust Cov [A, B] E [AB] E [A] E [B] /9 (/3)(5/9) /7 (4) Problem Solution First we need to find the conditional expectations E [B A ] E [B A ] bp B A (b ) (/3) + (/3) /3 () b bp B A (b ) (/) + (/) / () b Keep in mind that E[B A] is a random variable that is a function of A. that is we can write E [B A] g(a) { /3 A / A (3) We see that the range of U is S U {/, /3}. Inparticular, The complete PMF of U is P U (/) P A () /3 P U (/3) P A ( ) /3 (4) P U (u) Note that E [E [B A]] E [U] u ou can check that E[U] E[B]. { /3 u / /3 u /3 (5) up U (u) (/)(/3) + (/3)(/3) 5/9 (6) Problem Solution Random variables N and K have the joint PMF n e k,,...,n; P N,K (n, k) (n+)! n,,... () 69

47 We can find the marginal PMF for N by summing over all possible K. Forn, P N (n) n k n e (n +)! n e n! () We see that N has a Poisson PMF with expected value. For n, the conditional PMF of K given N n is P K N (k n) P { N,K (n, k) /(n +) k,,...,n (3) P N (n) That is, given N n, K has a discrete uniform PMF over {,,...,n}. Thus, E [K N n] n k/(n +)n/ (4) k We can conclude that E[K N] N/. Thus, by Theorem 4.5, E [K] E [E [K N]] E [N/] 5. (5) Problem 4.9. Solution This problem is fairly easy when we use conditional PMF s. In particular, given that N n pizzas were sold before noon, each of those pizzas has mushrooms with probability /3. The conditional PMF of M given N is the binomial distribution ( n ) P M N (m n) { m (/3) m (/3) n m m,,...,n () The other fact we know is that for each of the pizzas sold, the pizza is sold before noon with probability /. Hence, N has the binomial PMF P N (n) ) (/) n (/) n n,,..., {( n () The joint PMF of N and M is for integers m, n, P M,N (m, n) P M N (m n) P N (n) (3) {( n )( ) m n (/3) m (/3) n m (/) m n (4) Problem 4.9. Solution Random variables and have joint PDF - f, (x, y) { / x y () 7

48 (a) For y, the marginal PDF of is f (y) y f, (x, y) dx The complete expression for the marginal PDF of is dx (y +)/ () f (y) { (y +)/ y (3) (b) The conditional PDF of given is f (x y) f, (x, y) f (y) { +y x y (4) (c) Given y, the conditional PDF of is uniform over [,y]. expected value is E[ y] (y )/. Hence the conditional Problem 4.9. Solution We are given that the joint PDF of and is { /(πr f, (x, y) ) x + y r (a) The marginal PDF of is { r x f (x) πr dy r x πr r x r () () The conditional PDF of given is f (y x) f, (x, y) f (x) { /( r x ) y r x (3) (b) Given x, we observe that over the interval [ r x, r x ], has a uniform PDF. Since the conditional PDF f (y x) is symmetric about y, E [ x] (4) Problem Solution The key to solving this problem is to find the joint PMF of M and N. Note that N M. For n>m, the joint event {M m, N n} has probability m n m calls calls {}}{{}}{ P [M m, N n] P [ dd dv dd dv] () ( p) m p( p) n m p () ( p) n p (3) 7

49 A complete expression for the joint PMF of M and N is { ( p) P M,N (m, n) n p m,,...,n ; n m +,m+,... (4) The marginal PMF of N satisfies n P N (n) ( p) n p (n )( p) n p, n, 3,... (5) m Similarly, for m,,..., the marginal PMF of M satisfies P M (m) ( p) n p (6) nm+ p [( p) m +( p) m + ] (7) ( p) m p (8) The complete expressions for the marginal PMF s are { ( p) P M (m) m p m,,... { (n )( p) P N (n) n p n, 3,... (9) () Not surprisingly, if we view each voice call as a successful Bernoulli trial, M has a geometric PMF since it is the number of trials up to and including the first success. Also, N has a Pascal PMF since it is the number of trials required to see successes. The conditional PMF s are now easy to find. P N M (n m) P { M,N (m, n) ( p) n m p n m +,m+,... () P M (m) The interpretation of the conditional PMF of N given M is that given M m, N m + N where N has a geometric PMF with mean /p. The conditional PMF of M given N is P M N (m n) P M,N (m, n) P N (n) { /(n ) m,...,n () Given that call N n was the second voice call, the first voice call is equally likely to occur in any of the previous n calls. Problem Solution (a) The number of buses, N, must be greater than zero. Also, the number of minutes that pass cannot be less than the number of buses. Thus, P [N n, T t] > for integers n, t satisfying n t. 7

50 (b) First, we find the joint PMF of N and T by carefully considering the possible sample paths. In particular, P N,T (n, t) P [ABC] P [A]P [B]P [C] where the events A, B and C are A {n buses arrive in the first t minutes} () B {none of the first n buses are boarded} () C {at time t a bus arrives and is boarded} (3) These events are independent since each trial to board a bus is independent of when the buses arrive. These events have probabilities ( ) t P [A] p n ( p) t (n ) (4) n P [B] ( q) n (5) P [C] pq (6) Consequently, the joint PMF of N and T is { ( t P N,T (n, t) n ) p n ( p) t n ( q) n pq n,t n (7) (c) It is possible to find the marginal PMF s by summing the joint PMF. However, it is much easier to obtain the marginal PMFs by consideration of the experiment. Specifically, when a bus arrives, it is boarded with probability q. Moreover, the experiment ends when a bus is boarded. By viewing whether each arriving bus is boarded as an independent trial, N is the number of trials until the first success. Thus, N has the geometric PMF { ( q) P N (n) n q n,,... (8) To find the PMF of T, suppose we regard each minute as an independent trial in which a success occurs if a bus arrives and that bus is boarded. In this case, the success probability is pq and T is the number of minutes up to and including the first success. The PMF of T is also geometric. P N T (n t) P N,T (n, t) P T (t) P T (t) { ( pq) t pq t,,... (d) Once we have the marginal PMFs, the conditional PMFs are easy to find. { ( t ) ( p( q) n pq ) n ( ) t (n ) p pq n,,...,t That is, given you depart at time T t, the number of buses that arrive during minutes,...,t has a binomial PMF since in each minute a bus arrives with probability p. Similarly, the conditional PMF of T given N is P T N (t n) P N,T (n, t) P N (n) { ( t n ) p n ( p) t n t n, n +,... This result can be explained. Given that you board bus N n, thetimet when you leave is the time for n buses to arrive. If we view each bus arrival as a success of an independent trial, the time for n buses to arrive has the above Pascal PMF. (9) () () 73

51 Problem Solution If you construct a tree describing what type of call (if any) that arrived in any millisecond period, it will be apparent that a fax call arrives with probability α pqr or no fax arrives with probability α. That is, whether a fax message arrives each millisecond is a Bernoulli trial with success probability α. Thus, the time required for the first success has the geometric PMF { ( α) P T (t) t α t,,... () Note that N is the number of trials required to observe successes. Moreover, the number of trials needed to observe successes is N T + N where N is the number of trials needed to observe successes through. Since N is just the number of trials needed to observe 99 successes, it has the Pascal (k 99,p)PMF ( n P N (n) 98 ) α 99 ( α) n 99. () Since the trials needed to generate successes though are independent of the trials that yield the first success, N and T are independent. Hence P N T (n t) P N T (n t t) P N (n t). (3) Applying the PMF of N found above, we have ( ) n t P N T (n t) α 99 ( α) n t 99. (4) 98 Finally the joint PMF of N and T is P N,T (n, t) P N T (n t) P T (t) (5) {( n t ) 98 α ( α) n t,,...; n 99+t, + t,... (6) This solution can also be found a consideration of the sample sequence of Bernoulli trials in which we either observe or do not observe a fax message. To find the conditional PMF P T N (t n), we first must recognize that N is simply the number of trials needed to observe successes and thus has the Pascal PMF ( ) n P N (n) α ( α) n (7) 99 Hence for any integer n, the conditional PMF is P T N (t n) P N,T (n, t) ( n t 98 ) t,,...,n 99 ( n 99 ) (8) P N (n). Problem 4.. Solution Flip a fair coin times and let be the number of heads in the first 75 flips and be the number of heads in the last 5 flips. We know that and are independent and can find their PMFs easily. P (x) ( ) 75 (/) 75 P (y) x ( ) 5 (/) 5 () y 74

52 The joint PMF of and N can be expressed as the product of the marginal PMFs because we know that and are independent. ( )( ) 75 5 P, (x, y) (/) () x y Problem 4.. Solution Using the following probability model P (k) P (k) We can calculate the requested moments. 3/4 k /4 k E [] 3/4 +/4 5 () Var[] 3/4 ( 5) +/4 ( 5) 75 (3) E [ + ]E[]+E[] E [] (4) Since and are independent, Theorem 4.7 yields Var[ + ]Var[]+Var[ ]Var[] 5 (5) Since and are independent, P, (x, y) P (x)p (y) and E [ ] P, (x, y) ()() () P () P () (6) Problem 4..3 Solution x, y, ().75 (7) (a) Normally, checking independence requires the marginal PMFs. However, in this problem, the zeroes in the table of the joint PMF P, (x, y) allows us to verify very quickly that and are dependent. In particular, P ( ) /4 andp () 4/48 but P, (, ) P ( ) P () () (b) To fill in the tree diagram, we need the marginal PMF P (x) and the conditional PMFs P (y x). By summing the rows on the table for the joint PMF, we obtain P, (x, y) y y y P (x) x 3/6 /6 /4 x /6 /6 /6 / x /8 /8 /4 Now we use the conditional PMF P (y x) P, (x, y)/p (x) towrite 3/4 y { /3 y,, P (y ) /4 y P (y ) { / y, P (y ) () (3) (4) 75

53 Now we can us these probabilities to label the tree. The generic solution and the specific solution with the exact values are P P ( ) ( ) P () P () P ( ) P ( ) P ( ) P ( ) P ( ) P ( ) /4 / /4 3/4 /4 /3 /3 /3 / / Problem 4..4 Solution In the solution to Problem 4.9., we found that the conditional PMF of M given N is ( n ) P M N (m n) { m (/3) m (/3) n m m,,...,n () Since P M N (m n) depends on the event N n, weseethatm and N are dependent. Problem 4..5 Solution We can solve this problem for the general case when the probability of heads is p. For the fair coin, p /. Viewing each flip as a Bernoulli trial in which heads is a success, the number of flips until heads is the number of trials needed for the first success which has the geometric PMF P (x) { ( p) x p x,,... () Similarly, no matter how large may be, the number of additional flips for the second heads is the same experiment as the number of flips needed for the first occurrence of heads. That is, P (x) P (x). Moreover, the flips needed to generate the second occurrence of heads are independent of the flips that yield the first heads. Hence, it should be apparent that and are independent and { ( p) x +x P, (x,x )P (x ) P (x ) p x,,...; x,,... () However, if this independence is not obvious, it can be derived by examination of the sample path. When x andx, the event { x, x } occurs iff we observe the sample sequence tt t }{{} x times h tt t }{{} h (3) x times The above sample sequence has probability ( p) x p( p) x p which in fact equals P, (x,x ) given earlier. 76

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2},

Problem Y is an exponential random variable with parameter λ = 0.2. Given the event A = {Y < 2}, ECE32 Spring 25 HW Solutions April 6, 25 Solutions to HW Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where

More information

Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates and David J.

Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates and David J. Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates and David J. Goodman Problem Solutions : Yates and Goodman,.6.3.7.7.8..9.6 3.. 3.. and 3..

More information

Section 8.1. Vector Notation

Section 8.1. Vector Notation Section 8.1 Vector Notation Definition 8.1 Random Vector A random vector is a column vector X = [ X 1 ]. X n Each Xi is a random variable. Definition 8.2 Vector Sample Value A sample value of a random

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

Section 9.1. Expected Values of Sums

Section 9.1. Expected Values of Sums Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. NAME: ECE 302 Division MWF 0:30-:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding. If you are not in Prof. Pollak s section, you may not take this

More information

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables

Chapter 2. Some Basic Probability Concepts. 2.1 Experiments, Outcomes and Random Variables Chapter 2 Some Basic Probability Concepts 2.1 Experiments, Outcomes and Random Variables A random variable is a variable whose value is unknown until it is observed. The value of a random variable results

More information

Probability Review. Gonzalo Mateos

Probability Review. Gonzalo Mateos Probability Review Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ September 11, 2018 Introduction

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

Chapter 2 Random Variables

Chapter 2 Random Variables Stochastic Processes Chapter 2 Random Variables Prof. Jernan Juang Dept. of Engineering Science National Cheng Kung University Prof. Chun-Hung Liu Dept. of Electrical and Computer Eng. National Chiao Tung

More information

We introduce methods that are useful in:

We introduce methods that are useful in: Instructor: Shengyu Zhang Content Derived Distributions Covariance and Correlation Conditional Expectation and Variance Revisited Transforms Sum of a Random Number of Independent Random Variables more

More information

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009. NAME: ECE 32 Division 2 Exam 2 Solutions, /4/29. You will be required to show your student ID during the exam. This is a closed-book exam. A formula sheet is provided. No calculators are allowed. Total

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Lecture 4: Probability and Discrete Random Variables

Lecture 4: Probability and Discrete Random Variables Error Correcting Codes: Combinatorics, Algorithms and Applications (Fall 2007) Lecture 4: Probability and Discrete Random Variables Wednesday, January 21, 2009 Lecturer: Atri Rudra Scribe: Anonymous 1

More information

1.1 Review of Probability Theory

1.1 Review of Probability Theory 1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable

Lecture Notes 1 Probability and Random Variables. Conditional Probability and Independence. Functions of a Random Variable Lecture Notes 1 Probability and Random Variables Probability Spaces Conditional Probability and Independence Random Variables Functions of a Random Variable Generation of a Random Variable Jointly Distributed

More information

Probability and Distributions

Probability and Distributions Probability and Distributions What is a statistical model? A statistical model is a set of assumptions by which the hypothetical population distribution of data is inferred. It is typically postulated

More information

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2

Math 151. Rumbos Spring Solutions to Review Problems for Exam 2 Math 5. Rumbos Spring 22 Solutions to Review Problems for Exam 2. Let X and Y be independent Normal(, ) random variables. Put Z = Y X. Compute the distribution functions (z) and (z). Solution: Since X,

More information

conditional cdf, conditional pdf, total probability theorem?

conditional cdf, conditional pdf, total probability theorem? 6 Multiple Random Variables 6.0 INTRODUCTION scalar vs. random variable cdf, pdf transformation of a random variable conditional cdf, conditional pdf, total probability theorem expectation of a random

More information

1 Joint and marginal distributions

1 Joint and marginal distributions DECEMBER 7, 204 LECTURE 2 JOINT (BIVARIATE) DISTRIBUTIONS, MARGINAL DISTRIBUTIONS, INDEPENDENCE So far we have considered one random variable at a time. However, in economics we are typically interested

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Chapter 2: Random Variables

Chapter 2: Random Variables ECE54: Stochastic Signals and Systems Fall 28 Lecture 2 - September 3, 28 Dr. Salim El Rouayheb Scribe: Peiwen Tian, Lu Liu, Ghadir Ayache Chapter 2: Random Variables Example. Tossing a fair coin twice:

More information

Random Variables and Probability Distributions

Random Variables and Probability Distributions CHAPTER Random Variables and Probability Distributions Random Variables Suppose that to each point of a sample space we assign a number. We then have a function defined on the sample space. This function

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc. ECE32 Exam 2 Version A April 21, 214 1 Name: Solution Score: /1 This exam is closed-book. You must show ALL of your work for full credit. Please read the questions carefully. Please check your answers

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

1: PROBABILITY REVIEW

1: PROBABILITY REVIEW 1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Chapter 5 Joint Probability Distributions

Chapter 5 Joint Probability Distributions Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 5 Joint Probability Distributions 5 Joint Probability Distributions CHAPTER OUTLINE 5-1 Two

More information

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1

ECE 650 Lecture 4. Intro to Estimation Theory Random Vectors. ECE 650 D. Van Alphen 1 EE 650 Lecture 4 Intro to Estimation Theory Random Vectors EE 650 D. Van Alphen 1 Lecture Overview: Random Variables & Estimation Theory Functions of RV s (5.9) Introduction to Estimation Theory MMSE Estimation

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

Discrete Random Variable

Discrete Random Variable Discrete Random Variable Outcome of a random experiment need not to be a number. We are generally interested in some measurement or numerical attribute of the outcome, rather than the outcome itself. n

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions

ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions ECE 541 Stochastic Signals and Systems Problem Set 9 Solutions Problem Solutions : Yates and Goodman, 9.5.3 9.1.4 9.2.2 9.2.6 9.3.2 9.4.2 9.4.6 9.4.7 and Problem 9.1.4 Solution The joint PDF of X and Y

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes

UC Berkeley Department of Electrical Engineering and Computer Sciences. EECS 126: Probability and Random Processes UC Berkeley Department of Electrical Engineering and Computer Sciences EECS 6: Probability and Random Processes Problem Set 3 Spring 9 Self-Graded Scores Due: February 8, 9 Submit your self-graded scores

More information

EE4601 Communication Systems

EE4601 Communication Systems EE4601 Communication Systems Week 2 Review of Probability, Important Distributions 0 c 2011, Georgia Institute of Technology (lect2 1) Conditional Probability Consider a sample space that consists of two

More information

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl. E X A M Course code: Course name: Number of pages incl. front page: 6 MA430-G Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours Resources allowed: Notes: Pocket calculator,

More information

1 Review of di erential calculus

1 Review of di erential calculus Review of di erential calculus This chapter presents the main elements of di erential calculus needed in probability theory. Often, students taking a course on probability theory have problems with concepts

More information

Let X and Y denote two random variables. The joint distribution of these random

Let X and Y denote two random variables. The joint distribution of these random EE385 Class Notes 9/7/0 John Stensby Chapter 3: Multiple Random Variables Let X and Y denote two random variables. The joint distribution of these random variables is defined as F XY(x,y) = [X x,y y] P.

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

Basics on Probability. Jingrui He 09/11/2007

Basics on Probability. Jingrui He 09/11/2007 Basics on Probability Jingrui He 09/11/2007 Coin Flips You flip a coin Head with probability 0.5 You flip 100 coins How many heads would you expect Coin Flips cont. You flip a coin Head with probability

More information

Continuous r.v practice problems

Continuous r.v practice problems Continuous r.v practice problems SDS 321 Intro to Probability and Statistics 1. (2+2+1+1 6 pts) The annual rainfall (in inches) in a certain region is normally distributed with mean 4 and standard deviation

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Probability. Table of contents

Probability. Table of contents Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416) D. ARAPURA This is a summary of the essential material covered so far. The final will be cumulative. I ve also included some review problems

More information

Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 2 Random Variables. Random Variable Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

INTRODUCTION TO REAL ANALYSIS II MATH 4332 BLECHER NOTES

INTRODUCTION TO REAL ANALYSIS II MATH 4332 BLECHER NOTES INTRODUCTION TO REAL ANALYSIS II MATH 4332 BLECHER NOTES You will be expected to reread and digest these typed notes after class, line by line, trying to follow why the line is true, for example how it

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

Probability. Carlo Tomasi Duke University

Probability. Carlo Tomasi Duke University Probability Carlo Tomasi Due University Introductory concepts about probability are first explained for outcomes that tae values in discrete sets, and then extended to outcomes on the real line 1 Discrete

More information

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text. TEST #3 STA 5326 December 4, 214 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. (You will have access to

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

3-1. all x all y. [Figure 3.1]

3-1. all x all y. [Figure 3.1] - Chapter. Multivariate Distributions. All of the most interesting problems in statistics involve looking at more than a single measurement at a time, at relationships among measurements and comparisons

More information

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I

for valid PSD. PART B (Answer all five units, 5 X 10 = 50 Marks) UNIT I Code: 15A04304 R15 B.Tech II Year I Semester (R15) Regular Examinations November/December 016 PROBABILITY THEY & STOCHASTIC PROCESSES (Electronics and Communication Engineering) Time: 3 hours Max. Marks:

More information

Review of probability

Review of probability Review of probability Computer Sciences 760 Spring 2014 http://pages.cs.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts definition of probability random variables

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Multivariate random variables

Multivariate random variables Multivariate random variables DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Joint distributions Tool to characterize several

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p). CHAPTER 6 Solution to Problem 6 (a) The random variable R is binomial with parameters p and n Hence, ( ) n p R(r) = ( p) n r p r, for r =0,,,,n, r E[R] = np, and var(r) = np( p) (b) Let A be the event

More information

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems Review of Basic Probability The fundamentals, random variables, probability distributions Probability mass/density functions

More information

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs

Lecture Notes 3 Multiple Random Variables. Joint, Marginal, and Conditional pmfs. Bayes Rule and Independence for pmfs Lecture Notes 3 Multiple Random Variables Joint, Marginal, and Conditional pmfs Bayes Rule and Independence for pmfs Joint, Marginal, and Conditional pdfs Bayes Rule and Independence for pdfs Functions

More information

Data Analysis and Monte Carlo Methods

Data Analysis and Monte Carlo Methods Lecturer: Allen Caldwell, Max Planck Institute for Physics & TUM Recitation Instructor: Oleksander (Alex) Volynets, MPP & TUM General Information: - Lectures will be held in English, Mondays 16-18:00 -

More information

Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions

Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions Probability Theory and Statistics (EE/TE 3341) Homework 3 Solutions Yates and Goodman 3e Solution Set: 3.2.1, 3.2.3, 3.2.10, 3.2.11, 3.3.1, 3.3.3, 3.3.10, 3.3.18, 3.4.3, and 3.4.4 Problem 3.2.1 Solution

More information

1 Random variables and distributions

1 Random variables and distributions Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set

More information

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr.

Topic 2: Probability & Distributions. Road Map Probability & Distributions. ECO220Y5Y: Quantitative Methods in Economics. Dr. Topic 2: Probability & Distributions ECO220Y5Y: Quantitative Methods in Economics Dr. Nick Zammit University of Toronto Department of Economics Room KN3272 n.zammit utoronto.ca November 21, 2017 Dr. Nick

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

MAT 271E Probability and Statistics

MAT 271E Probability and Statistics MAT 71E Probability and Statistics Spring 013 Instructor : Class Meets : Office Hours : Textbook : Supp. Text : İlker Bayram EEB 1103 ibayram@itu.edu.tr 13.30 1.30, Wednesday EEB 5303 10.00 1.00, Wednesday

More information

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type. Expectations of Sums of Random Variables STAT/MTHE 353: 4 - More on Expectations and Variances T. Linder Queen s University Winter 017 Recall that if X 1,...,X n are random variables with finite expectations,

More information

University of Illinois ECE 313: Final Exam Fall 2014

University of Illinois ECE 313: Final Exam Fall 2014 University of Illinois ECE 313: Final Exam Fall 2014 Monday, December 15, 2014, 7:00 p.m. 10:00 p.m. Sect. B, names A-O, 1013 ECE, names P-Z, 1015 ECE; Section C, names A-L, 1015 ECE; all others 112 Gregory

More information

Lecture 2: Review of Probability

Lecture 2: Review of Probability Lecture 2: Review of Probability Zheng Tian Contents 1 Random Variables and Probability Distributions 2 1.1 Defining probabilities and random variables..................... 2 1.2 Probability distributions................................

More information

Conditional distributions

Conditional distributions Conditional distributions Will Monroe July 6, 017 with materials by Mehran Sahami and Chris Piech Independence of discrete random variables Two random variables are independent if knowing the value of

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 2016 MODULE 1 : Probability distributions Time allowed: Three hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16 EE539R: Problem Set 3 Assigned: 24/08/6, Due: 3/08/6. Cover and Thomas: Problem 2.30 (Maimum Entropy): Solution: We are required to maimize H(P X ) over all distributions P X on the non-negative integers

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

BINOMIAL DISTRIBUTION

BINOMIAL DISTRIBUTION BINOMIAL DISTRIBUTION The binomial distribution is a particular type of discrete pmf. It describes random variables which satisfy the following conditions: 1 You perform n identical experiments (called

More information

Probability and Statistics

Probability and Statistics Probability and Statistics 1 Contents some stochastic processes Stationary Stochastic Processes 2 4. Some Stochastic Processes 4.1 Bernoulli process 4.2 Binomial process 4.3 Sine wave process 4.4 Random-telegraph

More information

Chapter 5. Means and Variances

Chapter 5. Means and Variances 1 Chapter 5 Means and Variances Our discussion of probability has taken us from a simple classical view of counting successes relative to total outcomes and has brought us to the idea of a probability

More information

Multivariate Distributions (Hogg Chapter Two)

Multivariate Distributions (Hogg Chapter Two) Multivariate Distributions (Hogg Chapter Two) STAT 45-1: Mathematical Statistics I Fall Semester 15 Contents 1 Multivariate Distributions 1 11 Random Vectors 111 Two Discrete Random Variables 11 Two Continuous

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 4: Solutions Fall 007 Issued: Thursday, September 0, 007 Due: Friday, September 8,

More information