Conditional Expectation and Independence Exercises
|
|
- Monica Holmes
- 5 years ago
- Views:
Transcription
1 Conditional Expectation and Independence Exercises Exercise 3.1. Show the four following properties : for any random variables and Y and for any real numbers a and b, (E1) E P [a + by ] = ae P [] + be P [Y ] ; (E2) If 8! 2, (!) Y (!) then E P [] E P [Y ] ; (E3) In general, E P [Y ] 6= E P [] E P [Y ] ; (E4) If and Y are independent then E P [Y ] = E P [] E P [Y ] : Exercise 3.2. Show that for any random variable and any real numbers a and b, (V 1) Var P [] 0; (V 2) Var P [] = E P [ 2 ] E P [] 2 ; (V 3) 8a 2 R; Var P [a + b] = a 2 Var P [] : (V 4) Var P [ + Y ] = Var P [] + Var P [Y ] + 2Cov P [; Y ] Exercise 3.3. Show that for any random variables and Y and for any real numbers a and b, (C1) Cov P [; Y ] = E P [Y ] E P [] E P [Y ] ; (C2) If and Y are independent then Cov P [; Y ] = 0; (C3) 8a; b 2 R; Cov P [a 1 + b 2 ; Y ] = acov P [ 1 ; Y ] + bcov P [ 2 ; Y ] : 1
2 Exercise 3.4. Let us revisit a problem treated in chapter 2. Today is Monday and you have one dollar in your piggy bank. Starting tomorrow, every morning until Friday (inclusively), you toss a coin. If the result is tails you withdraw one dollar (if possible) from the piggy bank. If the result is heads you deposit one dollar in it. What is the conditional expectation of the amount of money in the bank Friday noon, given the information by Wednesday noon? Exercise 3.5. Recall exercise 2.4 on the gambler s ruin. Calculate E [ 4 j F 0 ] ; E [ 4 j F 2 ] ; E [Y 4 j G 2 ] and E [Y 2 j G 4 ] for an arbitrary p and for the speci c case p = 1 2. Exercise 3.6. Justify all your answers. Let the stochastic processes = f t : t 2 f0; 1; 2; 3; 4gg represent the price of a share throughout time. We consider a time interval of 6 months. 0 (!) 1 (!) 2 (!) 3 (!) 4 (!) P (!)! ; 06! ; 065! ; 06! ; 065! ; 06! ; 065! ; 0625! ; 08! ; 04! ; 0675! ; 0625! ; 0625! ; 0625! ; 0625! ; 0625! ; 0625 a) What ltration is generated by the process? b) Give the conditional distribution of 4, knowing 2. c) Calculate the conditional expectation of 4, knowing 2. d) Give the conditional distribution of 4, knowing F 2. e) Calculate the conditional expectation of 4, knowing F 2 Problem 3.7. Let us remain in the context of problem 3.6. Two investors, call them A and B, buy a share today (we could say a hundred thousands shares, but it would only add a bunch of zeros to our calculations) which they can sell at the end of this year (t = 2) or 2
3 at the end of the next year (t = 4). If an investor chooses to sell his share at time t = 2, he will then place the money in a bank account yielding 15% in interest per year (the interest is compounded annually at times t = 2 and t = 4). If instead he chooses not to sell the share at time t = 2, he will have to sell it at time t = 4. Those two investors will be away for the next ve years because they are doing a (possibly metaphorical, this is up to you) trip to Venus. Hence they both appoint prosecutors (talented and trustworthy) to manage the operations in their place. But they re out of luck: both prosecutors are unfamiliar with money issues (we said talented?), therefore the two investors must give precise instructions today to their respective prosecutors on what to do in the next year, depending on how the share will have evolved during that period. It goes without saying that both investors aim at maximizing their pro t. The prosecutor of investor A will check the price of the stock at time t = 1, but the prosecutor of investor B will not (he might not be the most suited candidate for this job, don t you think?). Said otherwise, observation times for prosecutor A are t = 1 and t = 2, while for prosecutor B the only observation time is t = 2: a) What instructions does investor A give to his prosecutor? b) What instructions does investor B give to his prosecutor? To facilitate comparisons, use the holding period yield (HPY). The HPY of a share on a given time interval (say we buy the share at the beginning of period t 1 and sell it at the end of period t 2 ) is R t1 ;t 2 = t 2 t1 : Problem 3.8. This problem uses notations de ned in problems 3.6 and 3.7. a) Let A and B represent the random times at which the shares are sold, respectively by prosecutors A and B. Are A and B stopping times? b) Say we replace the probability measure P by Q where Q (!) = 1 for every! Are A and B stopping times in this case? Problem 3.9. Let 1 ; 2 ; 3 be independent random variables such that P( i = 1) = P( i = 1) = 1 2 : Let us de ne A 1 = f 2 = 3 g ; A 2 = f 1 = 3 g and A 3 = f 1 = 2 g. Show that the events A 1 ; A 2 and A 3 are pairwise independent, but not mutually independent. Problem Let and Y be two independent random variables, both of standard normal distribution. Let " be a random variable independent of Y and such that: P f" = 1g = P f" = 1g = 1 2 : a) Show that Z = "Y is also normal but that Y + Z is not. b) Show that Y and Z are not independent, although Cov(Z; Y ) = 0. Problem In the case and Y are independent and identically distributed random variables, show that E [ Y j + Y ] = 0: 3
4 Solutions 1 Exercise 3.1 Proof of (E1) E [a + by ] = ae [] + be [Y ]. Since f ;Y (x; y) = P [ = x and Y = y] y2s Y y2s Y " # [ = P f = x and Y = yg y2s Y because those events are disjoint " # [ = P ff = xg \ fy = ygg y2s Y " ( )# [ = P f = xg \ fy = yg y2s Y = P [f = xg \ ] = P [f = xg] = f (x) (1) hence E [a + by ] = (ax + by) f ;Y (x; y) x2s y2s Y = axf ;Y (x; y) + x2s y2s Y y2s Y = a x f ;Y (x; y) + b x2s y2s Y = a xf (x) + b yf Y (y) x2s y2s Y y2s Y y byf ;Y (x; y) x2s f ;Y (x; y) x2s = ae [] + be [Y ]. Proof of (E2) Y ) E [] E [Y ] : 4
5 To begin with, note that if W is a non negative random variable then E [W ] 0. Indeed, E [W ] = {z} x f W (x) 0: {z } x2s W 0 0 Let W = Y. Since Y, then W 0. Hence E [W ] 0: But using (E1), we have that E [W ] = E [Y ] = E [Y ] E [] : Therefore, 0 E [Y ] E [], E [] E [Y ] : 2 Exercise 3.3 Proof of (C1) Cov [; Y ] = E [Y ] E [] E [Y ]. 5
6 Cov [; Y ] = f ;Y (x; y) (x E []) (y x2s y2s Y E [Y ]) f ;Y (x; y) = (xy x2s y2s Y xe [Y ] ye [] + E [] E [Y ]) f ;Y (x; y) = xy f ;Y (x; y) E [Y ] x f ;Y (x; y) x2s y2s Y x2s y2s Y E [] y f ;Y (x; y) + E [] E [Y ] y2s Y x2s x2s y2s Y = xy f ;Y (x; y) E [Y ] x f (x) x2s y2s Y x2s E [] yf Y (y) + E [] E [Y ] f (x) y2s Y x2s {z } =1 by de nition of a density function = E [Y ] E [Y ] E [] E [] E [Y ] + E [] E [Y ] = E [Y ] E [] E [Y ]. Proof of (C2) If and Y are independent then Cov [; Y ] = 0. Since independence of and Y implies E [Y ] = E [] E [Y ], to obtain the result we only have to invoke (C1) : Cov [; Y ] = E [Y ] E [] E [Y ] = E [] E [Y ] E [] E [Y ] = 0. 6
7 Proof of (C3) Cov [a a 2 2 ; Y ] = a 1 Cov [ 1 ; Y ] + a 2 Cov [ 2 ; Y ]. By (C1), we have that Cov [a a 2 2 ; Y ] = E [(a a 2 2 ) Y ] E [a a 2 2 ] E [Y ] = E [a 1 1 Y + a 2 2 Y ] (a 1 E [ 1 ] + a 2 E [ 2 ]) E [Y ] by (E1) = a 1 E [ 1 Y ] + a 2 E [ 2 Y ] a 1 E [ 1 ] E [Y ] + a 2 E [ 2 ] E [Y ] by (E1) = a 1 (E [ 1 Y ] E [ 1 ] E [Y ]) + a 2 (E [ 2 Y ] E [ 2 ] E [Y ]) = a 1 Cov [ 1 ; Y ] + a 2 Cov [ 2 ; Y ] Proof of (C4) V ar [ + Y ] = V ar [] + V ar [Y ] + 2Cov [; Y ]. V ar [ + Y ] = E ( + Y ) 2 (E [ + Y ]) 2 by (V 2) = E 2 + 2Y + Y 2 (E [] + E [Y ]) 2 by (E1) = E 2 + 2E [Y ] + E Y 2 (E []) 2 2E [] E [Y ] (E [Y ]) 2 = E 2 (E []) 2 + E Y 2 (E [Y ]) (E [Y ] E [] E [Y ]) = V ar [] + V ar [Y ] + Cov [; Y ] using (V 2) and (C1). 7
8 3 Exercise 3.4 The blocks of F 2 are B 1 = ft T T T; T T T H; T T HT; T T HHg ; B 2 = ft HT T; T HT H; T HHT; T HHHg ; B 3 = fht T T; HT T H; HT HT; HT HHg ; B 4 = fhht T; HHT H; HHHT; HHHHg : Since then! P (!) 4 (!) T T T T p 4 0 T T T H p 3 (1 p) 1 T T HT p 3 (1 p) 0 T T HH p 2 (1 p) 2 2 T HT T p 3 (1 p) 0 T HT H p 2 (1 p) 2 1 T HHT p 2 (1 p) 2 1 T HHH p (1 p) 3 3 P (B i )! P (!) 4 (!) HT T T p 3 (1 p) 0 HT T H p 2 (1 p) 2 1 HT HT p 2 (1 p) 2 1 HT HH p (1 p) 3 3 HHT T p 2 (1 p) 2 1 HHT H p (1 p) 3 3 HHHT p (1 p) 3 3 HHHH (1 p) 4 5 B 1 p 4 + 2p 3 (1 p) + p 2 (1 p) 2 = p 2 B 2 p 3 (1 p) + 2p 2 (1 p) 2 + p (1 p) 3 = p p 2 = p (1 p) B 3 p 3 (1 p) + 2p 2 (1 p) 2 + p (1 p) 3 = p p 2 = p (1 p) B 4 p 2 (1 p) 2 + 2p (1 p) 3 + (1 p) 4 = 1 2p + p 2 = (1 p) 2 Recall that if! 2 B i, then E P [ 4 jf 2 ] (!) = 1 P (B i )! 2B i (! ) P (! ) 8
9 Therefore, = If! 2 B 1 then E P [ 4 jf 2 ] (!) 1 (! ) P (! ) P (B 1 )! 2B 1 = 1 p 2 0p 4 + 1p 3 (1 p) + 0p 3 (1 p) + 2p 2 (1 p) 2 = (1 p) (2 p) If! 2 B 2 then E P [ 4 jf 2 ] (!) 1 = (! ) P (! ) P (B 2 )! 2B 2 1 = p (1 p) 0p3 (1 p) + 1p 2 (1 p) 2 + 1p 2 (1 p) 2 + 3p (1 p) 3 = (3 p) (1 p) If! 2 B 3 then E P [ 4 jf 2 ] (!) 1 = (! ) P (! ) P (B 3 )! 2B 3 1 = p (1 p) 0p3 (1 p) + 1p 2 (1 p) 2 + 1p 2 (1 p) 2 + 3p (1 p) 3 = (3 p) (1 p) If! 2 B 4 then E P [ 4 jf 2 ] (!) 1 = (! ) P (! ) P (B 4 )! 2B 4 1 = (1 p) 2 1p 2 (1 p) 2 + 3p (1 p) 3 + 3p (1 p) (1 p) 4 = 5 4p 9
10 In the speci c case of p = 1 2, If! 2 B 1 then E P [ 4 jf 2 ] (!) = 3 4 If! 2 B 2 then E P [ 4 jf 2 ] (!) = 5 4 If! 2 B 3 then E P [ 4 jf 2 ] (!) = 5 4 If! 2 B 4 then E P [ 4 jf 2 ] (!) = 3 4 Exercise 3.5 E [ 4 j F 0 ] = 2 (1 p) p (1 p) p 2 (1 p) p 3 (1 p) + 10 p 4 = 2 + 8p If p = 1 2, then E [ 4j F 0 ] = 6: 10
11 If! 2 B 1 = ft T T T; T T T H; T T HT; T T HHg ; then E [ 4 j F 2 ] (!) = 10 p4 + 8 p 3 (1 p) + 8 p 3 (1 p) + 6 p 2 (1 p) 2 p 4 + p 3 (1 p) + p 3 (1 p) + p 2 (1 p) 2 = 4p + 6 If! 2 B 2 = ft HT T; T HT H; T HHT; T HHHg ; then E [ 4 j F 2 ] (!) = 8 p3 (1 p) + 6 p 2 (1 p) p 2 (1 p) p (1 p) 3 p 3 (1 p) + p 2 (1 p) 2 + p 2 (1 p) 2 + p (1 p) 3 = 4p + 4 If! 2 B 3 = fht T T; HT T H; HT HT; HT HHg ; then E [ 4 j F 2 ] (!) = 8 p3 (1 p) + 6 p 2 (1 p) p 2 (1 p) p (1 p) 3 p 3 (1 p) + p 2 (1 p) 2 + p 2 (1 p) 2 + p (1 p) 3 = 4p + 4 If! 2 B 4 = fhht T; HHT H; HHHT; HHHHg ; then E [ 4 j F 2 ] (!) = 6 p2 (1 p) p (1 p) p (1 p) (1 p) 4 p 2 (1 p) 2 + p (1 p) 3 + p (1 p) 3 + (1 p) 4 = 4p + 2 If p = 1 2, then 8 8 if! 2 B 1 = ft T T T; T T T H; T T HT; T T HHg >< 6 if! 2 B E [ 4 j F 2 ] (!) = 2 = ft HT T; T HT H; T HHT; T HHHg 6 if! 2 B 3 = fht T T; HT T H; HT HT; HT HHg >: 4 if! 2 B 4 = fhht T; HHT H; HHHT; HHHHg 11
12 T T T T; T T T H; T T HT; T T HH; If! 2 B 1 = T HT T; T HT H; T HHT; T HHH E [Y 4 j G 2 ] (!) = 10 ; then If! 2 B 3 = fht T T; HT T H; HT HT; HT HHg ; then E [Y 4 j G 2 ] (!) = 10 p3 (1 p) + 6 p 2 (1 p) p 2 (1 p) p (1 p) 3 p 3 (1 p) + p 2 (1 p) 2 + p 2 (1 p) 2 + p (1 p) 3 = 2p (2p + 3) If! 2 B 4 = fhht T; HHT H; HHHT; HHHHg ; then E [Y 4 j G 2 ] (!) = 0 If p = 1 2, then 8 T T T T; T T T H; T T HT; T T HH; >< 10 if! 2 B 1 = T HT T; T HT H; T HHT; T HHH E [Y 4 j G 2 ] (!) = 4 if! 2 B 3 = fht T T; HT T H; HT HT; HT HHg >: 0 if! 2 B 4 = fhht T; HHT H; HHHT; HHHHg If! 2 C 1 = E [Y 2 j G 4 ] (!) = 10 T T T T; T T T H; T T HT; T T HH; T HT T; T HT H; T HHT; T HHH ; then If! 2 C 2 = fht T T g ; then E [Y 2 j G 4 ] (!) = 4 If! 2 C 3 = fht T Hg ; then E [Y 2 j G 4 ] (!) = 4 If! 2 C 4 = fht HT; HT HHg ; then E [Y 2 j G 4 ] (!) = 4 If! 2 C 5 = fhht T; HHT H; HHHT; HHHHg ; then E [Y 2 j G 4 ] (!) = 0 12
13 5 Exercise 3.6 a) What ltration is generated by the process? F 0 = f?; g F 1 = ff! 1 ; :::;! 10 g ; f! 11 ; :::;! 16 gg F 2 = ff! 1 ; :::;! 6 g ; f! 7 ; :::;! 10 g ; f! 11 ; :::;! 13 g ; f! 14 ;! 15 ;! 16 gg f!1 ;! F 3 = 2 g ; f! 3 ;! 4 g ; f! 5 ;! 6 g ; f! 7 ; :::;! 10 g ; f! 11 ;! 12 g ; f! 13 g ; f! 14 ;! 15 g f! 16 g F 4 = f! 1 g ; f! 2 g ; f! 3 g ; f! 4 g ; f! 5 g ; f! 6 g ; f! 7 g ; f! 8 g f! 9 g ; f! 10 g ; f! 11 g ; f! 12 g ; f! 13 g ; f! 14 g ; f! 15 g ; f! 16 g b) Give the conditional distribution of 4, knowing :06 >< 30:06+30:065 P [ 4 = x j 2 = 15] = P [ 4 = x j 2 = 12] = P [ 4 = x j 2 = 11] = >: 8 >< >: 8 >< >: 30:065 30:06+30:065 = 0:48 if x = 16 = 0:52 if x = 14 0 otherwise 0:08 40:0625+0:08+0:04+0: : :0625+0:08+0:04+0:0675 0:04+20: :0625+0:08+0:04+0:0675 0: :0625+0:08+0:04+0:0675 0; : if x = 20 20; : if x = 12 0 otherwise = 0:18286 if x = 18 = 0:28571 if x = 14 = 0:37714 if x = 12 = 0; if x = 11 0 otherwise 13
14 c) Calculate the conditional expectation of 4, knowing 2. If! 2 f! 1 ; :::;! 6 g E P [ 4 j ( 2 )] (!) = 3 0: : : : : :065 = 14:96 If! 2 f! 7 ; :::;! 13 g E P [ 4 j ( 2 )] (!) = 0: : :08 + 0:04 + 0: : : :08 + 0:04 + 0:0675 0: : : :08 + 0:04 + 0:0675 0: : :08 + 0:04 + 0:0675 = 13:514 If! 2 f! 14 ;! 15 ;! 16 g E P [ 4 j ( 2 )] (!) = = 14:667 14
15 d) Give the conditional distribution of 4, knowing F :06 >< 30:06+30:065 P [ 4 = x jf! 1 ; :::;! 6 g] = P [ 4 = x jf! 7 ; :::;! 10 g] = P [ 4 = x jf! 11 ; :::;! 13 g] = P [ 4 = x jf! 14 ;! 15 ;! 16 g] = >: 8 >< >: 8 >< >: 8 >< >: 30:065 30:06+30:065 = 0:48 if x = 16 = 0:52 if x = 14 0 otherwise 0:08 0:0625+0:08+0:04+0:0675 0:0625 0:0625+0:08+0:04+0:0675 0:04 0:0625+0:08+0:04+0:0675 0:0675 0:0625+0:08+0:04+0:0675 0: : if x = 14 20: : if x = 12 0 otherwise 0; : if x = 20 20; : if x = 12 0 otherwise e) Calculate the conditional expectation of 4, knowing F 2. = 0:32 if x = 18 = 0:25 if x = 14 = 0:16 if x = 12 = 0:27 if x = 11 0 otherwise if! 2 f! 1 ; :::;! 6 g ; E P [ 4 jf 2 ] (!) = 3 0: : : : : :065 = 14:96 if! 2 f! 7 ; :::;! 10 g ; E P [ 4 jf 2 ] (!) = 0: : :08 + 0:04 + 0: :0625 0: :08 + 0:04 + 0:0675 0: : :08 + 0:04 + 0: :0675 0: :08 + 0:04 + 0:0675 = 14:15 if! 2 f! 11 ;! 12 ;! 13 g ; E P [ 4 jf 2 ] (!) = = 12:667 if! 2 f! 14 ;! 15 ;! 16 g ; E P [ 4 jf 2 ] (!) = = 14:667 15
16 6 Problem 3.7 The conditional expectations of the yields are : E P [R 2;4 jf! 1 ; :::;! 6 g] = 14:96 15 E P [R 2;4 jf! 7 ; :::;! 10 g] = 14:15 12 E P [R 2;4 jf! 11 ;! 12 ;! 13 g] = 12:67 12 E P [R 2;4 jf! 14 ;! 15 ;! 16 g] = 14: E P [R 2;4 jf! 7 ; :::;! 13 g] = 13: Possible values for the random variable R 2;4 are : = : = 1:1792 = 1:0558 = 1:3334 = 1: = 1:0667 and = 0:93333 = 1:5 and = 1:1667 and = 1:8182 and = 1:0909: = 1 and = 0:
17 Recall that the bank account has a yield of 1.15 for times t = 2 to t = 4. P [R 2;4 < 1:15 jf! 1 ; :::;! 6 g] = P [ 4 = 16 or 4 = 14 jf! 1 ; :::;! 6 g] = 1 P [R 2;4 < 1:15 jf! 7 ; :::;! 10 g] = P [ 4 = 12 or 4 = 11 jf! 7 ; :::;! 10 g] = 0:43 P [R 2;4 < 1:15 jf! 11 ;! 12 ;! 13 g] = P [ 4 = 12 jf! 11 ;! 12 ;! 13 g] = 0:66667 P [R 2;4 < 1:15 jf! 14 ;! 15 ;! 16 g] = P [ 4 = 12 jf! 14 ;! 15 ;! 16 g] = 0:66667 P [R 2;4 < 1:15 jf! 7 ; :::;! 13 g] = P [ 4 = 12 or 4 = 11 jf! 7 ; :::;! 13 g] = 0: :15429 = 0:53143 a) What instructions does investor A give to his prosecutor? Justify your answer. If we observe the process from time t = 0, at time t = 2 we will be able to determine which of the following four events happened : f! 1 ; :::;! 6 g ; f! 7 ; :::;! 10 g ; f! 11 ; :::;! 13 g ; f! 14 ;! 15 ;! 16 g. If event f! 1 ; :::;! 6 g happened (the price of the share followed trajectories 11, 13, 15), then the probability to obtain a holding period yield (for the period from t = 2 to t = 4) superior to that of the bank account is zero. Hence in that case we would sell the share and invest the 15 dollars in the bank account. Then, the value of our portfolio at time t = 4 would be 15 1:15 = 17:25: If event f! 7 ; :::;! 10 g happened (the price of the share followed trajectories 11, 13, 12), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that of the bank account is around 43%. However, the expected HPY is 1:18 which is superior to that of the bank account. Hence we would keep the share. If event f! 11 ; :::;! 13 g happened (the price of the share followed trajectories 11, 12, 12), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that of the bank account is around 67%. In addition, the expected HPY is 1:0558 which is inferior to the 1.15 of the bank account. Hence we would sell the share and invest the 12 dollars in the bank account. Then, the value of our portfolio at time t = 4 would be 12 1:15 = 13:8. If event f! 14 ;! 15 ;! 16 g happened (the price of the share followed trajectories 11, 11, 11), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that 17
18 of the bank account is around 67%. However, the expected HPY is 1:3334 which is superior to the 1.15 of the bank account. This is because when the HPY exceeds the yield of the bank account, it does it by a huge margin (1:82 versus 1.15). People with a taste for risky investment would choose to keep the share, especially because even when the HPY is inferior to the yield of the bank account, it is not drastically inferior (1.09 versus 1.15). Trajectory Decision at time t = 2 11; 13; 15 sell the share 11; 13; 12 keep the share 11; 11; 12 sell the share 11; 11; 11 keep the share Value of the portfolio at time t = 4 if the share is sold 17:25 17:25 13:8 13:8 13:8 13:8 13:8 13:8 12:65 12:65 Value of the portfolio at time t = 4 if the share is kept 16 with prob. 18% 14 with prob. 19:5% 18 with prob. 8% 14 with prob. 6:25% 12 with prob. 4% 11 with prob. 6:75% 14 with prob. 6:25% 12 with prob. 12:5% 20 with prob. 6:25% 12 with prob. 12:5% b) What instructions does investor B give to his prosecutor? Justify your answer. If we observe the process only at time t = 2, we will be able to determine which of the following three events happened : f! 1 ; :::;! 6 g ; f! 7 ; :::;! 13 g ; f! 14 ;! 15 ;! 16 g. If event f! 1 ; :::;! 6 g happened (the price of the share at time t = 2 is 15), then the probability to obtain a HPY (for the period from t = 2 to t = 4) superior to that of the bank account is zero. Hence in that case we would sell the share and invest the 15 dollars in the bank account. Then, the value of our portfolio at time t = 4 would be 15 1:15 = 17:25: If event f! 7 ; :::;! 13 g happened (the price of the share at time t = 2 is 12), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that of the bank account is around 53% (0: :15429). In addition, the expected HPY is 1:1262, which is lower then the yield of the bank account. Hence in that case we would sell the share and invest the 12 dollars in the bank account. Then, the value of our portfolio at time t = 4 would be 12 1:15 = 13:8: If event f! 14 ;! 15 ;! 16 g happened (the price of the share at time t = 2 is 11), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that of the bank account is around 67%. However, the expected HPY is 1:3334 which is superior to the 1.15 of the bank account. This is because when the HPY exceeds the yield of the bank account, it does it by a huge margin (1:82 versus 1.15). People with a taste for risky investment would 18
19 choose to keep the share, especially because even when the HPY is inferior to the yield of the bank account, it is not drastically inferior (1.09 versus 1.15). Value of 2 Decision at time t = 2 15 sell the share 12 sell the share 11 keep the share 7 Problem 3.8 Value of the portfolio at time t = 4 if the share is sold 17:25 17:25 13:8 13:8 13:8 13:8 12:65 12:65 Value of the portfolio at time t = 4 if the share is kept 16 with prob. 18% 14 with prob. 19:5% 18 with prob. 8% 14 with prob. 12:5% 12 with prob. 16:5% 11 with prob. 6:75% 20 with prob. 6:25% 12 with prob. 12:5% a) Let A and B represent the random times at which the shares are sold, respectively by prosecutors A and B. Are A and B stopping times? Justify your answer. A is a stopping time because f! 2 : A (!) = 0g =? 2 F 0 f! 2 : A (!) = 1g =? 2 F 1 f! 2 : A (!) = 2g = f! 1 ; :::;! 6 g [ f! 11 ; :::;! 13 g 2 F 2 f! 2 : A (!) = 3g =? 2 F 3 f! 2 : A (!) = 4g = f! 7 ; :::;! 10 g [ f! 14 ; :::;! 15 g 2 F 4 B is a stopping time because f! 2 : B (!) = 0g =? 2 F 0 f! 2 : B (!) = 1g =? 2 F 1 f! 2 : B (!) = 2g = f! 1 ; :::;! 13 g 2 F 2 f! 2 : B (!) = 3g =? 2 F 3 f! 2 : B (!) = 4g = f! 14 ; :::;! 15 g 2 F 4 19
20 b) Say we replace the probability measure P by Q where Q (!) = 1 for every! Are A and B stopping times in this case? Justify your answer. Yes since being (or not being) a stopping time does not depend in any way on the probability measure used on the ltrated measurable space. 20
Stochastic processes and stopping time Exercises
Stochastic processes and stopping time Exercises Exercise 2.1. Today is Monday and you have one dollar in your piggy bank. Starting tomorrow, every morning until Friday (inclusively), you toss a coin.
More informationStochastic Processes Calcul stochastique. Geneviève Gauthier. HEC Montréal. Stochastic processes. Stochastic processes.
Processes 80-646-08 Calcul stochastique Geneviève Gauthier HEC Montréal Let (Ω, F) be a measurable space. A stochastic process X = fx t : t 2 T g is a family of random variables, all built on the same
More information1 Presessional Probability
1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional
More informationSolution to HW 12. Since B and B 2 form a partition, we have P (A) = P (A B 1 )P (B 1 ) + P (A B 2 )P (B 2 ). Using P (A) = 21.
Solution to HW 12 (1) (10 pts) Sec 12.3 Problem A screening test for a disease shows a positive result in 92% of all cases when the disease is actually present and in 7% of all cases when it is not. Assume
More informationProblems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.
Math 224 Fall 2017 Homework 1 Drew Armstrong Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman. Section 1.1, Exercises 4,5,6,7,9,12. Solutions to Book Problems.
More informationMarkov Chains. Chapter 16. Markov Chains - 1
Markov Chains Chapter 16 Markov Chains - 1 Why Study Markov Chains? Decision Analysis focuses on decision making in the face of uncertainty about one future event. However, many decisions need to consider
More informationMathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2
( : Focus on free Education) Exercise 16.2 Question 1: A die is rolled. Let E be the event die shows 4 and F be the event die shows even number. Are E and F mutually exclusive? Answer 1: When a die is
More information1 Random variables and distributions
Random variables and distributions In this chapter we consider real valued functions, called random variables, defined on the sample space. X : S R X The set of possible values of X is denoted by the set
More informationThe main purpose of this chapter is to prove the rst and second fundamental theorem of asset pricing in a so called nite market model.
1 2. Option pricing in a nite market model (February 14, 2012) 1 Introduction The main purpose of this chapter is to prove the rst and second fundamental theorem of asset pricing in a so called nite market
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationk P (X = k)
Math 224 Spring 208 Homework Drew Armstrong. Suppose that a fair coin is flipped 6 times in sequence and let X be the number of heads that show up. Draw Pascal s triangle down to the sixth row (recall
More informationProbability. Lecture Notes. Adolfo J. Rumbos
Probability Lecture Notes Adolfo J. Rumbos October 20, 204 2 Contents Introduction 5. An example from statistical inference................ 5 2 Probability Spaces 9 2. Sample Spaces and σ fields.....................
More informationPart (A): Review of Probability [Statistics I revision]
Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery
More informationLecture 1 : The Mathematical Theory of Probability
Lecture 1 : The Mathematical Theory of Probability 0/ 30 1. Introduction Today we will do 2.1 and 2.2. We will skip Chapter 1. We all have an intuitive notion of probability. Let s see. What is the probability
More informationDiscrete-Time Market Models
Discrete-Time Market Models 80-646-08 Stochastic Calculus I Geneviève Gauthier HEC Montréal I We will study in depth Section 2: The Finite Theory in the article Martingales, Stochastic Integrals and Continuous
More informationSnell envelope and the pricing of American-style contingent claims
Snell envelope and the pricing of -style contingent claims 80-646-08 Stochastic calculus I Geneviève Gauthier HEC Montréal Notation The The riskless security Market model The following text draws heavily
More informationExpectations and Variance
4. Model parameters and their estimates 4.1 Expected Value and Conditional Expected Value 4. The Variance 4.3 Population vs Sample Quantities 4.4 Mean and Variance of a Linear Combination 4.5 The Covariance
More information1 Functions, Graphs and Limits
1 Functions, Graphs and Limits 1.1 The Cartesian Plane In this course we will be dealing a lot with the Cartesian plane (also called the xy-plane), so this section should serve as a review of it and its
More informationRandom Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R
In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample
More information1: PROBABILITY REVIEW
1: PROBABILITY REVIEW Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 1: Probability Review 1 / 56 Outline We will review the following
More informationCHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS
CHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS 4.2 Events and Sample Space De nition 1. An experiment is the process by which an observation (or measurement) is obtained Examples 1. 1: Tossing a pair
More informationLecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample
Lecture 2: Probability Readings: Sections 5.1-5.3 1 Introduction Statistical Inference: drawing conclusions about the population based on a sample Parameter: a number that describes the population a fixed
More informationThe Table of Integrals (pages of the text) and the Formula Page may be used. They will be attached to the nal exam.
The Table of Integrals (pages 558-559 of the text) and the Formula Page may be used. They will be attached to the nal exam. 1. If f(x; y) =(xy +1) 2 p y 2 x 2,evaluatef( 2; 1). A. 1 B. 1 p 5 C. Not de
More informationExpectations and Variance
4. Model parameters and their estimates 4.1 Expected Value and Conditional Expected Value 4. The Variance 4.3 Population vs Sample Quantities 4.4 Mean and Variance of a Linear Combination 4.5 The Covariance
More informationSDS 321: Introduction to Probability and Statistics
SDS 321: Introduction to Probability and Statistics Lecture 2: Conditional probability Purnamrita Sarkar Department of Statistics and Data Science The University of Texas at Austin www.cs.cmu.edu/ psarkar/teaching
More informationC.3 First-Order Linear Differential Equations
A34 APPENDIX C Differential Equations C.3 First-Order Linear Differential Equations Solve first-order linear differential equations. Use first-order linear differential equations to model and solve real-life
More information2 = 41( ) = 8897 (A1)
. Find the sum of the arithmetic series 7 + 7 + 7 +...+ 47. (Total 4 marks) R. 7 + 7 + 7 +... + 47 7 + (n )0 = 47 0(n ) = 400 n = 4 (A) 4 S 4 = ((7) + 40(0)) = 4(7 + 00) = 8897 (A) OR 4 S 4 = (7 + 47)
More informationLecture 6 - Random Variables and Parameterized Sample Spaces
Lecture 6 - Random Variables and Parameterized Sample Spaces 6.042 - February 25, 2003 We ve used probablity to model a variety of experiments, games, and tests. Throughout, we have tried to compute probabilities
More informationProbability Theory and Simulation Methods
Feb 28th, 2018 Lecture 10: Random variables Countdown to midterm (March 21st): 28 days Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters
More informationBasic Probability. Introduction
Basic Probability Introduction The world is an uncertain place. Making predictions about something as seemingly mundane as tomorrow s weather, for example, is actually quite a difficult task. Even with
More information5.2 MULTIPLICATION OF POLYNOMIALS. section. Multiplying Monomials with the Product Rule
5.2 Multiplication of Polynomials (5 9) 231 98. Height difference. A red ball and a green ball are simultaneously tossed into the air. The red ball is given an initial velocity of 96 feet per second, and
More informationTotal. Name: Student ID: CSE 21A. Midterm #2. February 28, 2013
Name: Student ID: CSE 21A Midterm #2 February 28, 2013 There are 6 problems. The number of points a problem is worth is shown next to the problem. Show your work (even on multiple choice questions)! Also,
More informationIntroduction to Probability
Introduction to Probability Gambling at its core 16th century Cardano: Books on Games of Chance First systematic treatment of probability 17th century Chevalier de Mere posed a problem to his friend Pascal.
More informationSOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER 2. UNIVARIATE DISTRIBUTIONS
SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER. UNIVARIATE DISTRIBUTIONS. Random Variables and Distribution Functions. This chapter deals with the notion of random variable, the distribution
More informationLecture Lecture 5
Lecture 4 --- Lecture 5 A. Basic Concepts (4.1-4.2) 1. Experiment: A process of observing a phenomenon that has variation in its outcome. Examples: (E1). Rolling a die, (E2). Drawing a card form a shuffled
More information18.600: Lecture 7 Bayes formula and independence
18.600: Lecture 7 Bayes formula and independence Scott Sheffield MIT Outline Bayes formula Independence Outline Bayes formula Independence Recall definition: conditional probability Definition: P(E F )
More information12 1 = = 1
Basic Probability: Problem Set One Summer 07.3. We have A B B P (A B) P (B) 3. We also have from the inclusion-exclusion principle that since P (A B). P (A B) P (A) + P (B) P (A B) 3 P (A B) 3 For examples
More informationSection 2: Estimation, Confidence Intervals and Testing Hypothesis
Section 2: Estimation, Confidence Intervals and Testing Hypothesis Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/
More informationV. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE
V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE A game of chance featured at an amusement park is played as follows: You pay $ to play. A penny a nickel are flipped. You win $ if either
More informationCHAPTER - 16 PROBABILITY Random Experiment : If an experiment has more than one possible out come and it is not possible to predict the outcome in advance then experiment is called random experiment. Sample
More information2.3 Homework: The Product and Quotient Rules
2.3 Homework: The Product and Quotient Rules 1. Find the derivatives of each of the following functions. Label the derivative you find with its name using proper notation. a) f(x) = (x 2 x)3 x b) h(y)
More informationRELATIONS AND FUNCTIONS
RELATIONS AND FUNCTIONS Definitions A RELATION is any set of ordered pairs. A FUNCTION is a relation in which every input value is paired with exactly one output value. Example 1: Table of Values One way
More informationProbability. Table of contents
Probability Table of contents 1. Important definitions 2. Distributions 3. Discrete distributions 4. Continuous distributions 5. The Normal distribution 6. Multivariate random variables 7. Other continuous
More informationMotivation. Stat Camp for the MBA Program. Probability. Experiments and Outcomes. Daniel Solow 5/10/2017
Stat Camp for the MBA Program Daniel Solow Lecture 2 Probability Motivation You often need to make decisions under uncertainty, that is, facing an unknown future. Examples: How many computers should I
More information7.1. Calculus of inverse functions. Text Section 7.1 Exercise:
Contents 7. Inverse functions 1 7.1. Calculus of inverse functions 2 7.2. Derivatives of exponential function 4 7.3. Logarithmic function 6 7.4. Derivatives of logarithmic functions 7 7.5. Exponential
More information18.600: Lecture 7 Bayes formula and independence
18.600 Lecture 7 18.600: Lecture 7 Bayes formula and independence Scott Sheffield MIT 18.600 Lecture 7 Outline Bayes formula Independence 18.600 Lecture 7 Outline Bayes formula Independence Recall definition:
More informationLecture 7. Bayes formula and independence
18.440: Lecture 7 Bayes formula and independence Scott Sheffield MIT 1 Outline Bayes formula Independence 2 Outline Bayes formula Independence 3 Recall definition: conditional probability Definition: P(E
More informationIntroduction to Econometrics
Introduction to Econometrics Michael Bar October 3, 08 San Francisco State University, department of economics. ii Contents Preliminaries. Probability Spaces................................. Random Variables.................................
More informationAt t = T the investors learn the true state.
1. Martingales A discrete time introduction Model of the market with the following submodels (i) T + 1 trading dates, t =0, 1,...,T. Mathematics of Financial Derivatives II Seppo Pynnönen Professor of
More informationPre-Algebra Semester 1 Practice Exam B DRAFT
. Evaluate x y 5 6 80 when x = 0 and y =.. Which expression is equivalent to? + + + +. In Pre-Algebra class, we follow the order of operations in evaluating expressions. Which operation should a student
More information1. Under certain conditions the number of bacteria in a particular culture doubles every 10 seconds as shown by the graph below.
Exponential Functions Review Packet (from November Questions) 1. Under certain conditions the number of bacteria in a particular culture doubles every 10 seconds as shown by the graph below. 8 7 6 Number
More informationFundamentals of Probability CE 311S
Fundamentals of Probability CE 311S OUTLINE Review Elementary set theory Probability fundamentals: outcomes, sample spaces, events Outline ELEMENTARY SET THEORY Basic probability concepts can be cast in
More informationNotes 1 Autumn Sample space, events. S is the number of elements in the set S.)
MAS 108 Probability I Notes 1 Autumn 2005 Sample space, events The general setting is: We perform an experiment which can have a number of different outcomes. The sample space is the set of all possible
More information18.05 Practice Final Exam
No calculators. 18.05 Practice Final Exam Number of problems 16 concept questions, 16 problems. Simplifying expressions Unless asked to explicitly, you don t need to simplify complicated expressions. For
More informationCalculus I. Activity Collection. Featuring real-world contexts: by Frank C. Wilson
Calculus I by Frank C. ilson Activity Collection Featuring real-world contexts: Growing Money High School Students Infant Growth Rates Living with AIDS Movie Ticket Prices PreK - Grade 8 Students Shipping
More informationMath 180B Homework 4 Solutions
Math 80B Homework 4 Solutions Note: We will make repeated use of the following result. Lemma. Let (X n ) be a time-homogeneous Markov chain with countable state space S, let A S, and let T = inf { n 0
More informationLecture 9. Expectations of discrete random variables
18.440: Lecture 9 Expectations of discrete random variables Scott Sheffield MIT 1 Outline Defining expectation Functions of random variables Motivation 2 Outline Defining expectation Functions of random
More informationStat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1
Stat 366 A1 Fall 6) Midterm Solutions October 3) page 1 1. The opening prices per share Y 1 and Y measured in dollars) of two similar stocks are independent random variables, each with a density function
More informationCISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.
CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability Arthur G. Werschulz Fordham University Department of Computer and Information Sciences Copyright Arthur G. Werschulz, 2017.
More informationProperties of Logarithms 2
Properties of Logarithms 2 Write as a single logarithm. 1. 3log a x + 1 2 log a y 3log a z 2. log 2 x log 2 6 + 2log 2 5 log 8 27 Solve for x. 3. log 5 6 log 5 (x 2) = log 5 3 4. 6log 27 x = 1 5. 3 6 x
More informationQuadratic Programming
Quadratic Programming Quadratic programming is a special case of non-linear programming, and has many applications. One application is for optimal portfolio selection, which was developed by Markowitz
More informationSTATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING
STATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING Tutorial 2 STAT1301 Fall 2010 28SEP2010, MB103@HKU By Joseph Dong Look, imagine a remote village where there has been a long drought.
More informationIntroduction to Stochastic Processes
18.445 Introduction to Stochastic Processes Lecture 1: Introduction to finite Markov chains Hao Wu MIT 04 February 2015 Hao Wu (MIT) 18.445 04 February 2015 1 / 15 Course description About this course
More information2a) Graph: y x 3 2b) Graph: y 2x. 3a) Graph: y 4 3 x
Advanced Algebra Name Date: Semester Final Exam Review # ) Given f x x, determine the average rate of change between and 6? fb fa f x 6 6 f 9 Completed Example: ba f 4 7 6, 9, 7 9 7 6 f x 6 8 Unit : Linear
More informationChance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, ).
Chapter 2 Probability Chance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, 480-524). Blaise Pascal (1623-1662) Pierre de Fermat (1601-1665) Abraham de Moivre
More informationIntroduction to Stochastic Processes
Stat251/551 (Spring 2017) Stochastic Processes Lecture: 1 Introduction to Stochastic Processes Lecturer: Sahand Negahban Scribe: Sahand Negahban 1 Organization Issues We will use canvas as the course webpage.
More informationProblem Set 1. Due: Start of class on February 11.
Massachusetts Institute of Technology Handout 1 6.042J/18.062J: Mathematics for Computer Science February 4, 2003 Professors Charles Leiserson and Srini Devadas Problem Set 1 Due: Start of class on February
More informationProbabilities & Statistics Revision
Probabilities & Statistics Revision Christopher Ting Christopher Ting http://www.mysmu.edu/faculty/christophert/ : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 January 6, 2017 Christopher Ting QF
More informationM378K In-Class Assignment #1
The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.
More informationProbability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom
1 Learning Goals Probability: Terminology and Examples Class 2, 18.05 Jeremy Orloff and Jonathan Bloom 1. Know the definitions of sample space, event and probability function. 2. Be able to organize a
More informationAnnouncements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias
Recap Announcements Lecture 5: Statistics 101 Mine Çetinkaya-Rundel September 13, 2011 HW1 due TA hours Thursday - Sunday 4pm - 9pm at Old Chem 211A If you added the class last week please make sure to
More information17 Exponential and Logarithmic Functions
17 Exponential and Logarithmic Functions Concepts: Exponential Functions Power Functions vs. Exponential Functions The Definition of an Exponential Function Graphing Exponential Functions Exponential Growth
More informationMATH 1101 Exam 1 Review. Spring 2018
MATH 1101 Exam 1 Review Spring 2018 Topics Covered Section 2.1 Functions in the Real World Section 2.2 Describing the Behavior of Functions Section 2.3 Representing Functions Symbolically Section 2.4 Mathematical
More informationStochastic integral. Introduction. Ito integral. References. Appendices Stochastic Calculus I. Geneviève Gauthier.
Ito 8-646-8 Calculus I Geneviève Gauthier HEC Montréal Riemann Ito The Ito The theories of stochastic and stochastic di erential equations have initially been developed by Kiyosi Ito around 194 (one of
More informationSHORT ANSWER. Write the word or phrase that best completes each statement or answers the question.
Exam Name SHORT ANSWER. Write the word or phrase that best completes each statement or answers the question. You are planning on purchasing a new car and have your eye on a specific model. You know that
More informationDistribusi Binomial, Poisson, dan Hipergeometrik
Distribusi Binomial, Poisson, dan Hipergeometrik CHAPTER TOPICS The Probability of a Discrete Random Variable Covariance and Its Applications in Finance Binomial Distribution Poisson Distribution Hypergeometric
More informationLecture 7. Simple Dynamic Games
Lecture 7. Simple Dynamic Games 1. Two-Stage Games of Complete and Perfect Information Two-Stages dynamic game with two players: player 1 chooses action a 1 from the set of his feasible actions A 1 player
More informationMath489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7
Math489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7 Steve Dunbar Due Mon, November 2, 2009. Time to review all of the information we have about coin-tossing fortunes
More information6 th Grade - TNREADY REVIEW Q3 Expressions, Equations, Functions, and Inequalities
6 th Grade - TNREADY REVIEW Q3 Expressions, Equations, Functions, and Inequalities INSTRUCTIONS: Read through the following notes. Fill in shaded areas and highlight important reminders. Then complete
More informationMATH 445/545 Homework 1: Due February 11th, 2016
MATH 445/545 Homework 1: Due February 11th, 2016 Answer the following questions Please type your solutions and include the questions and all graphics if needed with the solution 1 A business executive
More informationn N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)
CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.) S.T. is the key to understanding driving forces. e.g., determines if a process proceeds spontaneously. Let s start with entropy
More informationMathematical studies Standard level Paper 2
Mathematical studies Standard level Paper 2 Friday 5 May 2017 (morning) 1 hour 30 minutes Instructions to candidates ydo not open this examination paper until instructed to do so. ya graphic display calculator
More informationLecture 6 Random walks - advanced methods
Lecture 6: Random wals - advanced methods 1 of 11 Course: M362K Intro to Stochastic Processes Term: Fall 2014 Instructor: Gordan Zitovic Lecture 6 Random wals - advanced methods STOPPING TIMES Our last
More informationMATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)
MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem
More informationNormal Random Variables and Probability
Normal Random Variables and Probability An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2015 Discrete vs. Continuous Random Variables Think about the probability of selecting
More informationLecture 3 - Axioms of Probability
Lecture 3 - Axioms of Probability Sta102 / BME102 January 25, 2016 Colin Rundel Axioms of Probability What does it mean to say that: The probability of flipping a coin and getting heads is 1/2? 3 What
More informationP [(E and F )] P [F ]
CONDITIONAL PROBABILITY AND INDEPENDENCE WORKSHEET MTH 1210 This worksheet supplements our textbook material on the concepts of conditional probability and independence. The exercises at the end of each
More information18.440: Lecture 25 Covariance and some conditional expectation exercises
18.440: Lecture 25 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Outline Covariance and correlation A property of independence If X and Y
More informationNonlinear Programming (NLP)
Natalia Lazzati Mathematics for Economics (Part I) Note 6: Nonlinear Programming - Unconstrained Optimization Note 6 is based on de la Fuente (2000, Ch. 7), Madden (1986, Ch. 3 and 5) and Simon and Blume
More information18.600: Lecture 24 Covariance and some conditional expectation exercises
18.600: Lecture 24 Covariance and some conditional expectation exercises Scott Sheffield MIT Outline Covariance and correlation Paradoxes: getting ready to think about conditional expectation Outline Covariance
More informationConditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom
Conditional Probability, Independence and Bayes Theorem Class 3, 18.05 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Know the definitions of conditional probability and independence of events. 2.
More information18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages
Name No calculators. 18.05 Final Exam Number of problems 16 concept questions, 16 problems, 21 pages Extra paper If you need more space we will provide some blank paper. Indicate clearly that your solution
More informationP (A) = P (B) = P (C) = P (D) =
STAT 145 CHAPTER 12 - PROBABILITY - STUDENT VERSION The probability of a random event, is the proportion of times the event will occur in a large number of repititions. For example, when flipping a coin,
More informationSolution: Solution: Solution:
Chapter 5: Exponential and Logarithmic Functions a. The exponential growth function is y = f(t) = ab t, where a = 2000 because the initial population is 2000 squirrels The annual growth rate is 3% per
More informationLESSON 2 PRACTICE PROBLEMS KEY
LESSON PRACTICE PROBLEMS KEY 1)If x -11= 1, then x = 4 d) 16 x 11 1 4 x 1 4 4 4 x 1 4 x 16 ) If 7x + 6y = 15 and 4x 6y = 18, what is the value of x? a) Line the equations up vertically: 7x 6y 15 4x 6y
More informationMock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual
Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual Question 1. Suppose you want to estimate the percentage of
More informationMATH1231 Algebra, 2017 Chapter 9: Probability and Statistics
MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics A/Prof. Daniel Chan School of Mathematics and Statistics University of New South Wales danielc@unsw.edu.au Daniel Chan (UNSW) MATH1231 Algebra
More informationRVs and their probability distributions
RVs and their probability distributions RVs and their probability distributions In these notes, I will use the following notation: The probability distribution (function) on a sample space will be denoted
More informationExpected Value and Variance
Expected Value and Variance This handout, reorganizes the material in. of the text. It also adds to it slightly (Theorem on p.) and presents two results from chapter of the text, because they fit in here
More informationSee animations and interactive applets of some of these at. Fall_2009/Math123/Notes
MA123, Chapter 7 Word Problems (pp. 125-153) Chapter s Goal: In this chapter we study the two main types of word problems in Calculus. Optimization Problems. i.e., max - min problems Related Rates See
More information