Conditional Expectation and Independence Exercises

Similar documents
Stochastic processes and stopping time Exercises

Stochastic Processes Calcul stochastique. Geneviève Gauthier. HEC Montréal. Stochastic processes. Stochastic processes.

1 Presessional Probability

Solution to HW 12. Since B and B 2 form a partition, we have P (A) = P (A B 1 )P (B 1 ) + P (A B 2 )P (B 2 ). Using P (A) = 21.

Problems from Probability and Statistical Inference (9th ed.) by Hogg, Tanis and Zimmerman.

Markov Chains. Chapter 16. Markov Chains - 1

Mathematics. ( : Focus on free Education) (Chapter 16) (Probability) (Class XI) Exercise 16.2

1 Random variables and distributions

The main purpose of this chapter is to prove the rst and second fundamental theorem of asset pricing in a so called nite market model.

Recitation 2: Probability

k P (X = k)

Probability. Lecture Notes. Adolfo J. Rumbos

Part (A): Review of Probability [Statistics I revision]

Lecture 1 : The Mathematical Theory of Probability

Discrete-Time Market Models

Snell envelope and the pricing of American-style contingent claims

Expectations and Variance

1 Functions, Graphs and Limits

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

1: PROBABILITY REVIEW

CHAPTER 4 PROBABILITY AND PROBABILITY DISTRIBUTIONS

Lecture 2: Probability. Readings: Sections Statistical Inference: drawing conclusions about the population based on a sample

The Table of Integrals (pages of the text) and the Formula Page may be used. They will be attached to the nal exam.

Expectations and Variance

SDS 321: Introduction to Probability and Statistics

C.3 First-Order Linear Differential Equations

2 = 41( ) = 8897 (A1)

Lecture 6 - Random Variables and Parameterized Sample Spaces

Probability Theory and Simulation Methods

Basic Probability. Introduction

5.2 MULTIPLICATION OF POLYNOMIALS. section. Multiplying Monomials with the Product Rule

Total. Name: Student ID: CSE 21A. Midterm #2. February 28, 2013

Introduction to Probability

SOME THEORY AND PRACTICE OF STATISTICS by Howard G. Tucker CHAPTER 2. UNIVARIATE DISTRIBUTIONS

Lecture Lecture 5

18.600: Lecture 7 Bayes formula and independence

12 1 = = 1

Section 2: Estimation, Confidence Intervals and Testing Hypothesis

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE


2.3 Homework: The Product and Quotient Rules

RELATIONS AND FUNCTIONS

Probability. Table of contents

Motivation. Stat Camp for the MBA Program. Probability. Experiments and Outcomes. Daniel Solow 5/10/2017

7.1. Calculus of inverse functions. Text Section 7.1 Exercise:

18.600: Lecture 7 Bayes formula and independence

Lecture 7. Bayes formula and independence

Introduction to Econometrics

At t = T the investors learn the true state.

Pre-Algebra Semester 1 Practice Exam B DRAFT

1. Under certain conditions the number of bacteria in a particular culture doubles every 10 seconds as shown by the graph below.

Fundamentals of Probability CE 311S

Notes 1 Autumn Sample space, events. S is the number of elements in the set S.)

18.05 Practice Final Exam

Calculus I. Activity Collection. Featuring real-world contexts: by Frank C. Wilson

Math 180B Homework 4 Solutions

Lecture 9. Expectations of discrete random variables

Stat 366 A1 (Fall 2006) Midterm Solutions (October 23) page 1

CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 7 Probability. Outline. Terminology and background. Arthur G.

Properties of Logarithms 2

Quadratic Programming

STATISTICAL INDEPENDENCE AND AN INVITATION TO THE Art OF CONDITIONING

Introduction to Stochastic Processes

2a) Graph: y x 3 2b) Graph: y 2x. 3a) Graph: y 4 3 x

Chance, too, which seems to rush along with slack reins, is bridled and governed by law (Boethius, ).

Introduction to Stochastic Processes

Problem Set 1. Due: Start of class on February 11.

Probabilities & Statistics Revision

M378K In-Class Assignment #1

Probability: Terminology and Examples Class 2, Jeremy Orloff and Jonathan Bloom

Announcements. Lecture 5: Probability. Dangling threads from last week: Mean vs. median. Dangling threads from last week: Sampling bias

17 Exponential and Logarithmic Functions

MATH 1101 Exam 1 Review. Spring 2018

Stochastic integral. Introduction. Ito integral. References. Appendices Stochastic Calculus I. Geneviève Gauthier.

SHORT ANSWER. Write the word or phrase that best completes each statement or answers the question.

Distribusi Binomial, Poisson, dan Hipergeometrik

Lecture 7. Simple Dynamic Games

Math489/889 Stochastic Processes and Advanced Mathematical Finance Solutions for Homework 7

6 th Grade - TNREADY REVIEW Q3 Expressions, Equations, Functions, and Inequalities

MATH 445/545 Homework 1: Due February 11th, 2016

n N CHAPTER 1 Atoms Thermodynamics Molecules Statistical Thermodynamics (S.T.)

Mathematical studies Standard level Paper 2

Lecture 6 Random walks - advanced methods

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

Normal Random Variables and Probability

Lecture 3 - Axioms of Probability

P [(E and F )] P [F ]

18.440: Lecture 25 Covariance and some conditional expectation exercises

Nonlinear Programming (NLP)

18.600: Lecture 24 Covariance and some conditional expectation exercises

Conditional Probability, Independence and Bayes Theorem Class 3, Jeremy Orloff and Jonathan Bloom

18.05 Final Exam. Good luck! Name. No calculators. Number of problems 16 concept questions, 16 problems, 21 pages

P (A) = P (B) = P (C) = P (D) =

Solution: Solution: Solution:

LESSON 2 PRACTICE PROBLEMS KEY

Mock Exam - 2 hours - use of basic (non-programmable) calculator is allowed - all exercises carry the same marks - exam is strictly individual

MATH1231 Algebra, 2017 Chapter 9: Probability and Statistics

RVs and their probability distributions

Expected Value and Variance

See animations and interactive applets of some of these at. Fall_2009/Math123/Notes

Transcription:

Conditional Expectation and Independence Exercises Exercise 3.1. Show the four following properties : for any random variables and Y and for any real numbers a and b, (E1) E P [a + by ] = ae P [] + be P [Y ] ; (E2) If 8! 2, (!) Y (!) then E P [] E P [Y ] ; (E3) In general, E P [Y ] 6= E P [] E P [Y ] ; (E4) If and Y are independent then E P [Y ] = E P [] E P [Y ] : Exercise 3.2. Show that for any random variable and any real numbers a and b, (V 1) Var P [] 0; (V 2) Var P [] = E P [ 2 ] E P [] 2 ; (V 3) 8a 2 R; Var P [a + b] = a 2 Var P [] : (V 4) Var P [ + Y ] = Var P [] + Var P [Y ] + 2Cov P [; Y ] Exercise 3.3. Show that for any random variables and Y and for any real numbers a and b, (C1) Cov P [; Y ] = E P [Y ] E P [] E P [Y ] ; (C2) If and Y are independent then Cov P [; Y ] = 0; (C3) 8a; b 2 R; Cov P [a 1 + b 2 ; Y ] = acov P [ 1 ; Y ] + bcov P [ 2 ; Y ] : 1

Exercise 3.4. Let us revisit a problem treated in chapter 2. Today is Monday and you have one dollar in your piggy bank. Starting tomorrow, every morning until Friday (inclusively), you toss a coin. If the result is tails you withdraw one dollar (if possible) from the piggy bank. If the result is heads you deposit one dollar in it. What is the conditional expectation of the amount of money in the bank Friday noon, given the information by Wednesday noon? Exercise 3.5. Recall exercise 2.4 on the gambler s ruin. Calculate E [ 4 j F 0 ] ; E [ 4 j F 2 ] ; E [Y 4 j G 2 ] and E [Y 2 j G 4 ] for an arbitrary p and for the speci c case p = 1 2. Exercise 3.6. Justify all your answers. Let the stochastic processes = f t : t 2 f0; 1; 2; 3; 4gg represent the price of a share throughout time. We consider a time interval of 6 months. 0 (!) 1 (!) 2 (!) 3 (!) 4 (!) P (!)! 1 11 13 15 16 16 0; 06! 2 11 13 15 16 14 0; 065! 3 11 13 15 15 16 0; 06! 4 11 13 15 15 14 0; 065! 5 11 13 15 14 16 0; 06! 6 11 13 15 14 14 0; 065! 7 11 13 12 13 14 0; 0625! 8 11 13 12 13 18 0; 08! 9 11 13 12 13 12 0; 04! 10 11 13 12 13 11 0; 0675! 11 11 11 12 13 14 0; 0625! 12 11 11 12 13 12 0; 0625! 13 11 11 12 12 12 0; 0625! 14 11 11 11 14 20 0; 0625! 15 11 11 11 14 12 0; 0625! 16 11 11 11 12 12 0; 0625 a) What ltration is generated by the process? b) Give the conditional distribution of 4, knowing 2. c) Calculate the conditional expectation of 4, knowing 2. d) Give the conditional distribution of 4, knowing F 2. e) Calculate the conditional expectation of 4, knowing F 2 Problem 3.7. Let us remain in the context of problem 3.6. Two investors, call them A and B, buy a share today (we could say a hundred thousands shares, but it would only add a bunch of zeros to our calculations) which they can sell at the end of this year (t = 2) or 2

at the end of the next year (t = 4). If an investor chooses to sell his share at time t = 2, he will then place the money in a bank account yielding 15% in interest per year (the interest is compounded annually at times t = 2 and t = 4). If instead he chooses not to sell the share at time t = 2, he will have to sell it at time t = 4. Those two investors will be away for the next ve years because they are doing a (possibly metaphorical, this is up to you) trip to Venus. Hence they both appoint prosecutors (talented and trustworthy) to manage the operations in their place. But they re out of luck: both prosecutors are unfamiliar with money issues (we said talented?), therefore the two investors must give precise instructions today to their respective prosecutors on what to do in the next year, depending on how the share will have evolved during that period. It goes without saying that both investors aim at maximizing their pro t. The prosecutor of investor A will check the price of the stock at time t = 1, but the prosecutor of investor B will not (he might not be the most suited candidate for this job, don t you think?). Said otherwise, observation times for prosecutor A are t = 1 and t = 2, while for prosecutor B the only observation time is t = 2: a) What instructions does investor A give to his prosecutor? b) What instructions does investor B give to his prosecutor? To facilitate comparisons, use the holding period yield (HPY). The HPY of a share on a given time interval (say we buy the share at the beginning of period t 1 and sell it at the end of period t 2 ) is R t1 ;t 2 = t 2 t1 : Problem 3.8. This problem uses notations de ned in problems 3.6 and 3.7. a) Let A and B represent the random times at which the shares are sold, respectively by prosecutors A and B. Are A and B stopping times? b) Say we replace the probability measure P by Q where Q (!) = 1 for every! 2. 16 Are A and B stopping times in this case? Problem 3.9. Let 1 ; 2 ; 3 be independent random variables such that P( i = 1) = P( i = 1) = 1 2 : Let us de ne A 1 = f 2 = 3 g ; A 2 = f 1 = 3 g and A 3 = f 1 = 2 g. Show that the events A 1 ; A 2 and A 3 are pairwise independent, but not mutually independent. Problem 3.10. Let and Y be two independent random variables, both of standard normal distribution. Let " be a random variable independent of Y and such that: P f" = 1g = P f" = 1g = 1 2 : a) Show that Z = "Y is also normal but that Y + Z is not. b) Show that Y and Z are not independent, although Cov(Z; Y ) = 0. Problem 3.11. In the case and Y are independent and identically distributed random variables, show that E [ Y j + Y ] = 0: 3

Solutions 1 Exercise 3.1 Proof of (E1) E [a + by ] = ae [] + be [Y ]. Since f ;Y (x; y) = P [ = x and Y = y] y2s Y y2s Y " # [ = P f = x and Y = yg y2s Y because those events are disjoint " # [ = P ff = xg \ fy = ygg y2s Y " ( )# [ = P f = xg \ fy = yg y2s Y = P [f = xg \ ] = P [f = xg] = f (x) (1) hence E [a + by ] = (ax + by) f ;Y (x; y) x2s y2s Y = axf ;Y (x; y) + x2s y2s Y y2s Y = a x f ;Y (x; y) + b x2s y2s Y = a xf (x) + b yf Y (y) x2s y2s Y y2s Y y byf ;Y (x; y) x2s f ;Y (x; y) x2s = ae [] + be [Y ]. Proof of (E2) Y ) E [] E [Y ] : 4

To begin with, note that if W is a non negative random variable then E [W ] 0. Indeed, E [W ] = {z} x f W (x) 0: {z } x2s W 0 0 Let W = Y. Since Y, then W 0. Hence E [W ] 0: But using (E1), we have that E [W ] = E [Y ] = E [Y ] E [] : Therefore, 0 E [Y ] E [], E [] E [Y ] : 2 Exercise 3.3 Proof of (C1) Cov [; Y ] = E [Y ] E [] E [Y ]. 5

Cov [; Y ] = f ;Y (x; y) (x E []) (y x2s y2s Y E [Y ]) f ;Y (x; y) = (xy x2s y2s Y xe [Y ] ye [] + E [] E [Y ]) f ;Y (x; y) = xy f ;Y (x; y) E [Y ] x f ;Y (x; y) x2s y2s Y x2s y2s Y E [] y f ;Y (x; y) + E [] E [Y ] y2s Y x2s x2s y2s Y = xy f ;Y (x; y) E [Y ] x f (x) x2s y2s Y x2s E [] yf Y (y) + E [] E [Y ] f (x) y2s Y x2s {z } =1 by de nition of a density function = E [Y ] E [Y ] E [] E [] E [Y ] + E [] E [Y ] = E [Y ] E [] E [Y ]. Proof of (C2) If and Y are independent then Cov [; Y ] = 0. Since independence of and Y implies E [Y ] = E [] E [Y ], to obtain the result we only have to invoke (C1) : Cov [; Y ] = E [Y ] E [] E [Y ] = E [] E [Y ] E [] E [Y ] = 0. 6

Proof of (C3) Cov [a 1 1 + a 2 2 ; Y ] = a 1 Cov [ 1 ; Y ] + a 2 Cov [ 2 ; Y ]. By (C1), we have that Cov [a 1 1 + a 2 2 ; Y ] = E [(a 1 1 + a 2 2 ) Y ] E [a 1 1 + a 2 2 ] E [Y ] = E [a 1 1 Y + a 2 2 Y ] (a 1 E [ 1 ] + a 2 E [ 2 ]) E [Y ] by (E1) = a 1 E [ 1 Y ] + a 2 E [ 2 Y ] a 1 E [ 1 ] E [Y ] + a 2 E [ 2 ] E [Y ] by (E1) = a 1 (E [ 1 Y ] E [ 1 ] E [Y ]) + a 2 (E [ 2 Y ] E [ 2 ] E [Y ]) = a 1 Cov [ 1 ; Y ] + a 2 Cov [ 2 ; Y ] Proof of (C4) V ar [ + Y ] = V ar [] + V ar [Y ] + 2Cov [; Y ]. V ar [ + Y ] = E ( + Y ) 2 (E [ + Y ]) 2 by (V 2) = E 2 + 2Y + Y 2 (E [] + E [Y ]) 2 by (E1) = E 2 + 2E [Y ] + E Y 2 (E []) 2 2E [] E [Y ] (E [Y ]) 2 = E 2 (E []) 2 + E Y 2 (E [Y ]) 2 + 2 (E [Y ] E [] E [Y ]) = V ar [] + V ar [Y ] + Cov [; Y ] using (V 2) and (C1). 7

3 Exercise 3.4 The blocks of F 2 are B 1 = ft T T T; T T T H; T T HT; T T HHg ; B 2 = ft HT T; T HT H; T HHT; T HHHg ; B 3 = fht T T; HT T H; HT HT; HT HHg ; B 4 = fhht T; HHT H; HHHT; HHHHg : Since then! P (!) 4 (!) T T T T p 4 0 T T T H p 3 (1 p) 1 T T HT p 3 (1 p) 0 T T HH p 2 (1 p) 2 2 T HT T p 3 (1 p) 0 T HT H p 2 (1 p) 2 1 T HHT p 2 (1 p) 2 1 T HHH p (1 p) 3 3 P (B i )! P (!) 4 (!) HT T T p 3 (1 p) 0 HT T H p 2 (1 p) 2 1 HT HT p 2 (1 p) 2 1 HT HH p (1 p) 3 3 HHT T p 2 (1 p) 2 1 HHT H p (1 p) 3 3 HHHT p (1 p) 3 3 HHHH (1 p) 4 5 B 1 p 4 + 2p 3 (1 p) + p 2 (1 p) 2 = p 2 B 2 p 3 (1 p) + 2p 2 (1 p) 2 + p (1 p) 3 = p p 2 = p (1 p) B 3 p 3 (1 p) + 2p 2 (1 p) 2 + p (1 p) 3 = p p 2 = p (1 p) B 4 p 2 (1 p) 2 + 2p (1 p) 3 + (1 p) 4 = 1 2p + p 2 = (1 p) 2 Recall that if! 2 B i, then E P [ 4 jf 2 ] (!) = 1 P (B i )! 2B i (! ) P (! ) 8

Therefore, = If! 2 B 1 then E P [ 4 jf 2 ] (!) 1 (! ) P (! ) P (B 1 )! 2B 1 = 1 p 2 0p 4 + 1p 3 (1 p) + 0p 3 (1 p) + 2p 2 (1 p) 2 = (1 p) (2 p) If! 2 B 2 then E P [ 4 jf 2 ] (!) 1 = (! ) P (! ) P (B 2 )! 2B 2 1 = p (1 p) 0p3 (1 p) + 1p 2 (1 p) 2 + 1p 2 (1 p) 2 + 3p (1 p) 3 = (3 p) (1 p) If! 2 B 3 then E P [ 4 jf 2 ] (!) 1 = (! ) P (! ) P (B 3 )! 2B 3 1 = p (1 p) 0p3 (1 p) + 1p 2 (1 p) 2 + 1p 2 (1 p) 2 + 3p (1 p) 3 = (3 p) (1 p) If! 2 B 4 then E P [ 4 jf 2 ] (!) 1 = (! ) P (! ) P (B 4 )! 2B 4 1 = (1 p) 2 1p 2 (1 p) 2 + 3p (1 p) 3 + 3p (1 p) 3 + 5 (1 p) 4 = 5 4p 9

In the speci c case of p = 1 2, If! 2 B 1 then E P [ 4 jf 2 ] (!) = 3 4 If! 2 B 2 then E P [ 4 jf 2 ] (!) = 5 4 If! 2 B 3 then E P [ 4 jf 2 ] (!) = 5 4 If! 2 B 4 then E P [ 4 jf 2 ] (!) = 3 4 Exercise 3.5 E [ 4 j F 0 ] = 2 (1 p) 4 + 4 4p (1 p) 3 +6 6p 2 (1 p) 2 + 8 4p 3 (1 p) + 10 p 4 = 2 + 8p If p = 1 2, then E [ 4j F 0 ] = 6: 10

If! 2 B 1 = ft T T T; T T T H; T T HT; T T HHg ; then E [ 4 j F 2 ] (!) = 10 p4 + 8 p 3 (1 p) + 8 p 3 (1 p) + 6 p 2 (1 p) 2 p 4 + p 3 (1 p) + p 3 (1 p) + p 2 (1 p) 2 = 4p + 6 If! 2 B 2 = ft HT T; T HT H; T HHT; T HHHg ; then E [ 4 j F 2 ] (!) = 8 p3 (1 p) + 6 p 2 (1 p) 2 + 6 p 2 (1 p) 2 + 4 p (1 p) 3 p 3 (1 p) + p 2 (1 p) 2 + p 2 (1 p) 2 + p (1 p) 3 = 4p + 4 If! 2 B 3 = fht T T; HT T H; HT HT; HT HHg ; then E [ 4 j F 2 ] (!) = 8 p3 (1 p) + 6 p 2 (1 p) 2 + 6 p 2 (1 p) 2 + 4 p (1 p) 3 p 3 (1 p) + p 2 (1 p) 2 + p 2 (1 p) 2 + p (1 p) 3 = 4p + 4 If! 2 B 4 = fhht T; HHT H; HHHT; HHHHg ; then E [ 4 j F 2 ] (!) = 6 p2 (1 p) 2 + 4 p (1 p) 3 + 4 p (1 p) 3 + 2 (1 p) 4 p 2 (1 p) 2 + p (1 p) 3 + p (1 p) 3 + (1 p) 4 = 4p + 2 If p = 1 2, then 8 8 if! 2 B 1 = ft T T T; T T T H; T T HT; T T HHg >< 6 if! 2 B E [ 4 j F 2 ] (!) = 2 = ft HT T; T HT H; T HHT; T HHHg 6 if! 2 B 3 = fht T T; HT T H; HT HT; HT HHg >: 4 if! 2 B 4 = fhht T; HHT H; HHHT; HHHHg 11

T T T T; T T T H; T T HT; T T HH; If! 2 B 1 = T HT T; T HT H; T HHT; T HHH E [Y 4 j G 2 ] (!) = 10 ; then If! 2 B 3 = fht T T; HT T H; HT HT; HT HHg ; then E [Y 4 j G 2 ] (!) = 10 p3 (1 p) + 6 p 2 (1 p) 2 + 0 p 2 (1 p) 2 + 0 p (1 p) 3 p 3 (1 p) + p 2 (1 p) 2 + p 2 (1 p) 2 + p (1 p) 3 = 2p (2p + 3) If! 2 B 4 = fhht T; HHT H; HHHT; HHHHg ; then E [Y 4 j G 2 ] (!) = 0 If p = 1 2, then 8 T T T T; T T T H; T T HT; T T HH; >< 10 if! 2 B 1 = T HT T; T HT H; T HHT; T HHH E [Y 4 j G 2 ] (!) = 4 if! 2 B 3 = fht T T; HT T H; HT HT; HT HHg >: 0 if! 2 B 4 = fhht T; HHT H; HHHT; HHHHg If! 2 C 1 = E [Y 2 j G 4 ] (!) = 10 T T T T; T T T H; T T HT; T T HH; T HT T; T HT H; T HHT; T HHH ; then If! 2 C 2 = fht T T g ; then E [Y 2 j G 4 ] (!) = 4 If! 2 C 3 = fht T Hg ; then E [Y 2 j G 4 ] (!) = 4 If! 2 C 4 = fht HT; HT HHg ; then E [Y 2 j G 4 ] (!) = 4 If! 2 C 5 = fhht T; HHT H; HHHT; HHHHg ; then E [Y 2 j G 4 ] (!) = 0 12

5 Exercise 3.6 a) What ltration is generated by the process? F 0 = f?; g F 1 = ff! 1 ; :::;! 10 g ; f! 11 ; :::;! 16 gg F 2 = ff! 1 ; :::;! 6 g ; f! 7 ; :::;! 10 g ; f! 11 ; :::;! 13 g ; f! 14 ;! 15 ;! 16 gg f!1 ;! F 3 = 2 g ; f! 3 ;! 4 g ; f! 5 ;! 6 g ; f! 7 ; :::;! 10 g ; f! 11 ;! 12 g ; f! 13 g ; f! 14 ;! 15 g f! 16 g F 4 = f! 1 g ; f! 2 g ; f! 3 g ; f! 4 g ; f! 5 g ; f! 6 g ; f! 7 g ; f! 8 g f! 9 g ; f! 10 g ; f! 11 g ; f! 12 g ; f! 13 g ; f! 14 g ; f! 15 g ; f! 16 g b) Give the conditional distribution of 4, knowing 2. 8 30:06 >< 30:06+30:065 P [ 4 = x j 2 = 15] = P [ 4 = x j 2 = 12] = P [ 4 = x j 2 = 11] = >: 8 >< >: 8 >< >: 30:065 30:06+30:065 = 0:48 if x = 16 = 0:52 if x = 14 0 otherwise 0:08 40:0625+0:08+0:04+0:0675 20:0625 40:0625+0:08+0:04+0:0675 0:04+20:0625 40:0625+0:08+0:04+0:0675 0:0675 40:0625+0:08+0:04+0:0675 0;0625 30:0625 3 if x = 20 20;0625 30:0625 3 if x = 12 0 otherwise = 0:18286 if x = 18 = 0:28571 if x = 14 = 0:37714 if x = 12 = 0; 15429 if x = 11 0 otherwise 13

c) Calculate the conditional expectation of 4, knowing 2. If! 2 f! 1 ; :::;! 6 g E P [ 4 j ( 2 )] (!) = 3 0:06 16 3 0:06 + 3 0:065 +14 3 0:065 3 0:06 + 3 0:065 = 14:96 If! 2 f! 7 ; :::;! 13 g E P [ 4 j ( 2 )] (!) = 0:08 18 4 0:0625 + 0:08 + 0:04 + 0:0675 2 0:0625 +14 4 0:0625 + 0:08 + 0:04 + 0:0675 0:04 + 2 0:0625 +12 4 0:0625 + 0:08 + 0:04 + 0:0675 0:0675 +11 4 0:0625 + 0:08 + 0:04 + 0:0675 = 13:514 If! 2 f! 14 ;! 15 ;! 16 g E P [ 4 j ( 2 )] (!) = 20 1 3 +122 3 = 14:667 14

d) Give the conditional distribution of 4, knowing F 2. 8 30:06 >< 30:06+30:065 P [ 4 = x jf! 1 ; :::;! 6 g] = P [ 4 = x jf! 7 ; :::;! 10 g] = P [ 4 = x jf! 11 ; :::;! 13 g] = P [ 4 = x jf! 14 ;! 15 ;! 16 g] = >: 8 >< >: 8 >< >: 8 >< >: 30:065 30:06+30:065 = 0:48 if x = 16 = 0:52 if x = 14 0 otherwise 0:08 0:0625+0:08+0:04+0:0675 0:0625 0:0625+0:08+0:04+0:0675 0:04 0:0625+0:08+0:04+0:0675 0:0675 0:0625+0:08+0:04+0:0675 0:0625 30:0625 3 if x = 14 20:0625 30:0625 3 if x = 12 0 otherwise 0;0625 30:0625 3 if x = 20 20;0625 30:0625 3 if x = 12 0 otherwise e) Calculate the conditional expectation of 4, knowing F 2. = 0:32 if x = 18 = 0:25 if x = 14 = 0:16 if x = 12 = 0:27 if x = 11 0 otherwise if! 2 f! 1 ; :::;! 6 g ; E P [ 4 jf 2 ] (!) = 3 0:06 16 3 0:06 + 3 0:065 +14 3 0:065 3 0:06 + 3 0:065 = 14:96 if! 2 f! 7 ; :::;! 10 g ; E P [ 4 jf 2 ] (!) = 0:08 18 0:0625 + 0:08 + 0:04 + 0:0675 + 14 0:0625 0:0625 + 0:08 + 0:04 + 0:0675 0:04 +12 0:0625 + 0:08 + 0:04 + 0:0675 + 11 0:0675 0:0625 + 0:08 + 0:04 + 0:0675 = 14:15 if! 2 f! 11 ;! 12 ;! 13 g ; E P [ 4 jf 2 ] (!) = 14 1 3 + 12 2 3 = 12:667 if! 2 f! 14 ;! 15 ;! 16 g ; E P [ 4 jf 2 ] (!) = 20 1 3 +122 3 = 14:667 15

6 Problem 3.7 The conditional expectations of the yields are : E P [R 2;4 jf! 1 ; :::;! 6 g] = 14:96 15 E P [R 2;4 jf! 7 ; :::;! 10 g] = 14:15 12 E P [R 2;4 jf! 11 ;! 12 ;! 13 g] = 12:67 12 E P [R 2;4 jf! 14 ;! 15 ;! 16 g] = 14:667 11 E P [R 2;4 jf! 7 ; :::;! 13 g] = 13:514 12 Possible values for the random variable R 2;4 are : = :997 33 = 1:1792 = 1:0558 = 1:3334 = 1:1262 16 15 18 12 20 11 = 1:0667 and 14 15 = 0:93333 = 1:5 and 14 12 = 1:1667 and 12 12 = 1:8182 and 12 11 = 1:0909: = 1 and 11 12 = 0:91667 16

Recall that the bank account has a yield of 1.15 for times t = 2 to t = 4. P [R 2;4 < 1:15 jf! 1 ; :::;! 6 g] = P [ 4 = 16 or 4 = 14 jf! 1 ; :::;! 6 g] = 1 P [R 2;4 < 1:15 jf! 7 ; :::;! 10 g] = P [ 4 = 12 or 4 = 11 jf! 7 ; :::;! 10 g] = 0:43 P [R 2;4 < 1:15 jf! 11 ;! 12 ;! 13 g] = P [ 4 = 12 jf! 11 ;! 12 ;! 13 g] = 0:66667 P [R 2;4 < 1:15 jf! 14 ;! 15 ;! 16 g] = P [ 4 = 12 jf! 14 ;! 15 ;! 16 g] = 0:66667 P [R 2;4 < 1:15 jf! 7 ; :::;! 13 g] = P [ 4 = 12 or 4 = 11 jf! 7 ; :::;! 13 g] = 0:37714 + 0:15429 = 0:53143 a) What instructions does investor A give to his prosecutor? Justify your answer. If we observe the process from time t = 0, at time t = 2 we will be able to determine which of the following four events happened : f! 1 ; :::;! 6 g ; f! 7 ; :::;! 10 g ; f! 11 ; :::;! 13 g ; f! 14 ;! 15 ;! 16 g. If event f! 1 ; :::;! 6 g happened (the price of the share followed trajectories 11, 13, 15), then the probability to obtain a holding period yield (for the period from t = 2 to t = 4) superior to that of the bank account is zero. Hence in that case we would sell the share and invest the 15 dollars in the bank account. Then, the value of our portfolio at time t = 4 would be 15 1:15 = 17:25: If event f! 7 ; :::;! 10 g happened (the price of the share followed trajectories 11, 13, 12), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that of the bank account is around 43%. However, the expected HPY is 1:18 which is superior to that of the bank account. Hence we would keep the share. If event f! 11 ; :::;! 13 g happened (the price of the share followed trajectories 11, 12, 12), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that of the bank account is around 67%. In addition, the expected HPY is 1:0558 which is inferior to the 1.15 of the bank account. Hence we would sell the share and invest the 12 dollars in the bank account. Then, the value of our portfolio at time t = 4 would be 12 1:15 = 13:8. If event f! 14 ;! 15 ;! 16 g happened (the price of the share followed trajectories 11, 11, 11), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that 17

of the bank account is around 67%. However, the expected HPY is 1:3334 which is superior to the 1.15 of the bank account. This is because when the HPY exceeds the yield of the bank account, it does it by a huge margin (1:82 versus 1.15). People with a taste for risky investment would choose to keep the share, especially because even when the HPY is inferior to the yield of the bank account, it is not drastically inferior (1.09 versus 1.15). Trajectory Decision at time t = 2 11; 13; 15 sell the share 11; 13; 12 keep the share 11; 11; 12 sell the share 11; 11; 11 keep the share Value of the portfolio at time t = 4 if the share is sold 17:25 17:25 13:8 13:8 13:8 13:8 13:8 13:8 12:65 12:65 Value of the portfolio at time t = 4 if the share is kept 16 with prob. 18% 14 with prob. 19:5% 18 with prob. 8% 14 with prob. 6:25% 12 with prob. 4% 11 with prob. 6:75% 14 with prob. 6:25% 12 with prob. 12:5% 20 with prob. 6:25% 12 with prob. 12:5% b) What instructions does investor B give to his prosecutor? Justify your answer. If we observe the process only at time t = 2, we will be able to determine which of the following three events happened : f! 1 ; :::;! 6 g ; f! 7 ; :::;! 13 g ; f! 14 ;! 15 ;! 16 g. If event f! 1 ; :::;! 6 g happened (the price of the share at time t = 2 is 15), then the probability to obtain a HPY (for the period from t = 2 to t = 4) superior to that of the bank account is zero. Hence in that case we would sell the share and invest the 15 dollars in the bank account. Then, the value of our portfolio at time t = 4 would be 15 1:15 = 17:25: If event f! 7 ; :::;! 13 g happened (the price of the share at time t = 2 is 12), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that of the bank account is around 53% (0:37714 + 0:15429). In addition, the expected HPY is 1:1262, which is lower then the yield of the bank account. Hence in that case we would sell the share and invest the 12 dollars in the bank account. Then, the value of our portfolio at time t = 4 would be 12 1:15 = 13:8: If event f! 14 ;! 15 ;! 16 g happened (the price of the share at time t = 2 is 11), then the probability to obtain a HPY (for the period from t = 2 to t = 4) inferior to that of the bank account is around 67%. However, the expected HPY is 1:3334 which is superior to the 1.15 of the bank account. This is because when the HPY exceeds the yield of the bank account, it does it by a huge margin (1:82 versus 1.15). People with a taste for risky investment would 18

choose to keep the share, especially because even when the HPY is inferior to the yield of the bank account, it is not drastically inferior (1.09 versus 1.15). Value of 2 Decision at time t = 2 15 sell the share 12 sell the share 11 keep the share 7 Problem 3.8 Value of the portfolio at time t = 4 if the share is sold 17:25 17:25 13:8 13:8 13:8 13:8 12:65 12:65 Value of the portfolio at time t = 4 if the share is kept 16 with prob. 18% 14 with prob. 19:5% 18 with prob. 8% 14 with prob. 12:5% 12 with prob. 16:5% 11 with prob. 6:75% 20 with prob. 6:25% 12 with prob. 12:5% a) Let A and B represent the random times at which the shares are sold, respectively by prosecutors A and B. Are A and B stopping times? Justify your answer. A is a stopping time because f! 2 : A (!) = 0g =? 2 F 0 f! 2 : A (!) = 1g =? 2 F 1 f! 2 : A (!) = 2g = f! 1 ; :::;! 6 g [ f! 11 ; :::;! 13 g 2 F 2 f! 2 : A (!) = 3g =? 2 F 3 f! 2 : A (!) = 4g = f! 7 ; :::;! 10 g [ f! 14 ; :::;! 15 g 2 F 4 B is a stopping time because f! 2 : B (!) = 0g =? 2 F 0 f! 2 : B (!) = 1g =? 2 F 1 f! 2 : B (!) = 2g = f! 1 ; :::;! 13 g 2 F 2 f! 2 : B (!) = 3g =? 2 F 3 f! 2 : B (!) = 4g = f! 14 ; :::;! 15 g 2 F 4 19

b) Say we replace the probability measure P by Q where Q (!) = 1 for every! 2. 16 Are A and B stopping times in this case? Justify your answer. Yes since being (or not being) a stopping time does not depend in any way on the probability measure used on the ltrated measurable space. 20