Generalized Gaussian Bridges of Prediction-Invertible Processes

Size: px
Start display at page:

Download "Generalized Gaussian Bridges of Prediction-Invertible Processes"

Transcription

1 Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine 1 tsottine/ 1 / 37

2 Abstract A generalized bridge is the law of a stochastic process that is conditioned on multiple linear functionals of its path. We consider canonical representation of the bridges. In the canonical representation the linear spaces L t (X ) = span {X s ; s t} coinside for all t < T for both the original process and its bridge representation. A Gaussian process X = (X t ) t [,T ] is prediction-invertible if it can be recovered (in law, at least) from its prediction martingale: X t = t p 1 T (t, s) de[x T F X s ]. In discrete time all non-degenerate Gaussian processes are prediction-invertible. In continuous time this is most probably not true. 2 / 37

3 References This work, S. and Yazigi (212), combines and extends the results of Alili (22) and Gasbarra S. and Valkeila (27). Alili. Canonical decompositions of certain generalized Brownian bridges. Electron. Comm. Probab., 7, 22, Gasbarra, S. and Valkeila. Gaussian bridges. Stochastic analysis and applications, Abel Symp. 2, 27, S. and Yazigi Generalized Gaussian Bridges. Preprint arxiv: v1, May / 37

4 Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 4 / 37

5 Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 5 / 37

6 Generalized (linearly conditioned) Bridges Let X = (X t ) t [,T ] be a continuous Gaussian process with positive definite covariance function R, mean function m of bounded variation, and X = m(). We consider the conditioning, or bridging, of X on N linear functionals G T = [GT i ]N i=1 of its paths: G T (X ) = T [ T g(t) dx t = ] N g i (t) dx t. i=1 6 / 37

7 Generalized (linearly conditioned) Bridges Let X = (X t ) t [,T ] be a continuous Gaussian process with positive definite covariance function R, mean function m of bounded variation, and X = m(). We consider the conditioning, or bridging, of X on N linear functionals G T = [GT i ]N i=1 of its paths: G T (X ) = T [ T g(t) dx t = ] N g i (t) dx t. i=1 Remark (On Linear Independence of Conditionings) We assume, without any loss of generality, that the functions g i are linearly independent. Indeed, if this is not the case then the linearly dependent, or redundant, components of g can simply be removed from the conditioning (1) below without changing it. 6 / 37

8 Generalized (linearly conditioned) Bridges Informally, the generalized (Gaussian) bridge X g;y is (the law of) the (Gaussian) process X conditioned on the set { T } N { T } g(t) dx t = y = g i (t) dx t = y i. (1) i=1 The rigorous definition is given in the next slide. 7 / 37

9 Generalized (linearly conditioned) Bridges Informally, the generalized (Gaussian) bridge X g;y is (the law of) the (Gaussian) process X conditioned on the set { T } N { T } g(t) dx t = y = g i (t) dx t = y i. (1) i=1 The rigorous definition is given in the next slide. Remark (Canonical Space Framework) For the sake of convenience, we will work on the canonical filtered probability space (Ω, F, F, P), where Ω = C[, T ], P corresponds to the Gaussian coordinate process X t (ω) = ω(t): P = P[X ]. The filtration F = (F t ) t [,T ] is the intrinsic filtration of the coordinate process X that is augmented with the null-sets and made right-continuous, and F = F T. 7 / 37

10 Generalized (linearly conditioned) Bridges Definition (Generalized Gaussian Bridge) The generalized bridge measure P g;y is the regular conditional law [ P g;y = P g;y T ] [X ] = P X g(t) dx t = y. 8 / 37

11 Generalized (linearly conditioned) Bridges Definition (Generalized Gaussian Bridge) The generalized bridge measure P g;y is the regular conditional law [ P g;y = P g;y T ] [X ] = P X g(t) dx t = y. A representation of the generalized Gaussian bridge is any process X g;y satisfying [ P [X g;y ] = P g;y [X ] = P X T ] g(t) dx t = y. 8 / 37

12 Generalized (linearly conditioned) Bridges Remark (Technical Observations) 1 Note that the conditioning on the P-null-set is not a problem, since the canonical space of continuous processes is small enough to admit regular conditional laws. 9 / 37

13 Generalized (linearly conditioned) Bridges Remark (Technical Observations) 1 Note that the conditioning on the P-null-set is not a problem, since the canonical space of continuous processes is small enough to admit regular conditional laws. 2 As a measure P g;y the generalized Gaussian bridge is unique, but it has several different representations X g;y. Indeed, for any representation of the bridge one can combine it with any P-measure-preserving transformation to get a new representation. 9 / 37

14 Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 1 / 37

15 Orthogonal Representation Denote by g the matrix [ T T ] g ij := g i, g j := Cov g i (t) dx t, g j (t) dx t. 11 / 37

16 Orthogonal Representation Denote by g the matrix [ T T ] g ij := g i, g j := Cov g i (t) dx t, g j (t) dx t. Remark ( is invertible and depend on T and R only) 1 Note that g does not depend on the mean of X nor on the conditioned values y: g depends only on the conditioning functions g = [g i ] N i=1 and the covariance R. 11 / 37

17 Orthogonal Representation Denote by g the matrix [ T T ] g ij := g i, g j := Cov g i (t) dx t, g j (t) dx t. Remark ( is invertible and depend on T and R only) 1 Note that g does not depend on the mean of X nor on the conditioned values y: g depends only on the conditioning functions g = [g i ] N i=1 and the covariance R. 2 Since g i s are linearly independent and R is positive definite, the matrix g is invertible. 11 / 37

18 Orthogonal Representation Theorem (Orthogonal Representation) The generalized Gaussian bridge X g;y can be represented as X g;y t = X t 1 t, g g 1 ( T ) g(u) dx u y. (2) 12 / 37

19 Orthogonal Representation Theorem (Orthogonal Representation) The generalized Gaussian bridge X g;y can be represented as X g;y t = X t 1 t, g g 1 ( T ) g(u) dx u y. (2) Moreover, any generalized Gaussian bridge X g;y is a Gaussian process with Cov [ X g;y t E [ Xt g;y ], Xs g;y ] = m(t) 1 t, g g 1 ( T ) g(u) dm(u) y, = 1 t, 1 s 1 t, g g 1 1 s, g. 12 / 37

20 Orthogonal Representation Corollary (Mean-Conditioning Invariance) Let X be a centered Gaussian process with X = and let m be a function of bounded variation. Let X g := X g; be a bridge where the conditional functionals are conditioned to zero. Then (X + m) g;y t = X g t + T ) (m(t) 1 t, g g 1 g(u) dm(u) + 1 t, g g 1 y. 13 / 37

21 Orthogonal Representation Corollary (Mean-Conditioning Invariance) Let X be a centered Gaussian process with X = and let m be a function of bounded variation. Let X g := X g; be a bridge where the conditional functionals are conditioned to zero. Then (X + m) g;y t = X g t + T ) (m(t) 1 t, g g 1 g(u) dm(u) + 1 t, g g 1 y. Remark (Normalization) Because of the corollary above we can, and will, assume in what follows that m = and y =. 13 / 37

22 Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 14 / 37

23 Canonical Representation for Martingales The problem with the orthogonal bridge representation (2) of X g;y is that in order to construct it at any point t [, T ) one needs the whole path of the underlying process X up to time T. In this section and the following sections we construct a bridge representation that is canonical in the following sense: 15 / 37

24 Canonical Representation for Martingales The problem with the orthogonal bridge representation (2) of X g;y is that in order to construct it at any point t [, T ) one needs the whole path of the underlying process X up to time T. In this section and the following sections we construct a bridge representation that is canonical in the following sense: Definition (Canonical Representation) The bridge X g;y is of canonical representation if, for all t [, T ), X g;y t L t (X ) and X t L t (X g;y ). 15 / 37

25 Canonical Representation for Martingales The problem with the orthogonal bridge representation (2) of X g;y is that in order to construct it at any point t [, T ) one needs the whole path of the underlying process X up to time T. In this section and the following sections we construct a bridge representation that is canonical in the following sense: Definition (Canonical Representation) The bridge X g;y is of canonical representation if, for all t [, T ), X g;y t L t (X ) and X t L t (X g;y ). Here L t (Y ) is the closed linear subspace of L 2 (Ω, F, P) generated by the random variables Y s, s t. 15 / 37

26 Canonical Representation for Martingales Remark (Gaussian Specialities) 1 Since the conditional laws of Gaussian processes are Gaussian and Gaussian spaces are linear, the assumptions Xt g;y L t (X ) and X t L t (X g;y ) are the same as assuming that Xt g;y is Ft X -measurable and X t is F X g;y t -measurable (and, consequently, Ft X = F X g;y t ). This fact is very special to Gaussian processes. Indeed, in general conditioned processes such as generalized bridges are not linear transformations of the underlying process. 16 / 37

27 Canonical Representation for Martingales Remark (Gaussian Specialities) 1 Since the conditional laws of Gaussian processes are Gaussian and Gaussian spaces are linear, the assumptions Xt g;y L t (X ) and X t L t (X g;y ) are the same as assuming that Xt g;y is Ft X -measurable and X t is F X g;y t -measurable (and, consequently, Ft X = F X g;y t ). This fact is very special to Gaussian processes. Indeed, in general conditioned processes such as generalized bridges are not linear transformations of the underlying process. 2 Another Gaussian fact here is that the bridges are Gaussian. In general conditioned processes do not belong to the same family of distributions as the original process. 16 / 37

28 Canonical Representation for Martingales We shall require that the restricted measures P g,y t := P g;y F t and P t := P F t are equivalent for all t < T (they are obviously singular for t = T ). To this end we assume that the matrix [( )( )] g ij (t) := E GT i (X ) G t i (X ) G j T (X ) G t j (X ) [ T T ] = E g i (s) dx s g j (s) dx s is invertible for all t < T. t t 17 / 37

29 Canonical Representation for Martingales We shall require that the restricted measures P g,y t := P g;y F t and P t := P F t are equivalent for all t < T (they are obviously singular for t = T ). To this end we assume that the matrix [( )( )] g ij (t) := E GT i (X ) G t i (X ) G j T (X ) G t j (X ) [ T T ] = E g i (s) dx s g j (s) dx s is invertible for all t < T. Remark (On Notation) t In the previous section we considered the matrix g, but from now on we consider the function g ( ). Their connection is of course g = g (). We hope that is overloading of notation does not cause confusion to the reader. t 17 / 37

30 Canonical Representation for Martingales Let now M be a Gaussian martingale with strictly increasing bracket M and M =. 18 / 37

31 Canonical Representation for Martingales Let now M be a Gaussian martingale with strictly increasing bracket M and M =. Remark (Bracket and Non-Degeneracy) Note that the bracket is strictly increasing if and only if the covariance R is positive definite. Indeed, for Gaussian martingales we have R(t, s) = Var(M t s ) = M t s. 18 / 37

32 Canonical Representation for Martingales Let now M be a Gaussian martingale with strictly increasing bracket M and M =. Remark (Bracket and Non-Degeneracy) Note that the bracket is strictly increasing if and only if the covariance R is positive definite. Indeed, for Gaussian martingales we have R(t, s) = Var(M t s ) = M t s. Define a Volterra kernel l g (t, s) := g (t) g 1 (t) g(s). (3) 18 / 37

33 Canonical Representation for Martingales Let now M be a Gaussian martingale with strictly increasing bracket M and M =. Remark (Bracket and Non-Degeneracy) Note that the bracket is strictly increasing if and only if the covariance R is positive definite. Indeed, for Gaussian martingales we have R(t, s) = Var(M t s ) = M t s. Define a Volterra kernel l g (t, s) := g (t) g 1 (t) g(s). (3) Note that the kernel l g depends on the process M through its covariance,, and in the Gaussian martingale case we have g ij (t) = T t g i (s)g j (s) d M s. 18 / 37

34 Canonical Representation for Martingales The following lemma is the key observation in finding the canonical generalized bridge representation. Actually, it is a multivariate version of Proposition 6 of Gasbarra, S. and Valkeila (27). 19 / 37

35 Canonical Representation for Martingales The following lemma is the key observation in finding the canonical generalized bridge representation. Actually, it is a multivariate version of Proposition 6 of Gasbarra, S. and Valkeila (27). Lemma (Radon-Nikodym for Bridges) Let l g be given by (3) and let M be a continuous Gaussian martingale with strictly increasing bracket M and M =. Then log dpg t dp t = t s l g (s, u) dm u dm s 1 2 t ( s ) 2 l g (s, u) dm u d M s. 19 / 37

36 Canonical Representation for Martingales Proof. Let p( ; µ, Σ) be the Gaussian density on R N and let [ ] αt g (dy) := P G T (M) dy Ft M. 2 / 37

37 Canonical Representation for Martingales Proof. Let p( ; µ, Σ) be the Gaussian density on R N and let [ ] αt g (dy) := P G T (M) dy Ft M. By the Bayes formula and the martingale property ( ) dp g t = dαg p ; G t (M), g (t) t dp t dα g () = ( ). p ; G (M), g () 2 / 37

38 Canonical Representation for Martingales Proof. Let p( ; µ, Σ) be the Gaussian density on R N and let [ ] αt g (dy) := P G T (M) dy Ft M. By the Bayes formula and the martingale property ( ) dp g t = dαg p ; G t (M), g (t) t dp t dα g () = ( ). p ; G (M), g () Denote F (t, M t ) := 1 2 G t g 1 ()G t and the claim follows by using the Itô formula. 2 / 37

39 Canonical Representation for Martingales Corollary (Semimartingale Decomposition) The canonical bridge representation M g satisfies the stochastic differential equation t dm t = dmt g l g (t, s) dms g d M t, (4) where l g is given by (3). Moreover M = M g. 21 / 37

40 Canonical Representation for Martingales Corollary (Semimartingale Decomposition) The canonical bridge representation M g satisfies the stochastic differential equation t dm t = dmt g l g (t, s) dms g d M t, (4) where l g is given by (3). Moreover M = M g. Proof. The claim follows by using the Girsanov s theorem. 21 / 37

41 Canonical Representation for Martingales Remark (Equivalence and Singularity) Note that T s l g (s, u) 2 du ds <. In view of (4) this means that M and M g are equivalent on [, T ). Indeed, equation (4) can be viewed as the Hitsuda representation between two equivalent Gaussian processes. 22 / 37

42 Canonical Representation for Martingales Remark (Equivalence and Singularity) Note that T s l g (s, u) 2 du ds <. In view of (4) this means that M and M g are equivalent on [, T ). Indeed, equation (4) can be viewed as the Hitsuda representation between two equivalent Gaussian processes. Also note that T s l g (s, u) 2 du ds = meaning, by the Hitsuda representation theorem, that M and M g are singular on [, T ]. 22 / 37

43 Canonical Representation for Martingales Next we solve the stochastic differential equation (4) of Corollary 6. In general, solving a Volterra Stieltjes equation like (4) in a closed form is difficult. 23 / 37

44 Canonical Representation for Martingales Next we solve the stochastic differential equation (4) of Corollary 6. In general, solving a Volterra Stieltjes equation like (4) in a closed form is difficult. Of course, the general theory of Volterra equations suggests that the solution will be of the form (6) of next theorem, where l g is the resolvent kernel of l g determined by the resolvent equation (7) given below. 23 / 37

45 Canonical Representation for Martingales Next we solve the stochastic differential equation (4) of Corollary 6. In general, solving a Volterra Stieltjes equation like (4) in a closed form is difficult. Of course, the general theory of Volterra equations suggests that the solution will be of the form (6) of next theorem, where l g is the resolvent kernel of l g determined by the resolvent equation (7) given below. Also, the general theory suggests that the resolvent kernel can be calculated implicitly by using the Neumann series. In our case the kernel l g is a quadratic form that factorizes in its argument. This allows us to calculate the resolvent l g explicitly as (5) below. 23 / 37

46 Canonical Representation for Martingales Theorem (Solution to an SDE) Let s t [, T ]. Define the Volterra kernel l g (t) g(t, s) := l g (t, s) g (s) = g (t)g (t) g 1 (t) g(s) g (s). (5) Then the bridge M g has the canonical representation t dmt g = dm t l g(t, s) dm s d M t, (6) i.e., (6) is the solution to (4). 24 / 37

47 Canonical Representation for Martingales Proof. Equation (6) is the solution to (4) if the kernel l g satisfies the resolvent equation l g (t, s) + l g(t, s) = t s l g (t, u)l g(u, s) d M u. (7) 25 / 37

48 Canonical Representation for Martingales Proof. Equation (6) is the solution to (4) if the kernel l g satisfies the resolvent equation l g (t, s) + l g(t, s) = t s l g (t, u)l g(u, s) d M u. (7) Indeed, suppose (6) is the solution to (4). This means that ( t ) dm t = dm t l g(t, s) dm s d M t t ( s ) l g (t, s) dm s l g(s, u) dm u d M s d M t, 25 / 37

49 Canonical Representation for Martingales Proof. In the integral form, by using the Fubini s theorem, this means that M t = M t + t t t t s s t t s s u l g(u, s) d M u dm s l g (u, s) d M u dm s l g (s, v)l g(v, u)d M v d M u dm s. 26 / 37

50 Canonical Representation for Martingales Proof. In the integral form, by using the Fubini s theorem, this means that M t = M t + t t t t s s t t s s l g(u, s) d M u dm s l g (u, s) d M u dm s u l g (s, v)l g(v, u)d M v d M u dm s. The resolvent criterion (7) follows by identifying the integrands in the d M u dm s -integrals above. 26 / 37

51 Canonical Representation for Martingales Proof. In the integral form, by using the Fubini s theorem, this means that M t = M t + t t t t s s t t s s l g(u, s) d M u dm s l g (u, s) d M u dm s u l g (s, v)l g(v, u)d M v d M u dm s. The resolvent criterion (7) follows by identifying the integrands in the d M u dm s -integrals above. Now it is straightforward, but tedious, to check that the resolvent equation holds. We omit the details. 26 / 37

52 Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 27 / 37

53 Prediction-Invertible Processes Let us now consider a Gaussian process X that is not a martingale. For a (Gaussian) process X its prediction martingale is the process ˆX defined as [ ˆX t = E X T Ft X ]. 28 / 37

54 Prediction-Invertible Processes Let us now consider a Gaussian process X that is not a martingale. For a (Gaussian) process X its prediction martingale is the process ˆX defined as [ ˆX t = E X T Ft X Since for Gaussian processes ˆX t L t (X ), we may write, at least formally, that ˆX t = t ]. p(t, s) dx s, where the abstract kernel p depends also on T (since ˆX depends on T ). 28 / 37

55 Prediction-Invertible Processes Let us now consider a Gaussian process X that is not a martingale. For a (Gaussian) process X its prediction martingale is the process ˆX defined as [ ˆX t = E X T Ft X ]. Since for Gaussian processes ˆX t L t (X ), we may write, at least formally, that ˆX t = t p(t, s) dx s, where the abstract kernel p depends also on T (since ˆX depends on T ). In the next definition we assume, among other things, that the kernel p exists as a real, and not only formal, function. We also assume that the kernel p is invertible. 28 / 37

56 Prediction-Invertible Processes Definition (Prediction-Invertibility) A Gaussian process X is prediction-invertible if there exists a kernel p such that its prediction martingale ˆX is continuous, can be represented as ˆX t = t p(t, s) dx s, 29 / 37

57 Prediction-Invertible Processes Definition (Prediction-Invertibility) A Gaussian process X is prediction-invertible if there exists a kernel p such that its prediction martingale ˆX is continuous, can be represented as ˆX t = t p(t, s) dx s, and there exists an inverse kernel p 1 such that, for all t [, T ], p 1 (t, ) L 2 ([, T ], d ˆX ) and X can be recovered from ˆX by X t = t p 1 (t, s) d ˆX s. 29 / 37

58 Prediction-Invertible Processes Remark In general it seems to be a difficult problem to determine whether a Gaussian process is prediction-invertible or not. In the discrete time non-degenerate case all Gaussian processes are prediction-invertible. In continuous time the situation is more difficult, as the next example illustrates. 3 / 37

59 Prediction-Invertible Processes Remark In general it seems to be a difficult problem to determine whether a Gaussian process is prediction-invertible or not. In the discrete time non-degenerate case all Gaussian processes are prediction-invertible. In continuous time the situation is more difficult, as the next example illustrates. Nevertheless, we can immediately see that we must have R(t, s) = t s p 1 (t, u) p 1 (s, u) d ˆX u, where ˆX u = Var (E [X T F u ]). However, this criterion does not seem to be very helpful in practice. 3 / 37

60 Prediction-Invertible Processes Example (Fine Points of Prediction-Invertibility) Consider the Gaussian slope X t = tξ, t [, T ], where ξ is a standard normal random variable. Now, if we consider the raw filtration Gt X = σ(x s ; s t), then X is not prediction invertible. Indeed, then ˆX = but ˆX t = X T, if t (, T ]. So, ˆX is not continuous. On the other hand, the augmented filtration is simply Ft X = σ(ξ) for all t [, T ]. So, ˆX = X T. Note, however, that in both cases the slope X can be recovered from the prediction martingale: X t = t ˆX T t. 31 / 37

61 Prediction-Invertible Processes We want to represent integrals w.r.t. X w.r.t. ˆX. 32 / 37

62 Prediction-Invertible Processes We want to represent integrals w.r.t. X w.r.t. ˆX. Definition (Linear Extensions of Kernels) Let X be prediction-invertible. Let P and P 1 extend the relations P[1 t ] = p(t, ) and P 1 [1 t ] = p 1 (t, ) linearly. 32 / 37

63 Prediction-Invertible Processes We want to represent integrals w.r.t. X w.r.t. ˆX. Definition (Linear Extensions of Kernels) Let X be prediction-invertible. Let P and P 1 extend the relations P[1 t ] = p(t, ) and P 1 [1 t ] = p 1 (t, ) linearly. Lemma (Abstract vs. Concrete Wiener Integrals) For f and ĝ nice enough T T f (t) dx t = ĝ(t) d ˆX t = T T P 1 [f ](t) d ˆX t, P[ĝ](t) dx t. 32 / 37

64 Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 33 / 37

65 Canonical Representation for Prediction-Invertible Processes The next (our main) theorem follows basically by rewriting T g(t) dx t = T P 1 [g](t) d ˆX (t) =. 34 / 37

66 Canonical Representation for Prediction-Invertible Processes The next (our main) theorem follows basically by rewriting T g(t) dx t = T P 1 [g](t) d ˆX (t) =. Theorem (Canonical Representation) Let X be prediction-invertible Gaussian process. Then t t ] Xt g = X t p 1 (t, u)p [ˆl ĝ (u, ) (s) d ˆX u dx s, s where ĝ = P 1 [g] and ˆl ĝ (u, v) = ĝ ˆX (u)ĝ (u)( ĝ ˆX ) 1 ĝ(v) (u) ĝ ˆX (v). 34 / 37

67 Canonical Representation for Prediction-Invertible Processes Proof. In the prediction-martingale level we have d ˆX ĝs = d ˆX s s ˆl ĝ (s, u) d ˆX u d ˆX s. 35 / 37

68 Canonical Representation for Prediction-Invertible Processes Proof. In the prediction-martingale level we have d ˆX ĝs = d ˆX s s ˆl ĝ (s, u) d ˆX u d ˆX s. Now, by operating with p 1 and P we get t ( s ) Xt g = X t p 1 (t, s) ˆl ĝ (s, u) d ˆX u d ˆX s = X t t p 1 (t, s) s ] P [ˆl ĝ(s, ) (u) dx u d ˆX s. Finally, the claim follows by using the Fubini s theorem. 35 / 37

69 Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 36 / 37

70 Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 37 / 37

71 Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 2 How to calculate the prediction-inversion kernel p 1 (t, s)? 37 / 37

72 Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 2 How to calculate the prediction-inversion kernel p 1 (t, s)? 3 What would be an example of continuous Gaussian process that is not prediction-invertible? 37 / 37

73 Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 2 How to calculate the prediction-inversion kernel p 1 (t, s)? 3 What would be an example of continuous Gaussian process that is not prediction-invertible? 4 What would ensure that L s (X ) L t (X ) for s < t and L s (X ) = t>s L t (X ) and L t (X ) = L t (X g )? Would this be enough for prediction-invertiblity? 37 / 37

74 Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 2 How to calculate the prediction-inversion kernel p 1 (t, s)? 3 What would be an example of continuous Gaussian process that is not prediction-invertible? 4 What would ensure that L s (X ) L t (X ) for s < t and L s (X ) = t>s L t (X ) and L t (X ) = L t (X g )? Would this be enough for prediction-invertiblity? The End Thank you for your attention! 37 / 37

Generalized Gaussian Bridges

Generalized Gaussian Bridges TOMMI SOTTINEN ADIL YAZIGI Generalized Gaussian Bridges PROCEEDINGS OF THE UNIVERSITY OF VAASA WORKING PAPERS 4 MATHEMATICS 2 VAASA 212 Vaasan yliopisto University of Vaasa PL 7 P.O. Box 7 (Wolffintie

More information

Representing Gaussian Processes with Martingales

Representing Gaussian Processes with Martingales Representing Gaussian Processes with Martingales with Application to MLE of Langevin Equation Tommi Sottinen University of Vaasa Based on ongoing joint work with Lauri Viitasaari, University of Saarland

More information

Representations of Gaussian measures that are equivalent to Wiener measure

Representations of Gaussian measures that are equivalent to Wiener measure Representations of Gaussian measures that are equivalent to Wiener measure Patrick Cheridito Departement für Mathematik, ETHZ, 89 Zürich, Switzerland. E-mail: dito@math.ethz.ch Summary. We summarize results

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Conditional Full Support for Gaussian Processes with Stationary Increments

Conditional Full Support for Gaussian Processes with Stationary Increments Conditional Full Support for Gaussian Processes with Stationary Increments Tommi Sottinen University of Vaasa Kyiv, September 9, 2010 International Conference Modern Stochastics: Theory and Applications

More information

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010 1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior

Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior Ehsan Azmoodeh University of Vaasa Finland 7th General AMaMeF and Swissquote Conference September 7 1, 215 Outline

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

arxiv: v1 [math.pr] 23 Jan 2018

arxiv: v1 [math.pr] 23 Jan 2018 TRANSFER PRINCIPLE FOR nt ORDER FRACTIONAL BROWNIAN MOTION WIT APPLICATIONS TO PREDICTION AND EQUIVALENCE IN LAW TOMMI SOTTINEN arxiv:181.7574v1 [math.pr 3 Jan 18 Department of Mathematics and Statistics,

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

(1)(a) V = 2n-dimensional vector space over a field F, (1)(b) B = non-degenerate symplectic form on V.

(1)(a) V = 2n-dimensional vector space over a field F, (1)(b) B = non-degenerate symplectic form on V. 18.704 Supplementary Notes March 21, 2005 Maximal parabolic subgroups of symplectic groups These notes are intended as an outline for a long presentation to be made early in April. They describe certain

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications The multidimensional Ito Integral and the multidimensional Ito Formula Eric Mu ller June 1, 215 Seminar on Stochastic Geometry and its applications page 2 Seminar on Stochastic Geometry and its applications

More information

Risk-Minimality and Orthogonality of Martingales

Risk-Minimality and Orthogonality of Martingales Risk-Minimality and Orthogonality of Martingales Martin Schweizer Universität Bonn Institut für Angewandte Mathematik Wegelerstraße 6 D 53 Bonn 1 (Stochastics and Stochastics Reports 3 (199, 123 131 2

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012 1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.

More information

CONDITIONAL FULL SUPPORT OF GAUSSIAN PROCESSES WITH STATIONARY INCREMENTS

CONDITIONAL FULL SUPPORT OF GAUSSIAN PROCESSES WITH STATIONARY INCREMENTS J. Appl. Prob. 48, 561 568 (2011) Printed in England Applied Probability Trust 2011 CONDITIONAL FULL SUPPOT OF GAUSSIAN POCESSES WITH STATIONAY INCEMENTS DAIO GASBAA, University of Helsinki TOMMI SOTTINEN,

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

Basic Definitions: Indexed Collections and Random Functions

Basic Definitions: Indexed Collections and Random Functions Chapter 1 Basic Definitions: Indexed Collections and Random Functions Section 1.1 introduces stochastic processes as indexed collections of random variables. Section 1.2 builds the necessary machinery

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

The Pedestrian s Guide to Local Time

The Pedestrian s Guide to Local Time The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments

More information

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze

More information

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Austrian Journal of Statistics April 27, Volume 46, 67 78. AJS http://www.ajs.or.at/ doi:.773/ajs.v46i3-4.672 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Yuliya

More information

On continuous time contract theory

On continuous time contract theory Ecole Polytechnique, France Journée de rentrée du CMAP, 3 octobre, 218 Outline 1 2 Semimartingale measures on the canonical space Random horizon 2nd order backward SDEs (Static) Principal-Agent Problem

More information

Independence of some multiple Poisson stochastic integrals with variable-sign kernels

Independence of some multiple Poisson stochastic integrals with variable-sign kernels Independence of some multiple Poisson stochastic integrals with variable-sign kernels Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

arxiv: v2 [math.pr] 22 Aug 2009

arxiv: v2 [math.pr] 22 Aug 2009 On the structure of Gaussian random variables arxiv:97.25v2 [math.pr] 22 Aug 29 Ciprian A. Tudor SAMOS/MATISSE, Centre d Economie de La Sorbonne, Université de Panthéon-Sorbonne Paris, 9, rue de Tolbiac,

More information

9 Radon-Nikodym theorem and conditioning

9 Radon-Nikodym theorem and conditioning Tel Aviv University, 2015 Functions of real variables 93 9 Radon-Nikodym theorem and conditioning 9a Borel-Kolmogorov paradox............. 93 9b Radon-Nikodym theorem.............. 94 9c Conditioning.....................

More information

Nonparametric Drift Estimation for Stochastic Differential Equations

Nonparametric Drift Estimation for Stochastic Differential Equations Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS

GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS Di Girolami, C. and Russo, F. Osaka J. Math. 51 (214), 729 783 GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS CRISTINA DI GIROLAMI and FRANCESCO RUSSO (Received

More information

Tools from Lebesgue integration

Tools from Lebesgue integration Tools from Lebesgue integration E.P. van den Ban Fall 2005 Introduction In these notes we describe some of the basic tools from the theory of Lebesgue integration. Definitions and results will be given

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Stochastic Calculus February 11, / 33

Stochastic Calculus February 11, / 33 Martingale Transform M n martingale with respect to F n, n =, 1, 2,... σ n F n (σ M) n = n 1 i= σ i(m i+1 M i ) is a Martingale E[(σ M) n F n 1 ] n 1 = E[ σ i (M i+1 M i ) F n 1 ] i= n 2 = σ i (M i+1 M

More information

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION A la mémoire de Paul-André Meyer Francesco Russo (1 and Pierre Vallois (2 (1 Université Paris 13 Institut Galilée, Mathématiques 99 avenue J.B. Clément

More information

Fast-slow systems with chaotic noise

Fast-slow systems with chaotic noise Fast-slow systems with chaotic noise David Kelly Ian Melbourne Courant Institute New York University New York NY www.dtbkelly.com May 12, 215 Averaging and homogenization workshop, Luminy. Fast-slow systems

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

Brownian Motion on Manifold

Brownian Motion on Manifold Brownian Motion on Manifold QI FENG Purdue University feng71@purdue.edu August 31, 2014 QI FENG (Purdue University) Brownian Motion on Manifold August 31, 2014 1 / 26 Overview 1 Extrinsic construction

More information

Generalized Hypothesis Testing and Maximizing the Success Probability in Financial Markets

Generalized Hypothesis Testing and Maximizing the Success Probability in Financial Markets Generalized Hypothesis Testing and Maximizing the Success Probability in Financial Markets Tim Leung 1, Qingshuo Song 2, and Jie Yang 3 1 Columbia University, New York, USA; leung@ieor.columbia.edu 2 City

More information

An Introduction to Malliavin Calculus. Denis Bell University of North Florida

An Introduction to Malliavin Calculus. Denis Bell University of North Florida An Introduction to Malliavin Calculus Denis Bell University of North Florida Motivation - the hypoellipticity problem Definition. A differential operator G is hypoelliptic if, whenever the equation Gu

More information

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem Nonlinear representation, backward SDEs, and application to the Principal-Agent problem Ecole Polytechnique, France April 4, 218 Outline The Principal-Agent problem Formulation 1 The Principal-Agent problem

More information

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November

More information

Continuous Time Finance

Continuous Time Finance Continuous Time Finance Lisbon 2013 Tomas Björk Stockholm School of Economics Tomas Björk, 2013 Contents Stochastic Calculus (Ch 4-5). Black-Scholes (Ch 6-7. Completeness and hedging (Ch 8-9. The martingale

More information

ON A SDE DRIVEN BY A FRACTIONAL BROWNIAN MOTION AND WITH MONOTONE DRIFT

ON A SDE DRIVEN BY A FRACTIONAL BROWNIAN MOTION AND WITH MONOTONE DRIFT Elect. Comm. in Probab. 8 23122 134 ELECTRONIC COMMUNICATIONS in PROBABILITY ON A SDE DRIVEN BY A FRACTIONAL BROWNIAN MOTION AND WIT MONOTONE DRIFT BRAIM BOUFOUSSI 1 Cadi Ayyad University FSSM, Department

More information

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 19, No. 4, December 26 GIRSANOV THEOREM FOR GAUSSIAN PROCESS WITH INDEPENDENT INCREMENTS Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** Abstract.

More information

Metric spaces and metrizability

Metric spaces and metrizability 1 Motivation Metric spaces and metrizability By this point in the course, this section should not need much in the way of motivation. From the very beginning, we have talked about R n usual and how relatively

More information

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Simo Särkkä Aalto University Tampere University of Technology Lappeenranta University of Technology Finland November

More information

On the submartingale / supermartingale property of diffusions in natural scale

On the submartingale / supermartingale property of diffusions in natural scale On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property

More information

Rough paths methods 4: Application to fbm

Rough paths methods 4: Application to fbm Rough paths methods 4: Application to fbm Samy Tindel Purdue University University of Aarhus 2016 Samy T. (Purdue) Rough Paths 4 Aarhus 2016 1 / 67 Outline 1 Main result 2 Construction of the Levy area:

More information

Stochastic Differential Equations

Stochastic Differential Equations Chapter 5 Stochastic Differential Equations We would like to introduce stochastic ODE s without going first through the machinery of stochastic integrals. 5.1 Itô Integrals and Itô Differential Equations

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

STAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model

STAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model STAT331 Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model Because of Theorem 2.5.1 in Fleming and Harrington, see Unit 11: For counting process martingales with

More information

Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term

Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term 1 Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term Enrico Priola Torino (Italy) Joint work with G. Da Prato, F. Flandoli and M. Röckner Stochastic Processes

More information

Quasi-sure Stochastic Analysis through Aggregation

Quasi-sure Stochastic Analysis through Aggregation Quasi-sure Stochastic Analysis through Aggregation H. Mete Soner Nizar Touzi Jianfeng Zhang March 7, 21 Abstract This paper is on developing stochastic analysis simultaneously under a general family of

More information

1. Geometry of the unit tangent bundle

1. Geometry of the unit tangent bundle 1 1. Geometry of the unit tangent bundle The main reference for this section is [8]. In the following, we consider (M, g) an n-dimensional smooth manifold endowed with a Riemannian metric g. 1.1. Notations

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Integral representations in models with long memory

Integral representations in models with long memory Integral representations in models with long memory Georgiy Shevchenko, Yuliya Mishura, Esko Valkeila, Lauri Viitasaari, Taras Shalaiko Taras Shevchenko National University of Kyiv 29 September 215, Ulm

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand

A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand Carl Mueller 1 and Zhixin Wu Abstract We give a new representation of fractional

More information

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Simo Särkkä Aalto University, Finland November 18, 2014 Simo Särkkä (Aalto) Lecture 4: Numerical Solution of SDEs November

More information

EE731 Lecture Notes: Matrix Computations for Signal Processing

EE731 Lecture Notes: Matrix Computations for Signal Processing EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University September 22, 2005 0 Preface This collection of ten

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

Weak solutions of mean-field stochastic differential equations

Weak solutions of mean-field stochastic differential equations Weak solutions of mean-field stochastic differential equations Juan Li School of Mathematics and Statistics, Shandong University (Weihai), Weihai 26429, China. Email: juanli@sdu.edu.cn Based on joint works

More information

arxiv: v2 [math.pr] 27 Oct 2015

arxiv: v2 [math.pr] 27 Oct 2015 A brief note on the Karhunen-Loève expansion Alen Alexanderian arxiv:1509.07526v2 [math.pr] 27 Oct 2015 October 28, 2015 Abstract We provide a detailed derivation of the Karhunen Loève expansion of a stochastic

More information

Stochastic Integration and Continuous Time Models

Stochastic Integration and Continuous Time Models Chapter 3 Stochastic Integration and Continuous Time Models 3.1 Brownian Motion The single most important continuous time process in the construction of financial models is the Brownian motion process.

More information

Kolmogorov equations in Hilbert spaces IV

Kolmogorov equations in Hilbert spaces IV March 26, 2010 Other types of equations Let us consider the Burgers equation in = L 2 (0, 1) dx(t) = (AX(t) + b(x(t))dt + dw (t) X(0) = x, (19) where A = ξ 2, D(A) = 2 (0, 1) 0 1 (0, 1), b(x) = ξ 2 (x

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

On the Markovian projection in the Brunick-Shreve mimicking result

On the Markovian projection in the Brunick-Shreve mimicking result On the Markovian projection in the Brunick-Shreve mimicking result Martin Forde Dept. Mathematics King s College London, London WC2R 2LS 2th June 213 Abstract: For a one-dimensional Itô process X t = σ

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

4 Derivations of the Discrete-Time Kalman Filter

4 Derivations of the Discrete-Time Kalman Filter Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time

More information

Time-Consistent Mean-Variance Portfolio Selection in Discrete and Continuous Time

Time-Consistent Mean-Variance Portfolio Selection in Discrete and Continuous Time ime-consistent Mean-Variance Portfolio Selection in Discrete and Continuous ime Christoph Czichowsky Department of Mathematics EH Zurich BFS 21 oronto, 26th June 21 Christoph Czichowsky (EH Zurich) Mean-variance

More information

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra A linear algebra proof of the fundamental theorem of algebra Andrés E. Caicedo May 18, 2010 Abstract We present a recent proof due to Harm Derksen, that any linear operator in a complex finite dimensional

More information

ON THE STRUCTURE OF GAUSSIAN RANDOM VARIABLES

ON THE STRUCTURE OF GAUSSIAN RANDOM VARIABLES ON THE STRUCTURE OF GAUSSIAN RANDOM VARIABLES CIPRIAN A. TUDOR We study when a given Gaussian random variable on a given probability space Ω, F,P) is equal almost surely to β 1 where β is a Brownian motion

More information

Lecture 4: Harmonic forms

Lecture 4: Harmonic forms Lecture 4: Harmonic forms Jonathan Evans 29th September 2010 Jonathan Evans () Lecture 4: Harmonic forms 29th September 2010 1 / 15 Jonathan Evans () Lecture 4: Harmonic forms 29th September 2010 2 / 15

More information

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there

More information

Putzer s Algorithm. Norman Lebovitz. September 8, 2016

Putzer s Algorithm. Norman Lebovitz. September 8, 2016 Putzer s Algorithm Norman Lebovitz September 8, 2016 1 Putzer s algorithm The differential equation dx = Ax, (1) dt where A is an n n matrix of constants, possesses the fundamental matrix solution exp(at),

More information

Convoluted Brownian motions: a class of remarkable Gaussian processes

Convoluted Brownian motions: a class of remarkable Gaussian processes Convoluted Brownian motions: a class of remarkable Gaussian processes Sylvie Roelly Random models with applications in the natural sciences Bogotá, December 11-15, 217 S. Roelly (Universität Potsdam) 1

More information

Convergence of Particle Filtering Method for Nonlinear Estimation of Vortex Dynamics

Convergence of Particle Filtering Method for Nonlinear Estimation of Vortex Dynamics Convergence of Particle Filtering Method for Nonlinear Estimation of Vortex Dynamics Meng Xu Department of Mathematics University of Wyoming February 20, 2010 Outline 1 Nonlinear Filtering Stochastic Vortex

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

Geometric projection of stochastic differential equations

Geometric projection of stochastic differential equations Geometric projection of stochastic differential equations John Armstrong (King s College London) Damiano Brigo (Imperial) August 9, 2018 Idea: Projection Idea: Projection Projection gives a method of systematically

More information

Microlocal Methods in X-ray Tomography

Microlocal Methods in X-ray Tomography Microlocal Methods in X-ray Tomography Plamen Stefanov Purdue University Lecture I: Euclidean X-ray tomography Mini Course, Fields Institute, 2012 Plamen Stefanov (Purdue University ) Microlocal Methods

More information

Optimum CUSUM Tests for Detecting Changes in Continuous Time Processes

Optimum CUSUM Tests for Detecting Changes in Continuous Time Processes Optimum CUSUM Tests for Detecting Changes in Continuous Time Processes George V. Moustakides INRIA, Rennes, France Outline The change detection problem Overview of existing results Lorden s criterion and

More information

Quasi-invariant Measures on Path Space. Denis Bell University of North Florida

Quasi-invariant Measures on Path Space. Denis Bell University of North Florida Quasi-invariant Measures on Path Space Denis Bell University of North Florida Transformation of measure under the flow of a vector field Let E be a vector space (or a manifold), equipped with a finite

More information