Generalized Gaussian Bridges of Prediction-Invertible Processes

Similar documents
Generalized Gaussian Bridges

Representing Gaussian Processes with Martingales

Representations of Gaussian measures that are equivalent to Wiener measure

Applications of Ito s Formula

Conditional Full Support for Gaussian Processes with Stationary Increments

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Tools of stochastic calculus

Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior

1. Stochastic Processes and filtrations

arxiv: v1 [math.pr] 23 Jan 2018

Lecture 22 Girsanov s Theorem

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Topics in fractional Brownian motion

(1)(a) V = 2n-dimensional vector space over a field F, (1)(b) B = non-degenerate symplectic form on V.

Lecture 21 Representations of Martingales

STAT 331. Martingale Central Limit Theorem and Related Results

Stochastic integration. P.J.C. Spreij

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

Risk-Minimality and Orthogonality of Martingales

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Lecture 17 Brownian motion as a Markov process

Lecture 19 L 2 -Stochastic integration

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

CONDITIONAL FULL SUPPORT OF GAUSSIAN PROCESSES WITH STATIONARY INCREMENTS

I forgot to mention last time: in the Ito formula for two standard processes, putting

1 Brownian Local Time

Basic Definitions: Indexed Collections and Random Functions

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

A Concise Course on Stochastic Partial Differential Equations

{σ x >t}p x. (σ x >t)=e at.

Convergence at first and second order of some approximations of stochastic integrals

The Pedestrian s Guide to Local Time

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments

On continuous time contract theory

Independence of some multiple Poisson stochastic integrals with variable-sign kernels

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

arxiv: v2 [math.pr] 22 Aug 2009

9 Radon-Nikodym theorem and conditioning

Nonparametric Drift Estimation for Stochastic Differential Equations

(B(t i+1 ) B(t i )) 2

GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS

Tools from Lebesgue integration

µ X (A) = P ( X 1 (A) )

Stochastic Calculus February 11, / 33

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer

Fast-slow systems with chaotic noise

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Brownian Motion on Manifold

Generalized Hypothesis Testing and Maximizing the Success Probability in Financial Markets

An Introduction to Malliavin Calculus. Denis Bell University of North Florida

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

Continuous Time Finance

ON A SDE DRIVEN BY A FRACTIONAL BROWNIAN MOTION AND WITH MONOTONE DRIFT

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

Metric spaces and metrizability

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations

On the submartingale / supermartingale property of diffusions in natural scale

Rough paths methods 4: Application to fbm

Stochastic Differential Equations

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

STAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model

Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term

Quasi-sure Stochastic Analysis through Aggregation

1. Geometry of the unit tangent bundle

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Part II Probability and Measure

Integral representations in models with long memory

The Wiener Itô Chaos Expansion

Gaussian vectors and central limit theorem

A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations

EE731 Lecture Notes: Matrix Computations for Signal Processing

Gaussian Processes. 1. Basic Notions

Weak solutions of mean-field stochastic differential equations

arxiv: v2 [math.pr] 27 Oct 2015

Stochastic Integration and Continuous Time Models

Kolmogorov equations in Hilbert spaces IV

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

On the Markovian projection in the Brunick-Shreve mimicking result

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Stochastic Differential Equations.

4 Derivations of the Discrete-Time Kalman Filter

Time-Consistent Mean-Variance Portfolio Selection in Discrete and Continuous Time

A linear algebra proof of the fundamental theorem of algebra

ON THE STRUCTURE OF GAUSSIAN RANDOM VARIABLES

Lecture 4: Harmonic forms

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Putzer s Algorithm. Norman Lebovitz. September 8, 2016

Convoluted Brownian motions: a class of remarkable Gaussian processes

Convergence of Particle Filtering Method for Nonlinear Estimation of Vortex Dynamics

Stochastic Calculus (Lecture #3)

Geometric projection of stochastic differential equations

Microlocal Methods in X-ray Tomography

Optimum CUSUM Tests for Detecting Changes in Continuous Time Processes

Quasi-invariant Measures on Path Space. Denis Bell University of North Florida

Transcription:

Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine 1 http://www.uwasa.fi/ tsottine/ 1 / 37

Abstract A generalized bridge is the law of a stochastic process that is conditioned on multiple linear functionals of its path. We consider canonical representation of the bridges. In the canonical representation the linear spaces L t (X ) = span {X s ; s t} coinside for all t < T for both the original process and its bridge representation. A Gaussian process X = (X t ) t [,T ] is prediction-invertible if it can be recovered (in law, at least) from its prediction martingale: X t = t p 1 T (t, s) de[x T F X s ]. In discrete time all non-degenerate Gaussian processes are prediction-invertible. In continuous time this is most probably not true. 2 / 37

References This work, S. and Yazigi (212), combines and extends the results of Alili (22) and Gasbarra S. and Valkeila (27). Alili. Canonical decompositions of certain generalized Brownian bridges. Electron. Comm. Probab., 7, 22, 27 36. Gasbarra, S. and Valkeila. Gaussian bridges. Stochastic analysis and applications, Abel Symp. 2, 27, 361 382. S. and Yazigi Generalized Gaussian Bridges. Preprint arxiv:125.345v1, May 212. 3 / 37

Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 4 / 37

Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 5 / 37

Generalized (linearly conditioned) Bridges Let X = (X t ) t [,T ] be a continuous Gaussian process with positive definite covariance function R, mean function m of bounded variation, and X = m(). We consider the conditioning, or bridging, of X on N linear functionals G T = [GT i ]N i=1 of its paths: G T (X ) = T [ T g(t) dx t = ] N g i (t) dx t. i=1 6 / 37

Generalized (linearly conditioned) Bridges Let X = (X t ) t [,T ] be a continuous Gaussian process with positive definite covariance function R, mean function m of bounded variation, and X = m(). We consider the conditioning, or bridging, of X on N linear functionals G T = [GT i ]N i=1 of its paths: G T (X ) = T [ T g(t) dx t = ] N g i (t) dx t. i=1 Remark (On Linear Independence of Conditionings) We assume, without any loss of generality, that the functions g i are linearly independent. Indeed, if this is not the case then the linearly dependent, or redundant, components of g can simply be removed from the conditioning (1) below without changing it. 6 / 37

Generalized (linearly conditioned) Bridges Informally, the generalized (Gaussian) bridge X g;y is (the law of) the (Gaussian) process X conditioned on the set { T } N { T } g(t) dx t = y = g i (t) dx t = y i. (1) i=1 The rigorous definition is given in the next slide. 7 / 37

Generalized (linearly conditioned) Bridges Informally, the generalized (Gaussian) bridge X g;y is (the law of) the (Gaussian) process X conditioned on the set { T } N { T } g(t) dx t = y = g i (t) dx t = y i. (1) i=1 The rigorous definition is given in the next slide. Remark (Canonical Space Framework) For the sake of convenience, we will work on the canonical filtered probability space (Ω, F, F, P), where Ω = C[, T ], P corresponds to the Gaussian coordinate process X t (ω) = ω(t): P = P[X ]. The filtration F = (F t ) t [,T ] is the intrinsic filtration of the coordinate process X that is augmented with the null-sets and made right-continuous, and F = F T. 7 / 37

Generalized (linearly conditioned) Bridges Definition (Generalized Gaussian Bridge) The generalized bridge measure P g;y is the regular conditional law [ P g;y = P g;y T ] [X ] = P X g(t) dx t = y. 8 / 37

Generalized (linearly conditioned) Bridges Definition (Generalized Gaussian Bridge) The generalized bridge measure P g;y is the regular conditional law [ P g;y = P g;y T ] [X ] = P X g(t) dx t = y. A representation of the generalized Gaussian bridge is any process X g;y satisfying [ P [X g;y ] = P g;y [X ] = P X T ] g(t) dx t = y. 8 / 37

Generalized (linearly conditioned) Bridges Remark (Technical Observations) 1 Note that the conditioning on the P-null-set is not a problem, since the canonical space of continuous processes is small enough to admit regular conditional laws. 9 / 37

Generalized (linearly conditioned) Bridges Remark (Technical Observations) 1 Note that the conditioning on the P-null-set is not a problem, since the canonical space of continuous processes is small enough to admit regular conditional laws. 2 As a measure P g;y the generalized Gaussian bridge is unique, but it has several different representations X g;y. Indeed, for any representation of the bridge one can combine it with any P-measure-preserving transformation to get a new representation. 9 / 37

Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 1 / 37

Orthogonal Representation Denote by g the matrix [ T T ] g ij := g i, g j := Cov g i (t) dx t, g j (t) dx t. 11 / 37

Orthogonal Representation Denote by g the matrix [ T T ] g ij := g i, g j := Cov g i (t) dx t, g j (t) dx t. Remark ( is invertible and depend on T and R only) 1 Note that g does not depend on the mean of X nor on the conditioned values y: g depends only on the conditioning functions g = [g i ] N i=1 and the covariance R. 11 / 37

Orthogonal Representation Denote by g the matrix [ T T ] g ij := g i, g j := Cov g i (t) dx t, g j (t) dx t. Remark ( is invertible and depend on T and R only) 1 Note that g does not depend on the mean of X nor on the conditioned values y: g depends only on the conditioning functions g = [g i ] N i=1 and the covariance R. 2 Since g i s are linearly independent and R is positive definite, the matrix g is invertible. 11 / 37

Orthogonal Representation Theorem (Orthogonal Representation) The generalized Gaussian bridge X g;y can be represented as X g;y t = X t 1 t, g g 1 ( T ) g(u) dx u y. (2) 12 / 37

Orthogonal Representation Theorem (Orthogonal Representation) The generalized Gaussian bridge X g;y can be represented as X g;y t = X t 1 t, g g 1 ( T ) g(u) dx u y. (2) Moreover, any generalized Gaussian bridge X g;y is a Gaussian process with Cov [ X g;y t E [ Xt g;y ], Xs g;y ] = m(t) 1 t, g g 1 ( T ) g(u) dm(u) y, = 1 t, 1 s 1 t, g g 1 1 s, g. 12 / 37

Orthogonal Representation Corollary (Mean-Conditioning Invariance) Let X be a centered Gaussian process with X = and let m be a function of bounded variation. Let X g := X g; be a bridge where the conditional functionals are conditioned to zero. Then (X + m) g;y t = X g t + T ) (m(t) 1 t, g g 1 g(u) dm(u) + 1 t, g g 1 y. 13 / 37

Orthogonal Representation Corollary (Mean-Conditioning Invariance) Let X be a centered Gaussian process with X = and let m be a function of bounded variation. Let X g := X g; be a bridge where the conditional functionals are conditioned to zero. Then (X + m) g;y t = X g t + T ) (m(t) 1 t, g g 1 g(u) dm(u) + 1 t, g g 1 y. Remark (Normalization) Because of the corollary above we can, and will, assume in what follows that m = and y =. 13 / 37

Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 14 / 37

Canonical Representation for Martingales The problem with the orthogonal bridge representation (2) of X g;y is that in order to construct it at any point t [, T ) one needs the whole path of the underlying process X up to time T. In this section and the following sections we construct a bridge representation that is canonical in the following sense: 15 / 37

Canonical Representation for Martingales The problem with the orthogonal bridge representation (2) of X g;y is that in order to construct it at any point t [, T ) one needs the whole path of the underlying process X up to time T. In this section and the following sections we construct a bridge representation that is canonical in the following sense: Definition (Canonical Representation) The bridge X g;y is of canonical representation if, for all t [, T ), X g;y t L t (X ) and X t L t (X g;y ). 15 / 37

Canonical Representation for Martingales The problem with the orthogonal bridge representation (2) of X g;y is that in order to construct it at any point t [, T ) one needs the whole path of the underlying process X up to time T. In this section and the following sections we construct a bridge representation that is canonical in the following sense: Definition (Canonical Representation) The bridge X g;y is of canonical representation if, for all t [, T ), X g;y t L t (X ) and X t L t (X g;y ). Here L t (Y ) is the closed linear subspace of L 2 (Ω, F, P) generated by the random variables Y s, s t. 15 / 37

Canonical Representation for Martingales Remark (Gaussian Specialities) 1 Since the conditional laws of Gaussian processes are Gaussian and Gaussian spaces are linear, the assumptions Xt g;y L t (X ) and X t L t (X g;y ) are the same as assuming that Xt g;y is Ft X -measurable and X t is F X g;y t -measurable (and, consequently, Ft X = F X g;y t ). This fact is very special to Gaussian processes. Indeed, in general conditioned processes such as generalized bridges are not linear transformations of the underlying process. 16 / 37

Canonical Representation for Martingales Remark (Gaussian Specialities) 1 Since the conditional laws of Gaussian processes are Gaussian and Gaussian spaces are linear, the assumptions Xt g;y L t (X ) and X t L t (X g;y ) are the same as assuming that Xt g;y is Ft X -measurable and X t is F X g;y t -measurable (and, consequently, Ft X = F X g;y t ). This fact is very special to Gaussian processes. Indeed, in general conditioned processes such as generalized bridges are not linear transformations of the underlying process. 2 Another Gaussian fact here is that the bridges are Gaussian. In general conditioned processes do not belong to the same family of distributions as the original process. 16 / 37

Canonical Representation for Martingales We shall require that the restricted measures P g,y t := P g;y F t and P t := P F t are equivalent for all t < T (they are obviously singular for t = T ). To this end we assume that the matrix [( )( )] g ij (t) := E GT i (X ) G t i (X ) G j T (X ) G t j (X ) [ T T ] = E g i (s) dx s g j (s) dx s is invertible for all t < T. t t 17 / 37

Canonical Representation for Martingales We shall require that the restricted measures P g,y t := P g;y F t and P t := P F t are equivalent for all t < T (they are obviously singular for t = T ). To this end we assume that the matrix [( )( )] g ij (t) := E GT i (X ) G t i (X ) G j T (X ) G t j (X ) [ T T ] = E g i (s) dx s g j (s) dx s is invertible for all t < T. Remark (On Notation) t In the previous section we considered the matrix g, but from now on we consider the function g ( ). Their connection is of course g = g (). We hope that is overloading of notation does not cause confusion to the reader. t 17 / 37

Canonical Representation for Martingales Let now M be a Gaussian martingale with strictly increasing bracket M and M =. 18 / 37

Canonical Representation for Martingales Let now M be a Gaussian martingale with strictly increasing bracket M and M =. Remark (Bracket and Non-Degeneracy) Note that the bracket is strictly increasing if and only if the covariance R is positive definite. Indeed, for Gaussian martingales we have R(t, s) = Var(M t s ) = M t s. 18 / 37

Canonical Representation for Martingales Let now M be a Gaussian martingale with strictly increasing bracket M and M =. Remark (Bracket and Non-Degeneracy) Note that the bracket is strictly increasing if and only if the covariance R is positive definite. Indeed, for Gaussian martingales we have R(t, s) = Var(M t s ) = M t s. Define a Volterra kernel l g (t, s) := g (t) g 1 (t) g(s). (3) 18 / 37

Canonical Representation for Martingales Let now M be a Gaussian martingale with strictly increasing bracket M and M =. Remark (Bracket and Non-Degeneracy) Note that the bracket is strictly increasing if and only if the covariance R is positive definite. Indeed, for Gaussian martingales we have R(t, s) = Var(M t s ) = M t s. Define a Volterra kernel l g (t, s) := g (t) g 1 (t) g(s). (3) Note that the kernel l g depends on the process M through its covariance,, and in the Gaussian martingale case we have g ij (t) = T t g i (s)g j (s) d M s. 18 / 37

Canonical Representation for Martingales The following lemma is the key observation in finding the canonical generalized bridge representation. Actually, it is a multivariate version of Proposition 6 of Gasbarra, S. and Valkeila (27). 19 / 37

Canonical Representation for Martingales The following lemma is the key observation in finding the canonical generalized bridge representation. Actually, it is a multivariate version of Proposition 6 of Gasbarra, S. and Valkeila (27). Lemma (Radon-Nikodym for Bridges) Let l g be given by (3) and let M be a continuous Gaussian martingale with strictly increasing bracket M and M =. Then log dpg t dp t = t s l g (s, u) dm u dm s 1 2 t ( s ) 2 l g (s, u) dm u d M s. 19 / 37

Canonical Representation for Martingales Proof. Let p( ; µ, Σ) be the Gaussian density on R N and let [ ] αt g (dy) := P G T (M) dy Ft M. 2 / 37

Canonical Representation for Martingales Proof. Let p( ; µ, Σ) be the Gaussian density on R N and let [ ] αt g (dy) := P G T (M) dy Ft M. By the Bayes formula and the martingale property ( ) dp g t = dαg p ; G t (M), g (t) t dp t dα g () = ( ). p ; G (M), g () 2 / 37

Canonical Representation for Martingales Proof. Let p( ; µ, Σ) be the Gaussian density on R N and let [ ] αt g (dy) := P G T (M) dy Ft M. By the Bayes formula and the martingale property ( ) dp g t = dαg p ; G t (M), g (t) t dp t dα g () = ( ). p ; G (M), g () Denote F (t, M t ) := 1 2 G t g 1 ()G t and the claim follows by using the Itô formula. 2 / 37

Canonical Representation for Martingales Corollary (Semimartingale Decomposition) The canonical bridge representation M g satisfies the stochastic differential equation t dm t = dmt g l g (t, s) dms g d M t, (4) where l g is given by (3). Moreover M = M g. 21 / 37

Canonical Representation for Martingales Corollary (Semimartingale Decomposition) The canonical bridge representation M g satisfies the stochastic differential equation t dm t = dmt g l g (t, s) dms g d M t, (4) where l g is given by (3). Moreover M = M g. Proof. The claim follows by using the Girsanov s theorem. 21 / 37

Canonical Representation for Martingales Remark (Equivalence and Singularity) Note that T s l g (s, u) 2 du ds <. In view of (4) this means that M and M g are equivalent on [, T ). Indeed, equation (4) can be viewed as the Hitsuda representation between two equivalent Gaussian processes. 22 / 37

Canonical Representation for Martingales Remark (Equivalence and Singularity) Note that T s l g (s, u) 2 du ds <. In view of (4) this means that M and M g are equivalent on [, T ). Indeed, equation (4) can be viewed as the Hitsuda representation between two equivalent Gaussian processes. Also note that T s l g (s, u) 2 du ds = meaning, by the Hitsuda representation theorem, that M and M g are singular on [, T ]. 22 / 37

Canonical Representation for Martingales Next we solve the stochastic differential equation (4) of Corollary 6. In general, solving a Volterra Stieltjes equation like (4) in a closed form is difficult. 23 / 37

Canonical Representation for Martingales Next we solve the stochastic differential equation (4) of Corollary 6. In general, solving a Volterra Stieltjes equation like (4) in a closed form is difficult. Of course, the general theory of Volterra equations suggests that the solution will be of the form (6) of next theorem, where l g is the resolvent kernel of l g determined by the resolvent equation (7) given below. 23 / 37

Canonical Representation for Martingales Next we solve the stochastic differential equation (4) of Corollary 6. In general, solving a Volterra Stieltjes equation like (4) in a closed form is difficult. Of course, the general theory of Volterra equations suggests that the solution will be of the form (6) of next theorem, where l g is the resolvent kernel of l g determined by the resolvent equation (7) given below. Also, the general theory suggests that the resolvent kernel can be calculated implicitly by using the Neumann series. In our case the kernel l g is a quadratic form that factorizes in its argument. This allows us to calculate the resolvent l g explicitly as (5) below. 23 / 37

Canonical Representation for Martingales Theorem (Solution to an SDE) Let s t [, T ]. Define the Volterra kernel l g (t) g(t, s) := l g (t, s) g (s) = g (t)g (t) g 1 (t) g(s) g (s). (5) Then the bridge M g has the canonical representation t dmt g = dm t l g(t, s) dm s d M t, (6) i.e., (6) is the solution to (4). 24 / 37

Canonical Representation for Martingales Proof. Equation (6) is the solution to (4) if the kernel l g satisfies the resolvent equation l g (t, s) + l g(t, s) = t s l g (t, u)l g(u, s) d M u. (7) 25 / 37

Canonical Representation for Martingales Proof. Equation (6) is the solution to (4) if the kernel l g satisfies the resolvent equation l g (t, s) + l g(t, s) = t s l g (t, u)l g(u, s) d M u. (7) Indeed, suppose (6) is the solution to (4). This means that ( t ) dm t = dm t l g(t, s) dm s d M t t ( s ) l g (t, s) dm s l g(s, u) dm u d M s d M t, 25 / 37

Canonical Representation for Martingales Proof. In the integral form, by using the Fubini s theorem, this means that M t = M t + t t t t s s t t s s u l g(u, s) d M u dm s l g (u, s) d M u dm s l g (s, v)l g(v, u)d M v d M u dm s. 26 / 37

Canonical Representation for Martingales Proof. In the integral form, by using the Fubini s theorem, this means that M t = M t + t t t t s s t t s s l g(u, s) d M u dm s l g (u, s) d M u dm s u l g (s, v)l g(v, u)d M v d M u dm s. The resolvent criterion (7) follows by identifying the integrands in the d M u dm s -integrals above. 26 / 37

Canonical Representation for Martingales Proof. In the integral form, by using the Fubini s theorem, this means that M t = M t + t t t t s s t t s s l g(u, s) d M u dm s l g (u, s) d M u dm s u l g (s, v)l g(v, u)d M v d M u dm s. The resolvent criterion (7) follows by identifying the integrands in the d M u dm s -integrals above. Now it is straightforward, but tedious, to check that the resolvent equation holds. We omit the details. 26 / 37

Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 27 / 37

Prediction-Invertible Processes Let us now consider a Gaussian process X that is not a martingale. For a (Gaussian) process X its prediction martingale is the process ˆX defined as [ ˆX t = E X T Ft X ]. 28 / 37

Prediction-Invertible Processes Let us now consider a Gaussian process X that is not a martingale. For a (Gaussian) process X its prediction martingale is the process ˆX defined as [ ˆX t = E X T Ft X Since for Gaussian processes ˆX t L t (X ), we may write, at least formally, that ˆX t = t ]. p(t, s) dx s, where the abstract kernel p depends also on T (since ˆX depends on T ). 28 / 37

Prediction-Invertible Processes Let us now consider a Gaussian process X that is not a martingale. For a (Gaussian) process X its prediction martingale is the process ˆX defined as [ ˆX t = E X T Ft X ]. Since for Gaussian processes ˆX t L t (X ), we may write, at least formally, that ˆX t = t p(t, s) dx s, where the abstract kernel p depends also on T (since ˆX depends on T ). In the next definition we assume, among other things, that the kernel p exists as a real, and not only formal, function. We also assume that the kernel p is invertible. 28 / 37

Prediction-Invertible Processes Definition (Prediction-Invertibility) A Gaussian process X is prediction-invertible if there exists a kernel p such that its prediction martingale ˆX is continuous, can be represented as ˆX t = t p(t, s) dx s, 29 / 37

Prediction-Invertible Processes Definition (Prediction-Invertibility) A Gaussian process X is prediction-invertible if there exists a kernel p such that its prediction martingale ˆX is continuous, can be represented as ˆX t = t p(t, s) dx s, and there exists an inverse kernel p 1 such that, for all t [, T ], p 1 (t, ) L 2 ([, T ], d ˆX ) and X can be recovered from ˆX by X t = t p 1 (t, s) d ˆX s. 29 / 37

Prediction-Invertible Processes Remark In general it seems to be a difficult problem to determine whether a Gaussian process is prediction-invertible or not. In the discrete time non-degenerate case all Gaussian processes are prediction-invertible. In continuous time the situation is more difficult, as the next example illustrates. 3 / 37

Prediction-Invertible Processes Remark In general it seems to be a difficult problem to determine whether a Gaussian process is prediction-invertible or not. In the discrete time non-degenerate case all Gaussian processes are prediction-invertible. In continuous time the situation is more difficult, as the next example illustrates. Nevertheless, we can immediately see that we must have R(t, s) = t s p 1 (t, u) p 1 (s, u) d ˆX u, where ˆX u = Var (E [X T F u ]). However, this criterion does not seem to be very helpful in practice. 3 / 37

Prediction-Invertible Processes Example (Fine Points of Prediction-Invertibility) Consider the Gaussian slope X t = tξ, t [, T ], where ξ is a standard normal random variable. Now, if we consider the raw filtration Gt X = σ(x s ; s t), then X is not prediction invertible. Indeed, then ˆX = but ˆX t = X T, if t (, T ]. So, ˆX is not continuous. On the other hand, the augmented filtration is simply Ft X = σ(ξ) for all t [, T ]. So, ˆX = X T. Note, however, that in both cases the slope X can be recovered from the prediction martingale: X t = t ˆX T t. 31 / 37

Prediction-Invertible Processes We want to represent integrals w.r.t. X w.r.t. ˆX. 32 / 37

Prediction-Invertible Processes We want to represent integrals w.r.t. X w.r.t. ˆX. Definition (Linear Extensions of Kernels) Let X be prediction-invertible. Let P and P 1 extend the relations P[1 t ] = p(t, ) and P 1 [1 t ] = p 1 (t, ) linearly. 32 / 37

Prediction-Invertible Processes We want to represent integrals w.r.t. X w.r.t. ˆX. Definition (Linear Extensions of Kernels) Let X be prediction-invertible. Let P and P 1 extend the relations P[1 t ] = p(t, ) and P 1 [1 t ] = p 1 (t, ) linearly. Lemma (Abstract vs. Concrete Wiener Integrals) For f and ĝ nice enough T T f (t) dx t = ĝ(t) d ˆX t = T T P 1 [f ](t) d ˆX t, P[ĝ](t) dx t. 32 / 37

Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 33 / 37

Canonical Representation for Prediction-Invertible Processes The next (our main) theorem follows basically by rewriting T g(t) dx t = T P 1 [g](t) d ˆX (t) =. 34 / 37

Canonical Representation for Prediction-Invertible Processes The next (our main) theorem follows basically by rewriting T g(t) dx t = T P 1 [g](t) d ˆX (t) =. Theorem (Canonical Representation) Let X be prediction-invertible Gaussian process. Then t t ] Xt g = X t p 1 (t, u)p [ˆl ĝ (u, ) (s) d ˆX u dx s, s where ĝ = P 1 [g] and ˆl ĝ (u, v) = ĝ ˆX (u)ĝ (u)( ĝ ˆX ) 1 ĝ(v) (u) ĝ ˆX (v). 34 / 37

Canonical Representation for Prediction-Invertible Processes Proof. In the prediction-martingale level we have d ˆX ĝs = d ˆX s s ˆl ĝ (s, u) d ˆX u d ˆX s. 35 / 37

Canonical Representation for Prediction-Invertible Processes Proof. In the prediction-martingale level we have d ˆX ĝs = d ˆX s s ˆl ĝ (s, u) d ˆX u d ˆX s. Now, by operating with p 1 and P we get t ( s ) Xt g = X t p 1 (t, s) ˆl ĝ (s, u) d ˆX u d ˆX s = X t t p 1 (t, s) s ] P [ˆl ĝ(s, ) (u) dx u d ˆX s. Finally, the claim follows by using the Fubini s theorem. 35 / 37

Outline 1 Generalized (linearly conditioned) Bridges 2 Orthogonal Representation 3 Canonical Representation for Martingales 4 Prediction-Invertible Processes 5 Canonical Representation for Prediction-Invertible Processes 6 Open Questions 36 / 37

Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 37 / 37

Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 2 How to calculate the prediction-inversion kernel p 1 (t, s)? 37 / 37

Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 2 How to calculate the prediction-inversion kernel p 1 (t, s)? 3 What would be an example of continuous Gaussian process that is not prediction-invertible? 37 / 37

Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 2 How to calculate the prediction-inversion kernel p 1 (t, s)? 3 What would be an example of continuous Gaussian process that is not prediction-invertible? 4 What would ensure that L s (X ) L t (X ) for s < t and L s (X ) = t>s L t (X ) and L t (X ) = L t (X g )? Would this be enough for prediction-invertiblity? 37 / 37

Open Questions 1 What kind of condition (for the covariance) would imply prediction-invertibility? 2 How to calculate the prediction-inversion kernel p 1 (t, s)? 3 What would be an example of continuous Gaussian process that is not prediction-invertible? 4 What would ensure that L s (X ) L t (X ) for s < t and L s (X ) = t>s L t (X ) and L t (X ) = L t (X g )? Would this be enough for prediction-invertiblity? The End Thank you for your attention! 37 / 37