Discrete Time Approximation and Monte-Carlo Simulation of Backward Stochastic Differential Equations

Similar documents
Part II CONTINUOUS TIME STOCHASTIC PROCESSES

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation

Discrete time approximation of decoupled Forward-Backward SDE with jumps

On One Analytic Method of. Constructing Program Controls

A NUMERICAL SCHEME FOR BSDES. BY JIANFENG ZHANG University of Southern California, Los Angeles

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

A-posteriori estimates for backward SDEs

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

A numerical scheme for backward doubly stochastic differential equations

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

Solution in semi infinite diffusion couples (error function analysis)

Department of Economics University of Toronto

( ) () we define the interaction representation by the unitary transformation () = ()

CS286.2 Lecture 14: Quantum de Finetti Theorems II

Relative controllability of nonlinear systems with delays in control

FTCS Solution to the Heat Equation

Volatility Interpolation

Should Exact Index Numbers have Standard Errors? Theory and Application to Asian Growth

Fall 2010 Graduate Course on Dynamic Learning

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

Notes on the stability of dynamic systems and the use of Eigen Values.

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

Let s treat the problem of the response of a system to an applied external force. Again,

SELFSIMILAR PROCESSES WITH STATIONARY INCREMENTS IN THE SECOND WIENER CHAOS

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

SOME NOISELESS CODING THEOREMS OF INACCURACY MEASURE OF ORDER α AND TYPE β

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

Mechanics Physics 151

Appendix H: Rarefaction and extrapolation of Hill numbers for incidence data

Method of upper lower solutions for nonlinear system of fractional differential equations and applications

Comparison of Differences between Power Means 1

Epistemic Game Theory: Online Appendix

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair

P R = P 0. The system is shown on the next figure:

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Linear Response Theory: The connection between QFT and experiments

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

The Pricing of Basket Options: A Weak Convergence Approach

Density Matrix Description of NMR BCMB/CHEM 8190

Testing a new idea to solve the P = NP problem with mathematical induction

Lecture 6: Learning for Control (Generalised Linear Regression)

Online Appendix for. Strategic safety stocks in supply chains with evolving forecasts

F-Tests and Analysis of Variance (ANOVA) in the Simple Linear Regression Model. 1. Introduction

Approximate Analytic Solution of (2+1) - Dimensional Zakharov-Kuznetsov(Zk) Equations Using Homotopy

ON THE WEAK LIMITS OF SMOOTH MAPS FOR THE DIRICHLET ENERGY BETWEEN MANIFOLDS

arxiv: v1 [math.pr] 6 Mar 2019

Survival Analysis and Reliability. A Note on the Mean Residual Life Function of a Parallel System

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Tight results for Next Fit and Worst Fit with resource augmentation

Li An-Ping. Beijing , P.R.China

Graduate Macroeconomics 2 Problem set 5. - Solutions

On computing differential transform of nonlinear non-autonomous functions and its applications

e-journal Reliability: Theory& Applications No 2 (Vol.2) Vyacheslav Abramov

ANOVA FOR DIFFUSIONS AND ITO PROCESSES. By Per Aslak Mykland and Lan Zhang

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

Density Matrix Description of NMR BCMB/CHEM 8190

Lecture VI Regression

Cubic Bezier Homotopy Function for Solving Exponential Equations

On the numerical treatment ofthenonlinear partial differentialequation of fractional order

Track Properities of Normal Chain

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems

Bayesian Inference of the GARCH model with Rational Errors

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

Mechanics Physics 151

Dual Approximate Dynamic Programming for Large Scale Hydro Valleys

3. OVERVIEW OF NUMERICAL METHODS

Existence of Time Periodic Solutions for the Ginzburg-Landau Equations. model of superconductivity

FI 3103 Quantum Physics

Mechanics Physics 151

Variants of Pegasos. December 11, 2009

Comb Filters. Comb Filters

A HIERARCHICAL KALMAN FILTER

On elements with index of the form 2 a 3 b in a parametric family of biquadratic elds

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

Advanced Macroeconomics II: Exchange economy

Optimal Investment under Relative Performance Concerns

Advanced Machine Learning & Perception

2 Aggregate demand in partial equilibrium static framework

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

NATIONAL UNIVERSITY OF SINGAPORE PC5202 ADVANCED STATISTICAL MECHANICS. (Semester II: AY ) Time Allowed: 2 Hours

RELATIONSHIP BETWEEN VOLATILITY AND TRADING VOLUME: THE CASE OF HSI STOCK RETURNS DATA

@FMI c Kyung Moon Sa Co.

Lecture 2 M/G/1 queues. M/G/1-queue

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

January Examinations 2012

M. Y. Adamu Mathematical Sciences Programme, AbubakarTafawaBalewa University, Bauchi, Nigeria

arxiv: v5 [math.pr] 31 Mar 2015

Higher-order numerical scheme for linear quadratic problems with bang bang controls

( ) [ ] MAP Decision Rule

Lecture 11 SVM cont

This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and

About Hydrodynamic Limit of Some Exclusion Processes via Functional Integration

Robustness Experiments with Two Variance Components

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

Transcription:

Dscree Tme Approxmaon and Mone-Carlo Smulaon of Backward Sochasc Dfferenal Equaons Bruno Bouchard Unversé Pars VI, PMA, and CREST Pars, France bouchard@ccrjusseufr Nzar Touz CREST Pars, France ouz@ensaefr Ocober 2002 Absrac We sugges a dscree-me approxmaon for decoupled forward-backward sochasc dfferenal equaons The L p norm of he error s shown o be of he order of he me sep Gven a smulaon-based esmaor of he condonal expecaon operaor, we hen sugges a backward smulaon scheme, and we sudy he nduced L p error Ths esmae s more nvesgaed n he conex of he Mallavn approach for he approxmaon of condonal expecaons Exensons o he refleced case are also consdered Key words: Mone-Carlo mehods for refleced forward-backward SDE s, Mallavn calculus, regresson esmaon MSC 1991 subjec classfcaons: 65C05, 60H07, 62G08 1

1 Inroducon In hs paper, we are neresed n he problem of dscrezaon and smulaon of he decoupled forward-backward sochasc dfferenal equaon SDE, hereafer on he me nerval 0, 1 : dx = bx d + σx dw, dy = f, X, Y, Z d Z dw X 0 = x and Y 1 = gx 1, where W s a sandard Brownan moon, b, σ and f are valued respecvely n R n, M n and R The analyss of hs paper exends easly o he case of refleced backward SDE s wh z-ndependen generaor f Ths exenson s presened n he las secon of hs paper Noce ha he problem of dscrezaon and smulaon of he forward componens X s well-undersood, see eg 18, and we are manly neresed n he backward componen Y Gven a paron : 0 = 0 < < n = 1 of he nerval 0, 1, we consder he frs nave Euler dscrezaon of he backward SDE : Ỹ Ỹ 1 = f 1, X 1, Ỹ 1, Z 1 1 Z 1 W W 1, ogeher wh he fnal daa Ỹ n = gx n Of course, gven Ỹ, Z, here s no F 1 measurable random varables Ỹ 1, Z 1 whch sasfy he above equaon A workable backward nducon scheme s obaned by akng condonal expecaons Ths suggess naurally he followng backward procedure for he defnon of he dscree-me approxmaon Y, Z : Y n = g X n, Z 1 = 1 1 E Y W W 1 F 1 Y 1 = E Y F 1 + f 1, X 1, Y 1, Z 1 1, for all = 1,, n Here F } s he compleed flraon of he Brownan moon W Our frs man resul, Theorem 31, s an esmae of he error Y Y of he order of 1 A smlar error esmae was obaned by 27, bu wh a slghly dfferen, and less naural, dscrezaon scheme The key-ngreden for he smulaon of he backward componen Y s he followng well-known resul : under sandard Lpschz condons, he backward componen and he assocaed conrol Y, Z, whch solves he backward SDE, can be expressed as a funcon of X, e Y, Z = ux, vx, 1, for some deermnsc funcons u and v Then, he condonal expecaons, nvolved n he above dscrezaon scheme, reduce o he regresson of Y and Y W W 1 on he random varable X 1 For nsance, one can use he classcal kernel regresson esmaon, as n 9, he bass projecon mehod suggesed by 21, see also 11, or he Mallavn approach nroduced n 15, and furher developed n 7, see also 19 Gven a smulaon-based approxmaon Ê 1 of E F 1, we hen analyse he backward smulaon scheme Ŷ n = g X n, Ẑ 1 = 1 1 Ê 1 Ŷ 1 = Ê 1 Ŷ W W 1 Ŷ + f 1, X 1, Ŷ 1, Ẑ 1 1, 2

Le η denoe he maxmum smulaon error of Ê 1 E 1 Ŷ and Ê 1 E 1 Ŷ W W 1 Observe ha η depends boh on he number of smulaed pahs and he me sep Also, gven a number N of smulaed pahs for he regresson approxmaon, he bes esmae ha one can expec for η s N 1/2, he classcal Mone Carlo error deduced from he Cenral Lm Theorem Our second man resul, Theorem 41, saes ha he L p norm of he error due o he regresson esmaon s of he order 1 η Ths rae of convergence s easly undersood n he case of a regular grd, as he scheme nvolves 1 seps, each of hem requrng some regresson approxmaon As a consequence of hs resul, for = n 1, we see ha n order o acheve he rae n 1/2, one needs o use a leas N = n 3 smulaed pahs for he regresson esmaon We nex nvesgae n more deals he error Ê 1 E 1 Ŷ and Ê 1 E 1 Ŷ W W 1 More precsely, we examne a common dffculy o he kernel and he Mallavn regresson esmaon mehods : n boh mehods he regresson esmaor s he rao of wo sascs, whch s no guaraneed o be negrable We solve hs dffculy by nroducng a runcaon procedure along he above backward smulaon scheme In Theorem 51, we show ha hs reduces he error o he analyss of he negraed sandard devaon of he regresson esmaor Ths quany s esmaed for he Mallavn regresson esmaor n 6 The resuls of hs secon mply an esmae of he L p error Ŷ Y of he order of 1 d/4p N 1/2p, where N s he number of smulaed pahs for he regresson esmaon, see Theorem 62 In order o beer undersand hs resul, le = n 1 n me-seps, hen n order o acheve an error esmae of he order n 1/2, one needs o use N = n 3p+d/2 smulaed pahs for he regresson esmaon a each sep In he lm case p = 1, hs reduces o N = n 3+d/2 Unforunaely, we have no been able o oban he bes expeced N = n 3 number of smulaed pahs We conclude hs nroducory secon by some references o he exsng alernave numercal mehods for backward SDE s Frs, he four sep algorhm was developed by 23 o solve a class of more general forward-backward SDE s, see also 13 Ther mehod s based on he fne dfference approxmaon of he assocaed PDE, whch unforunaely can no be managed n hgh dmenson Recenly, a quanzaon echnque was suggesed by 3 and 4 for he resoluon of refleced backward SDE s when he generaor f does no depend on he conrol varable z Ths mehod s based on he approxmaon of he connuous me processes on a fne grd, and requres a furher esmaon of he ranson probables on he grd Dscree-me scheme based on he approxmaon of he Brownan moon by some dscree process have been consdered n 10, 12, 8, 1 and 22 Ths echnque allows o smplfy he compuaon of he condonal expecaons nvolved a each me sep However, he mplemenaon of hese schemes n hgh dmenson s quesonable We fnally refer o 2 for a random me schemes, whch requres a furher approxmaon of condonal expecaons o gve an mplemenaon Noaons : We shall denoe by M n,d he se of all n d marces wh real coeffcens 1/2 We smply denoe R n := M n,1 and M n := M n,n We shall denoe by a :=,j,j a2 he Euclydan norm on M n,d, a he ranspose of a, a k he k-h column of a, or he k-h 3

componen f a R d Fnally, we denoe by x y := x y he scalar produc on R n 2 The smulaon and dscrezaon problem Le Ω, F} 01, P be a flered probably space equpped wh a d dmensonal sandard Brownan moon W } 01 Consder wo funcons b : R d R d and σ : R d M d sasfyng he Lpschz condon : bu bv + σu σv K u v 21 for some consan K ndependen of u, v R d Then, s well-known ha, for any nal condon x R d, he forward sochasc dfferenal equaon X = x + has a unque F } adaped soluon X } 01 sasfyng } E X 2 <, 0 sup 01 bx s ds + σx s dw s 22 see eg 17 Nex, le f : 0, 1 R d R R d R and g : R d R be wo funcons sasfyng he Lpschz condon gu gv + fξ fζ K u v + ξ ζ 23 for some consan K ndependen of u, v R d and ξ, ζ 0, 1 R d R R d Consder he backward sochasc dfferenal equaon : Y = gx 1 + 1 fs, X s, Y s, Z s ds 1 Z s dw s, 1 24 The Lpschz condon 23 ensures he exsence and unqueness of an adaped soluon Y, Z o 24 sasfyng 1 } E sup 01 Y 2 + Z 2 d <, 0 see eg 24 Equaons 22-24 defne a decoupled sysem of forward-backward sochasc dfferenal equaons The purpose of hs paper s o sudy he problem of dscrezaon and smulaon of he componens X, Y of he soluon of 22-24 Remark 21 Under he Lpschz condons 21-23, s easly checked ha : Y a 0 + a 1 X, 0 1, for some parameers a 0 and a 1 dependng on K, b0, σ0, g0 and f0 In he subsequen paragraph, we shall derve a smlar bound on he dscree-me approxmaon of Y The a pror knowledge of such a bound wll be of crucal mporance for he smulaon scheme suggesed n hs paper 4

3 Dscree-me approxmaon error In order o approxmae he soluon of he above BSDE, we nroduce he followng dscrezed verson Le : 0 = 0 < 1 < < n = 1 be a paron of he me nerval 0, 1 wh mesh := max 1n 1 Throughou hs paper, we shall use he noaons : = 1 and W = W W 1, = 1,, n The forward componen X wll be approxmaed by he classcal Euler scheme : X 0 = X 0 X = X 1 + bx 1 + σx 1 W, for = 1,, n, 31 and we se X := X 1 + bx 1 1 + σx 1 W W 1 for 1, We shall denoe by F } 0n he assocaed dscree-me flraon : F := σ X j, j Under he Lpschz condons on b and σ, he followng L p esmae for he error due o he Euler scheme s well-known : 1/p lm sup 1 E sup X X p + max sup X X 1 p <, 32 0 01 1n 1 for all p 1, see eg 18 We nex consder he followng naural dscree-me approxmaon of he backward componen Y : Y 1 = g X 1, Y 1 = E 1 Z 1 = 1 Y + f 1, X 1, Y E 1 Y W 1, Z 1, 1 n, 33 34 where E = E F The above condonal expecaons are well-defned a each sep of he algorhm Indeed, by a backward nducon argumen, s easly checked ha Y L 2 for all Remark 31 Usng an nducon argumen, s easly seen ha he random varables Y and Z are deermnsc funcons of X for each = 0,, n From he Markov feaure of he process X, hen follows ha he condonal expecaons nvolved n 33-34 can be replaced by he correpondng regressons : E 1 Y = E Y X 1 and E 1 5 Y W = E Y W X 1

For laer use, we observe ha he same argumen shows ha : E 1 Y = E Y X 1 and E 1 Y W = E Y W X 1, where E := E F for all 0 n Noce ha Y, Z dffers from he approxmaon scheme suggesed n 27 whch nvolves he compuaon of 2d + 1 condonal expecaons a each sep For laer use, we need o nroduce a connuous-me approxmaon of Y, Z Snce Y L 2 for all 1 n, we deduce, from he classcal marngale represenaon heorem, ha here exss some square negrable process Z such ha : We hen defne : Y +1 = E Y +1 F + = E Y +1 + +1 +1 Z s dw s Z s dw s 35 Y := Y f, X, Y, Z + Zs dw s, < +1 The followng propery of he Z s needed for he proof of he man resul of hs secon Lemma 31 For all 1 n, we have : Z 1 = E 1 1 Z s ds Proof Snce Y L 2, here s a sequence ξ k k of random varables n D 1,2 convergng o Y n L 2 Then, follows from he Clark-Oconne formula ha, for all k : ξ k = E 1 ξ k + Usng Remark 31, we now compue ha : Z 1 = E 1 Y W ζs k dw s where ζs k := E D s ξ k F s, 1 s 1 = lm k E 1 = lm k E 1 = lm k E 1 ξ k W D s ξ k ds 1 ζs k ds, 36 1 by he Mallavn negraon by pars formula and he ower propery for condonal expecaons We hen esmae ha : E 1 ζs k Zs ds 2 1/2 E 1 ζs k Zs ds 1 6 1

Snce ξ k converges o Y resul = E 1 2 E 1 ξ k E 1 ξ k Y E 1 Y 2 1/2 Y ξ k 2 1/2 n L 2, he las nequaly ogeher wh 36 provde he requred We also need he followng esmae proved n Theorem 343 of 27 Lemma 32 For each 1 n, defne Z 1 := 1 E Z s ds F 1 1 Then : lm sup 0 1 max 1n sup E Y Y 1 2 + 1 < n } E Z Z 1 2 d 1 =1 < We are now ready o sae our frs resul, whch provdes an error esmae of he approxmaon scheme 33-34 of he same order han 27 Theorem 31 lm sup 0 1 sup 01 E Y 1 } Y 2 + E Z Z 2 d 0 < Proof In he followng, C > 0 wll denoe a generc consan ndependen of and n ha may ake dfferen values from lne o lne Le 0,, n 1} be fxed, and se δy := Y Y, δz := Z Z and δf := f, X, Y, Z f, X, Y, Z for, +1 By Iô s Lemma, we compue ha +1 A := E δy 2 + E δz s 2 ds E δy +1 2 +1 = E 2δY s δf s ds, +1 1 Le α > 0 be a consan o be chosen laer on From he Lpschz propery of f, ogeher wh he nequaly ab αa 2 + b 2 /α, hs provdes : +1 A E C +1 + C α αe δy s 2 ds +1 δy s + Xs X + Ys Y + Zs Z ds E 2 + Xs X 2 + Ys Y 2 + Zs Z 2 ds 37 7

Now observe ha : E X s X 2 E Ys Y 2 C 38 2 E Y s Y 2 + E δy 2} C + E δy 2}, 39 by 32 and he esmae of Lemma 32 Also, wh he noaon of Lemma 32, follows from Lemma 31 ha : E Z s Z 2 2 E Z s Z 2 + E Z Z } 2 = 2 E Z s Z 2 1 +1 } 2 + E E δz r F dr +1 2 E Zs Z 2 1 +1 } + E δz r 2 dr 310 +1 by Jensen s nequaly We now plug 38-39-310 no 37 o oban : A +1 + C α +1 +1 + C α αe δy s 2 ds + C α +1 +1 +1 E + δy 2 + Zs Z 2 ds E δz r 2 drds 311 αe δy s 2 ds + C +1 E + δy 2 + Z s α Z 2 ds +1 E δz r 2 dr 312 2 From he defnon of A and 312, we see ha, for < +1, where E δy 2 E δy 2 + B := E δy +1 2 + C α +1 E δz s 2 ds α +1 E δy s 2 ds + B 313 +1 +1 2 + E δy 2 + E δz r 2 dr + E } Z s Z 2 ds By Gronwall s Lemma, hs shows ha E δy 2 B e α for < +1, whch plugged n he second nequaly of 313 provdes : E δy 2 + +1 E δz s 2 ds B 1 + α e α B 1 + Cα 314 for small For = and α suffcenly larger han C, we deduce from hs nequaly ha : E δy 2 + 1 +1 E δz s 2 ds 1 + C E +1 } δy+1 2 + 2 + Zs 2 Z 2 ds, for small 8

3 Ierang he las nequaly, we ge : E δy 2 + 1 2 +1 E δz s 2 ds 1 + C 1/ E δy 1 2 + + n =1 1 E } Z s Z 2 1 ds Usng he esmae of Lemma 32, ogeher wh he Lpschz propery of g and 32, hs provdes : E δy 2 + 1 2 +1 E δz s 2 ds C 1 + C 1/ } E δy 1 2 + C for small Summng up he nequaly 314 wh =, we ge : C, 315 1 0 E δz s 2 ds E δy 1 2 E δy 0 2 + C E δz s 2 ds α 0 + C n 1 +1 } 2 + E δy 2 + E Z s Z 0,1 2 α ds =0 1 For α suffcenly larger han C, follows from 315 and Lemma 32 ha : 1 0 E δz s 2 ds C Togeher wh Lemma 32 and 315, hs shows ha B C, and herefore : sup E δy 2 C, 01 by akng he supremum over n 314 Ths complees he proof of he heorem We end up hs secon wh he followng bound on he Y s whch wll be used n he smulaon based approxmaon of he dscree-me condonal expecaon operaors E, 0 n 1 Lemma 33 Assume ha for some K 1, and defne he sequence g0 + f0 + b + σ K, 316 αn := 2K, βn := K α := 1 K 1 1 + K 1/2 α+1 + β+14k 2 } + 3K 2 } β := 1 K 1 1 + K 1/2 K β+1 + 3K 2, 0 n 1 9

Then, for all 0 n E 1 Moreover, Y α + β X 2, 317 Ŷ E 1 } Ŷ 2 1/2 α + β 1 + 2K X 2 1 + 4K 2, 318 Ŷ } W α + β 1 + 2K X 2 1 + 4K 2 319 E 1 lm sup 0 max 0n α + β } < Proof We frs observe ha he bound 318 s a by-produc of he proof of 317 The bound 319 follows drecly from 318 ogeher wh he Cauchy-Schwarz nequaly In order o prove 317, we use a backward nducon argumen Frs, snce g s K Lpschz and g0 s bounded by K, we have : Y 1 K 1 + X 1 K 2 + X 1 2 = α n + β n X 1 2 320 We nex assume ha Y +1 α +1 + β+1 X 2 +1, 321 for some fxed 0 n 1 From he Lpschz propery of f, here exss an R R d R R d - valued F -measurable random varable τ, ξ, ν, ζ, essenally bounded by K, such ha : f, X, Y, +1 1 E Y +1 W +1 f 0 Y = E = τ + ξ X + ν Y By he defnon of Y n 33, hs provdes Y + +1 + +1 1 ζ E +1 + +1f0 τ + ξ X + ν Y + +1 1 ζ E Y +1 W +1 Y +1 W +1 } Then, follows from he Cauchy-Schwarz nequaly and he nequaly x 1 + x 2 ha, for 1, 1 K Y E Y +1 1 + ζ W +1 + K 2 + X E Y 2 1/2 +1 E 1 + ζ W +1 2 1/2 + K 3 + X 2 322 Now, snce ζ s F -measurable and bounded by K, observe ha : E 1 + ζ W +1 2 1 + K 10

We hen ge from 322 : 1 K Y 1 + K 1/2 E Usng 321, we now wre ha : E Y +1 2 1/2 α +1 + β +1 E Y 2 1/2 +1 + K 3 + X 2 323 X + b X + σ X W +1 4 } 1/2324 where, by 316 and he assumpon K 1, drec compuaon leads o : E X + b X + σ X W +1 4 } 1/2 X + K 2 + 3K 2 Togeher wh 323-324-325, hs mples ha : 1 K Y 1 + K 1/2 α+1 + β+1 +K 3 + K X 2 1 + K 1/2 α+1 + β+1 +3K 2 1 + X 2 1 + 2K X 2 + 4K 2 325 1 + 2K X } 2 + 4K 2 1 + 2K X } 2 + 4K 2 I follows ha : Y α+1 + β+1 X 2 Now observe ha for all 0 n : n 1 β = 7K 2 1 + K j /2 1 + 2K j + 1 + K n /2 1 + 2K n βn j= 8K 2 1 + K 1 2 1 + 2K 1, where he las erm s unformly bounded n The same argumen shows ha max 0n α s unformly bounded n 4 Error due o he regresson approxmaon In hs secon, we focus on he problem of smulang he approxmaon X, Y of he componens X, Y of he soluon of he decoupled forward-backward sochasc dfferenal equaon 22-24 The forward componen X defned by 31 can of course be smulaed on he me grd defned by he paron by he classcal Mone-Carlo mehod Then, we are reduced o he problem of smulang he approxmaon Y defned n 33-34, gven he approxmaon X of X 11

Noce ha each sep of he backward nducon 33-34 requres he compuaon of d + 1 condonal expecaons In pracce, one can only hope o have an approxmaon Ê of he condonal expecaon operaor E Therefore, he man dea for he defnon of an approxmaon of Y, and herefore of Z, s o replace he condonal expecaon E by Ê n he backward scheme 33-34 However, we would lke o mprove he effcency of he approxmaon } scheme of Y, Z when s known o le n some gven doman Le hen =, be a sequence of pars of maps from R d no R, + } sasfyng : 0n X Y X for all = 0,, n, 41 e are some gven a pror known bounds on Y for each For nsance, one can defne by he bounds derved n Lemma 33 When no bounds on Y are known, one may ake = and = + Gven a random varable ζ valued n R, we shall use he noaon : T ζ := X ζ X, where and denoe respecvely he bnary maxmum and mnmum operaors Snce he backward scheme 33-34 nvolves he compuaon of he condonal expecaons E 1 Y and E 1 Y W, we shall also need o nroduce he sequences R = } R, R and 0n I = sasfyng : I, I } 0n of pars of maps from Rd no R, + } R 1X 1 E 1 Y R 1X 1 I 1X 1 E 1 Y W I 1X 1 for all = 1,, n The correspondng operaors T R and T I are defned smlarly o T An example of such sequences s gven by Lemma 33 Now, gven an approxmaon Ê of E, we defne he process Ŷ, Ẑ by he backward nducon scheme : Ŷ1 = Y1 = g X1, ˇY 1 = Ê 1 Ŷ + f ˇY Ŷ 1 = T 1 Ẑ 1 = 1 Ê 1 1 1, X 1, ˇY 1, Ẑ 1 42, 43 Ŷ W, 44 for all 1 n Recall from Remark 31 ha he condonal expecaons nvolved n 33-34 are n fac regresson funcons Ths smplfes consderably he problem of approxmang E Example 41 Non-paramerc regresson Le ζ be an F +1 -measurable random varable, and X j, ζ j N j=1 be N ndependen copes of X 1,, X n, ζ The non-paramerc 12

kernel esmaor of he regresson operaor E s defned by : N Ẽ j=1 ζj κ h N X 1 j X ζ :=, N j=1 h κ N X 1 j X where κ s a kernel funcon and h N s a bandwdh marx convergng o 0 as N We send he reader o 5 for deals on he analyss of he error Ẽ E The above regresson esmaor can be mproved n our conex by usng he a pror bounds on Y : Ê 1 Ŷ = T R 1 Ẽ 1 Ŷ and Ê 1 Ŷ W = T I 1 Ẽ 1 Ŷ W Example 42 Mallavn regresson approach Le φ be a mappng from R d no R, and X j N j=1 be N ndependen copes of X 1,, X n The Mallavn regresson esmaor of he operaor E s defned by : Ẽ φx +1 := N j=1 φ X j +1 H X X j S j, N j=1 H X X j S j where H x s he Heavsde funcon, H x y = d =1 1 x y, and Sj are ndependen copes of some random varable S whose precse defnon s gven n 6 below Noce ha he praccal mplemenaon of hs approxmaon procedure n he backward nducon 42-44 requres a slgh exenson of hs esmaor Ths ssue wll be dscussed precsely n 6 As n he prevous example, we use he bounds on Y o defne he approxmaons The above regresson esmaor can be mproved n our conex by usng he a pror bounds on Y : Ê 1 Ŷ = T R 1 Ẽ 1 Ŷ and Ê 1 Ŷ W = T I 1 Ẽ 1 Ŷ W Remark 41 The use of a pror bounds on he condonal expecaon o be compued s a crucal sep n our analyss Ths s due o he fac ha, n general, naural esmaors Ẽ, as n Examples 41 and 42, produce random varables whch are no necessarly negrable We now urn o he man resul of hs secon, whch provdes an L p esmae of he error Ŷ Y n erms of he regresson errors Ê E Theorem 41 Le p > 1 be gven, and be a sequence of pars of maps valued n R, } sasfyng 41 Then, here s a consan C > 0 whch only depends on K, p such ha : Ŷ Y L p for all 0 n C } max Êj Ej L Ŷ + j+1 Ê 0jn 1 p j Ej Ŷ j+1 L W j+1 p 13

Proof In he followng, C > 0 wll denoe a generc consan, whch only depends on K, p, ha may change from lne o lne Le 0 n 1 be fxed We frs compue ha : Y ˇY = ɛ + E Y +1 Ŷ +1 + +1 f, X, Y, +1 1 E Y +1 W +1 f, X, ˇY }, +1 1 E Ŷ +1 W +1 45 where ɛ := E Ê Ŷ +1 + +1 f, X, ˇY, +1 1 Ê Ŷ +1 W +1 f, X, ˇY, +1 1 E Ŷ +1 W +1 } From he Lpschz propery of f, we have : } ɛ L p η := C Ê E L Ŷ + +1 Ê p E Ŷ +1 L W +1 p Agan, from he Lpschz propery of f, here exss an R R d -valued F -measurable random varable ν, ζ, essenally bounded by K, such ha : f, X, Y, +1 1 E Y +1 W +1 f, X, ˇY, +1 1 E Ŷ +1 W +1 = ν Y ˇY + +1 1 ζ E Y +1 Ŷ +1 W +1 Then, follows from 45 and he Hölder nequaly ha : 1 K Y ˇY ɛ + E Y +1 Ŷ +1 1 + ζ W +1 ɛ + E Y +1 Ŷ p 1/p +1 E 1 + ζ W +1 q 1/q ɛ + E Y +1 Ŷ p 1/p +1 E 1 + ζ W +1 2k 1/2k where q s he conjugae of p and k q/2 s an arbrary neger Recallng ha Ŷ ˇY and Y = T Y, by 41, hs provdes T 1 K Y Ŷ ɛ + E = Y +1 Ŷ +1 p 1/p E 1 + ζ W +1 2k 1/2k 46 by he 1 Lpschz propery of T Now, snce ζ s F -measurable and bounded by K, observe ha : E 1 + ζ W +1 2k = = 2k 2k j j=0 k 2k j=0 2j 1 + C 14 E ζ W +1 j E ζ W +1 2j

We hen ge from 46 : 1 K Y Ŷ L p ɛ L p + 1 + C 1/2k Y +1 Ŷ +1 L p η + 1 + C 1/2k Y +1 Ŷ +1 L p 47 For small, follows from hs nequaly ha : Y +1 Ŷ +1 L p 1 1 K 1/ 1 + C 1/2k C max η j 0jn 1 max 0jn 1 η Remark 42 In he parcular case where he generaor f does no depend on he conrol varable z, Theorem 41 s vald for p = 1 Ths s easly checked by nocng ha, n hs case, ζ = 0 n he above proof 5 Regresson error esmae In hs secon, we focus on he regresson procedure Le R, S be a par of random varables In boh Examples 41 and 42, he regresson esmaor s based on he observaon ha he regresson funcon can be wren n rx := E R S = x = qr x q 1 x where q R x := E R ε x S, and ε x denoes he Drac measure a he pon x Then, he regresson esmaon problem s reduced o he problem of esmang separaely q R x and q 1 x, and he man dffculy les n he presence of he Drac measure nsde he expecaon operaor Whle he kernel esmaor s based on approxmang he Drac measure by a kernel funcon wh bandwdh shrnkng o zero, he Mallavn esmaor s suggesed by an alernave represenaon of q R x obaned by negrang up he Drac measure o he Heavsde funcon, see 6 In boh cases, one defnes an esmaor : r N x, ω := ˆqR N x, ω, ω Ω, x, ω ˆq 1 N where ˆq N Rx, ω and ˆq1 N x, ω are defned as he means on a sample of N ndependen copes A x, ω, B x, ω} 1N of some correspondng random varables Ax, ω, Bx, ω} : ˆq R Nx, ω := 1 N N A x, ω and ˆq Nx, 1 ω := 1 N =1 N B x, ω =1 15

In he Mallavn approach hese random varables Ax, ω, Bx, ω} have expecaon equal o q R x, q 1 x}, see Theorem 61 below Usng he above defnons, follows ha : V R x := N Var ˆq R N x = VarAx, V 1 x := N Var ˆq 1 N x = VarBx 51 In order o prepare for he resuls of 6, we shall now concenrae on he case where EAx, Bx = q R x, q 1 x, so ha 51 holds A smlar analyss can be performed for he kernel approach, see Remark 51 below In vew of Theorem 41, he L p error esmae of Ŷ Y s relaed o he L p error on he regresson esmaor As menoned n Remark 41, he regresson error r N S rs s no guaraneed o be negrable Indeed, he classcal cenral lm heorem ndcaes ha he denomnaor ˆq N 1x of r N x has a gaussan asympoc dsrbuon, whch nduces negrably problems on he rao ˆq R N x/ˆq1 N x We solve hs dffculy by nroducng he a pror bounds ρx, ρx } on rx : and we defne he runcaed esmaors : ρx rx ρx for all x, ˆr N x := T ρ r N x := ρx r N x ρx We are now ready for he man resul of hs secon Theorem 51 Le R and S be random varables valued respecvely n R and R d Assume ha S has a densy q 1 > 0 wh respec o he Lebesgue measure on R d For all x R d, le A x } 1N and B x } be wo famles of ndependen and dencally 1N dsrbued random varables on R sasfyng EA x = q R x and EB x = q 1 x for all 1 N Se γx := ρ ρ x, and assume ha Γr, γ, V 1, V R := γx p 1 V R x 1/2 + rx + γx V 1 x 1/2 dx < 52 R d for some p 1 Then : ˆr N S rs L p Proof We frs esmae ha : 2 Γr, γ, V 1, V R N 1/2 1/p ˆr N S rs p L E r p N S rs p γs p ε R N = E S rsε1 N S p ˆq N 1 S γs p 53 where ε R Nx, ω := ˆq R Nx, ω q R x and ε 1 Nx, ω := ˆq 1 Nx, ω q 1 x 16

For laer use, we observe ha : ε R N x L 1 εr N x L 2 N 1/2 V R x 1/2, ε 1 N x L 1 ε1 N x L 2 N 1/2 V 1 x 1/2, by 51 Nex, for all x R d, we consder he even se Mx := ω Ω : ˆq Nx, 1 ω q 1 x 2 1 q 1 x }, and observe ha, for ae ω Ω, ε R N x, ω rxε1 N x p ˆq N 1 x, ω γx p 2 εr N x, ω rxε1 N x q 1 x p γx p 1 Mx ω +γx p 1 Mx cω 54 As for he frs erm on he rgh hand-sde, we drecly compue ha for p 1 : 2 εr N x rxε1 N x p q 1 x γx p 2 ε R N x rxε1 N x q 1 x γxp 1 so ha E 2 εr N S rsε1 N S p q 1 S γs p 1 MS 2 E ε R Nx rxε 1 Nx γx p 1 dx R d 2 ε R N x L 2 + rxε 1 Nx L 2 γx p 1 dx R d = 2N 1/2 V R x 1/2 + rx V 1 x 1/2 γx p 1 dx 55 R d The second erm on he rgh hand-sde of 54 s esmaed by means of he Tchebychev nequaly : E γs p 1 M c S = E E γs p 1 M c S S = E γx p P MS c S = E γx p P 2 ˆq 1 NS q 1 S > q 1 S S 2 E γx p q 1 S 1 E ˆq NS 1 q 1 S 2 E γx p q 1 S 1 Varˆq 1 NS 1/2 = 2N 1/2 R d γx p V 1 x 1/2 dx 56 The requred resul follows by pluggng nequales 54, 55 and 56 no 53 Remark 51 Observe from he above proof, ha he error esmae of Theorem 51 could have been wren n erms of ε R N x L 1 and ε 1 N x L 1 nsead of N 1/2 V R x 1/2 and N 1/2 V 1 x 1/2 In ha case, he esmae of Theorem 51 reads : ˆr N S rs p L 2 γx p 1 ε R Nx p L R d 1 + rx + γx ε 1 Nx L 1 dx, 17

a resul whch does no requre he assumpon EAx, Bx = q R x, q 1 x In he kernel approach, ε R N x L 1 and ε R N x L 1 wll ypcally go o zero as N ends o nfny A dealed sudy of he above quany s lef o fuure researches 6 Mallavn Calculus based regresson approxmaon In hs secon, we concenrae on he Mallavn approach for he approxmaon of he condonal expecaon E as nroduced n Example 42 We shall assume hroughou hs secon ha } b, σ Cb and nf ξ σxξ : ξ R d and ξ = 1 > 0 for all x R d 61 For sake of smplcy, we shall also resrc he presenaon o he case of regular samplng : = 1 = for all 1 n 61 Alernave represenaon of condonal expecaon We sar by nroducng some noaons Throughou hs secon, we shall denoe by J k he subse of N k whose elemens I = 1,, k sasfy 1 1 < < k d We exend hs defnon o k = 0 by seng J 0 = Le I = 1,, m and J = j 1,, n be wo arbrary elemens n J m and J n Then 1,, m } j 1,, j n } = k 1,, k p } for some maxn, m} p mnd, m + n}, and 1 k 1 < < k p d We hen denoe I J := k 1,, k p J p Gven a marx-valued process h, wh columns denoed by h, and a random varable F, we denoe S h F := 0 F h dw for = 1,, k, and S h I F := S h 1 S h k F for I = 1,, k J k, whenever hese sochasc negrals exs n he Skorohod sense We exend hs defnon o k = 0 by seng S h F := F Smlarly, for I J k, we se : S h I F := S h Ī F where Ī J d k and I Ī s he unque elemen of J d Le ϕ be a C 0 b Rd +, e connuous and bounded, mappng from R d + no R We say ha ϕ s a smooh localzng funcon f ϕ0 = 1 and I ϕ C 0 b Rd + for all k = 0,, d and I J k Here, I ϕ = k ϕ/ x 1 x k For k = 0, J k =, and we se ϕ := ϕ We denoe by L he collecon of all smooh localzaon funcons Wh hese noaons, we nroduce he se HX 1 n 1 as he collecon of all marx-valued L 2 F 1 processes h sasfyng 0 D X h d = I d and 18 0 D X +1 h d = 0 62

here I d denoes he deny marx of M d and such ha, for any affne funcon a : R d R, a W +1 ϕ X s well-defned n D 1,2 for all I J k, k d and ϕ L S h I 63 For laer use, we observe ha by a sraghforward exenson of Remark 34 n 7, we have he represenaon : S h a W +1 ϕ X x = d 1 j j=0 J J j J ϕ X x S h J a W +1, 64 for any affne funcon a : R d R Moreover, follows from ha 61, ha X D for each 1,, n}, where he Mallavn dervaves can be compued recursvely as follows : D X 1 = σ X 0 1 1 D X = D X 1 + b + d j=1 X 1 D X 1 σ j X 1 D X 1 W j +1 + σ X 1 1 1, In parcular, for 1,, we oban ha D X = σ process defned : X 1 Le h be he M d valued h, σx 1 I d on 1, := I d + σ 1 bσx + d j=1 σ 1 σ j σx W j +1 on, +1 0 elsewhere 65 Snce b, σ, σ 1 Cb les n HX by 61, one easly checks ha h sasfes 62-63 and herefore Remark 61 For laer use, le us observe ha, for all s 1,, s l 1, +1, follows from 61 ha : } sup h, L p + D s1,,s l h, L p Cp 1, 66 1 +1 for some consan C p whch does no depend on Moreover, for any affne funcon a : R d R, and all 1 k, d, S h k} a W +1 = a W +1 h k, 1 W + a W +1 h k, W +1 +1 a W +1 Trace D h k, d +1 a h k, d, 19

so ha we also deduce he esmaes : S h k} a W +1 C p 1/2, L p D s1,,s l S h k} a W +1 C p 1 67 L p We now provde a slgh exenson of Corollary 31 n 7 whch wll be he sarng pon for our condonal expecaon esmaor Theorem 61 Le ϱ be a real-valued mappng and ξ be a vecor random varable ndependen of σx, 1 n wh R := ϱx +1, ξa W +1 L 2, for some affne funcon a : R d R Then, for all localzng funcons ϕ, ψ L : E R X = x = E Q R h, ϕ x E Q 1 68 h, ψ x where Q R h, ϕ x := H x X ϱx +1, ξs h a W +1 ϕ X x, and S h = S h 1,,d Moreover, f q denoes he densy of X, hen : Ŷ +1 q x = E Q 1 h, ϕ x Remark 62 The above heorem holds for any random varable F D, nsead of he parcular affne ransformaon of he Brownan ncremens a W +1 One only has o change he defnon of L accordngly n order o ensure ha he nvolved Skorohod negrals are well-defned However, we shall see n Secon 62 ha we only need hs characerzaon for he affne funcons a k x = 1 k=0 + x k 1 k 1, 0 k d Indeed, wrng as ϱx +1, ζ, we are neresed n compung E Ŷ +1 = E Ŷ +1 a 0 W +1 and E Ŷ +1 W k +1 = E Ŷ +1 a k W +1, 1 k d, see also he defnon of R afer 612 62 Applcaon o he esmaon of Ê The algorhm s nspred from he work of 9 and 21 We consder nn copes X 1,, X nn of he dscree-me process X on he grd, where N s some posve neger Se N := 1N + 1,, N}, 1 n For ease of noaon, we wre X 0 for X We consder he approxmaon scheme 42-43-44 wh an approxmaon of condonal expecaon operaor E suggesed by Theorem 61 A each me sep of hs algorhm, we shall make use of a he subse X 20

:= X j, j N The ndependence of he X s s crucal as explaned n Remark 63 below Inalzaon : For j 0} N n, we se : Ŷ j 1 := Y j 1 = g X j 1 69 Backward nducon : For = n,, 2, we se, for j 0} N 1 : ˇY j 1 = Êj 1 Ŷ j + f 1, X j j 1, ˇY 1, Ẑj 1 Ŷ j 1 := T j 1 ˇY j 1 Ẑ j 1 = Êj 1 1 Ŷ j W j 610 611 The approxmaons of he condonal expecaons Êj 1 are obaned as follows 1 We frs compue he esmaor suggesed by Theorem 61 : Ẽ 1 j R j := ˆQ Rj h j 1, ϕ X j 1 ˆQ 1 h j 1, ϕ X j 1 612 where, for R = Ŷ a W and a : R d R s an affne funcon, ˆQ Rj h j 1, ϕ X j 1 := 1 H X N X j l 1 Ŷ l S hl a W l +1 l N 1 ˆQ h 1 j 1, ϕ X j 1 := 1 H X N X j l 1 S hl l N 1 ϕx l 1 X j 1 ϕx l 1 X j 1 2 We nex use he sequence of a pror bounds on Y, see Secon 4, ogeher wh he nduced sequences R and I, o mprove he above esmaor : Ê 1 Ŷ j j := T Rj 1 Ẽ j 1 Ŷ j and Ê j 1 Ŷ j W j := T Ij 1 Ẽ j 1 Ŷ j W j Fnal sep : For = 1, he condonal expecaons Ê 1 = Ê 0 are compued by he usual emprcal mean : Ê0 R 0 1 := 1 R l 1 613 N l N 1 Remark 63 Noce ha by consrucon, for each = 1,, n 1 and k N, Ŷ k, Ẑk can be wren as a square negrable funcon of X k and ζ := W j, j l +1 N l Ths s precsely he reason why our smulaon scheme uses n ndependen ses of N smulaed pahs Indeed, hs ensures ha he above random varable ζ s ndependen of F k 1, and herefore we fall n he conex of Theorem 61 21

The followng provdes an esmae of he smulaon error n he above algorhm Theorem 62 Le p > 1 and ϕ L sasfyng d k=0 I J k R d u 4p+2 I ϕu 2 du < Consder he funcon ϕ x = ϕ 1/2 x as a localzng funcon n 42-43-44-612 Le, R, I be he bounds defned by Lemma 33 Then : lm sup 0 max 0n p+d/4 N 1/2 Ŷ Y p < L p The above esmae s obaned n wo seps Frs, Theorem 41 reduces he problem o he analyss of he regresson smulaon error Nex, for 1 n, he resul follows from Theorem 63 whch s he man objec of he subsequen paragraph The case = 0 s rval as he regresson esmaor 613 s he classcal emprcal mean Remark 64 In he parcular case where he generaor f does no depend on he conrol varable z, he above proposon s vald wh p = 1 Ths follows from Remark 42 Remark 65 In he prevous proposon, we have nroduced he normalzed localzng funcon ϕ x = ϕ 1/2 x Ths normalzaon s necessary for he conrol of he error esmae as ends o zero An neresng observaon s ha, n he case where R s of he form ϱx +1, ζa W +1 for some affne funcon a : R d R, hs normalzaon s n agreemen wh 7 who showed ha he mnmal negraed varance whn he class of separable localzaon funcons s gven by ˆϕx = exp ˆη x wh E R d 1 ˆη 2 k=0 1k / I J k S I h a W +1 } 2 j I ˆηj = E R d 1 k=0 1k / I J k S I h a W +1 } 2, 1 d j I ˆηj Indeed, we wll show n Lemma 61 below ha SI h a W +1 s of order I /2, and herefore he above rao s of order 1 63 Analyss of he regresson error Accordng o Theorem 51, he L p esmae of he regresson error depends on he negraed sandard devaon Γ defned n 52 In order o analyze hs erm, we sar by Lemma 61 For any neger m = 0,, d, I J m, and any affne funcon a : R d R, we have : lm sup m/2 max S h 0 1n 1 I a W +1 <, L p for all p 1 22

Proof Le d j 1 > > j m 1 be m negers, and defne I k := j m k,, j 1 For ease of presenaon, we nroduce process h := h, and we observe ha S h I lneary of he Skorohod negral We shall also wre S h I prove he requred resul, we nend o show ha for S h I = m S h I, by a W +1 In order o for all negers 0 l k m 1 and all τ := s 1,, s l 1, +1 l, D τ S h L I C k p k l/2, 614 p where C p s a consan whch does no depend on, and, τ =, D τ F = F, whenever l = 0 We shall use a backward nducon argumen on he varable k Frs, for k = m 1, 614 follows from 67 We nex assume ha 614 holds for some 1 k m 1, and we nend o exend o k 1 We frs need o nroduce some noaons Le S l be he collecon of all permuaons σ of he se 1,, l For σ S l and some neger u l, we se τ σ u := s σ1,, s σu and τ σ u := s σu+1,, s σl, wh he convenon τ σ 0 = τ σ l = Le l k 1 be fxed By drec compuaon, we see ha : D τ S h I k 1 L p l σ S l u=0 D τ σ u S h I k L 2p +1 + 1 D τ σ u S h j m k+1 L 2p D τ σ u D S h I k L 2p m k+1 D τ σ u hj d L 2p by Cauchy-Schwarz nequaly By 67 and he nducon hypohess 614, hs provdes : D τ S h L I k+1 C l 1 l k l/2 1/2 + k u/2 + k u 1/2 p σ S l u=0 C l 1 k+1 l/2 + k+1 l/2 + σ S l = C k+1 l/2, u=0 u=0 l k+1 l/2 where C s a generc consan, ndependen of and, wh dfferen values from lne o lne u=0 Lemma 62 Le µ be a map from R d no R wh polynomal growh : Le ϕ L be such ha : sup x R d µx 1 + x m <, for some m 1 d k=0 I J k R d u m I ϕu 2 du < Le R +1 := ϱ X +1, ζ a W +1 for some deermnsc funcon ϱ, some affne funcon a : R d R, and some random varable ζ ndependen of F 1 Assume ha R +1 L 2+ε 23

for some ε > 0 Then, where lm sup 0 max 1n d/2 R +1 2 L µxv 2+ε,R +1 xdx <, R d V,R +1 x = Var Q R +1 h, ϕ x and ϕ x := ϕ 1/2 x Proof We shall wre S h I for S h I a W +1 We frs esmae ha : Q V,R R +1 x E +1 h, ϕ x 2 = E H x X R+1 S 2 h a W +1 ϕ X x } 2 2 d E j=0 J J j H x X R+1 2 J ϕ X x } 2 S h J where we used 64 For ease of noaon, we nroduce he parameer η > 0 such ha 21 + η 2 = 2 + ε and η := 1 + 1/η s he conjugae of 1 + η Applyng wce he Hölder nequaly, we see ha : R d µxv,r +1 xdx 2 where d E j=0 J J j d 2 R +1 2 L 2+ε A J := } 2 R+1 2 S h J H x X µx J ϕ X x 2 dx R d j=0 J J j S h J H x X µx J ϕ X x 2 dx R d 2 A J L 2 η1+η 615 By defnon of ϕ, we observe ha J ϕ x = J /2 I ϕ 1/2 x I hen follows from a drec change of varable ogeher wh he polynomal growh condon on µ ha : A J = d/2 J µx 1/2 x J ϕ x 2 dx R d + C d/2 J R d + 1 + m k=0 k L η L η m k X k 1/2 x m k J ϕ x 2 dx C d/2 J J ϕ x 2 dx + 2C } X m x m R d L m η J ϕ x 2 dx + R d + Noce ha he rgh hand-sde erm s fne by our assumpon on he localzng funcon Snce max 1n X L m η s bounded unformly n by 32, hs proves ha A J C d/2 J Pluggng hs no 615, we oban : R d µxv,r +1 xdx C R +1 2 L 2+ε 24 d d/2 J j=0 J J j S h J 2, L 2 η1+η L η

and he requred resul follows from Lemma 61 recall ha J = d J, see he defnons n 61 Theorem 63 Le R +1 := ϱ X +1, ζ a W +1 for some deermnsc funcon ϱ, some affne funcon a ha : R d R, and some random varable ζ ndependen of F 1 Assume ρ X r X := E R +1 ρ X for some ρ = ρ, ρ wh polynomal growh : ρ x + ρ x sup x R d max 1n 1 + x m <, for some m 0 Le p 1 be arbrary, consder some localzng funcon ϕ L sasfyng : d u 2pm+2 I ϕu 2 du <, R d k=0 I J k and se ϕ x := ϕ 1/2 x Le Ẽ R +1 be defned as n 612, wh localzng funcon Ẽ R +1 Then, ϕ, and consder he runcaed regresson esmaor Ê R +1 := T ρ lm sup 0 max 1n d/4 N 1/2 Ê E R +1 p < L p Proof Se γ := ρ ρ and observe ha γ nhers he polynomal growh of ρ Wh he noaons of Lemma 62, follows from Theorem 51 ha : Ê E R +1 p 2N 1/2 Γ r, γ, V,1, V L p,r +1, provded ha he rgh hand-sde s fne The res of hs proof s dedcaed o he esmaon of hs erm From he polynomal growh condon on ρ, we esmae ha : Γ r, γ, V,1, V,R +1 C 1 + x mp V,R +1 x 1/2 + C 1 + x mp V,1x 1/2 R d R d We only consder he frs erm on he rgh hand-sde, as he second one s reaed smlarly In order o prove he requred resul, s suffcen o show ha : lm sup max 0 1n d/4 1 + x mp V,R +1 x 1/2 < R d Le φx = C φ 1 + x 2 1 wh C φ such ha R d φxdx = 1 By he Jensen nequaly, we ge : R d 1 + x mp V,R +1 x 1/2 C φx R d C The proof s compleed by appealng o Lemma 62 R d φ 2 x 1 + x 2mp 1/2 V,R +1 x 1 + x 2mp+2 V,R +1 x 1/2 25

7 Exenson o refleced backward SDE s The purpose of hs secon s o exend our analyss o refleced backward SDE s n he case where he generaor f does no depend on he z varable We hen consder K- Lpschz funcons f : 0, 1 R d R R and g : R d R, for some K > 0, and we le Y, Z, A be he unque soluon of : Y = gx 1 + 1 f s, X s, Y s ds 1 Z s dw s + A 1 A 71 Y gx, 0 1, 72 such ha Y L 2, for all 0 1, Z L 2 0, 1 and A s a non-decreasng cadlag process sasfyng : 1 0 Y gx da = 0 We refer o 14 for he exsence and unqueness ssue 71 Dscree-me approxmaon I s well-known ha Y adms a Snell envelope ype represenaon We herefore nroduce he dscree-me counerpar of hs represenaon : Y 1 = g X1, Y 1 = max g X 1, E 1 } Y + f 1, X 1, Y 1 73, 1 n 74 Observe ha our scheme dffers for 4, who consder he backward scheme defned by } Ỹ 1 = max g X 1, E 1 Ỹ + f, X, Ỹ, 1 n, nsead of 74 By drec adapaon of he proofs of 4, we oban he followng esmae of he dscrezaon error Noce ha s of he same order han n he non-refleced case Theorem 71 For all p 1, lm sup 0 1/2 sup Y Y L p < 0k Proof For 0 n, we denoe by Θ he se of soppng mes wh values n,, n = 1}, and we defne : R := n 1 ess sup E τ Θ gx τ + 1 τ>j f j, X j, Y j L := ess sup τ Θ E j= τ gx τ + f s, X s, Y s ds, 0 n 26

From Lemma 2 a n 4, we have : lm sup 0 1/2 max L Y L p < 75 0n A sraghforward adapaon of Lemma 5 n 4 leads o lm sup 0 1/2 max R L L p < 76 0n In order o complee he proof, we shall show ha : lm sup 0 1/2 max Y R L p < 77 0n We frs wre Y n s Snell envelope ype represenaon : n 1 Y := ess sup E τ Θ gxτ + 1 τ>j f By he Lpschz condons on f and g, we hen esmae ha : R Y n C ess sup τ Θ E Xτ X τ + E + C E max jn + X j X j n j= E j= j= j, X j, Y j + Y j R R j + j Y j I follows from he arbrarness of, ha for each neger l n : R l Y l C E max X j X j E jn + n j= E X j X j + Yj Y + Y j R R j + j Y j= j Usng he dscree me verson of Gronwall s Lemma, we herefore oban : R Y C E max X j X n jn j + E + Y j R j, j for some consan C ndependen of 77 s hen obaned by usng 32-75-76 27

72 A pror bounds on he dscree-me approxmaon Consder he sequence of maps x = α + β x, 0 n, where he α, β are defned as n Lemma 33 Observe ha +1 g, 0 n 1 Le = g, } 0n Then, follows from he same argumens as n Lemma 33, ha : T Y = Y, 0 n In parcular, hs nduces smlar bounds on E 1 Y by drec compuaons 73 Smulaon As n he non-refleced case, we defne he approxmaon Ŷ of Y by : Ŷ 1 = g X1, ˇY 1 = Ê 1 Y + f 1, X 1, ˇY 1 78 Ŷ 1 = T 1 ˇY 1 = gx 1 ˇY 1 1X 1, 1 n, where Ê s some approxmaon of E Wh hs consrucon, he esmaon of he regresson error of Theorem 41 mmedaely exends o he conex of refleced backward SDE s approxmaon In parcular, we oban he same L p error esmae of he regresson approxmaon as n he non-refleced case : Theorem 72 Le p 1 be gven Then, here s a consan C > 0 whch only depends on K, p such ha : for all 0 n Ŷ Y L p C max 0jn 1 Ê j Ej L Ŷ j+1 p From hs heorem, we can now deduce an esmae of he L p error Ŷ Y n he case where Ê s defned as n Secon 62 Le ϕ L sasfyng d k=0 I J k R d u 4p+2 I ϕu 2 du <, for some p 1 Consder he approxmaon Ŷ obaned by he above smulaon scheme, where Ê s defned as n Secon 62 wh normalzed localzng funcon ϕ x = ϕ 1/2 x Then, we have he followng L p esmae of he error due o he regresson approxmaon : lm sup 0 max 0n p+d/4 N 1/2 Ŷ Y p < L p 28

References 1 F Anonell and A Kohasu-Hga 2000 Flraon sably of backward SDE s, Sochasc Analyss and Is Applcaons, 18, 11-37 2 V Bally 1997 An approxmaon scheme for BSDEs and applcaons o conrol and nonlnear PDE s Pman Research Noes n Mahemacs Seres, 364, Longman 3 V Bally and G Pagès 2001 A quanzaon algorhm for solvng muldmensonal opmal soppng problems, preprn 4 V Bally and G Pagès 2002 Error analyss of he quanzaon algorhm for obsacle problems, preprn 5 D Bosq 1998 Non-paramerc Sascs for Sochasc Processes, Sprnger Verlag, New York 6 B Bouchard 2000 Conrôle Sochasque Applqué à la Fnance, PhD hess, Unversé Pars Dauphne 7 B Bouchard, I Ekeland, and N Touz 2002 On he Mallavn approach o Mone Carlo approxmaon of condonal expecaons, preprn 8 P Brand, B Delyon, and J Mémn 2001 Donsker-ype heorem for BSDE s, Elecronc Communcaons n Probably, 6, 1-14 9 J F Carrère 1996 Valuaon of he Early-Exercse Prce for Opons usng Smulaons and Nonparamerc Regresson, Insurance : mahemacs and Economcs, 19, 19-30 10 D Chevance 1997 Numercal Mehods for Backward Sochasc Dfferenal Equaons, n Numercal mehods n fnance, Ed LCG Rogers and D Talay, Cambrdge Unversy Press, 232-244 11 E Clémen, D Lamberon, and P Proer 2002 An analyss of a leas squares regresson mehod for Amercan opon prcng, Fnance and Sochascs, 6, 449-472 12 F Coque, V Mackevčus, and J Mémn 1998 Sably n D of marngales and backward equaons under dscrezaon of flraon, Sochasc Processes and her Applcaons, 75, 235-248 13 J Jr Douglas, J Ma, and P Proer 1996 Numercal Mehods for Forward- Backward Sochasc Dfferenal Equaons, Annals of Appled Probably, 6, 940-968 14 N El Karou, C Kapoudjan, E Pardoux, S Peng, and MC Quenez 1997 Refleced soluons of backward sochasc dfferenal equaons and relaed obsacle problems for PDE s, Annals of Probably, 25, 702-737 29

15 E Fourné, J-M Lasry, J Lebuchoux, and P-L Lons 2001 Applcaons of Mallavn calculus o Mone Carlo mehods n fnance II, Fnance and Sochascs, 5, 201-236 16 E Fourné, J-M Lasry, J Lebuchoux, P-L Lons, and N Touz 1999 Applcaons of Mallavn calculus o Mone Carlo mehods n fnance, Fnance and Sochascs, 3, 391-412 17 I Karazas and SE Shreve 1988 Brownan Moon and Sochasc Calculus Graduae Texs n Mahemacs, 113, Sprnger-Verlag, New York 18 PE Kloeden and E Plaen 1992 Numercal Soluon of Sochasc Dfferenal Equaons Sprnger-Verlag 19 A Kohasu-Hga and R Peersson 2001 Varance reducon mehods for smulaon of denses on Wener space, SIAM Journal on Numercal Analyss, o appear 20 P-L Lons and H Regner 2001 Calcul du prx e des sensblés d une opon amércane par une méhode de Mone Carlo, preprn 21 F A Longsaff and R S Schwarz 2001 Valung Amercan Opons By Smulaon : A smple Leas-Square Approach, Revew of Fnancal Sudes, 14, 113-147 22 J Ma, P Proer, J San Marn, and S Torres 2002 Numercal Mehod for Backward Sochasc Dfferenal Equaons, Annals of Appled Probably, 12 1, 302-316 23 J Ma, P Proer, and J Yong 1994 Solvng forward-backward sochasc dfferenal equaons explcly - a four sep scheme, Probably Theory and Relaed Felds, 98, 339-359 24 J Ma and J Yong 1999 Forward-Backward Sochasc Dfferenal Equaons and Ther Applcaons, Lecure Noes n Mah, 1702, Sprnger 25 D Nualar 1995 The Mallavn Calculus and Relaed Topcs Sprnger Verlag, Berln 26 G Pagès and H Pham 2002 A quanzaon algorhm for muldmensonal sochasc conrol problems, preprn 27 J Zhang 2001 Some fne properes of backward sochasc dfferenal equaons, PhD hess, Purdue Unversy 28 J Zhang 2001 A Numercal Scheme for Backward Sochasc Dfferenal Equaons : Approxmaon by Sep Processes, preprn 30