Two new developments in Multilevel Monte Carlo methods

Similar documents
Monte Carlo Methods for Uncertainty Quantification

arxiv: v1 [math.na] 7 Nov 2017

Expected Value of Partial Perfect Information

Application of QMC methods to PDEs with random coefficients

Adaptive timestepping for SDEs with non-globally Lipschitz drift

Multilevel Monte Carlo for Lévy Driven SDEs

Non-nested Adaptive Timesteps in Multilevel Monte Carlo Computations

Fast evaluation of the inverse Poisson CDF

M. Croci (Oxford), M. B. Giles (Oxford), P. E. Farrell (Oxford), M. E. Rognes (Simula) MCQMC July 4, 2018

Optimal Randomized Algorithms for Integration on Function Spaces with underlying ANOVA decomposition

Multi-level Monte Carlo methods in Uncertainty Quantification

Multilevel simulation of functionals of Bernoulli random variables with application to basket credit derivatives

Overview of Accelerated Simulation Methods for Plasma Kinetics

Approximation of inverse Poisson CDF on GPUs

What Should We Do with Radioactive Waste?

Multilevel Monte Carlo methods for highly heterogeneous media

Multilevel Monte Carlo for Stochastic McKean-Vlasov Equations

Computing logarithms and other special functions

Variance Reduction Techniques for Monte Carlo Simulations with Stochastic Volatility Models

Introduction to the Numerical Solution of SDEs Selected topics

Hierarchical Bayesian Inversion

What s new in high-dimensional integration? designing for applications

Numerical Analysis of Elliptic PDEs with Random Coefficients

Approximation of High-Dimensional Numerical Problems Algorithms, Analysis and Applications

New Multilevel Algorithms Based on Quasi-Monte Carlo Point Sets

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems

Numerical Methods II

arxiv: v2 [math.na] 17 Oct 2017

Modeling with Itô Stochastic Differential Equations

Estimation of probability density functions by the Maximum Entropy Method

Simulation. Alberto Ceselli MSc in Computer Science Univ. of Milan. Part 4 - Statistical Analysis of Simulated Data

Hochdimensionale Integration

Applicability of Quasi-Monte Carlo for lattice systems

Multilevel accelerated quadrature for elliptic PDEs with random diffusion. Helmut Harbrecht Mathematisches Institut Universität Basel Switzerland

Part III. Quasi Monte Carlo methods 146/349

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

Scientific Computing: Monte Carlo

Stochastic Computation

Rates of Convergence and CLTs for Subcanonical Debiased MLMC

Quasi-Monte Carlo integration over the Euclidean space and applications

Bath Institute For Complex Systems

Nonuniform Random Variate Generation

Quasi-Monte Carlo Methods for Applications in Statistics

STA414/2104 Statistical Methods for Machine Learning II

Exact Simulation of Multivariate Itô Diffusions

Sequential Monte Carlo Samplers for Applications in High Dimensions

Parallel Multilevel Monte Carlo Algorithm for Capacitance Extraction in Non-Manhattan Geometries

Stochastic Numerical Analysis

Practical unbiased Monte Carlo for Uncertainty Quantification

Central limit theorem. Paninski, Intro. Math. Stats., October 5, probability, Z N P Z, if

Perturbed Proximal Gradient Algorithm

Computational complexity analysis for Monte Carlo approximations of classically scaled population processes

Discretization of SDEs: Euler Methods and Beyond

Tractability of Multivariate Problems

Aspects of Multigrid

Multilevel stochastic collocations with dimensionality reduction

Introduction to Computational Stochastic Differential Equations

c 2018 Society for Industrial and Applied Mathematics

Stochastic Modelling in Climate Science

Monte Carlo Integration. Computer Graphics CMU /15-662, Fall 2016

DUBLIN CITY UNIVERSITY

Midterm Exam 1 Solution

Implementation of Sparse Wavelet-Galerkin FEM for Stochastic PDEs

Gaussian Process Approximations of Stochastic Differential Equations

A Noisy-Influence Regularity Lemma for Boolean Functions Chris Jones

Computational methods for continuous time Markov chains with applications to biological processes

Statistical Machine Learning II Spring 2017, Learning Theory, Lecture 4

Pascal s Triangle on a Budget. Accuracy, Precision and Efficiency in Sparse Grids

Concepts in Global Sensitivity Analysis IMA UQ Short Course, June 23, 2015

Stochastic optimal control with rough paths

Lecture on Parameter Estimation for Stochastic Differential Equations. Erik Lindström

Hot new directions for QMC research in step with applications

Proceedings of the 2015 Winter Simulation Conference L. Yilmaz, W. K. V. Chan, I. Moon, T. M. K. Roeder, C. Macal, and M. D. Rossetti, eds.

Model Selection and Geometry

Stochastic methods for solving partial differential equations in high dimension

The Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance

Monte Carlo Integration I [RC] Chapter 3

Stochastic Convergence, Delta Method & Moment Estimators

Multilevel Monte Carlo for uncertainty quantification in structural engineering

Higher order weak approximations of stochastic differential equations with and without jumps

Large-scale Stochastic Optimization

Stochastic Integration (Simple Version)

Pedestrian Flow in the Mean Field Limit. Thesis by Abdul Lateef Haji Ali, B.Sc. In Partial Fulfillment of the Requirements.

Lecture 5. 1 Review (Pairwise Independence and Derandomization)

Algorithmic Stability and Generalization Christoph Lampert

Error Analysis for Quasi-Monte Carlo Methods

Exercise 5 Release: Due:

A multilevel Monte Carlo algorithm for Lévy driven stochastic differential equations

UNCERTAINTY QUANTIFICATION IN LIQUID COMPOSITE MOULDING PROCESSES

Simulation Method for Solving Stochastic Differential Equations with Constant Diffusion Coefficients

Progress in high-dimensional numerical integration and its application to stochastic optimization

NONLOCALITY AND STOCHASTICITY TWO EMERGENT DIRECTIONS FOR APPLIED MATHEMATICS. Max Gunzburger

The optimistic principle applied to function optimization

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

High-dimensional Problems in Finance and Economics. Thomas M. Mertens

Reduced Basis in General Relativity: Select-Solve-Represent-Predict

Uncertainty Quantification in Computational Science

Natural Evolution Strategies for Direct Search

Pseudo-marginal MCMC methods for inference in latent variable models

Saddlepoint-Based Bootstrap Inference in Dependent Data Settings

Transcription:

Two new developments in Multilevel Monte Carlo methods Mike Giles Abdul-Lateef Haji-Ali Oliver Sheridan-Methven Mathematical Institute, University of Oxford Rob Scheichl s Farewell Conference University of Bath, November 13th, 2018 Mike Giles (Oxford) Multilevel Monte Carlo 1 / 20

MLMC the early days Jan 2006 the idea for MLMC for SDEs came while on holiday paper written by July 2006 MCQMC conference presentation in Sept summer 2006 already thinking about PDEs, so talked to Tom Hou (Caltech), but that didn t lead to anything his postdoc wasn t interested also talked to Ian Sloan (UNSW) that led to a visit and a collaboration on MCQMC by 2007 very keen to pursue MLMC for PDEs so approached Rob at ICIAM 2007 in Zurich a year later we started working together, obtained EPSRC funding with Andrew Cliffe, then Aretha joined in 2009 and the rest is history Mike Giles (Oxford) Multilevel Monte Carlo 2 / 20

My current MLMC research Matteo Croci, Patrick Farrell coupled generation of stochastic fields on non-nested grids by solving elliptic PDE driven by spatial white noise ( 2 + k 2 ) q u = W This is an extension of nested grid research by Rob based on prior research which Chris Farmer told us about working well for MLMC, currently being extended to MLQMC Abdul-Lateef Haji-Ali nested expectation Oliver Sheridan-Methven approximate random variables and reduced precision computing Mike Giles (Oxford) Multilevel Monte Carlo 3 / 20

Nested expectations [ The concern here is estimation of E f ( E[X Y ] ) ] where Y is multi-dimensional X can be either scalar or multi-dimensional We will assume that X Y can be simulated exactly at average O(1) cost in complex cases this requires use of Rhee & Glynn s randomised MLMC for unbiased estimation. Applications differ in the smoothness of f : C 2 the nice case, e.g. x 2 continuous, piecewise C 2 the more typical case. e.g. max(x, 0) discontinuous the nasty case, e.g. Heaviside H(x) Mike Giles (Oxford) Multilevel Monte Carlo 4 / 20

Nested simulation MLMC for nice case is quite natural: on level l use 2 l samples to compute X (f ) E[X Y ] on coupled level l 1, split into two subsets of 2 l 1 and average each to get X (c1), X (c2) MLMC antithetic correction estimator is then ( ) Z = f (X (f ) ) 1 2 f (X (c1) ) + f (X (c2) ) if f (x) is linear then Z =0 ; with a bounded second derivative then ( Z = O (X (c1) X (c2) ) 2) = O(2 l ) so the variance is O(2 2l ) while the cost is O(2 l ) hence β =2, γ =1 in MLMC theorem = O(ε 2 ) complexity Mike Giles (Oxford) Multilevel Monte Carlo 5 / 20

Nested simulation What if f (x) = max(x, 0)? Z = 0 if X (c1), X (c2) have same sign if E[X Y ] has bounded density near 0, then O(2 l/2 ) probability of X (c1), X (c2) having different signs, and in that case Z = O(2 l/2 ) MLMC correction variance is O(2 3l/2 ) while the cost is O(2 l ) hence β =3/2, γ =1 in MLMC theorem = O(ε 2 ) complexity Mike Giles (Oxford) Multilevel Monte Carlo 6 / 20

Nested simulation Three major applications: Financial credit modelling Bujok, Hambly, Reisinger (2013) continous, piecewise linear f (x) first proof of O(2 3l/2 ) variance Expected Value of Partial Perfect Information G, Goda (2018) [ ] E max E[f d (X ) Y ] max E[f d (X )] d d Conditional value-at-risk G, Haji-Ali (2018) min {x + 1η } max (E[X x], 0) x Mike Giles (Oxford) Multilevel Monte Carlo 7 / 20

Nested simulation Finally, the nasty case f (x) = H(x) Z = 0 if X (c1), X (c2) have same sign if E[X Y ] has bounded density near 0, then O(2 l/2 ) probability of X (c1), X (c2) having different signs, and in that case Z = O(1) MLMC correction variance is O(2 l/2 ) while the cost is still O(2 l ) hence β =1/2, γ =1 in MLMC theorem additional analysis gives α=1 so O(ε 5/2 ) complexity By adaptively changing the number of inner samples (with minimum 2 l and maximum 2 2l on level l, can improve to get β =1, γ =1 and O(ε 2 log ε 2 ) complexity Analysis was challenging but satisfying G, Haji-Ali (2018) Mike Giles (Oxford) Multilevel Monte Carlo 8 / 20

Approximate random variables In some applications, generating the random numbers can be a significant cost, especially with QMC when inverting the CDF. e.g. Poisson distribution, increments of a Lévy process, non-central chi-squared distribution (CIR model) Even with Normal random variables, cost of conversion from uniform r.v. to Normal is non-trivial for vector implementation. This has led to previous research by Hefter, Mayer, Ritter at TU Kaiserslautern. Mike Giles (Oxford) Multilevel Monte Carlo 9 / 20

Approximate random variables Simplest example: Euler-Maruyama approximation of a scalar SDE: X tm+1 = X tm + a( X tm )h + b( X tm ) h Z m where Z m is a unit Normal r.v. generated as Z m = Φ 1 (U m ) where U m is a uniform (0, 1) r.v., and this leads to a Monte Carlo approximation of E[f (X T )]. Suppose instead we use approximate Normals Z m generated by Z m = Q(U m ) where Q is a piecewise constant approximation to Φ 1 on 2 q uniformly-sized sub-intervals. Mike Giles (Oxford) Multilevel Monte Carlo 10 / 20

Z Approximate random variables 3 2 exact approx 1 0-1 -2-3 0 0.2 0.4 0.6 0.8 1 U Note: only need first q bits of U m and a lookup table to calculate Z m, and analysis proves that E[ (Z Z) 2 ] = O(q 1 2 q ) (Alternatively, could use a polynomial approximation to Φ 1 ) Mike Giles (Oxford) Multilevel Monte Carlo 11 / 20

Approximate random variables For an SDE approximation with a specified number of timesteps, a two-level MLMC approach gives E[ P] = E[ P] + E[ P P] Further analysis proves that for a Lipschitz function P f (X T ) E[ ( P P) ( 2 ] = O E[ (Z Z) ) 2 ] so this can lead to big savings if E[ (Z Z) 2 ] 1 and Q(U m ) is much cheaper to evaluate than Φ 1 (U m ). Mike Giles (Oxford) Multilevel Monte Carlo 12 / 20

Approximate random variables How does this work in combination with standard MLMC? Answer: nested MLMC E[P L ] = E[P 0 ] + l=1 L E[P l P l 1 ] l=1 = E[ P 0 ] + E[P 0 P 0 ] L { + E[ P l P [ l 1 ] + E (P l P l 1 ) ( P l P ] } l 1 ) The pair ( P l, P l 1 ) are generated in the same way as (P l, P l 1 ), just replacing exact Z m by approximate Z m. Mike Giles (Oxford) Multilevel Monte Carlo 13 / 20

Approximate random variables Numerical analysis a work-in-progress Path differences? X l X l 1 h 1/2 X l X l 1 h 1/2 X l X l E[(Z Z) 2 ] ( X l X l 1 ) ( X l X l 1 ) h 1/2 E[(Z Z) 2 ] Mike Giles (Oxford) Multilevel Monte Carlo 14 / 20

Approximate random variables MLMC variances? For smooth payoff functions f (x), looks possible to prove V[( P l P l 1 ) ( P l P l 1 )] h E[(Z Z) 2 ] For continuous piecewise smooth functions (e.g. put/call functions) the analysis is much tougher. Numerical results suggest V[( P l P l 1 ) ( P l P l 1 )] min {h 1/2 E[(Z Z) 2 ], h (E[(Z Z) 2 ]) 1/2} but it looks tough to prove better than V[( P l P l 1 ) ( P l P { l 1 )] min E[(Z Z) 2 ], h (E[(Z Z) 2 ]) 1/3}. Mike Giles (Oxford) Multilevel Monte Carlo 15 / 20

Approximate random variables numerical results Dimensionless variance 10 1 10 2 10 3 10 4 10 5 10 6 10 7 10 8 10 9 10 10 10 11 Convergence between QMLMC levels O ( 2 ( 1.59±0.05)l) O ( 2 ( 1.23±0.1)l) O ( 2 ( 1.72±0.03)l) O ( 2 ( 1.14±0.2)l) O ( 2 ( 0.599±0.6)l) Decreasing p Decreasing p O.L.S. O ( 2 ( 2.0±0.004)l) O.L.S. O ( 2 ( 1.46±0.05)l) ˆP l ˆP l 1 ( Pl P l 1 ) ˆP l ˆP l 1 P l P l 1 0 1 2 3 4 5 6 7 Level l 2 5 2 7 2 9 2 11 2 13 2 15 2 17 2 19 2 21 2 23 2 25 2 27 2 29 2 31 2 33 2 35 Mike Giles (Oxford) Multilevel Monte Carlo 16 / 20

Reduced precision arithmetic Further computational savings can be achieved by using reduced precision arithmetic. Previous research at TU Kaiserslautern (Korn, Ritter, Wehn and others) and Imperial College (Luk and others) has used FPGAs with complete control over the precision, but I prefer GPUs. In the latest NVIDIA GPUs, half-precision FP16 is twice as fast as single precision FP32 (which is 2-8 times faster than double precision FP64). In most cases, single precision is perfectly sufficient for calculating P; half precision can be used for P. MC averaging should probably be done in double precision in both cases. Mike Giles (Oxford) Multilevel Monte Carlo 17 / 20

Reduced precision arithmetic Very important: to ensure the telescoping sum is respected, must ensure that exactly the same computations are performed for P whether on its own or in calculating P P. The effect of half-precision arithmetic can be modelled as X tm+1 = X tm + a( X tm )h + b( X tm ) h Z m + δ X tm V m where δ 10 3 and the V m are iid unit variance random variables. Overall, this leads to an O(δ 2 /h) increase in the variance; if this increases it too much, the reduced precision should not be used. Mike Giles (Oxford) Multilevel Monte Carlo 18 / 20

Conclusions I continue to look for new ways in which to use MLMC I m very happy with new nested simulation MLMC Abdul will continue this line of research at Heriot-Watt I think there s more to be done (probably by Abdul) with adaptive sampling within MLMC in other contexts I m now excited about the new research using approximate distributions I think this has interesting potential for the future Mike Giles (Oxford) Multilevel Monte Carlo 19 / 20

References M. Croci, MBG, M.E. Rognes, P.E. Farrell. Efficient white noise sampling and coupling for multilevel Monte Carlo with non-nested meshes. To appear in SIAM/ASA Journal on Uncertainty Quantification, 2018/9 MBG, A.-L. Haji-Ali. Multilevel nested simulation for efficient risk estimation. arxiv pre-print 1802.05016. MBG, T. Goda. Decision-making under uncertainty: using MLMC for efficient estimation of EVPPI. To appear in Statistics and Computing, 2018/9. MBG, M. Hefter, L. Mayer, K. Ritter. Random bit quadrature and approximation of distributions on Hilbert spaces. Foundations of Computational Mathematics, online March 2018. MBG, M. Hefter, L. Mayer, K. Ritter. Random bit multilevel algorithms for stochastic differential equations. arxiv pre-print 1808.10623. Mike Giles (Oxford) Multilevel Monte Carlo 20 / 20