Giovanni Migliorati. MATHICSE-CSQI, École Polytechnique Fédérale de Lausanne

Similar documents
Collocation methods for uncertainty quantification in PDE models with random data

Comparison of Clenshaw-Curtis and Leja quasi-optimal sparse grids for the approximation of random PDEs

Downloaded 06/07/18 to Redistribution subject to SIAM license or copyright; see

Sparse Grids. Léopold Cambier. February 17, ICME, Stanford University

Implementation of Sparse Wavelet-Galerkin FEM for Stochastic PDEs

Quasi-optimal and adaptive sparse grids with control variates for PDEs with random diffusion coefficient

Tutorial on quasi-monte Carlo methods

Sparse Quadrature Algorithms for Bayesian Inverse Problems

Stochastic methods for solving partial differential equations in high dimension

On the Stability of Polynomial Interpolation Using Hierarchical Sampling

Hypothesis testing for Stochastic PDEs. Igor Cialenco

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Approximation of High-Dimensional Rank One Tensors

Efficient Solvers for Stochastic Finite Element Saddle Point Problems

Application of QMC methods to PDEs with random coefficients

Nonlinear tensor product approximation

STAT 200C: High-dimensional Statistics

Interpolation via weighted l 1 minimization

Part III. Quasi Monte Carlo methods 146/349

Improved Discrepancy Bounds for Hybrid Sequences. Harald Niederreiter. RICAM Linz and University of Salzburg

THE CHOICE OF AUXILIARY DENSITY FUNCTION IN STOCHASTIC COLLOCATION

Global Maxwellians over All Space and Their Relation to Conserved Quantites of Classical Kinetic Equations

Splitting methods with boundary corrections

A NEW UPPER BOUND ON THE STAR DISCREPANCY OF (0,1)-SEQUENCES. Peter Kritzer 1. Abstract

Scattered Data Interpolation with Wavelet Trees

2014:05 Incremental Greedy Algorithm and its Applications in Numerical Integration. V. Temlyakov

Trivariate polynomial approximation on Lissajous curves 1

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Progress in high-dimensional numerical integration and its application to stochastic optimization

A sparse grid stochastic collocation method for elliptic partial differential equations with random input data

Optimal Polynomial Admissible Meshes on the Closure of C 1,1 Bounded Domains

Fourier and Wavelet Signal Processing

The Discrepancy Function and the Small Ball Inequality in Higher Dimensions

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications

Fast Numerical Methods for Stochastic Computations

Low-discrepancy sequences obtained from algebraic function fields over finite fields

MATHICSE Mathematics Institute of Computational Science and Engineering School of Basic Sciences - Section of Mathematics

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Geometric bounds for Steklov eigenvalues

NEEDLET APPROXIMATION FOR ISOTROPIC RANDOM FIELDS ON THE SPHERE

STOCHASTIC SAMPLING METHODS

Numerische Mathematik

Constructing optimal polynomial meshes on planar starlike domains

Anomalous transport of particles in Plasma physics

MATHICSE Mathematics Institute of Computational Science and Engineering School of Basic Sciences - Section of Mathematics

Large-scale eigenvalue problems

Outline of Fourier Series: Math 201B

Analysis and Computation of Hyperbolic PDEs with Random Data

arxiv: v2 [math.na] 21 Jul 2016

EFFICIENT STOCHASTIC GALERKIN METHODS FOR RANDOM DIFFUSION EQUATIONS

ON DISCRETE LEAST-SQUARES PROJECTION IN UNBOUNDED DOMAIN WITH RANDOM EVALUATIONS AND ITS APPLICATION TO PARAMETRIC UNCERTAINTY QUANTIFICATION

Interpolation via weighted l 1 -minimization

Numerical Methods in Economics MIT Press, Chapter 9 Notes Quasi-Monte Carlo Methods. Kenneth L. Judd Hoover Institution.

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Tractability of Multivariate Problems

Low-discrepancy point sets lifted to the unit sphere

Performance Evaluation of Generalized Polynomial Chaos

Inference for High Dimensional Robust Regression

ASYMPTOTICALLY EXACT A POSTERIORI ESTIMATORS FOR THE POINTWISE GRADIENT ERROR ON EACH ELEMENT IN IRREGULAR MESHES. PART II: THE PIECEWISE LINEAR CASE

Rough paths methods 4: Application to fbm

Parameterized PDEs Compressing sensing Sampling complexity lower-rip CS for PDEs Nonconvex regularizations Concluding remarks. Clayton G.

arxiv: v3 [math.na] 18 Sep 2016

Some Aspects of Universal Portfolio

CMS winter meeting 2008, Ottawa. The heat kernel on connected sums

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

STAT 200C: High-dimensional Statistics

We denote the space of distributions on Ω by D ( Ω) 2.

On Reparametrization and the Gibbs Sampler

Explicit bounds on the entangled value of multiplayer XOR games. Joint work with Thomas Vidick (MIT)

Eigenvalues, random walks and Ramanujan graphs

Lipschitz continuity for solutions of Hamilton-Jacobi equation with Ornstein-Uhlenbeck operator

P (A G) dp G P (A G)

Vector Spaces. Vector space, ν, over the field of complex numbers, C, is a set of elements a, b,..., satisfying the following axioms.

MIXED PROBLEM WITH INTEGRAL BOUNDARY CONDITION FOR A HIGH ORDER MIXED TYPE PARTIAL DIFFERENTIAL EQUATION

Orthogonality of hat functions in Sobolev spaces

Small ball probabilities and metric entropy

Low-rank techniques applied to moment equations for the stochastic Darcy problem with lognormal permeability

Dynamical systems with Gaussian and Levy noise: analytical and stochastic approaches

On continuous time contract theory

Lecture 5: Importance sampling and Hamilton-Jacobi equations

Convergence of Multivariate Quantile Surfaces

arxiv: v2 [math.na] 8 Sep 2017

Bernstein-Durrmeyer operators with arbitrary weight functions

1. Introduction. In this paper we consider stochastic optimization problems of the form

On the Complexity of Best Arm Identification with Fixed Confidence

Sampling and low-rank tensor approximation of the response surface

ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS

= 1 2 x (x 1) + 1 {x} (1 {x}). [t] dt = 1 x (x 1) + O (1), [t] dt = 1 2 x2 + O (x), (where the error is not now zero when x is an integer.

Research Article Localization and Perturbations of Roots to Systems of Polynomial Equations

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Convex Feasibility Problems

On some weighted fractional porous media equations

On the bang-bang property of time optimal controls for infinite dimensional linear systems

A sparse grid stochastic collocation method for partial differential equations with random input data

Reduced basis method for the reliable model reduction of Navier-Stokes equations in cardiovascular modelling

Construction algorithms for plane nets in base b

Dyadic diaphony of digital sequences

Sparse-grid polynomial interpolation approximation and integration for parametric and stochastic elliptic PDEs with lognormal inputs

Regularity of the density for the stochastic heat equation

Approximation of BSDEs using least-squares regression and Malliavin weights

Transcription:

Analysis of the stability and accuracy of multivariate polynomial approximation by discrete least squares with evaluations in random or low-discrepancy point sets Giovanni Migliorati MATHICSE-CSQI, École Polytechnique Fédérale de Lausanne Analysis with random points: joint work with Fabio Nobile (EPFL), Raul Tempone (KAUST), Albert Cohen (UPMC), Abdellah Chkifa (UPMC) and Erik von Schwerin (KTH). Analysis with low-discrepancy points: joint work with Fabio Nobile. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 1

Outline 1 Discrete least squares on multivariate polynomial spaces 2 Stability and accuracy with evaluations in random points 3 Stability and accuracy with evaluations in low-discrepancy point sets 4 Conclusions G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 2

Discrete least squares on multivariate polynomial spaces 1 Discrete least squares on multivariate polynomial spaces 2 Stability and accuracy with evaluations in random points 3 Stability and accuracy with evaluations in low-discrepancy point sets 4 Conclusions G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 3

Discrete least squares on multivariate polynomial spaces Notation and definitions For any d 1, Γ := [ 1, 1] d and any real numbers α, β > 1, define d ρ(y) := B(α, β) d (1 y i ) α (1 + y i ) β, y Γ, f 1, f 2 L 2 ρ (Γ) := Γ i=1 f 1 (y)f 2 (y)ρ(y)dy, f 1, f 2 M := 1 M M f 1 (y m ) f 2 (y m ), m=1 L 2 ρ :=, 1/2, L 2 M :=, 1/2 ρ M, with y 1,..., y M being any points in Γ, either realizations of i.i.d. random variables Y 1,..., Y M i.i.d. ρ or deterministically given (e.g. low-discrepancy point sets). Given univariate L 2 ρ-orthonormal polynomials (ϕ k ) k 0 and a multi-index set Λ N d 0, for any ν Λ we define d ψ ν (y) := i=1 ϕ νi (y i ), y Γ, P Λ := span {ψ ν : ν Λ}. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 4

Discrete least squares on multivariate polynomial spaces Markov and Nikolskii inequalities for multivariate polynomials with downward closed multi-index sets Definition (Downward closed multi-index set) Λ is downward closed if (ν Λ and ν ν) ν Λ. Lemma (M. 2014) In any dimension, for any Λ downward closed and any α, β N 0 it holds u 2 L (Γ) (#Λ)2 max{α,β}+2 u 2 L 2 ρ (Γ), u P Λ(Γ). Lemma (M. 2014) In any dimension and for any Λ downward closed, when α = β = 0 (Legendre polynomials), it holds d 2 u y 1 y d 4 d (#Λ) 4 u 2 L 2 ρ (Γ) L 2 (Γ), u P Λ(Γ). ρ G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 5

Discrete least squares on multivariate polynomial spaces Discrete least squares on polynomial spaces For any smooth (analytic) real-valued (or Hilbert-valued) function φ : Γ R, we define its continuous and discrete L 2 projections over P Λ as Π Λ φ := argmin φ v L 2 ρ, Π M Λ φ := argmin φ v M. v P Λ v P Λ Algebraic formulation: design matrix [D] ij = ψ j (y i ), right-hand side [b] i = φ(y i ), for any i = 1,..., M and j = 1,..., #Λ. Normal equations: D D β = D b, with β containing the coefficients of the expansion Π M Λ φ = ν Λ β νψ ν. We define also the matrix G := D D/M. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 6

Discrete least squares on multivariate polynomial spaces Optimality of discrete least squares in the L 2 ρ norm In any dimension, with any index set Λ and any ρ with bounded support: Proposition (M., Nobile, von Schwerin and Tempone, FoCM 2014) For any (random or deterministic) choice of M points in Γ it holds ) φ ΠΛ M φ L (1 2 + G 1 inf φ v ρ L. Proof v P Λ Theorem (M., Nobile, von Schwerin and Tempone, FoCM 2014) Given M points in Γ, being realizations of random variables independent and identically distributed w.r.t. ρ, it holds lim G 1 = lim G = 1, M + M + almost surely. Proposition (M., Nobile, von Schwerin and Tempone, FoCM 2014) cond (G) = G G 1. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 7

Discrete least squares on multivariate polynomial spaces Norm equivalence on P Λ (case of random points) Find δ (0, 1) such that with high probability. (1 δ) v 2 L 2 ρ v 2 M (1 + δ) v 2 L 2 ρ, v P Λ, Since v 2 M = M 1 Dv, Dv 2 = Gv, v 2 and v 2 R #Λ R #Λ L = v, 2 v 2, the ρ R #Λ matrix G satisfies v 2 v 2 M G = sup v P Λ \{v 0} v 2, G 1 L = sup 2 ρ L 2 v P ρ Λ \{v 0} v 2. M Hence, norm equivalence on P Λ w.h.p. iff concentration bounds again with high probability. 1 δ G 1 + δ, 1 1 + δ G 1 1 1 δ, G I δ, G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 8

Stability and accuracy with evaluations in random points 1 Discrete least squares on multivariate polynomial spaces 2 Stability and accuracy with evaluations in random points 3 Stability and accuracy with evaluations in low-discrepancy point sets 4 Conclusions G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 9

Stability and accuracy with evaluations in random points Given any L 2 ρ-orthonormal polynomial basis (ψ ν ) ν Λ of P Λ, define K(Λ) := sup y Γ ( ) ψ ν (y) 2 v 2 = sup L ν Λ v P Λ v 2. L 2 ρ Lemma (Chkifa, Cohen, M., Nobile and Tempone, 2013) In any dimension and for any downward closed Λ it holds K(Λ) (#Λ) ln 3/ ln 2, with tensorized Chebyshev 1st kind polynomials. Lemma (M. 2014) In any dimension, for any downward closed Λ and any α, β N 0 it holds K(Λ) (#Λ) 2 max{α,β}+2, with tensorized Jacobi polynomials. These bounds are quite general, and set the ground for adaptive polynomial approximation based on discrete least squares. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 10

Stability and accuracy with evaluations in random points Assume that φ τ almost surely w.r.t. ρ and define T τ (t) := sign(t) min{τ, t }, ΠM Λ := T τ (Π M Λ ). Theorem (Chkifa, Cohen, M., Nobile and Tempone, 2013) For any γ>0 and any downward closed Λ, if M is such that K(Λ) 0.15 M 1 + γ ln M then, for any φ L (Γ) with φ L τ, it holds that Pr (cond(g) 3) 1 2M γ, ( Pr φ ΠΛ M φ L 2 (1 + ) 2) inf φ v ρ L 1 2M γ, v P Λ ( E φ Π ) ( ) Λ M 0.6 φ 2 L 1 + φ Π 2 ρ Λ φ 2 L (1 + γ) ln M + 8τ 2 M γ. 2 ρ (δ = 1/2 everywhere!) G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 11

Stability and accuracy with low-discrepancy point sets 1 Discrete least squares on multivariate polynomial spaces 2 Stability and accuracy with evaluations in random points 3 Stability and accuracy with evaluations in low-discrepancy point sets 4 Conclusions G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 12

Stability and accuracy with low-discrepancy point sets Discrete least squares with deterministic points: multivariate case with Chebyshev density in [0, 1] d Deterministic points introduced by Zhou, Narayan and Xu: ( ) 2π y j = cos M (j,..., jd ) [ 1, 1] d, j = 1,..., M, asymptotically distributed according to the Chebyshev density. Theorem (Zhou, Narayan and Xu, arxiv 2014) In any dimension d and with the Chebyshev density, if M is a prime number and M 4 d+1 d 2 (#Λ) 2 then it holds that ( φ ΠΛ M φ L 2 1 + 4 ) ρ d 2 inf φ v L. #Λ v P Λ The proof uses arguments from number theory. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 13

Stability and accuracy with low-discrepancy point sets Discrete least squares with deterministic points: the multivariate case with uniform density in [0, 1] d Given any set of M points y 1,..., y M [0, 1] d and any set U {1,..., d}, we define its local discrepancy U (t, 1) := 1 M M i=1 q U I [0,t q ](y q i ) t q, t [0, 1] U, q U and its star-discrepancy D,U := sup U (t, 1). t [0,1] U Values of components in {1,..., d} \ U are frozen to 1. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 14

Stability and accuracy with low-discrepancy point sets Example d = 2, U = {2}, {1, 2} \ U = {1} (picture from J.Dick, F.Pillichshammer: Digital Nets and Sequences, 2010) G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 15

Stability and accuracy with low-discrepancy point sets Discrete least squares with deterministic points: the multivariate case with uniform density in [0, 1] d Let t 0, m 1, d 1 and b 2 be integers with t m. A (t, m, d)-net in base b is a point set consisting of b m points in [0, 1) d such that every elementary interval of the form d q=1 [ aj b h, a ) j + 1 j b h j with each h j 0, 0 a j < b h j exactly b t points. and h 1 +... + h d = m t, contains Example: (0, 4, 2)-net in base b = 2 (Hammersley points). G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 16

Stability and accuracy with low-discrepancy point sets Discrete least squares with deterministic points: the multivariate case with uniform density in [0, 1] d Our analysis uses a type of Koksma-Hlawka inequality and low-discrepancy point sets. Starting point: Lemma (Hlawka-Zaremba s identity ) Given M points y 1,..., y M [0, 1] d, for any f with continuous mixed derivatives it holds f (y)dy 1 M f (y i ) [0,1] d M = ( 1) U U (y U, 1) U i=1 U {1,...,d} [0,1] U y U f (y U, 1) dy U. Lemma (Standard Koksma-Hlawka inequality) f (y)dy 1 M f (y i ) [0,1] d M D,{1,...,d} f HK. i=1 G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 17

Stability and accuracy with low-discrepancy point sets Discrete least squares with deterministic points: the multivariate case with uniform density in [0, 1] d Three main ingredients in our approach: 1) we prove a variant of the standard Koksma-Hlawka inequality starting from the Hlawka-Zaremba s identity: Lemma (M., Nobile 2014) f 2 L 2 f 2 M ρ D,U T y T f (y U, 1) U {1,...,d} T U L 2 ([0,1] U ) U T y U\T f (y T, 1) L 2 ([0,1] U ). 2) Markov-type and Nikolskii-type multivariate inequalities for polynomials associated with downward closed multi-index sets (M. 2014). 3) upper bounds for the star-discrepancy of (t, m, d)-nets and (t, d)-sequences (e.g. Faure-Kritzer, Monatsh. Math. 2013). G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 18

Stability and accuracy with low-discrepancy point sets Discrete least squares with deterministic points: the multivariate case with uniform density in [0, 1] d Consider any (t, m, d)-net in base b 2 with quality parameter t 0. Theorem (M., Nobile 2014) In any dimension d, with the uniform density and with anisotropic tensor product spaces P Λ, if ( { } ) b 1 1 > δ > 0.7 b t 2 (1 + 2 ln M)d exp + O (1) (#Λ) 2 ln b M then it holds that cond(g) 1 + δ 1 δ, ) 1 φ ΠΛ M φ L (1 2 + inf φ v ρ L. 1 δ v P Λ Similar theorem also for (t, d)-sequences (M.,Nobile 2014). G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 19

Stability and accuracy with low-discrepancy point sets Discrete least squares with deterministic points: the multivariate case with uniform density in [0, 1] d Given Λ downward closed and U {1,..., d} we define its sections by Λ U := {ν N U 0 : µ = (µ U, µ {1,...,d}\U ) Λ and ν = µ U }. In general, for any downward closed multi-index set the condition becomes 1 > δ > min (#Λ)4 D,U, =U {1,...,d} =U {1,...,d} D,U ( ) 2 #Λ {1,...,d}\U (#Λ T ) 2 ( ) 2 #Λ U\T. T U Nonoptimal when Λ is more sparse than anisotropic tensor product, compared to M (#Λ) 2 with random points and any Λ downward closed. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 20

Conclusions 1 Discrete least squares on multivariate polynomial spaces 2 Stability and accuracy with evaluations in random points 3 Stability and accuracy with evaluations in low-discrepancy point sets 4 Conclusions G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 21

Conclusions Conclusions (theoretical analysis) RANDOM POINTS: analysis w.r.t. M, d, Λ, ρ, smoothness φ: in any dimension d, proven stability and accuracy provided that M/ ln M C 1 (dim(p Λ )) ln 3 ln 2 with Chebyshev density, M/ ln M C 2 (dim(p Λ )) 2 with uniform density, M/ ln M C 3 (dim(p Λ )) 2 max{α,β}+2 with beta(α + 1, β + 1), α, β 0, with the constants C 1, C 2, C 3 being independent of d. DETERMINISTIC POINTS: analysis w.r.t. M, d, Λ, smoothness φ: in any dimension d, proven stability and accuracy provided that M Ĉ1(d)(dim(P Λ )) 2 with Chebyshev density and any Λ (Zhou et al.), M/(1 + 2 ln M) d Ĉ2(dim(P Λ )) 2 with uniform density and anisotropic tensor product, M/(1 + 2 ln M) d Ĉ3(dim(P Λ )) γ, 2 γ 4 with uniform density and any Λ downward closed. with the constant Ĉ1 being dependent on d, and Ĉ2, Ĉ3 being dependent on the parameters of the (t, m, d)-net or (t, d)-sequence. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 22

Conclusions Conclusions (experience from numerics) In high dimensions and with smooth functions, with both random and deterministic points, it seems to be enough M dim(p Λ ) to achieve the optimal convergence rate up to a threshold. A lot of numerical evidence, but no formal proof yet. Deterministic points CAN outperform random points in low dimensions. What about high dimensions? Discrete least squares is a well-promising approximation tool for multivariate aleatory functions and PDEs with stochastic data. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 23

Conclusions References on discrete least squares with RANDOM points A.Cohen, M.Davenport, D.Leviatan: On the stability and accuracy of least squares approximations. Foundations of Computational Mathematics, 2013. G.Migliorati, F.Nobile, E.von Schwerin, R.Tempone: Analysis of discrete L 2 projection on polynomial spaces with random evaluations. Foundations of Computational Mathematics, 2014. A.Chkifa, A.Cohen, G.Migliorati, F.Nobile, R.Tempone: Discrete least squares polynomial approximation with random evaluations; application to parametric and stochastic elliptic PDEs. submitted. Available as MATHICSE report 35-2013. G.Migliorati, F.Nobile, E.von Schwerin, R.Tempone: Approximation of Quantities of Interest in stochastic PDEs by the random discrete L 2 projection on polynomial spaces, SIAM J. Sci. Comput., 2013. G.Migliorati: Multivariate Markov-type and Nikolskii-type inequalities for polynomials associated with downward closed multi-index sets, submitted. Available as MATHICSE report 1-2014. G.Migliorati: Polynomial approximation by the random discrete L 2 projection and application to inverse problems for PDEs with stochastic data, PhD thesis, Department of Mathematics at Politecnico di Milano and Centre de Mathématiques Appliquées at École Polytechnique, 2013. References on discrete least squares with DETERMINISTIC points T.Zhou, A.Narayan, Z.Xu: Multivariate discrete least-squares approximations with a new type of collocation grid, arkiv:1401.0894v1, 2014. G.Migliorati, F.Nobile: Analysis of discrete least squares on multivariate polynomial spaces with evaluations in low-discrepancy point sets, submitted. Available as MATHICSE report 25-2014. G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 24

Conclusions Thank you for your attention! G.Migliorati (EPFL) ICERM - Brown University Providence - September 23th, 2014 25