An introduction to particle simulation of rare events
|
|
- Verity Stevenson
- 5 years ago
- Views:
Transcription
1 An introduction to particle simulation of rare events P. Del Moral Centre INRIA de Bordeaux - Sud Ouest Workshop OPUS, 29 juin 2010, IHP, Paris P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 1 / 33
2 Outline 1 Introduction, motivations 2 The heuristic of particle methods 3 Particle Feynman-Kac models 4 Stochastic models and methods 5 Some convergence results 6 Some references P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 2 / 33
3 Summary 1 Introduction, motivations Some rare event problems Stochastic models Importance sampling techniques 2 The heuristic of particle methods 3 Particle Feynman-Kac models 4 Stochastic models and methods 5 Some convergence results 6 Some references P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 3 / 33
4 Rare event analysis Stochastic process X Rare event A : Proba(X A) & Law((X 0,..., X t ) X A) engineering/physics/biology/economics/finance : Finance : ruin/default probabilities, financial crashes, eco. crisis,... engineering : networks overload, breakdowns, engines failures,... Physics : climate models, directed polymer conformations, particle in absorbing medium, ground states of Schroedinger models. Statistics : tail probabilities, extreme random values. Combinatorics : Complex enumeration problems. Process strategies Rare event Control and prediction. X t = F t (X t 1, W t ) Law((W 0,..., W t ) X A) P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 4 / 33
5 Stochastic models Only 2 Ingredients 1 Physical/biological/financial process : queuing network, portfolio, volatility process, stock market evolutions, interacting/exchange economic models... 1 Potential function (energy type, indicator, restrictions): critical level crossing, penalties functions, constraints subsets, performance levels, long range dependence... Objectives Estimation of the probability of the rare event. Computing the full distributions of the path of the process evolving in the critical regime prediction control. P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 5 / 33
6 Twisted Monte Carlo methods P(X A) = Find Q s.t. Q(A) 1 Elementary Monte Carlo estimate X i iid Q dp P(A) := dq (x) 1 A(x) Q(dx) P N (A) := 1 N Variance Drawbacks dp dq (x) 1 A(x) P(dx) 1 i N dp dq (X i ) 1 A (X i ) (ex. cross-entropy : Q {Q a, a A}) Huge variance if Q badly chosen optimal choice Q(dx) 1 A (x)p(dx). Need to twist the original reference process X. Stochastic evolution X = (X 0,..., X n ) dp n dq n (X ) := n k=0 p k (X k X k 1 ) q k (X k X k 1 ) degenerate product martingale w.r.t. n P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 6 / 33
7 Summary 1 Introduction, motivations 2 The heuristic of particle methods Flow of optimal twisted measures 5 Examples of flow of target measures 3 types of occupation measures Some equivalent stochastic algorithms 3 Particle Feynman-Kac models 4 Stochastic models and methods 5 Some convergence results 6 Some references P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 7 / 33
8 Flow of measure with increasing sampling complexity Rare event = cascade/series of intermediate less-rare events ( energy levels, physical gateways, index level crossings). Conditional probability flow = flow of optimal twisted measures n η n = Law(process series of n intermediate events) Rare event probabilities= Normalizing constants. Particle methods (Sampling a genealogical type default tree model % success or default) Explorations/Local search propositions of the solution space. Branching-Selection individuals critical regimes. P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 8 / 33
9 5 Examples of flow of target measures 1 η n = Law((X 0,..., X n ) 0 p n X p A p ) 2 η n (dx) e βnv (x) λ(dx) with β n 3 η n (dx) 1 An (x) λ(dx) with A n 4 η n = Law K π ((X 0,..., X n ) X n = x n ). 5 η n = Law( X hits B n X hits B n before A) 5 particle heuristics : 1 M n -local moves individual selections A n i.e. G n = 1 An 2 MCMC local moves η n = η n M n individual selections G n = e (βn+1 βn)v 3 MCMC local moves η n = η n M n individual selections G n = 1 An+1 4 M-local moves Selection G(x 1, x 2 ) = π(dx2)k(x2,dx1) π(dx 1)M(x 1,dx 2) 5 M n -local moves Selection-branching on upper/lower levels B n. P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 9 / 33
10 Interaction/branch. process 3 types of occupation measures (N = 3) = = = N Current population 1 N i=1 δ ξ i n i-th individual at time n N Genealogical tree model 1 N i=1 δ (ξ i 0,n,ξ i 1,n,...,ξi n,n) i-th ancestral line Complete genealogical tree model 1 N N i=1 δ (ξ i 0,ξi 1,...,ξi n) Empirical mean potentials [success % (G n = 1 A )] 1 N N G n (ξn) i i=1 P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 10 / 33
11 Equivalent Stochastic Algorithms : Genetic and evolutionary type algorithms. Spatial branching models, subset splitting techniques. Sequential Monte Carlo methods, population Monte Carlo models. Diffusion Monte Carlo (DMC), Quantum Monte Carlo (QMC),... Some botanical names = application domain areas : bootsrapping, selection, pruning-enrichment, reconfiguration, cloning, go with the winner, spawning, condensation, grouping, rejuvenations, harmony searches, biomimetics, splitting, [(Meta)Heuristics] 1996 Feynman-Kac mean field particle model P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 11 / 33
12 Summary 1 Introduction, motivations 2 The heuristic of particle methods 3 Particle Feynman-Kac models Some notation Asymptotic Analysis Some workout examples Sensitivity analysis 4 Stochastic models and methods 5 Some convergence results 6 Some references P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 12 / 33
13 Notation E measurable state space, P(E) proba. on E, B(E) bounded meas. functions (µ, f ) P(E) B(E) µ(f ) = µ(dx) f (x) M(x, dy) integral operator over E M(f )(x) = M(x, dy)f (y) [µm](dy) = µ(dx)m(x, dy) (= [µm](f ) = µ[m(f )] ) Bayes-Boltzmann-Gibbs transformation : G : E [0, [ with µ(g) > 0 Ψ G (µ)(dx) = 1 µ(g) G(x) µ(dx) E finite Matrix notations µ = [µ(1),..., µ(d)] and f = [f (1),..., f (d)] P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 13 / 33
14 Infinite Population N = Feynman-Kac measures (G n, M n ) Current population: η N n (f ) := 1 N N i=1 f (ξ i n) N η n (f ) := γ n(f ) γ n (1) with [Potential functions G n ] & [X n Markov chain transitions M n ] γ n (f ) := E f n (X n ) G p (X p ) 0 p<n Genealogical tree occupation measures: 1 N N i=1 δ (ξ i 0,n,ξ i 1,n,...,ξi n,n ) N dq n := 1 Z n 0 p<n G p (X p ) dp n Normalizing constants: 0 p<n ηn p (G p ) N Z n = γ n (1) P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 14 / 33
15 Feynman-Kac models ALL of the above heuristics Example : G n = 1 An and X n Markov chain Normalizing constants: γ n (1) = P (n-th rare event) = P(X p A p, 0 p < n) n-th time marginals: η n = Law(X n n-th rare event)= Law(X n X p A p, 0 p < n) Path space measures: Q n = Law((X p ) 0 p n n-th rare event)= Law((X p ) 0 p n X p A p, p < n) Note : Including the n-th rare event level set Updated measures γ n, η n,, Q n with 0 p n P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 15 / 33
16 Another detailed static example Restriction of measures : A n (Ex.: A n = [a n, [ tails probab.) η n (dx) = 1 Z n 1 An (x) λ(dx) Feynman-Kac representation : η n (f ) := γ n(f ) γ n (1) with γ n (f ) := E f n (X n ) 1 Ap+1 (X p ) 0 p<n and Note : P (X n dx n X n 1 = x n 1 ) = M n (x n 1, dx n ) with η n = η n M n Z n = λ (A n ) = λ ( 1 An 1 An 1 ) λ ( 1 An 1 ) }{{} η n 1(1 An ) (A 0=E) Z n 1 = 0 p<n η p ( 1Ap+1 ) P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 16 / 33
17 Sensitivity analysis θ M n (x n 1, dx n ) = p θ n(x n 1, x n ) λ X n (dx n ) and G n (x) = G θ n (x) Parametric Feynman-Kac models : (G n, Q n, γ n, η n,...) (G θ n, Q θ n, γ θ n, η θ n,...) θ log γθ n(1) = Q θ n(f θ n ) with the additive functional and F θ n (x 0,..., x n ) := θ Q θ n(ϕ n ) = Q ( θ n Fn θ [ϕ n Q ) θ n(ϕ n )] 0 k n f θ k (x k 1, x k ) fk θ (x k 1, x p ) := log G p θ (x k ) + log pθ k (x k 1, x k ) θ θ P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 17 / 33
18 Additive functionals (with Doucet & Singh Hal-INRIA july 09) Path space models P n := Law(X 0,..., X n ) dq n := 1 G p (X p ) Z n dp n 0 p<n Hyp. : M n (x n 1, dx n ) = H n (x n 1, x n ) λ n (dx n ) Q n (d(x 0,..., x n )) = η n (dx n ) M n,ηn 1 (x n, dx n 1 )... M 1,η0 (x 1, dx 0 ) with the backward transitions : M p+1,η (x, dx ) G p (x ) H p+1 (x, x) η(dx ) Particle estimates Ancestral tree or the complete genealogical tree : Q N n (d(x 0,..., x n )) = η N n (dx n ) M n,η N n 1 (x n, dx n 1 )... M 1,η N 0 (x 1, dx 0 ) P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 18 / 33
19 Summary 1 Introduction, motivations 2 The heuristic of particle methods 3 Particle Feynman-Kac models 4 Stochastic models and methods McKean distribution models Mean field particle interpretations 5 Some convergence results 6 Some references P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 19 / 33
20 Nonlinear Markov models Flows of Feynman-Kac measures A two step correction prediction model η n Updating-correction Prediction/Markov transport η n = Ψ Gn (η n ) η n+1 = η n M n+1 Selection nonlinear transport formulae Ψ Gn (η n ) = η n S n,ηn with, for any ɛ n = ɛ n (η n ) [0, 1] s.t. ɛ n G n 1 S n,ηn (x,.) := ɛ n G n (x) δ x + (1 ɛ n G n (x)) Ψ Gn (η n ) η n+1 = η n (S n,ηn M n+1 ) := η n K n+1,ηn P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 20 / 33
21 Nonlinear Markov chains η n = Law(X n )=Perfect sampling algorithm Nonlinear transport formulae : η n+1 = η n K n+1,ηn with the collection of Markov probability transitions : K n+1,ηn = S n,ηn M n+1 Local transitions : P(X n dx n X n 1 ) = K n,η n 1 (X n 1, dx n ) with η n 1 = Law(X n 1 ) McKean measures (canonical process) : P n (d(x 0,..., x n )) = η 0 (dx 0 ) K 1,η0 (x 0, dx 1 )... K n,ηn 1 (x n 1, dx n ) P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 21 / 33
22 Mean field particle interpretations Sampling pb Mean field particle interpretations Markov Chain ξ n = (ξn, 1..., ξn N ) En N s.t. η N n := 1 δ N ξ i n N η n 1 i N Approximated local transitions ( 1 i N) ξn 1 i ξn i K n,η N n 1 (ξn 1, i dx n ) P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 22 / 33
23 Schematic picture : ξ n E N n ξ n+1 E N n+1 Rationale : ξ 1 n. ξ i n. ξ N n K n+1,η N n ξ 1 n+1. ξ i n+1. ξ N n+1 η N n N η n = K n+1,η N n N K n+1,ηn = ξ i i.i.d. copies of X Particle McKean measures : 1 N N δ (ξ i 0,...,ξn i ) N Law(X 0,..., X n ) i=1 P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 23 / 33
24 Feynman-Kac models Genetic type stochastic algo. ξṇ 1 ξ 1 M n+1 n ξn+1 1. S ξṇ i n,η N n.. ξ n i ξn+1 i... ξn N ξn+1 N Acceptance/Rejection-Selection : [Geometric type clocks] ξ N n S n,η N n (ξ i n, dx) := ɛ n G n (ξn) i δ ξ i n (dx) + ( 1 ɛ n G n (ξn) ) i N G n(ξ j P n ) j=1 N k=1 Gn(ξk )δ ξ j (dx) n n Ex. : G n = 1 A G n (ξ i n) = 1 A (ξ i n) P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 24 / 33
25 Some key advantages Mean field models=stochastic linearization/perturbation technique : η N n = η N n 1K n,η N n N W N n with Wn N W n Centered Gaussian Fields. η n = η n 1 K n,ηn 1 stable No propagation of local sampling errors = Uniform control w.r.t. the time horizon No burning, no need to study the stability of MCMC models. Stochastic adaptive grid approximation Nonlinear system positive-benefic interactions. Simple and natural sampling algorithm. P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 25 / 33
26 Summary 1 Introduction, motivations 2 The heuristic of particle methods 3 Particle Feynman-Kac models 4 Stochastic models and methods 5 Some convergence results Non asymptotic theorems Unnormalized models Additive functionals 6 Some references P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 26 / 33
27 Asympt. theo. TCL,PGD, PDM (n,n) fixed some examples : Empirical processes : sup sup N E( η N n η n p F n ) < n 0 N 1 Concentration inequalities uniform w.r.t. time : sup P( ηn N (f n ) η n (f n ) > ɛ) c exp (Nɛ 2 )/(2 σ 2 ) n 0 + Guionnet sup n 0 (IHP 01) & Ledoux sup Fn (JTP 00) & Rio AAP 10 Propagations of chaos exansions (+Patras,Rubenthaler (AAP 09)) : P N n,q := Law(ξ 1 n,..., ξ q n) η q n + 1 N 1 P n,q N k k P n,q + 1 N k+1 k+1 P N n,q with sup N 1 k+1 P N n,q tv < & sup n 0 1 P n,q tv c q 2. P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 27 / 33
28 Unnormalized models Un-bias particle approximation measures γn N (f n ) := ηn N (f n ) ηp N (G p ) 0 p<n Asymptotic theorms : fluctuations & deviations + A. Guionnet (AAP 99, SPA 98), + L. Miclo (SP 00), + D. Dawson (01) Non asymptotic theory : bias and variance estimates 1 Taylor type expansion (+Patras & Rubenthaler (AAP 09) : E ( (γ N n ) q (F ) ) =: Q N n,q(f ) = γ q n (F ) + 1 k (q 1)(n+1) 1 N k k Q n,q (F ) 2 Variance estimates (+Cerou & Guyader Hal-INRIA 08 & IPH 2010) : ( [γ N E n (f n ) γ n (f n ) ] ) 2 c n N γ n(1) 2 P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 28 / 33
29 Additive functionals F n (x 0,..., x n ) = 1 n p n f p (x p ) Bias estimate+uniform L p -bounds + variance (1/N 2 ) N E ( [(Q N n Q n )(F n )] 2) c 1/n +1/N }{{} bias term Uniform exponential concentration 1 log sup P N n 0 ( [Q N n Q n ](F n ) b ) + ɛ ɛ 2 /(2b 2 ) N P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 29 / 33
30 Summary 1 Introduction, motivations 2 The heuristic of particle methods 3 Particle Feynman-Kac models 4 Stochastic models and methods 5 Some convergence results 6 Some references P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 30 / 33
31 some references + http links Particle methods & Sequential Monte Carlo Feynman-Kac formulae. Genealogical and interacting particle systems, Springer (2004) Refs. (with L. Miclo) Branching and Interacting Particle Systems Approximations of Feynman-Kac Formulae. Sém. de Proba. (00). (with Doucet A. et Jasra A.) Sequential MC Samplers. JRSS B (2006). (with A. Doucet) On a class of genealogical and interacting Metropolis models. Sém. de Proba. 37 (2003). (with F. Patras, S. Rubenthaler) Coalescent tree based functional representations for some Feynman-Kac particle models, AAP (09) (with A. Doucet and S. S. Singh) A Backward Particle Interpretation of Feynman-Kac Formulae (HAL-INRIA 2009) to appear in M2AN (2010). (with F. Cerou et A. Guyader) A non asymptotic variance theorem for unnormalized Feynman-Kac particle models (HAL-INRIA 2008), to appear in the Annals of IHP (2010) P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 31 / 33
32 Rare event analysis Particle simulation of twisted measures (with J. Garnier) Genealogical Particle Analysis of Rare events. Annals of Applied Probab., 15-4 (2005). (with J. Garnier) Simulations of rare events in fiber optics by interacting particle systems. Optics Communications, Vol. 267 (2006). Branching processes (with P. Lezaud) Branching and interacting particle interpretation of rare event proba.. Stochastic Hybrid Systems : Theory and Safety Critical Applications, eds. H. Blom and J. Lygeros. Springer (2006). (with F. Cerou, Le Gland F., Lezaud P.) Genealogical Models in Entrance Times Rare Event Analysis, Alea, Vol. I, (2006). Proceedings Conf. RESIM 2006 (with A. M. Johansen et A. Doucet) Sequential Monte Carlo Samplers for Rare Events (with F. Cerou, A. Guyader, F. LeGland, P. Lezaud et H. Topart) Some recent improvements to importance splitting P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 32 / 33
33 Absorption models (with L. Miclo) Particle Approximations of Lyapunov Exponents Connected to Schrodinger Operators and Feynman-Kac Semigroups. ESAIM Probability & Statistics, vol. 7, pp (2003). (with A. Doucet) Particle Motions in Absorbing Medium with Hard and Soft Obstacles. Stochastic Analysis and Applications, vol. 22 (2004). + recent preprints (with A. Doucet and S. S. Singh) Forward Smoothing Using Sequential Monte Carlo Technical Report 638. Cambridge University Engineering Department (2009). (with A. Doucet et A. Jasra) On Adaptive Resampling Procedures for Sequential Monte Carlo Methods (HAL-INRIA 2008). P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 33 / 33
A Backward Particle Interpretation of Feynman-Kac Formulae
A Backward Particle Interpretation of Feynman-Kac Formulae P. Del Moral Centre INRIA de Bordeaux - Sud Ouest Workshop on Filtering, Cambridge Univ., June 14-15th 2010 Preprints (with hyperlinks), joint
More informationAdvanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP
Advanced Monte Carlo integration methods P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP MCQMC 2012, Sydney, Sunday Tutorial 12-th 2012 Some hyper-refs Feynman-Kac formulae,
More informationConcentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand
Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting
More informationMean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral
Mean field simulation for Monte Carlo integration Part II : Feynman-Kac models P. Del Moral INRIA Bordeaux & Inst. Maths. Bordeaux & CMAP Polytechnique Lectures, INLN CNRS & Nice Sophia Antipolis Univ.
More informationA new class of interacting Markov Chain Monte Carlo methods
A new class of interacting Marov Chain Monte Carlo methods P Del Moral, A Doucet INRIA Bordeaux & UBC Vancouver Worshop on Numerics and Stochastics, Helsini, August 2008 Outline 1 Introduction Stochastic
More informationMean field simulation for Monte Carlo integration. Part I : Intro. nonlinear Markov models (+links integro-diff eq.) P. Del Moral
Mean field simulation for Monte Carlo integration Part I : Intro. nonlinear Markov models (+links integro-diff eq.) P. Del Moral INRIA Bordeaux & Inst. Maths. Bordeaux & CMAP Polytechnique Lectures, INLN
More informationContraction properties of Feynman-Kac semigroups
Journées de Statistique Marne La Vallée, January 2005 Contraction properties of Feynman-Kac semigroups Pierre DEL MORAL, Laurent MICLO Lab. J. Dieudonné, Nice Univ., LATP Univ. Provence, Marseille 1 Notations
More informationOn the stability and the applications of interacting particle systems
1/42 On the stability and the applications of interacting particle systems P. Del Moral INRIA Bordeaux Sud Ouest PDE/probability interactions - CIRM, April 2017 Synthesis joint works with A.N. Bishop,
More informationOn the Concentration Properties of Interacting Particle Processes. Contents
Foundations and Trends R in Machine Learning Vol. 3, Nos. 3 4 (2010) 225 389 c 2012 P. Del Moral, P. Hu, and L. Wu DOI: 10.1561/2200000026 On the Concentration Properties of Interacting Particle Processes
More informationA Backward Particle Interpretation of Feynman-Kac Formulae
A Backward Particle Interpretation of Feynman-Kac Formulae Pierre Del Moral, Arnaud Doucet, Sumeetpal Singh To cite this version: Pierre Del Moral, Arnaud Doucet, Sumeetpal Singh. A Backward Particle Interpretation
More informationStability of Feynman-Kac Semigroup and SMC Parameters' Tuning
Stability of Feynman-Kac Semigroup and SMC Parameters' Tuning François GIRAUD PhD student, CEA-CESTA, INRIA Bordeaux Tuesday, February 14th 2012 1 Recalls and context of the analysis Feynman-Kac semigroup
More informationAn Brief Overview of Particle Filtering
1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems
More informationAuxiliary Particle Methods
Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley
More informationPerfect simulation algorithm of a trajectory under a Feynman-Kac law
Perfect simulation algorithm of a trajectory under a Feynman-Kac law Data Assimilation, 24th-28th September 212, Oxford-Man Institute C. Andrieu, N. Chopin, A. Doucet, S. Rubenthaler University of Bristol,
More informationOn the concentration properties of Interacting particle processes
ISTITUT ATIOAL DE RECHERCHE E IFORMATIQUE ET E AUTOMATIQUE arxiv:1107.1948v2 [math.a] 12 Jul 2011 On the concentration properties of Interacting particle processes Pierre Del Moral, Peng Hu, Liming Wu
More informationSequential Monte Carlo Methods for Bayesian Computation
Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter
More informationFull text available at: On the Concentration Properties of Interacting Particle Processes
On the Concentration Properties of Interacting Particle Processes On the Concentration Properties of Interacting Particle Processes Pierre Del Moral INRIA, Université Bordeaux 1 France Peng Hu INRIA, Université
More informationImportance splitting for rare event simulation
Importance splitting for rare event simulation F. Cerou Frederic.Cerou@inria.fr Inria Rennes Bretagne Atlantique Simulation of hybrid dynamical systems and applications to molecular dynamics September
More informationParticle Filters: Convergence Results and High Dimensions
Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline
More informationOn Adaptive Resampling Procedures for Sequential Monte Carlo Methods
On Adaptive Resampling Procedures for Sequential Monte Carlo Methods Pierre Del Moral, Arnaud Doucet, Ajay Jasra To cite this version: Pierre Del Moral, Arnaud Doucet, Ajay Jasra. On Adaptive Resampling
More informationSTABILITY AND UNIFORM APPROXIMATION OF NONLINEAR FILTERS USING THE HILBERT METRIC, AND APPLICATION TO PARTICLE FILTERS 1
The Annals of Applied Probability 0000, Vol. 00, No. 00, 000 000 STABILITY AND UNIFORM APPROXIMATION OF NONLINAR FILTRS USING TH HILBRT MTRIC, AND APPLICATION TO PARTICL FILTRS 1 By François LeGland and
More informationIntroduction. log p θ (y k y 1:k 1 ), k=1
ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE
More informationSequential Monte Carlo Samplers for Applications in High Dimensions
Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex
More informationControlled sequential Monte Carlo
Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation
More informationA Central Limit Theorem for Fleming-Viot Particle Systems Application to the Adaptive Multilevel Splitting Algorithm
A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Algorithm F. Cérou 1,2 B. Delyon 2 A. Guyader 3 M. Rousset 1,2 1 Inria Rennes Bretagne Atlantique 2 IRMAR, Université de Rennes
More informationAsymptotics and Simulation of Heavy-Tailed Processes
Asymptotics and Simulation of Heavy-Tailed Processes Department of Mathematics Stockholm, Sweden Workshop on Heavy-tailed Distributions and Extreme Value Theory ISI Kolkata January 14-17, 2013 Outline
More informationA Backward Particle Interpretation of Feynman-Kac Formulae
A Backward Particle Interpretation of Feynman-Kac Formulae Pierre Del Moral, Arnaud Doucet, Sumeetpal Singh To cite this version: Pierre Del Moral, Arnaud Doucet, Sumeetpal Singh. A Backward Particle Interpretation
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationParticle Filtering Approaches for Dynamic Stochastic Optimization
Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,
More informationPerturbed Proximal Gradient Algorithm
Perturbed Proximal Gradient Algorithm Gersende FORT LTCI, CNRS, Telecom ParisTech Université Paris-Saclay, 75013, Paris, France Large-scale inverse problems and optimization Applications to image processing
More informationMarkov Chain Monte Carlo Methods for Stochastic
Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013
More informationSome recent improvements to importance splitting
Some recent improvements to importance splitting F. Cerou IRISA / INRIA Campus de Beaulieu 35042 RENNES Cédex, France Frederic.Cerou@inria.fr F. Le Gland IRISA / INRIA Campus de Beaulieu 35042 RENNES Cédex,
More informationAn introduction to stochastic particle integration methods: with applications to risk and insurance
An introduction to stochastic particle integration methods: with applications to risk and insurance P. Del Moral, G. W. Peters, C. Vergé Abstract This article presents a guided introduction to a general
More informationOn the Robustness of the Snell envelope
On the Robustness of the Snell envelope Pierre Del Moral, Peng Hu, adia Oudjane, Bruno Rémillard To cite this version: Pierre Del Moral, Peng Hu, adia Oudjane, Bruno Rémillard On the Robustness of the
More informationMarkov Chain Monte Carlo Methods for Stochastic Optimization
Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,
More informationRobustesse des techniques de Monte Carlo dans l analyse d événements rares
Institut National de Recherche en Informatique et Automatique Institut de Recherche en Informatique et Systèmes Aléatoires Robustesse des techniques de Monte Carlo dans l analyse d événements rares H.
More informationAdaptive Monte Carlo methods
Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert
More informationLecture 7 and 8: Markov Chain Monte Carlo
Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani
More informationWeak convergence of Markov chain Monte Carlo II
Weak convergence of Markov chain Monte Carlo II KAMATANI, Kengo Mar 2011 at Le Mans Background Markov chain Monte Carlo (MCMC) method is widely used in Statistical Science. It is easy to use, but difficult
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationSequential Monte Carlo Methods in High Dimensions
Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,
More informationA Note on Auxiliary Particle Filters
A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,
More informationPseudo-marginal MCMC methods for inference in latent variable models
Pseudo-marginal MCMC methods for inference in latent variable models Arnaud Doucet Department of Statistics, Oxford University Joint work with George Deligiannidis (Oxford) & Mike Pitt (Kings) MCQMC, 19/08/2016
More informationBayesian Phylogenetics:
Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes
More informationSMC 2 : an efficient algorithm for sequential analysis of state-space models
SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra
More informationSimulation estimators in a dynamic model for food contaminant exposure
Simulation estimators in a dynamic model for food contaminant exposure Joint work with P. Bertail (CREST LS and Paris X) and Stéphan Clémençon (INRA Mét@risk and Paris X) INRA-Mét@risk, Food Risk Analysis
More informationBayesian Parameter Inference for Partially Observed Stopped Processes
Bayesian Parameter Inference for Partially Observed Stopped Processes BY AJAY JASRA, NIKOLAS KANTAS & ADAM PERSING 3 Department of Statistics & Applied Probability, National University of Singapore, Singapore,
More informationInfinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix
Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations
More informationKernel Sequential Monte Carlo
Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture
More informationMonte Carlo Methods. Leon Gu CSD, CMU
Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte
More informationMultilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems
Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November
More informationAn introduction to Sequential Monte Carlo
An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods
More informationAdvanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering
Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London
More informationAn introduction to Approximate Bayesian Computation methods
An introduction to Approximate Bayesian Computation methods M.E. Castellanos maria.castellanos@urjc.es (from several works with S. Cabras, E. Ruli and O. Ratmann) Valencia, January 28, 2015 Valencia Bayesian
More informationAdvances and Applications in Perfect Sampling
and Applications in Perfect Sampling Ph.D. Dissertation Defense Ulrike Schneider advisor: Jem Corcoran May 8, 2003 Department of Applied Mathematics University of Colorado Outline Introduction (1) MCMC
More informationGraphical Models and Kernel Methods
Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.
More informationBayesian Estimation with Sparse Grids
Bayesian Estimation with Sparse Grids Kenneth L. Judd and Thomas M. Mertens Institute on Computational Economics August 7, 27 / 48 Outline Introduction 2 Sparse grids Construction Integration with sparse
More informationParticle Filtering for Data-Driven Simulation and Optimization
Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October
More informationON COMPOUND POISSON POPULATION MODELS
ON COMPOUND POISSON POPULATION MODELS Martin Möhle, University of Tübingen (joint work with Thierry Huillet, Université de Cergy-Pontoise) Workshop on Probability, Population Genetics and Evolution Centre
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationLinear variance bounds for particle approximations of time-homogeneous Feynman Kac formulae
Available online at www.sciencedirect.com Stochastic Processes and their Applications 122 (2012) 1840 1865 www.elsevier.com/locate/spa Linear variance bounds for particle approximations of time-homogeneous
More informationOn Markov chain Monte Carlo methods for tall data
On Markov chain Monte Carlo methods for tall data Remi Bardenet, Arnaud Doucet, Chris Holmes Paper review by: David Carlson October 29, 2016 Introduction Many data sets in machine learning and computational
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationPattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods
Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs
More informationThere are self-avoiding walks of steps on Z 3
There are 7 10 26 018 276 self-avoiding walks of 38 797 311 steps on Z 3 Nathan Clisby MASCOS, The University of Melbourne Institut für Theoretische Physik Universität Leipzig November 9, 2012 1 / 37 Outline
More information27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling
10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More informationPractical unbiased Monte Carlo for Uncertainty Quantification
Practical unbiased Monte Carlo for Uncertainty Quantification Sergios Agapiou Department of Statistics, University of Warwick MiR@W day: Uncertainty in Complex Computer Models, 2nd February 2015, University
More informationCalibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods
Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June
More informationNonparametric Drift Estimation for Stochastic Differential Equations
Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,
More informationOn Reparametrization and the Gibbs Sampler
On Reparametrization and the Gibbs Sampler Jorge Carlos Román Department of Mathematics Vanderbilt University James P. Hobert Department of Statistics University of Florida March 2014 Brett Presnell Department
More informationMonte Carlo Methods. Handbook of. University ofqueensland. Thomas Taimre. Zdravko I. Botev. Dirk P. Kroese. Universite de Montreal
Handbook of Monte Carlo Methods Dirk P. Kroese University ofqueensland Thomas Taimre University ofqueensland Zdravko I. Botev Universite de Montreal A JOHN WILEY & SONS, INC., PUBLICATION Preface Acknowledgments
More informationMonte Carlo Methods. Geoff Gordon February 9, 2006
Monte Carlo Methods Geoff Gordon ggordon@cs.cmu.edu February 9, 2006 Numerical integration problem 5 4 3 f(x,y) 2 1 1 0 0.5 0 X 0.5 1 1 0.8 0.6 0.4 Y 0.2 0 0.2 0.4 0.6 0.8 1 x X f(x)dx Used for: function
More informationInteracting and Annealing Particle Filters: Mathematics and a Recipe for Applications. Christoph Schnörr Bodo Rosenhahn Hans-Peter Seidel
Interacting and Annealing Particle Filters: Mathematics and a Recipe for Applications Jürgen Gall Jürgen Potthoff Christoph Schnörr Bodo Rosenhahn Hans-Peter Seidel MPI I 2006 4 009 September 2006 Authors
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture
More informationUniversity of Toronto Department of Statistics
Norm Comparisons for Data Augmentation by James P. Hobert Department of Statistics University of Florida and Jeffrey S. Rosenthal Department of Statistics University of Toronto Technical Report No. 0704
More informationMarkov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can
More informationHigh-dimensional Problems in Finance and Economics. Thomas M. Mertens
High-dimensional Problems in Finance and Economics Thomas M. Mertens NYU Stern Risk Economics Lab April 17, 2012 1 / 78 Motivation Many problems in finance and economics are high dimensional. Dynamic Optimization:
More informationLecture Particle Filters
FMS161/MASM18 Financial Statistics November 29, 2010 Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace
More informationBasic math for biology
Basic math for biology Lei Li Florida State University, Feb 6, 2002 The EM algorithm: setup Parametric models: {P θ }. Data: full data (Y, X); partial data Y. Missing data: X. Likelihood and maximum likelihood
More informationRare event simulation for a static distribution
Rare event simulation for a static distribution Frédéric Cérou, Pierre Del Moral, Teddy Furon, Arnaud Guyader To cite this version: Frédéric Cérou, Pierre Del Moral, Teddy Furon, Arnaud Guyader. Rare event
More informationPart 1: Expectation Propagation
Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud
More informationCondensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C.
Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Spall John Wiley and Sons, Inc., 2003 Preface... xiii 1. Stochastic Search
More informationMonte Carlo methods for sampling-based Stochastic Optimization
Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from
More informationThe Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision
The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that
More informationAn overview of importance splitting for rare event simulation
Home Search Collections Journals About Contact us My IOPscience An overview of importance splitting for rare event simulation This article has been downloaded from IOPscience. Please scroll down to see
More informationSequential Monte Carlo samplers
J. R. Statist. Soc. B (2006) 68, Part 3, pp. 411 436 Sequential Monte Carlo samplers Pierre Del Moral, Université Nice Sophia Antipolis, France Arnaud Doucet University of British Columbia, Vancouver,
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationUPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES
Applied Probability Trust 7 May 22 UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES HAMED AMINI, AND MARC LELARGE, ENS-INRIA Abstract Upper deviation results are obtained for the split time of a
More informationOnline appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US
Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus
More informationStochastic Proximal Gradient Algorithm
Stochastic Institut Mines-Télécom / Telecom ParisTech / Laboratoire Traitement et Communication de l Information Joint work with: Y. Atchade, Ann Arbor, USA, G. Fort LTCI/Télécom Paristech and the kind
More informationStochastic Simulation
Stochastic Simulation Idea: probabilities samples Get probabilities from samples: X count x 1 n 1. x k total. n k m X probability x 1. n 1 /m. x k n k /m If we could sample from a variable s (posterior)
More informationAn Introduction to Malliavin calculus and its applications
An Introduction to Malliavin calculus and its applications Lecture 3: Clark-Ocone formula David Nualart Department of Mathematics Kansas University University of Wyoming Summer School 214 David Nualart
More informationTwo viewpoints on measure valued processes
Two viewpoints on measure valued processes Olivier Hénard Université Paris-Est, Cermics Contents 1 The classical framework : from no particle to one particle 2 The lookdown framework : many particles.
More informationState-Space Methods for Inferring Spike Trains from Calcium Imaging
State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline
More informationLearning Static Parameters in Stochastic Processes
Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We
More informationData assimilation as an optimal control problem and applications to UQ
Data assimilation as an optimal control problem and applications to UQ Walter Acevedo, Angwenyi David, Jana de Wiljes & Sebastian Reich Universität Potsdam/ University of Reading IPAM, November 13th 2017
More informationMonte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan
Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis
More information