Introduction to Hamiltonian Monte Carlo Method
|
|
- Russell McBride
- 5 years ago
- Views:
Transcription
1 Introduction to Hamiltonian Monte Carlo Method Mingwei Tang Department of Statistics University of Washington November 14,
2 Hamiltonian System Notation: q R d : position vector, p R d : momentum vector Hamiltonian H(p, q): R 2d R 1 Evolution equation for Hamilton system dq dt = H p dp dt = H q (1) 2
3 Potential and Kinetic Decompose the Hamiltonian H(p, q) = U(q) + K(p). U(q): potential energy depend on position K(p): Kinetic energy depend on momentum Motivating example: Free fall U(q) = mgq K(p) = 1 2 mv 2 = p2 2m H(p, q) = mgq + p2 is the total energy 2m Velocity: v = dq dt = H p = p/m Force F = dp dt = H q = mg 3
4 Properties of Hamiltonian system 1. Reversibility: The mapping T s: (q(t), p(t)) (q(t + s), p(t + s)) is one-to-one Has inverse T s : negate p, apply T s. negate p again 2. Conserved (Hamiltonian invariant) dh dt = dq H dt q + dp H dt p = H H p q H H q p = 0 H(p, q) is constant over time t. 3. Volume preservation: The map T s preserves the volume ( ) Tδ For small δ, Jacobian det 1 (p, q) 4
5 Idea of HMC D : Observed data, q : parameters (latent variables), π(q) prior distribution Likelihood function L(D q) Posterior distribution Pr(q D) L(D q)π(q) Position parameters, potential U(q) log-posterior U(q) = log [L(D q)π(q)] Introduce ancillary variable p for Kinetic energy d pi 2 K(p) = log (N (0, M)) 2m i i=1 p, q are independent Hamiltonian: H(p, q) = U(q) + K(p) 5
6 Idea of HMC: Cont Now we defined U(q) and K(p). Relate that to a distribution Canonical distribution Pr(p, q) = 1 Z exp ( H(p, q)/t ) = 1 exp ( U(q)/T ) exp ( K(p)/T ) (2) Z where T : temperature, Z normalizing constant Ususally set T = 1, Pr(q, p) Posterior distribution Mulitivaranit Guassian Goal: sample (p, q) jointly from canonical distribution 6
7 Ideal HMC Specify variance matrix M, time s > 0 For i = 1,..., N 1. Sample p (i) from N (0, M) 2. Starting with current (p (i), q (i 1) ), integral on Hamiltonian system for s period: (p, q ) T s((p (i), q (i 1) )) (leaves H(, ) invariant) 3. q (i) q, p (i) p Output q (1),..., q (N) as posterior samples Problem: The Hamiltonian system may not have a closed-form solution Need numerical method to for ODE system 7
8 Numerical ODE integrator Targeting problem: dq dt = H p = M 1 p dp dt = H = log (L(D q)π(q)) q Leap-frog method, for small time ɛ > 0 p(t + ɛ/2) = p(t) (ɛ/2) U q (q(t)) q(t + ɛ) = q(t) + ɛm 1 p(t + ɛ/2) p(t + ɛ) = p(t + ɛ/2) (ɛ/2) U (q(t + ɛ/2)) q 8
9 Numerical stability for Hamiltonian system 9
10 Property of Leap frog Time reversibility: Integrate n steps forward and then n steps backward, arrive at same starting position. Symplectic property: Converse the (slightly modified) energy 10
11 Idea HMC review Specify variance M, time s > 0 For i = 1,..., N 1. Sample p (i) from N (0, M) 2. Starting with current (p (i), q (i 1) ), integral on Hamiltonian system for s period: (p, q ) T s((p (i), q (i 1) )) 3. q (i) q, p (i) p Output q (1),..., q (N) as posterior samples Numerical method does not leave H(p, q) unchanged during integration H((p, q )) H((p (i), q (i 1) )) Need to adjust that 11
12 HMC in practice Specify variance matrix M, step size ɛ > 0, L : number of the leap frog steps For i = 1,..., N 1. Sample p (i) from N (0, M) 2. Starting with current (p (i), q (i 1) ), p p (p, q ) Leapfrog(p (i), q (i 1), ɛ, L) 3. Metropolis-Hastings with probability { Pr(p, q } ) α = min 1, Pr(p (i), q (i 1) ) set q (i) q, p (i) p (leaves canonical distribution invariant) Output q (1),..., q (N) as posterior samples 12
13 Comparison with random walk Metropolis-Hastings HMC: proposal based on Hamiltonian dynamics, not random walk Random walk Metropolis-Hastings (RWMH) need more steps to get a independent sample Optimum acceptance: HMC (65%), RWMH (23%) Computation d: Number of iterations to get a independent sample: HMC: O(d 1/4 ) vs RWMH: O(d) Total number of computations O(d 5/4 ) vs RWMH: O(d 2 ) See (Roberts et al. 2001) and (Neal 2011) for more details 13
14 Tuning parameters Stepsize ɛ: Large ɛ: Low acceptance rate Small ɛ: Waste computation, random walk behavior (ɛl) too small might need different ɛ for different region, eg. choose ɛ by random Number of leap-frog steps L: Trajectory length is crucial for exploring state space systematically More constrained in some directions, but much less constrained in other directions U-turns in long-trajectory 14
15 NUTS Solution: No-U-Turn Sampler (NUTS) (Hoffman et al. 2014) Adaptive way to select number of leap-frog step L Adaptive way to select step size ɛ The exact algorithm behind Stan! 15
16 NUTS: Select L Criterion for U-turns d q t q 0 2 = (q t q 0) T p t < 0 (3) dt 2 Start from (p (i), q (i 1) ) 1. Run leap-frog steps until (3) happens. Have candidate set B of (p, q) pairs 2. Select subset C B satisfies detail balanced equation 3. Random select q (i) from C 16
17 Selecting stepsize ɛ Warm-up phase M adapt H t be the acceptance probability at t-th iterations e.g { } Pr(p, q ) H t = min 1, Pr(p (t), q (t 1) ) h t(ɛ) = E t[h t ɛ] one step Dual averaging in each iteration for solving h t(ɛ) = δ where δ is the optimum acceptance rate, for HMC δ = 0.65 Find ɛ after M adapt iterations 17
18 Summary HMC: A MCMC algorithm make use of Hamiltonian dynamics Parameters as position, posterior likelihood as potential energy Propose new state based on Hamiltonian dynamics Leap-frog for numerical simulation, sensitive for tuning NUTS: A HMC with adaptive tuning on (L, ɛ) for more efficient proposal L: Avoid U-turns ɛ: Dual-averaging optimization to make the acceptance rate close to optimum 18
19 Implement your Own HMC Review the Hamiltonian dynamics dq dt = H p = M 1 p dp dt = H = log (L(D q)π(q)) q Need gradient log (L(D q)π(q)) = log (L(D q)) + log(π(q)) Stan: automatic gradient calculation Gradient: Stan can do gradient-based optimization (quasi-newton method L-BFGS) 19
17 : Optimization and Monte Carlo Methods
10-708: Probabilistic Graphical Models Spring 2017 17 : Optimization and Monte Carlo Methods Lecturer: Avinava Dubey Scribes: Neil Spencer, YJ Choe 1 Recap 1.1 Monte Carlo Monte Carlo methods such as rejection
More informationHamiltonian Monte Carlo
Hamiltonian Monte Carlo within Stan Daniel Lee Columbia University, Statistics Department bearlee@alum.mit.edu BayesComp mc-stan.org Why MCMC? Have data. Have a rich statistical model. No analytic solution.
More informationGradient-based Monte Carlo sampling methods
Gradient-based Monte Carlo sampling methods Johannes von Lindheim 31. May 016 Abstract Notes for a 90-minute presentation on gradient-based Monte Carlo sampling methods for the Uncertainty Quantification
More information1 Geometry of high dimensional probability distributions
Hamiltonian Monte Carlo October 20, 2018 Debdeep Pati References: Neal, Radford M. MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo 2.11 (2011): 2. Betancourt, Michael. A conceptual
More information19 : Slice Sampling and HMC
10-708: Probabilistic Graphical Models 10-708, Spring 2018 19 : Slice Sampling and HMC Lecturer: Kayhan Batmanghelich Scribes: Boxiang Lyu 1 MCMC (Auxiliary Variables Methods) In inference, we are often
More informationHamiltonian Monte Carlo with Fewer Momentum Reversals
Hamiltonian Monte Carlo with ewer Momentum Reversals Jascha Sohl-Dickstein December 6, 2 Hamiltonian dynamics with partial momentum refreshment, in the style of Horowitz, Phys. ett. B, 99, explore the
More informationHamiltonian Monte Carlo
Chapter 7 Hamiltonian Monte Carlo As with the Metropolis Hastings algorithm, Hamiltonian (or hybrid) Monte Carlo (HMC) is an idea that has been knocking around in the physics literature since the 1980s
More informationMarkov Chain Monte Carlo (MCMC)
School of Computer Science 10-708 Probabilistic Graphical Models Markov Chain Monte Carlo (MCMC) Readings: MacKay Ch. 29 Jordan Ch. 21 Matt Gormley Lecture 16 March 14, 2016 1 Homework 2 Housekeeping Due
More informationMIT /30 Gelman, Carpenter, Hoffman, Guo, Goodrich, Lee,... Stan for Bayesian data analysis
MIT 1985 1/30 Stan: a program for Bayesian data analysis with complex models Andrew Gelman, Bob Carpenter, and Matt Hoffman, Jiqiang Guo, Ben Goodrich, and Daniel Lee Department of Statistics, Columbia
More informationDiscontinuous Hamiltonian Monte Carlo for discrete parameters and discontinuous likelihoods
Discontinuous Hamiltonian Monte Carlo for discrete parameters and discontinuous likelihoods arxiv:1705.08510v3 [stat.co] 7 Sep 2018 Akihiko Nishimura Department of Biomathematics, University of California
More informationPaul Karapanagiotidis ECO4060
Paul Karapanagiotidis ECO4060 The way forward 1) Motivate why Markov-Chain Monte Carlo (MCMC) is useful for econometric modeling 2) Introduce Markov-Chain Monte Carlo (MCMC) - Metropolis-Hastings (MH)
More informationHamiltonian Monte Carlo for Scalable Deep Learning
Hamiltonian Monte Carlo for Scalable Deep Learning Isaac Robson Department of Statistics and Operations Research, University of North Carolina at Chapel Hill isrobson@email.unc.edu BIOS 740 May 4, 2018
More informationBoltzmann-Gibbs Preserving Langevin Integrators
Boltzmann-Gibbs Preserving Langevin Integrators Nawaf Bou-Rabee Applied and Comp. Math., Caltech INI Workshop on Markov-Chain Monte-Carlo Methods March 28, 2008 4000 atom cluster simulation Governing Equations
More informationProbabilistic Graphical Models
10-708 Probabilistic Graphical Models Homework 3 (v1.1.0) Due Apr 14, 7:00 PM Rules: 1. Homework is due on the due date at 7:00 PM. The homework should be submitted via Gradescope. Solution to each problem
More informationPattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods
Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs
More informationComputer Practical: Metropolis-Hastings-based MCMC
Computer Practical: Metropolis-Hastings-based MCMC Andrea Arnold and Franz Hamilton North Carolina State University July 30, 2016 A. Arnold / F. Hamilton (NCSU) MH-based MCMC July 30, 2016 1 / 19 Markov
More informationGSHMC: An efficient Markov chain Monte Carlo sampling method. Sebastian Reich in collaboration with Elena Akhmatskaya (Fujitsu Laboratories Europe)
GSHMC: An efficient Markov chain Monte Carlo sampling method Sebastian Reich in collaboration with Elena Akhmatskaya (Fujitsu Laboratories Europe) 1. Motivation In the first lecture, we started from a
More informationMonte Carlo in Bayesian Statistics
Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview
More informationarxiv: v4 [stat.co] 4 May 2016
Hamiltonian Monte Carlo Acceleration using Surrogate Functions with Random Bases arxiv:56.5555v4 [stat.co] 4 May 26 Cheng Zhang Department of Mathematics University of California, Irvine Irvine, CA 92697
More informationRiemann Manifold Methods in Bayesian Statistics
Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes
More informationAdaptive Monte Carlo methods
Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert
More informationarxiv: v1 [stat.me] 6 Apr 2013
Generalizing the No-U-Turn Sampler to Riemannian Manifolds Michael Betancourt Applied Statistics Center, Columbia University, New York, NY 127, USA Hamiltonian Monte Carlo provides efficient Markov transitions
More informationTutorial on Probabilistic Programming with PyMC3
185.A83 Machine Learning for Health Informatics 2017S, VU, 2.0 h, 3.0 ECTS Tutorial 02-04.04.2017 Tutorial on Probabilistic Programming with PyMC3 florian.endel@tuwien.ac.at http://hci-kdd.org/machine-learning-for-health-informatics-course
More information17 : Markov Chain Monte Carlo
10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo
More informationMCMC 2: Lecture 3 SIR models - more topics. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham
MCMC 2: Lecture 3 SIR models - more topics Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham Contents 1. What can be estimated? 2. Reparameterisation 3. Marginalisation
More informationClosed-form Gibbs Sampling for Graphical Models with Algebraic constraints. Hadi Mohasel Afshar Scott Sanner Christfried Webers
Closed-form Gibbs Sampling for Graphical Models with Algebraic constraints Hadi Mohasel Afshar Scott Sanner Christfried Webers Inference in Hybrid Graphical Models / Probabilistic Programs Limitations
More informationHamiltonian Monte Carlo Without Detailed Balance
Jascha Sohl-Dickstein Stanford University, Palo Alto. Khan Academy, Mountain View Mayur Mudigonda Redwood Institute for Theoretical Neuroscience, University of California at Berkeley Michael R. DeWeese
More informationSymplectic integration. Yichao Jing
Yichao Jing Hamiltonian & symplecticness Outline Numerical integrator and symplectic integration Application to accelerator beam dynamics Accuracy and integration order Hamiltonian dynamics In accelerator,
More informationStat 535 C - Statistical Computing & Monte Carlo Methods. Lecture February Arnaud Doucet
Stat 535 C - Statistical Computing & Monte Carlo Methods Lecture 13-28 February 2006 Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Limitations of Gibbs sampling. Metropolis-Hastings algorithm. Proof
More informationNonlinear Single-Particle Dynamics in High Energy Accelerators
Nonlinear Single-Particle Dynamics in High Energy Accelerators Part 2: Basic tools and concepts Nonlinear Single-Particle Dynamics in High Energy Accelerators This course consists of eight lectures: 1.
More informationKernel adaptive Sequential Monte Carlo
Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline
More informationLecture 7 and 8: Markov Chain Monte Carlo
Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani
More informationChaotic motion. Phys 750 Lecture 9
Chaotic motion Phys 750 Lecture 9 Finite-difference equations Finite difference equation approximates a differential equation as an iterative map (x n+1,v n+1 )=M[(x n,v n )] Evolution from time t =0to
More informationLegendre Transforms, Calculus of Varations, and Mechanics Principles
page 437 Appendix C Legendre Transforms, Calculus of Varations, and Mechanics Principles C.1 Legendre Transforms Legendre transforms map functions in a vector space to functions in the dual space. From
More informationDimension-Independent likelihood-informed (DILI) MCMC
Dimension-Independent likelihood-informed (DILI) MCMC Tiangang Cui, Kody Law 2, Youssef Marzouk Massachusetts Institute of Technology 2 Oak Ridge National Laboratory 2 August 25 TC, KL, YM DILI MCMC USC
More informationHamiltonian Dynamics
Hamiltonian Dynamics CDS 140b Joris Vankerschaver jv@caltech.edu CDS Feb. 10, 2009 Joris Vankerschaver (CDS) Hamiltonian Dynamics Feb. 10, 2009 1 / 31 Outline 1. Introductory concepts; 2. Poisson brackets;
More informationQuasi-Newton Methods for Markov Chain Monte Carlo
Quasi-Newton Methods for Markov Chain Monte Carlo Yichuan Zhang and Charles Sutton School of Informatics University of Edinburgh Y.Zhang-60@sms.ed.ac.uk, csutton@inf.ed.ac.uk Abstract The performance of
More informationPhysics 5153 Classical Mechanics. Canonical Transformations-1
1 Introduction Physics 5153 Classical Mechanics Canonical Transformations The choice of generalized coordinates used to describe a physical system is completely arbitrary, but the Lagrangian is invariant
More informationEstimating Accuracy in Classical Molecular Simulation
Estimating Accuracy in Classical Molecular Simulation University of Illinois Urbana-Champaign Department of Computer Science Institute for Mathematics and its Applications July 2007 Acknowledgements Ben
More informationA Review of Pseudo-Marginal Markov Chain Monte Carlo
A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationKernel Adaptive Metropolis-Hastings
Kernel Adaptive Metropolis-Hastings Arthur Gretton,?? Gatsby Unit, CSML, University College London NIPS, December 2015 Arthur Gretton (Gatsby Unit, UCL) Kernel Adaptive Metropolis-Hastings 12/12/2015 1
More informationREVIEW. Hamilton s principle. based on FW-18. Variational statement of mechanics: (for conservative forces) action Equivalent to Newton s laws!
Hamilton s principle Variational statement of mechanics: (for conservative forces) action Equivalent to Newton s laws! based on FW-18 REVIEW the particle takes the path that minimizes the integrated difference
More informationGeneral Construction of Irreversible Kernel in Markov Chain Monte Carlo
General Construction of Irreversible Kernel in Markov Chain Monte Carlo Metropolis heat bath Suwa Todo Department of Applied Physics, The University of Tokyo Department of Physics, Boston University (from
More informationEULER-LAGRANGE TO HAMILTON. The goal of these notes is to give one way of getting from the Euler-Lagrange equations to Hamilton s equations.
EULER-LAGRANGE TO HAMILTON LANCE D. DRAGER The goal of these notes is to give one way of getting from the Euler-Lagrange equations to Hamilton s equations. 1. Euler-Lagrange to Hamilton We will often write
More informationSampling versus optimization in very high dimensional parameter spaces
Sampling versus optimization in very high dimensional parameter spaces Grigor Aslanyan Berkeley Center for Cosmological Physics UC Berkeley Statistical Challenges in Modern Astronomy VI Carnegie Mellon
More informationProbabilistic Graphical Models Lecture 17: Markov chain Monte Carlo
Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,
More informationHomework 4. Goldstein 9.7. Part (a) Theoretical Dynamics October 01, 2010 (1) P i = F 1. Q i. p i = F 1 (3) q i (5) P i (6)
Theoretical Dynamics October 01, 2010 Instructor: Dr. Thomas Cohen Homework 4 Submitted by: Vivek Saxena Goldstein 9.7 Part (a) F 1 (q, Q, t) F 2 (q, P, t) P i F 1 Q i (1) F 2 (q, P, t) F 1 (q, Q, t) +
More informationThe University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),
The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz
More informationUnderstanding Automatic Differentiation to Improve Performance
Understanding Automatic Differentiation to Improve Performance Charles Margossian Columbia University, Department of Statistics July 22nd 2018 Sampler Metropolis Hasting, Gibbs Hamiltonian Monte Carlo
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationTreecodes for Cosmology Thomas Quinn University of Washington N-Body Shop
Treecodes for Cosmology Thomas Quinn University of Washington N-Body Shop Outline Motivation Multipole Expansions Tree Algorithms Periodic Boundaries Time integration Gravitational Softening SPH Parallel
More informationChaotic motion. Phys 420/580 Lecture 10
Chaotic motion Phys 420/580 Lecture 10 Finite-difference equations Finite difference equation approximates a differential equation as an iterative map (x n+1,v n+1 )=M[(x n,v n )] Evolution from time t
More informationHierarchical Bayesian Modeling
Hierarchical Bayesian Modeling Making scientific inferences about a population based on many individuals Angie Wolfgang NSF Postdoctoral Fellow, Penn State Astronomical Populations Once we discover an
More informationCSC 2541: Bayesian Methods for Machine Learning
CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 3 More Markov Chain Monte Carlo Methods The Metropolis algorithm isn t the only way to do MCMC. We ll
More informationThe No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo
The No-U-Turn Sampler The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo Matthew D. Hoffman Department of Statistics Columbia University New York, NY 10027, USA Andrew Gelman
More informationMCMC and Gibbs Sampling. Kayhan Batmanghelich
MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction
More informationIntroduction to Stochastic Gradient Markov Chain Monte Carlo Methods
Introduction to Stochastic Gradient Markov Chain Monte Carlo Methods Changyou Chen Department of Electrical and Computer Engineering, Duke University cc448@duke.edu Duke-Tsinghua Machine Learning Summer
More informationHamiltonian flow in phase space and Liouville s theorem (Lecture 5)
Hamiltonian flow in phase space and Liouville s theorem (Lecture 5) January 26, 2016 90/441 Lecture outline We will discuss the Hamiltonian flow in the phase space. This flow represents a time dependent
More informationLarge Scale Bayesian Inference
Large Scale Bayesian I in Cosmology Jens Jasche Garching, 11 September 2012 Introduction Cosmography 3D density and velocity fields Power-spectra, bi-spectra Dark Energy, Dark Matter, Gravity Cosmological
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationLeap Frog Solar System
Leap Frog Solar System David-Alexander Robinson Sch. 08332461 20th November 2011 Contents 1 Introduction & Theory 2 1.1 The Leap Frog Integrator......................... 2 1.2 Class.....................................
More informationBayesian Phylogenetics:
Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes
More informationForward Problems and their Inverse Solutions
Forward Problems and their Inverse Solutions Sarah Zedler 1,2 1 King Abdullah University of Science and Technology 2 University of Texas at Austin February, 2013 Outline 1 Forward Problem Example Weather
More informationNotes on pseudo-marginal methods, variational Bayes and ABC
Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution
More informationThe Geometry of Euler s equation. Introduction
The Geometry of Euler s equation Introduction Part 1 Mechanical systems with constraints, symmetries flexible joint fixed length In principle can be dealt with by applying F=ma, but this can become complicated
More informationCalibrating Environmental Engineering Models and Uncertainty Analysis
Models and Cornell University Oct 14, 2008 Project Team Christine Shoemaker, co-pi, Professor of Civil and works in applied optimization, co-pi Nikolai Blizniouk, PhD student in Operations Research now
More informationCanonical transformations (Lecture 4)
Canonical transformations (Lecture 4) January 26, 2016 61/441 Lecture outline We will introduce and discuss canonical transformations that conserve the Hamiltonian structure of equations of motion. Poisson
More informationSampling. Very simple. high dimensional. Summary : Monte Carlo Methods. normalizing. problems. Importance. very general. good proposal qkl.
Lecture 7 : Hamiltonian Monte Carlo Scribes : Ming mthg & Colin Lectures : No class on Monday Homework Z : Due Friday 9 Feb ( start this week! ) Need Does Summary : Monte Carlo Methods Importance Sampling
More informationComparison of Stochastic Volatility Models Using Integrated Information Criteria
Comparison of Stochastic Volatility Models Using Integrated Information Criteria A Thesis Submitted to the College of Graduate Studies and Research in Partial Fulfillment of the Requirements for the degree
More informationManifold Monte Carlo Methods
Manifold Monte Carlo Methods Mark Girolami Department of Statistical Science University College London Joint work with Ben Calderhead Research Section Ordinary Meeting The Royal Statistical Society October
More informationThermostat-assisted Continuous-tempered Hamiltonian Monte Carlo for Multimodal Posterior Sampling
Thermostat-assisted Continuous-tempered Hamiltonian Monte Carlo for Multimodal Posterior Sampling Rui Luo, Yaodong Yang, Jun Wang, and Yuanyuan Liu Department of Computer Science, University College London
More informationPhysics 235 Chapter 7. Chapter 7 Hamilton's Principle - Lagrangian and Hamiltonian Dynamics
Chapter 7 Hamilton's Principle - Lagrangian and Hamiltonian Dynamics Many interesting physics systems describe systems of particles on which many forces are acting. Some of these forces are immediately
More informationOnline appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US
Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus
More information9.3: Separable Equations
9.3: Separable Equations An equation is separable if one can move terms so that each side of the equation only contains 1 variable. Consider the 1st order equation = F (x, y). dx When F (x, y) = f (x)g(y),
More informationMonte Carlo methods for sampling-based Stochastic Optimization
Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from
More informationAdaptive HMC via the Infinite Exponential Family
Adaptive HMC via the Infinite Exponential Family Arthur Gretton Gatsby Unit, CSML, University College London RegML, 2017 Arthur Gretton (Gatsby Unit, UCL) Adaptive HMC via the Infinite Exponential Family
More informationRandom Walks A&T and F&S 3.1.2
Random Walks A&T 110-123 and F&S 3.1.2 As we explained last time, it is very difficult to sample directly a general probability distribution. - If we sample from another distribution, the overlap will
More informationLikelihood-free MCMC
Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte
More informationCE 530 Molecular Simulation
CE 530 Molecular Simulation Lecture Molecular Dynamics Simulation David A. Kofke Department of Chemical Engineering SUNY Buffalo kofke@eng.buffalo.edu MD of hard disks intuitive Review and Preview collision
More informationMCMC algorithms for fitting Bayesian models
MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models
More informationStochastic Variational Integrators
Stochastic Variational Integrators Jeremy Schmitt Variational integrators are numerical geometric integrator s derived from discretizing Hamilton s principle. They are symplectic integrators that exhibit
More informationBayesian Sampling Using Stochastic Gradient Thermostats
Bayesian Sampling Using Stochastic Gradient Thermostats Nan Ding Google Inc. dingnan@google.com Youhan Fang Purdue University yfang@cs.purdue.edu Ryan Babbush Google Inc. babbush@google.com Changyou Chen
More informationTwisted geodesic flows and symplectic topology
Mike Usher UGA September 12, 2008/ UGA Geometry Seminar Outline 1 Classical mechanics of a particle in a potential on a manifold Lagrangian formulation Hamiltonian formulation 2 Magnetic flows Lagrangian
More informationKernel Sequential Monte Carlo
Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section
More informationarxiv: v1 [stat.co] 2 Nov 2017
Binary Bouncy Particle Sampler arxiv:1711.922v1 [stat.co] 2 Nov 217 Ari Pakman Department of Statistics Center for Theoretical Neuroscience Grossman Center for the Statistics of Mind Columbia University
More informationPackage elhmc. R topics documented: July 4, Type Package
Package elhmc July 4, 2017 Type Package Title Sampling from a Empirical Likelihood Bayesian Posterior of Parameters Using Hamiltonian Monte Carlo Version 1.1.0 Date 2017-07-03 Author Dang Trung Kien ,
More informationAn introduction to adaptive MCMC
An introduction to adaptive MCMC Gareth Roberts MIRAW Day on Monte Carlo methods March 2011 Mainly joint work with Jeff Rosenthal. http://www2.warwick.ac.uk/fac/sci/statistics/crism/ Conferences and workshops
More informationICES REPORT March Tan Bui-Thanh And Mark Andrew Girolami
ICES REPORT 4- March 4 Solving Large-Scale Pde-Constrained Bayesian Inverse Problems With Riemann Manifold Hamiltonian Monte Carlo by Tan Bui-Thanh And Mark Andrew Girolami The Institute for Computational
More informationLecture 8: Bayesian Estimation of Parameters in State Space Models
in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space
More informationParallel MCMC with Generalized Elliptical Slice Sampling
Journal of Machine Learning Research 15 (2014) 2087-2112 Submitted 10/12; Revised 1/14; Published 6/14 Parallel MCMC with Generalized Elliptical Slice Sampling Robert Nishihara Department of Electrical
More informationMix & Match Hamiltonian Monte Carlo
Mix & Match Hamiltonian Monte Carlo Elena Akhmatskaya,2 and Tijana Radivojević BCAM - Basque Center for Applied Mathematics, Bilbao, Spain 2 IKERBASQUE, Basque Foundation for Science, Bilbao, Spain MCMSki
More informationPhysics 452 Lecture 33: A Particle in an E&M Field
Physics 452 Lecture 33: A Particle in an E&M Field J. Peatross In lectures 31 and 32, we considered the Klein-Gordon equation for a free particle. We would like to add a potential to the equation (since
More informationPhysics 106b: Lecture 7 25 January, 2018
Physics 106b: Lecture 7 25 January, 2018 Hamiltonian Chaos: Introduction Integrable Systems We start with systems that do not exhibit chaos, but instead have simple periodic motion (like the SHO) with
More informationIntroduction to Markov Chain Monte Carlo & Gibbs Sampling
Introduction to Markov Chain Monte Carlo & Gibbs Sampling Prof. Nicholas Zabaras Sibley School of Mechanical and Aerospace Engineering 101 Frank H. T. Rhodes Hall Ithaca, NY 14853-3801 Email: zabaras@cornell.edu
More informationSession 3A: Markov chain Monte Carlo (MCMC)
Session 3A: Markov chain Monte Carlo (MCMC) John Geweke Bayesian Econometrics and its Applications August 15, 2012 ohn Geweke Bayesian Econometrics and its Session Applications 3A: Markov () chain Monte
More informationLecture 11 : Overview
Lecture 11 : Overview Error in Assignment 3 : In Eq. 1, Hamiltonian should be H = p2 r 2m + p2 ϕ 2mr + (p z ea z ) 2 2 2m + eφ (1) Error in lecture 10, slide 7, Eq. (21). Should be S(q, α, t) m Q = β =
More informationMarkov Chain Monte Carlo, Numerical Integration
Markov Chain Monte Carlo, Numerical Integration (See Statistics) Trevor Gallen Fall 2015 1 / 1 Agenda Numerical Integration: MCMC methods Estimating Markov Chains Estimating latent variables 2 / 1 Numerical
More informationRoll-back Hamiltonian Monte Carlo
Roll-back Hamiltonian Monte Carlo Kexin Yi Department of Physics Harvard University Cambridge, MA, 02138 kyi@g.harvard.edu Finale Doshi-Velez School of Engineering and Applied Sciences Harvard University
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More information