Large Scale Bayesian Inference
|
|
- Gervais Bailey
- 5 years ago
- Views:
Transcription
1 Large Scale Bayesian I in Cosmology Jens Jasche Garching, 11 September 2012
2 Introduction Cosmography 3D density and velocity fields Power-spectra, bi-spectra Dark Energy, Dark Matter, Gravity Cosmological parameters Large Scale Bayesian i High dimensional ( ~ 10^7 parameters ) State-of-the-art technology On the verge of numerical feasibility
3 Introduction Cosmography 3D density and velocity fields Power-spectra, bi-spectra Dark Energy, Dark Matter, Gravity Cosmological parameters Large Scale Bayesian i High dimensional ( ~ 10^7 parameters ) State-of-the-art technology On the verge of numerical feasibility
4 Introduction Why do we need Bayesian i?
5 Introduction Why do we need Bayesian i? I of signals = ill-posed problem
6 Introduction Why do we need Bayesian i? I of signals = ill-posed problem Noise Incomplete observations Systematics
7 Introduction Why do we need Bayesian i? I of signals = ill-posed problem Noise Incomplete observations Systematics
8 Introduction Why do we need Bayesian i? I of signals = ill-posed problem Noise Incomplete observations Systematics
9 Introduction Why do we need Bayesian i? I of signals = ill-posed problem Noise Incomplete observations Systematics No unique recovery possible!!!
10 Introduction What are the possible signals compatible with observations?
11 Introduction What are the possible signals compatible with observations? Object of interest: Signal posterior distribution We can do science! Model comparison Parameter studies Report statistical summaries Non-linear, Non-Gaussian error propagation
12 Introduction What are the possible signals compatible with observations? Object of interest: Signal posterior distribution We can do science! Model comparison Parameter studies Report statistical summaries Non-linear, Non-Gaussian error propagation
13 Markov Chain Monte Carlo Problems: High dimensional (~10^7 parameter) A large number of correlated parameters No reduction of problem size possible Complex posterior distributions Numerical approximation Dim > 4 MCMC Metropolis-Hastings
14 Markov Chain Monte Carlo Problems: High dimensional (~10^7 parameter) A large number of correlated parameters No reduction of problem size possible Complex posterior distributions Numerical approximation Dim > 4 MCMC Metropolis-Hastings
15 Markov Chain Monte Carlo Problems: High dimensional (~10^7 parameter) A large number of correlated parameters No reduction of problem size possible Complex posterior distributions Numerical approximation Dim > 4 MCMC Metropolis-Hastings
16 Markov Chain Monte Carlo Problems: High dimensional (~10^7 parameter) A large number of correlated parameters No reduction of problem size possible Complex posterior distributions Numerical approximation Dim > 4 MCMC Metropolis-Hastings
17 Markov Chain Monte Carlo Problems: High dimensional (~10^7 parameter) A large number of correlated parameters No reduction of problem size possible Complex posterior distributions Numerical approximation Dim > 4 MCMC Metropolis-Hastings
18 Hamiltonian sampling Parameter space exploration via Hamiltonian sampling interpret log-posterior as potential introduce Gaussian auxiliary momentum variable rltant joint posterior distribution of separable in and marginalization over and yields again
19 Hamiltonian sampling Parameter space exploration via Hamiltonian sampling interpret log-posterior as potential introduce Gaussian auxiliary momentum variable resultant joint posterior distribution of separable in and marginalization over yields again and
20 Hamiltonian sampling Parameter space exploration via Hamiltonian sampling interpret log-posterior as potential introduce Gaussian auxiliary momentum variable resultant joint posterior distribution of separable in and marginalization over yields again and
21 Hamiltonian sampling Parameter space exploration via Hamiltonian sampling interpret log-posterior as potential introduce Gaussian auxiliary momentum variable resultant joint posterior distribution of separable in and marginalization over yields again and
22 Hamiltonian sampling Parameter space exploration via Hamiltonian sampling interpret log-posterior as potential introduce Gaussian auxiliary momentum variable resultant joint posterior distribution of separable in and marginalization over yields again and
23 Hamiltonian sampling IDEA: Use Hamiltonian dynamics to explore solve Hamiltonian system to obtain new sample Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity All samples are accepted
24 Hamiltonian sampling IDEA: Use Hamiltonian dynamics to explore solve Hamiltonian system to obtain new sample Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity All samples are accepted
25 Hamiltonian sampling IDEA: Use Hamiltonian dynamics to explore solve Hamiltonian system to obtain new sample Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity All samples are accepted
26 Hamiltonian sampling IDEA: Use Hamiltonian dynamics to explore solve Hamiltonian system to obtain new sample Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity All samples are accepted
27 Hamiltonian sampling IDEA: Use Hamiltonian dynamics to explore solve Hamiltonian system to obtain new sample Hamiltonian dynamics conserve the Hamiltonian Metropolis acceptance probability is unity All samples are accepted
28 Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution
29 Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution Prior
30 Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution Likelihood
31 Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution
32 Hamiltonian sampling Example: Wiener posterior = multivariate normal distribution EOM: coupled harmonic oscillator
33 Hamiltonian sampling How to set the Mass matrix? Large number of tunable parameter Determines efficiency of sampler Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
34 Hamiltonian sampling How to set the Mass matrix? Large number of tunable parameter Determines efficiency of sampler Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
35 Hamiltonian sampling How to set the Mass matrix? Large number of tunable parameter Determines efficiency of sampler Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
36 Hamiltonian sampling How to set the Mass matrix? Large number of tunable parameter Determines efficiency of sampler Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
37 Hamiltonian sampling How to set the Mass matrix? Large number of tunable parameter Determines efficiency of sampler Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
38 Hamiltonian sampling How to set the Mass matrix? Large number of tunable parameter Determines efficiency of sampler Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
39 Hamiltonian sampling How to set the Mass matrix? Large number of tunable parameter Determines efficiency of sampler Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
40 Hamiltonian sampling How to set the Mass matrix? Large number of tunable parameter Determines efficiency of sampler Mass matrix aims at decoupling the system In practice: use diagonal approximation The quality of approximation determines sampler efficiency Non-Gaussian case: Taylor expand to find Mass matrix
41 HMC in action I of non-linear density fields in cosmology Non-linear density field Log-normal prior See e.g. Coles & Jones (1991), Kayo et al. (2001) Galaxy distribution Poisson likelihood Signal dependent noise Credit: M. Blanton and the Sloan Digital Sky Survey Problem: Non-Gaussian sampling in high dimensions HADES (HAmiltonian Density Estimation and Sampling) Jasche, Kitaura (2010)
42 HMC in action I of non-linear density fields in cosmology Non-linear density field Log-normal prior See e.g. Coles & Jones (1991), Kayo et al. (2001) Galaxy distribution Poisson likelihood Signal dependent noise Credit: M. Blanton and the Sloan Digital Sky Survey Problem: Non-Gaussian sampling in high dimensions HADES (HAmiltonian Density Estimation and Sampling) Jasche, Kitaura (2010)
43 HMC in action I of non-linear density fields in cosmology Non-linear density field Log-normal prior See e.g. Coles & Jones (1991), Kayo et al. (2001) Galaxy distribution Poisson likelihood Signal dependent noise Credit: M. Blanton and the Sloan Digital Sky Survey Problem: Non-Gaussian sampling in high dimensions HADES (HAmiltonian Density Estimation and Sampling) Jasche, Kitaura (2010)
44 LSS i with the SDSS Application of HADES to SDSS DR7 cubic, equidistant box with sidelength 750 Mpc ~ 3 Mpc grid resolution ~ 10^7 volume elements / parameters Jasche, Kitaura, Li, Enßlin J. Jasche, Bayesian LSS I (2010)
45 LSS i with the SDSS Application of HADES to SDSS DR7 cubic, equidistant box with sidelength 750 Mpc ~ 3 Mpc grid resolution ~ 10^7 volume elements / parameters Goal: provide a representation of the SDSS density posterior to provide 3D cosmographic descriptions to quantify uncertainties of the density distribution Jasche, Kitaura, Li, Enßlin J. Jasche, Bayesian LSS I (2010)
46 LSS i with the SDSS
47
48 Multiple Block Sampling What if the HMC is not an option? Problem: Design of good proposal distributions High rejection rates Multiple block sampling ( see e.g. Hastings (1997) ) Break down into subproblems Serial processing only! simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
49 Multiple Block Sampling What if the HMC is not an option? Problem: Design of good proposal distributions High rejection rates Multiple block sampling ( see e.g. Hastings (1997) ) Break down into subproblems Serial processing only! simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
50 Multiple Block Sampling What if the HMC is not an option? Problem: Design of good proposal distributions High rejection rates Multiple block sampling ( see e.g. Hastings (1997) ) Break down into subproblems Serial processing only! simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
51 Multiple Block Sampling What if the HMC is not an option? Problem: Design of good proposal distributions High rejection rates Multiple block sampling ( see e.g. Hastings (1997) ) Break down into subproblems Serial processing only! simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
52 Multiple Block Sampling What if the HMC is not an option? Problem: Design of good proposal distributions High rejection rates Multiple block sampling ( see e.g. Hastings (1997) ) Break down into subproblems simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
53 Multiple Block Sampling What if the HMC is not an option? Problem: Design of good proposal distributions High rejection rates Multiple block sampling ( see e.g. Hastings (1997) ) Break down into subproblems Serial processing only! simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
54 Multiple Block Sampling What if the HMC is not an option? Problem: Design of good proposal distributions High rejection rates Multiple block sampling ( see e.g. Hastings (1997) ) Break down into subproblems Serial processing only! simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
55 Multiple Block Sampling What if the HMC is not an option? Problem: Design of good proposal distributions High rejection rates Multiple block sampling ( see e.g. Hastings (1997) ) Break down into subproblems Serial processing only! simplifies design of conditional proposal distributions Average acceptance rate is higher Requires serial processing
56 Multiple Block Sampling Can we boost block sampling? Sometimes it is easier to explore full joint the PDF Block sampler: Permits efficient sampling for numerical expensive posteriors
57 Multiple Block Sampling Can we boost block sampling? Sometimes it is easier to explore full joint the PDF Block sampler: Permits efficient sampling for numerical expensive posteriors
58 Multiple Block Sampling Can we boost block sampling? Sometimes it is easier to explore full joint the PDF Block sampler: process in parallel! Permits efficient sampling for numerical expensive posteriors
59 Multiple Block Sampling Can we boost block sampling? Sometimes it is easier to explore full joint the PDF Block sampler: process in parallel! Permits efficient sampling for numerical expensive posteriors
60 Multiple Block Sampling Can we boost block sampling? Sometimes it is easier to explore full joint the PDF Block sampler: process in parallel! Permits efficient sampling for numerical expensive posteriors
61 Photometric redshift sampling Photometric surveys millions of galaxies ( ~10^7-10^8) low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution: Block sampler: Process in parallel! HMC sampler!
62 Photometric redshift sampling Photometric surveys millions of galaxies ( ~10^7-10^8) low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution: Block sampler:
63 Photometric redshift sampling Photometric surveys millions of galaxies ( ~10^7-10^8) low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution: Block sampler:
64 Photometric redshift sampling Photometric surveys millions of galaxies ( ~10^7-10^8) low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution: Block sampler:
65 Photometric redshift sampling Photometric surveys millions of galaxies ( ~10^7-10^8) low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution: Block sampler:
66 Photometric redshift sampling Photometric surveys millions of galaxies ( ~10^7-10^8) low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution: Block sampler: Process in parallel!
67 Photometric redshift sampling Photometric surveys millions of galaxies ( ~10^7-10^8) low redshift accuracy ( ~ 100 Mpc along LOS) Infer accurate redshifts: Rather sample from joint distribution: Block sampler: Process in parallel! HMC sampler!
68 Photometric redshift sampling Application to artificial photometric data ~ Noise, Systematics, Position uncertainty (~100 Mpc) ~ 10^7 density amplitudes /parameters ~ 2x10^7 radial galaxy positions / parameters J. Jasche, Bayesian LSS I ~ 3x10^7 parameters in total
69 Photometric redshift sampling Application to artificial photometric data ~ Noise, Systematics, Position uncertainty (~100 Mpc) ~ 10^7 density amplitudes /parameters ~ 2x10^7 radial galaxy positions / parameters ~ 3x10^7 parameters J. Jasche, Bayesian LSS I in total
70 Photometric redshift sampling Jasche, Wandelt (2012)
71 Deviation from the truth Before After Jasche, Wandelt (2012)
72 Deviation from the truth Raw data Density estimate Jasche, Wandelt (2012)
73 4D physical i Physical motivation Complex final state Simple initial state
74 4D physical i Physical motivation Complex final state Simple initial state Final state
75 4D physical i Physical motivation Complex final state Simple initial state Initial state Final state
76 4D physical i Physical motivation Complex final state Simple initial state Initial state Final state Gravity
77 4D physical i The ideal scenario: We need a very very very large computer! Not practical! Even with approximations!!!!
78 4D physical i The ideal scenario: We need a very very very large computer! Not practical! Even with approximations!!!!
79 4D physical i The ideal scenario: We need a very very very large computer! Not practical! Even with approximations!!!!
80 4D physical i The ideal scenario: We need a very very very large computer! Not practical! Even with approximations!!!!
81 4D physical i The ideal scenario: We need a very very very large computer! Not practical! Even with approximations!!!!
82 4D physical i BORG (Bayesian Origin Reconstruction from Galaxies) HMC Second order Lagrangian perturbation theory BORG Jasche, Wandelt (2012)
83 4D physical i
84 4D physical i
85 Summary & Conclusion Large scale Bayesian i I in high dimensions from incomplete observations Noise, systematic effects, survey geometry, selection effects, biases Need to quantify uncertainties explore posterior distribution Markov Chain Monte Carlo methods Hamiltonian sampling (exploit symmetries, decouple system) Multiple block sampling (break down into subproblems) 3 high dimensional examples (>10^7 parameter) Nonlinear density i Photometric redshift and density i 4D physical i
86 The End Thank you
STA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationMonte Carlo in Bayesian Statistics
Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview
More informationData Analysis Methods
Data Analysis Methods Successes, Opportunities, and Challenges Chad M. Schafer Department of Statistics & Data Science Carnegie Mellon University cschafer@cmu.edu The Astro/Data Science Community LSST
More informationPaul Karapanagiotidis ECO4060
Paul Karapanagiotidis ECO4060 The way forward 1) Motivate why Markov-Chain Monte Carlo (MCMC) is useful for econometric modeling 2) Introduce Markov-Chain Monte Carlo (MCMC) - Metropolis-Hastings (MH)
More informationIntroduction to CosmoMC
Introduction to CosmoMC Part I: Motivation & Basic concepts Institut de Ciències del Cosmos - Universitat de Barcelona Dept. de Física Teórica y del Cosmos, Universidad de Granada, 1-3 Marzo 2016 What
More informationForward Problems and their Inverse Solutions
Forward Problems and their Inverse Solutions Sarah Zedler 1,2 1 King Abdullah University of Science and Technology 2 University of Texas at Austin February, 2013 Outline 1 Forward Problem Example Weather
More informationAdvanced Statistical Methods. Lecture 6
Advanced Statistical Methods Lecture 6 Convergence distribution of M.-H. MCMC We denote the PDF estimated by the MCMC as. It has the property Convergence distribution After some time, the distribution
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationHierarchical Bayesian Modeling
Hierarchical Bayesian Modeling Making scientific inferences about a population based on many individuals Angie Wolfgang NSF Postdoctoral Fellow, Penn State Astronomical Populations Once we discover an
More informationProbing the covariance matrix
Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241
More informationParameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn
Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation
More informationCosmoSIS Webinar Part 1
CosmoSIS Webinar Part 1 An introduction to cosmological parameter estimation and sampling Presenting for the CosmoSIS team: Elise Jennings (Fermilab, KICP) Joe Zuntz (University of Manchester) Vinicius
More informationContents. Part I: Fundamentals of Bayesian Inference 1
Contents Preface xiii Part I: Fundamentals of Bayesian Inference 1 1 Probability and inference 3 1.1 The three steps of Bayesian data analysis 3 1.2 General notation for statistical inference 4 1.3 Bayesian
More informationPART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics
Table of Preface page xi PART I INTRODUCTION 1 1 The meaning of probability 3 1.1 Classical definition of probability 3 1.2 Statistical definition of probability 9 1.3 Bayesian understanding of probability
More informationSampling versus optimization in very high dimensional parameter spaces
Sampling versus optimization in very high dimensional parameter spaces Grigor Aslanyan Berkeley Center for Cosmological Physics UC Berkeley Statistical Challenges in Modern Astronomy VI Carnegie Mellon
More informationKernel adaptive Sequential Monte Carlo
Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline
More informationPOWER SPECTRUM ESTIMATION FOR J PAS DATA
CENTRO DE ESTUDIOS DE FÍSICA DEL COSMOS DE ARAGÓN (CEFCA) POWER SPECTRUM ESTIMATION FOR J PAS DATA Carlos Hernández Monteagudo, Susana Gracia (CEFCA) & Raul Abramo (Univ. de Sao Paulo) Madrid, February
More informationMultimodal Nested Sampling
Multimodal Nested Sampling Farhan Feroz Astrophysics Group, Cavendish Lab, Cambridge Inverse Problems & Cosmology Most obvious example: standard CMB data analysis pipeline But many others: object detection,
More information19 : Slice Sampling and HMC
10-708: Probabilistic Graphical Models 10-708, Spring 2018 19 : Slice Sampling and HMC Lecturer: Kayhan Batmanghelich Scribes: Boxiang Lyu 1 MCMC (Auxiliary Variables Methods) In inference, we are often
More information(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis
Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals
More informationThe flickering luminosity method
The flickering luminosity method Martin Feix in collaboration with Adi Nusser (Technion) and Enzo Branchini (Roma Tre) Institut d Astrophysique de Paris SSG16 Workshop, February 2nd 2016 Outline 1 Motivation
More informationLecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1
Lecture 5 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationInference of Galaxy Population Statistics Using Photometric Redshift Probability Distribution Functions
Inference of Galaxy Population Statistics Using Photometric Redshift Probability Distribution Functions Alex Malz Center for Cosmology and Particle Physics, New York University 7 June 2016 Galaxy population
More informationUsing training sets and SVD to separate global 21-cm signal from foreground and instrument systematics
Using training sets and SVD to separate global 21-cm signal from foreground and instrument systematics KEITH TAUSCHER*, DAVID RAPETTI, JACK O. BURNS, ERIC SWITZER Aspen, CO Cosmological Signals from Cosmic
More informationLog Gaussian Cox Processes. Chi Group Meeting February 23, 2016
Log Gaussian Cox Processes Chi Group Meeting February 23, 2016 Outline Typical motivating application Introduction to LGCP model Brief overview of inference Applications in my work just getting started
More informationMonte Carlo Markov Chains: A Brief Introduction and Implementation. Jennifer Helsby Astro 321
Monte Carlo Markov Chains: A Brief Introduction and Implementation Jennifer Helsby Astro 321 What are MCMC: Markov Chain Monte Carlo Methods? Set of algorithms that generate posterior distributions by
More informationMCMC and Gibbs Sampling. Kayhan Batmanghelich
MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction
More informationTransdimensional Markov Chain Monte Carlo Methods. Jesse Kolb, Vedran Lekić (Univ. of MD) Supervisor: Kris Innanen
Transdimensional Markov Chain Monte Carlo Methods Jesse Kolb, Vedran Lekić (Univ. of MD) Supervisor: Kris Innanen Motivation for Different Inversion Technique Inversion techniques typically provide a single
More information17 : Markov Chain Monte Carlo
10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo
More informationMarkov Chain Monte Carlo, Numerical Integration
Markov Chain Monte Carlo, Numerical Integration (See Statistics) Trevor Gallen Fall 2015 1 / 1 Agenda Numerical Integration: MCMC methods Estimating Markov Chains Estimating latent variables 2 / 1 Numerical
More informationModelling Operational Risk Using Bayesian Inference
Pavel V. Shevchenko Modelling Operational Risk Using Bayesian Inference 4y Springer 1 Operational Risk and Basel II 1 1.1 Introduction to Operational Risk 1 1.2 Defining Operational Risk 4 1.3 Basel II
More informationBayesian inference for multivariate extreme value distributions
Bayesian inference for multivariate extreme value distributions Sebastian Engelke Clément Dombry, Marco Oesting Toronto, Fields Institute, May 4th, 2016 Main motivation For a parametric model Z F θ of
More informationProbabilistic Graphical Models Lecture 17: Markov chain Monte Carlo
Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,
More informationMCMC Sampling for Bayesian Inference using L1-type Priors
MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling
More informationBaryon acoustic oscillations A standard ruler method to constrain dark energy
Baryon acoustic oscillations A standard ruler method to constrain dark energy Martin White University of California, Berkeley Lawrence Berkeley National Laboratory... with thanks to Nikhil Padmanabhan
More informationStatistics and Prediction. Tom
Statistics and Prediction Tom Kitching tdk@roe.ac.uk @tom_kitching David Tom David Tom photon What do we want to measure How do we measure Statistics of measurement Cosmological Parameter Extraction Understanding
More informationStatistical Methods in Particle Physics Lecture 1: Bayesian methods
Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationRefining Photometric Redshift Distributions with Cross-Correlations
Refining Photometric Redshift Distributions with Cross-Correlations Alexia Schulz Institute for Advanced Study Collaborators: Martin White Introduction Talk Overview Weak lensing tomography can improve
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationIntroduction to Bayes
Introduction to Bayes Alan Heavens September 3, 2018 ICIC Data Analysis Workshop Alan Heavens Introduction to Bayes September 3, 2018 1 / 35 Overview 1 Inverse Problems 2 The meaning of probability Probability
More informationData Analysis I. Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK. 10 lectures, beginning October 2006
Astronomical p( y x, I) p( x, I) p ( x y, I) = p( y, I) Data Analysis I Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK 10 lectures, beginning October 2006 4. Monte Carlo Methods
More informationMarkov Chain Monte Carlo
1 Motivation 1.1 Bayesian Learning Markov Chain Monte Carlo Yale Chang In Bayesian learning, given data X, we make assumptions on the generative process of X by introducing hidden variables Z: p(z): prior
More informationReminder of some Markov Chain properties:
Reminder of some Markov Chain properties: 1. a transition from one state to another occurs probabilistically 2. only state that matters is where you currently are (i.e. given present, future is independent
More informationDeblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J.
Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox fox@physics.otago.ac.nz Richard A. Norton, J. Andrés Christen Topics... Backstory (?) Sampling in linear-gaussian hierarchical
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationNotes on pseudo-marginal methods, variational Bayes and ABC
Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationAn introduction to Markov Chain Monte Carlo techniques
An introduction to Markov Chain Monte Carlo techniques G. J. A. Harker University of Colorado ASTR5550, 19th March 2012 Outline Introduction Bayesian inference: recap MCMC: when to use it and why A simple
More informationData Release 5. Sky coverage of imaging data in the DR5
Data Release 5 The Sloan Digital Sky Survey has released its fifth Data Release (DR5). The spatial coverage of DR5 is about 20% larger than that of DR4. The photometric data in DR5 are based on five band
More informationIntroduction to Bayesian methods in inverse problems
Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationSystematic uncertainties in statistical data analysis for particle physics. DESY Seminar Hamburg, 31 March, 2009
Systematic uncertainties in statistical data analysis for particle physics DESY Seminar Hamburg, 31 March, 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationThe Power. of the Galaxy Power Spectrum. Eric Linder 13 February 2012 WFIRST Meeting, Pasadena
The Power of the Galaxy Power Spectrum Eric Linder 13 February 2012 WFIRST Meeting, Pasadena UC Berkeley & Berkeley Lab Institute for the Early Universe, Korea 11 Baryon Acoustic Oscillations In the beginning...
More informationCS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling
CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy
More informationMarkov Chain Monte Carlo (MCMC)
School of Computer Science 10-708 Probabilistic Graphical Models Markov Chain Monte Carlo (MCMC) Readings: MacKay Ch. 29 Jordan Ch. 21 Matt Gormley Lecture 16 March 14, 2016 1 Homework 2 Housekeeping Due
More informationLearning the hyper-parameters. Luca Martino
Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth
More informationInformation Field Theory. Torsten Enßlin MPI for Astrophysics Ludwig Maximilian University Munich
Information Field Theory Torsten Enßlin MPI for Astrophysics Ludwig Maximilian University Munich Information Field Theory signal field: degrees of freedom data set: finite additional information needed
More informationDown by the Bayes, where the Watermelons Grow
Down by the Bayes, where the Watermelons Grow A Bayesian example using SAS SUAVe: Victoria SAS User Group Meeting November 21, 2017 Peter K. Ott, M.Sc., P.Stat. Strategic Analysis 1 Outline 1. Motivating
More informationActive Learning for Fitting Simulation Models to Observational Data
Active Learning for Fitting Simulation Models to Observational Data Jeff Schneider Example: Active Learning for Cosmology Model Fitting NASA / ESA 8 parameter space!" true parameters" real universe" 7
More informationGravitational Wave Astronomy s Next Frontier in Computation
Gravitational Wave Astronomy s Next Frontier in Computation Chad Hanna - Penn State University Penn State Physics Astronomy & Astrophysics Outline 1. Motivation 2. Gravitational waves. 3. Birth of gravitational
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning
More informationEco517 Fall 2013 C. Sims MCMC. October 8, 2013
Eco517 Fall 2013 C. Sims MCMC October 8, 2013 c 2013 by Christopher A. Sims. This document may be reproduced for educational and research purposes, so long as the copies contain this notice and are retained
More informationA Review of Pseudo-Marginal Markov Chain Monte Carlo
A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the
More informationGalaxy Bias from the Bispectrum: A New Method
Galaxy Bias from the Bispectrum: A New Method Jennifer E. Pollack University of Bonn Thesis Supervisor: Cristiano Porciani Collaborator: Robert E. Smith Motivation 1) Main Objective of state-of-the-art
More informationTutorial on Probabilistic Programming with PyMC3
185.A83 Machine Learning for Health Informatics 2017S, VU, 2.0 h, 3.0 ECTS Tutorial 02-04.04.2017 Tutorial on Probabilistic Programming with PyMC3 florian.endel@tuwien.ac.at http://hci-kdd.org/machine-learning-for-health-informatics-course
More informationSequential Monte Carlo Samplers for Applications in High Dimensions
Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex
More informationMorphology and Topology of the Large Scale Structure of the Universe
Morphology and Topology of the Large Scale Structure of the Universe Stephen Appleby KIAS Research Fellow Collaborators Changbom Park, Juhan Kim, Sungwook Hong The 6th Survey Science Group Workshop 28th
More informationBAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA
BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci
More informationBayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference
Bayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference Osnat Stramer 1 and Matthew Bognar 1 Department of Statistics and Actuarial Science, University of
More informationBayesian model selection in graphs by using BDgraph package
Bayesian model selection in graphs by using BDgraph package A. Mohammadi and E. Wit March 26, 2013 MOTIVATION Flow cytometry data with 11 proteins from Sachs et al. (2005) RESULT FOR CELL SIGNALING DATA
More informationLikelihood-free MCMC
Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte
More informationBayesian Phylogenetics:
Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes
More informationLecture 7 and 8: Markov Chain Monte Carlo
Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani
More informationVCMC: Variational Consensus Monte Carlo
VCMC: Variational Consensus Monte Carlo Maxim Rabinovich, Elaine Angelino, Michael I. Jordan Berkeley Vision and Learning Center September 22, 2015 probabilistic models! sky fog bridge water grass object
More informationBayesian PalaeoClimate Reconstruction from proxies:
Bayesian PalaeoClimate Reconstruction from proxies: Framework Bayesian modelling of space-time processes General Circulation Models Space time stochastic process C = {C(x,t) = Multivariate climate at all
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationStatistical Searches in Astrophysics and Cosmology
Statistical Searches in Astrophysics and Cosmology Ofer Lahav University College London, UK Abstract We illustrate some statistical challenges in Astrophysics and Cosmology, in particular noting the application
More informationProbabilistic Graphical Models
10-708 Probabilistic Graphical Models Homework 3 (v1.1.0) Due Apr 14, 7:00 PM Rules: 1. Homework is due on the due date at 7:00 PM. The homework should be submitted via Gradescope. Solution to each problem
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML
More informationLeïla Haegel University of Geneva
Picture found at: http://www.ps.uci.edu/~tomba/sk/tscan/pictures.html Leïla Haegel University of Geneva Seminar at the University of Sheffield April 27th, 2017 1 Neutrinos are: the only particle known
More informationA tool to test galaxy formation theories. Joe Wolf (UC Irvine)
A tool to test galaxy formation theories Joe Wolf (UC Irvine) SF09 Cosmology Summer Workshop July 7 2009 Team Irvine: Greg Martinez James Bullock Manoj Kaplinghat Frank Avedo KIPAC: Louie Strigari Haverford:
More informationF denotes cumulative density. denotes probability density function; (.)
BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models
More informationUncertainty quantification for inverse problems with a weak wave-equation constraint
Uncertainty quantification for inverse problems with a weak wave-equation constraint Zhilong Fang*, Curt Da Silva*, Rachel Kuske** and Felix J. Herrmann* *Seismic Laboratory for Imaging and Modeling (SLIM),
More informationST 740: Markov Chain Monte Carlo
ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:
More informationTime Delay Estimation: Microlensing
Time Delay Estimation: Microlensing Hyungsuk Tak Stat310 15 Sep 2015 Joint work with Kaisey Mandel, David A. van Dyk, Vinay L. Kashyap, Xiao-Li Meng, and Aneta Siemiginowska Introduction Image Credit:
More informationBayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems
Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.
More informationRisk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods
Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Konstantin Zuev Institute for Risk and Uncertainty University of Liverpool http://www.liv.ac.uk/risk-and-uncertainty/staff/k-zuev/
More informationCSC 2541: Bayesian Methods for Machine Learning
CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 10 Alternatives to Monte Carlo Computation Since about 1990, Markov chain Monte Carlo has been the dominant
More informationAn introduction to Bayesian reasoning in particle physics
An introduction to Bayesian reasoning in particle physics Graduiertenkolleg seminar, May 15th 2013 Overview Overview A concrete example Scientific reasoning Probability and the Bayesian interpretation
More informationBayesian Inference in Astronomy & Astrophysics A Short Course
Bayesian Inference in Astronomy & Astrophysics A Short Course Tom Loredo Dept. of Astronomy, Cornell University p.1/37 Five Lectures Overview of Bayesian Inference From Gaussians to Periodograms Learning
More informationSupplementary Note on Bayesian analysis
Supplementary Note on Bayesian analysis Structured variability of muscle activations supports the minimal intervention principle of motor control Francisco J. Valero-Cuevas 1,2,3, Madhusudhan Venkadesan
More informationUncertainty quantification for Wavefield Reconstruction Inversion
Uncertainty quantification for Wavefield Reconstruction Inversion Zhilong Fang *, Chia Ying Lee, Curt Da Silva *, Felix J. Herrmann *, and Rachel Kuske * Seismic Laboratory for Imaging and Modeling (SLIM),
More informationMarkov chain Monte Carlo methods in atmospheric remote sensing
1 / 45 Markov chain Monte Carlo methods in atmospheric remote sensing Johanna Tamminen johanna.tamminen@fmi.fi ESA Summer School on Earth System Monitoring and Modeling July 3 Aug 11, 212, Frascati July,
More informationWavelets Applied to CMB Analysis
Wavelets Applied to CMB Analysis CMB non-gaussianity Constraining the Primordial Power Spectrum IPAM, Nov 2004 Pia Mukherjee, Astronomy Centre, University of Sussex The Universe.. is a perturbed Robertson
More informationWALLABY. Jeremy Mould WALLABY Science Meeting December 2, 2010
WALLABY Jeremy Mould WALLABY Science Meeting December 2, 2010 Jeremy Mould Seminar: Monash University School of Physics October 7, 2010 Dark Matter Detected in regions where M/L > 100 Outer parts of spiral
More informationThe University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),
The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz
More informationNumerical Analysis for Statisticians
Kenneth Lange Numerical Analysis for Statisticians Springer Contents Preface v 1 Recurrence Relations 1 1.1 Introduction 1 1.2 Binomial CoefRcients 1 1.3 Number of Partitions of a Set 2 1.4 Horner's Method
More informationMarkov Chain Monte Carlo Using the Ratio-of-Uniforms Transformation. Luke Tierney Department of Statistics & Actuarial Science University of Iowa
Markov Chain Monte Carlo Using the Ratio-of-Uniforms Transformation Luke Tierney Department of Statistics & Actuarial Science University of Iowa Basic Ratio of Uniforms Method Introduced by Kinderman and
More informationBayesian networks: approximate inference
Bayesian networks: approximate inference Machine Intelligence Thomas D. Nielsen September 2008 Approximative inference September 2008 1 / 25 Motivation Because of the (worst-case) intractability of exact
More information