Parameter Selection, Model Calibration, and Uncertainty Propagation for Physical Models
|
|
- Jody McLaughlin
- 5 years ago
- Views:
Transcription
1 Parameter Selection, Model Calibration, and Uncertainty Propagation for Physical Models Ralph C. Smith Department of Mathematics North Carolina State University Experimental Setup! d 2 T s 2(a + b) = dx2 ab Heat Model! h k [T s(x) T amb ] dt s dx (0) = Φ k, dt s dx (L) =h k [T amb T s (L)]
2 Experimental Setup and Data:! Heat Model Example Aluminum Rod Data 80 Temperature ( o C) Steady State Model:! d 2 T s 2(a + b) = dx2 ab dt s dx (0) = Φ k h k [T s(x) T amb ] dt s dx (L) = h k [T amb T s (L)] Location (cm) Objectives: Employ Bayesian analysis for! Model calibration! Uncertainty propagation! Experimental design! Note:! Parameter set q =[h, k, Φ] is not identifiable
3 Statistical Inference Goal: The goal in statistical inference is to make conclusions about a phenomenon based on observed data. Frequentist: Observations made in the past are analyzed with a specified model. Result is regarded as confidence about state of real world. Probabilities defined as frequencies with which an event occurs if experiment is repeated several times. Parameter Estimation: o Relies on estimators derived from different data sets and a specific sampling distribution. o Parameters may be unknown but are fixed and deterministic. Bayesian: Interpretation of probability is subjective and can be updated with new data. Parameter Estimation: Parameters are considered to be random variables having associated densities.
4 Bayes Theorem: P (A B) = Example: Coin Flip Likelihood: π(υ q) = P (B A)P (A) P (B) Υ i (ω) = N q υ i (1 q) 1 υ i i=1 Bayesian Model Calibration 0, ω = T 1, ω = H = q υ i (1 q) N υ i = q N 1 (1 q) N 0 Posterior with Noninformative Prior: π 0 (q) =1 Bayesian Model Calibration: Parameters assumed to be random variables π(q υ) = π(υ q)π 0 (q) R p π(υ q)π 0 (q)dq π(q υ) = qn 1 (1 q) N qn 1 (1 q) N 0dq = (N + 1)! N 0!N 1! qn 1 (1 q) N 0
5 Bayesian Model Calibration: Parameters considered to be random variables with associated densities. Bayesian Model Calibration π(q υ) = π(υ q)π 0 (q) R p π(υ q)π 0 (q)dq Problem: Often requires high dimensional integration; o e.g., p = 18 for MFC model o p = thousands to millions for some models Strategies: Sampling methods Sparse grid quadrature techniques
6 Markov Chain Techniques Markov Chain: Sequence of events where current state depends only on last value. Baseball: States are S = {win,lose}. Initial state is p 0 =[0.8, 0.2]. Assume that team which won last game has 70% chance of winning next game and 30% chance of losing next game. Assume losing team wins 40% and loses 60% of next games win lose 0.6 Percentage of teams who win/lose next game given by p 1 =[0.8, 0.2] =[0.64, 0.36] Question: does the following limit exist? p n =[0.8, 0.2] n
7 Markov Chain Techniques Baseball Example: Solve constrained relation π = πp, πi =1 to obtain [π win, π lose ] π =[0.5714, ] =[πwin, π lose ], π win + π lose =1
8 Markov Chain Techniques Baseball Example: Solve constrained relation π = πp, πi =1 to obtain [π win, π lose ] π =[0.5714, ] =[πwin, π lose ], π win + π lose =1 Alternative: Iterate to compute solution Notes: n p n n p n n p n 0 [0.8000, ] 4 [0.5733, ] 8 [0.5714, ] 1 [0.6400, ] 5 [0.5720, ] 9 [0.5714, ] 2 [0.5920, ] 6 [0.5716, ] 10 [0.5714, ] 3 [0.5776, ] 7 [0.5715, ] Forms basis for Markov Chain Monte Carlo (MCMC) techniques Goal: construct chains whose stationary distribution is the posterior density
9 Markov Chain Monte Carlo Methods Strategy: Markov chain simulation used when it is impossible, or computationally prohibitive, to sample q directly from π(q υ) = π(υ q)π 0 (q) R p π(υ q)π 0 (q)dq Note: Create a Markov process whose stationary distribution is π(q υ). In Markov chain theory, we are given a Markov chain P, and we construct its equilibrium distribution. In MCMC theory, we are given a distribution and we want to construct a Markov chain that is reversible with respect to it.
10 Model Calibration Problem Assumption: Assume that measurement errors are iid and ε i N(0, σ 2 ) Likelihood: π(υ q) =L(q, σ υ) = 1 2 (2πσ 2 ) n/2 e SS q/2σ where n SS q = [υ i f i (q)] 2 i=1 is the sum of squares error.
11 Markov Chain Monte Carlo Methods General Strategy: Current value: X k 1 = q k 1 Propose candidate q J(q q k 1 ) from proposal (jumping) distribution With probability α(q,q k 1 ), accept q ; i.e., X k = q Otherwise, stay where you are: X k = q k 1 Intuition: Recall that π(υ q)π 0 (q) π(q υ) = π(υ q)π R p 0 (q)dq π(υ q) =! q) 1 n (2πσ 2 e i=1 [υ i f i (q)] 2 /2σ 2 = ) n/2 SSq 1 2 (2πσ 2 ) n/2 e SS q/2σ q q * q k 1 q* q k 1 q
12 Intuition: Markov Chain Monte Carlo Methods! q) SSq q q * q k 1 q* q k 1 q Consider r(q q k 1 )= π(q υ) π(q k 1 υ) = π(υ q )π 0 (q ) π(υ q k 1 )π 0 (q k 1 ) If r<1 π(υ q ) < π(υ q k 1 ), accept with probability α = r If r>1, accept with probability α =1 Note: Narrower proposal distribution yields higher probability of acceptance.
13 Markov Chain Monte Carlo Methods Note: Narrower proposal distribution yields higher probability of acceptance. q * q 1 =q* q 2 q =q =q* 3 2 q0 J( q* q k 1 )! q) q0 q* q* q 1 =q* q 3 =q 1 q 2 =q 1 J( q* q k 1 )! q) Parameter Value Chain Iteration Parameter Value Chain Iteration
14 Proposal Distribution Proposal Distribution: Significantly affects mixing Too wide: Too many points rejected and chain stays still for long periods; Too narrow: Acceptance ratio is high but algorithm is slow to explore parameter space Ideally, it should have similar shape to posterior distribution. (q! q 2 J ( q* q k 1 ) (q! J q* q 2 ( q k 1 ) q 1 q 1 (a) (b) Problem: Anisotropic posterior, isotropic proposal; Efficiency nonuniform for different parameters Result: Recovers efficiency of univariate case
15 Proposal Distribution Proposal Distribution: Two basic approaches Choose a fixed proposal function o Independent Metropolis Random walk (local Metropolis) q = q k 1 + Rz o Two (of several) choices: (i) R = ci q N(q k 1,cI) (ii) R = chol(v ) q N(q k 1,V) where Sensitivity Matrix V = σ 2 OLS X T (q OLS )X (q OLS ) 1 σ 2 OLS = 1 n [υ i f i (q OLS )] 2 n p i=1 (q! q 2 J X ik (q OLS )= f i(q OLS ) q k ( q* q k 1 ) (q! J q* q 2 ( q k 1 ) q 1 q 1 (a) (b)
16 Random Walk Metropolis Algorithm for Parameter Estimation 1. Set number of chain elements M and design parameters n s, σ s 2. Determine q 0 = arg min q N i=1 [υ i f i (q)] 2 3. Set SS q 0 = N i=1 [υ i f i (q 0 )] 2 4. Compute initial variance estimate: s 2 0 = SS q 0 n p 5. Construct covariance estimate V = s 2 0[X T (q 0 )X (q 0 )] 1 and R = chol(v ) 6. For k =1,,M (a) Sample z k N(0, 1) (b) Construct candidate q = q k 1 + Rz k (c) Sample u α U(0, 1) (d) Compute SS q (e) Compute α(q q k 1 )=min = N i=1 [υ i f i (q )] 2 1,e [SS q SS q k 1 ]/2s 2 k 1 (f) If u α < α, Set q k = q,ss q k = SS q else Set q k = q k 1,SS q k = SS q k 1 endif (g) Update s k Inv-gamma(a val,b val ) where a val =0.5(n s + n),b val =0.5(n s σ 2 s + SS q k) Parameter Value Chain Iteration
17 Delayed Rejection Adaptive Metropolis (DRAM) Adaptive Metropolis: Update chain covariance matrix as chain values are accepted. Diminishing adaptation and bounded convergence required since no longer Markov chain. Employ recursive relations V k = s p cov(q 0,q 1,,q k 1 )+εi p q k = 1 k +1 k i=0 q i = k k +1 1 k k 1 i=0 V k+1 = k 1 k V k + s p k q i + 1 k +1 qk = k k +1 qk k +1 qk k q k 1 ( q k 1 ) T (k + 1) q k ( q k ) T + q k (q k ) T + εi p
18 Chain Convergence (Burn-In) Techniques: Visually check chains Statistical tests Often abused in the literature Parameter Value Chain Iteration Chain not converged Chain for nonidentifiable parameter
19 Delayed Rejection Adaptive Metropolis (DRAM) Websites Examples Examples on using the toolbox for some statistical problems.
20 Delayed Rejection Adaptive Metropolis (DRAM) We fit the Monod model y = θ 1 1 θ , N(0,Iσ2 ) to observations x (mg / L COD): y (1 / h): First clear some variables from possible previous runs. clear data model options Next, create a data structure for the observations and control variables. Typically one could make a structure data that contains fields xdata and ydata. data.xdata = [ ]'; % x (mg / L COD) data.ydata = [ ]'; % y (1 / h) Construct model modelfun theta(1)*x./(theta(2)+x); ssfun sum((data.ydata-modelfun(data.xdata,theta)).^2); model.ssfun = ssfun; model.sigma2 = 0.01^2;
21 Delayed Rejection Adaptive Metropolis (DRAM) Input parameters params = { {'theta1', tmin(1), 0} {'theta2', tmin(2), 0} }; and set options options.nsimu = 4000; options.updatesigma = 1; options.qcov = tcov; Run code [res,chain,s2chain] = mcmcrun(model,data,params,options);
22 Delayed Rejection Adaptive Metropolis (DRAM) Plot results figure(2); clf mcmcplot(chain,[],res,'chainpanel'); figure(3); clf mcmcplot(chain,[],res,'pairs'); Examples: Several available in MCMC_EXAMPLES ODE solver illustrated in algae example
23 Delayed Rejection Adaptive Metropolis (DRAM) Construct credible and prediction intervals figure(5); clf out = mcmcpred(res,chain,[],x,modelfun); mcmcpredplot(out); hold on plot(data.xdata,data.ydata,'s'); % add data points to the plot xlabel('x [mg/l COD]'); ylabel('y [1/h]'); hold off title('predictive envelopes of the model')
24 Steady State Model:! d 2 T s 2(a + b) = dx2 ab dt s dx (0) = Φ k DRAM for Heat Example h k [T s(x) T amb ] dt s dx (L) = h k [T amb T s (L)] Note:! Parameter set q =[Φ,h,k] is not identifiable Website Temperature ( o C) Aluminum Rod Data Location (cm)
25 DRAM for Heat Model with 3 Parameters: Results Note:! Parameter set q =[Φ,h,k] is not identifiable Notes:! Cond(V) = 1e+35! Data and model are not informing priors!
26 2 Parameter Heat Model:! Notes:! d 2 T s 2(a + b) = dx2 ab dt s dx (0) = Φ k Assignment:! Assignment h k [T s(x) T amb ] dt s dx (L) = h k [T amb T s (L)] Set k = 2.37 W/cm C, which is the physical value for aluminum Parameter set q =[h, Φ] is now identifiable Modify the posted 3 parameter code for the 2 parameter model. How do your chains and results compare?! Consider various chain lengths to establish burn-in.!
Parameter Selection and Model Calibration for an SIR Model
Parameter Selection and Model Calibration for an SIR Model Ralph C. Smith Department of Mathematics North Carolina State Universit SIR Model ds = N - S - kis, S(0) =S 0 di = kis - (r + )I, I(0) =I 0 dr
More informationComputer Practical: Metropolis-Hastings-based MCMC
Computer Practical: Metropolis-Hastings-based MCMC Andrea Arnold and Franz Hamilton North Carolina State University July 30, 2016 A. Arnold / F. Hamilton (NCSU) MH-based MCMC July 30, 2016 1 / 19 Markov
More informationReminder of some Markov Chain properties:
Reminder of some Markov Chain properties: 1. a transition from one state to another occurs probabilistically 2. only state that matters is where you currently are (i.e. given present, future is independent
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationMarkov Chain Monte Carlo
Markov Chain Monte Carlo Recall: To compute the expectation E ( h(y ) ) we use the approximation E(h(Y )) 1 n n h(y ) t=1 with Y (1),..., Y (n) h(y). Thus our aim is to sample Y (1),..., Y (n) from f(y).
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationMarkov Chain Monte Carlo (MCMC) and Model Evaluation. August 15, 2017
Markov Chain Monte Carlo (MCMC) and Model Evaluation August 15, 2017 Frequentist Linking Frequentist and Bayesian Statistics How can we estimate model parameters and what does it imply? Want to find the
More informationST 740: Markov Chain Monte Carlo
ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:
More informationIntroduction to Bayesian methods in inverse problems
Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction
More informationNonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems
Nonlinear Model Reduction for Uncertainty Quantification in Large-Scale Inverse Problems Krzysztof Fidkowski, David Galbally*, Karen Willcox* (*MIT) Computational Aerospace Sciences Seminar Aerospace Engineering
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.
More informationUncertainty Propagation
Setting: Uncertainty Propagation We assume that we have determined distributions for parameters e.g., Bayesian inference, prior experiments, expert opinion Ṫ 1 = 1 - d 1 T 1 - (1 - ")k 1 VT 1 Ṫ 2 = 2 -
More informationMonte Carlo in Bayesian Statistics
Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview
More informationMARKOV CHAIN MONTE CARLO
MARKOV CHAIN MONTE CARLO RYAN WANG Abstract. This paper gives a brief introduction to Markov Chain Monte Carlo methods, which offer a general framework for calculating difficult integrals. We start with
More informationMCMC algorithms for fitting Bayesian models
MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models
More informationMarkov Chain Monte Carlo, Numerical Integration
Markov Chain Monte Carlo, Numerical Integration (See Statistics) Trevor Gallen Fall 2015 1 / 1 Agenda Numerical Integration: MCMC methods Estimating Markov Chains Estimating latent variables 2 / 1 Numerical
More informationLecture 7 and 8: Markov Chain Monte Carlo
Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani
More informationMCMC and Gibbs Sampling. Kayhan Batmanghelich
MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction
More informationBayesian Phylogenetics:
Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes
More informationLecture 6: Markov Chain Monte Carlo
Lecture 6: Markov Chain Monte Carlo D. Jason Koskinen koskinen@nbi.ku.dk Photo by Howard Jackman University of Copenhagen Advanced Methods in Applied Statistics Feb - Apr 2016 Niels Bohr Institute 2 Outline
More information(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis
Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals
More informationStrong Lens Modeling (II): Statistical Methods
Strong Lens Modeling (II): Statistical Methods Chuck Keeton Rutgers, the State University of New Jersey Probability theory multiple random variables, a and b joint distribution p(a, b) conditional distribution
More informationBayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence
Bayesian Inference in GLMs Frequentists typically base inferences on MLEs, asymptotic confidence limits, and log-likelihood ratio tests Bayesians base inferences on the posterior distribution of the unknowns
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian
More informationAdaptive Monte Carlo methods
Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert
More informationA Bayesian Approach to Phylogenetics
A Bayesian Approach to Phylogenetics Niklas Wahlberg Based largely on slides by Paul Lewis (www.eeb.uconn.edu) An Introduction to Bayesian Phylogenetics Bayesian inference in general Markov chain Monte
More informationPrinciples of Bayesian Inference
Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters
More informationBayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems
Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.
More informationThe Metropolis-Hastings Algorithm. June 8, 2012
The Metropolis-Hastings Algorithm June 8, 22 The Plan. Understand what a simulated distribution is 2. Understand why the Metropolis-Hastings algorithm works 3. Learn how to apply the Metropolis-Hastings
More informationStatistics: Learning models from data
DS-GA 1002 Lecture notes 5 October 19, 2015 Statistics: Learning models from data Learning models from data that are assumed to be generated probabilistically from a certain unknown distribution is a crucial
More informationMarkov Chain Monte Carlo The Metropolis-Hastings Algorithm
Markov Chain Monte Carlo The Metropolis-Hastings Algorithm Anthony Trubiano April 11th, 2018 1 Introduction Markov Chain Monte Carlo (MCMC) methods are a class of algorithms for sampling from a probability
More informationAdvanced Sampling Algorithms
+ Advanced Sampling Algorithms + Mobashir Mohammad Hirak Sarkar Parvathy Sudhir Yamilet Serrano Llerena Advanced Sampling Algorithms Aditya Kulkarni Tobias Bertelsen Nirandika Wanigasekara Malay Singh
More informationSTA 294: Stochastic Processes & Bayesian Nonparametrics
MARKOV CHAINS AND CONVERGENCE CONCEPTS Markov chains are among the simplest stochastic processes, just one step beyond iid sequences of random variables. Traditionally they ve been used in modelling a
More informationMarkov chain Monte Carlo methods in atmospheric remote sensing
1 / 45 Markov chain Monte Carlo methods in atmospheric remote sensing Johanna Tamminen johanna.tamminen@fmi.fi ESA Summer School on Earth System Monitoring and Modeling July 3 Aug 11, 212, Frascati July,
More informationMCMC notes by Mark Holder
MCMC notes by Mark Holder Bayesian inference Ultimately, we want to make probability statements about true values of parameters, given our data. For example P(α 0 < α 1 X). According to Bayes theorem:
More informationMCMC Sampling for Bayesian Inference using L1-type Priors
MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning
More informationBayesian Estimation of Input Output Tables for Russia
Bayesian Estimation of Input Output Tables for Russia Oleg Lugovoy (EDF, RANE) Andrey Polbin (RANE) Vladimir Potashnikov (RANE) WIOD Conference April 24, 2012 Groningen Outline Motivation Objectives Bayesian
More informationSAMSI Astrostatistics Tutorial. More Markov chain Monte Carlo & Demo of Mathematica software
SAMSI Astrostatistics Tutorial More Markov chain Monte Carlo & Demo of Mathematica software Phil Gregory University of British Columbia 26 Bayesian Logical Data Analysis for the Physical Sciences Contents:
More informationComputational statistics
Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationHastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model
UNIVERSITY OF TEXAS AT SAN ANTONIO Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model Liang Jing April 2010 1 1 ABSTRACT In this paper, common MCMC algorithms are introduced
More informationMS&E 226: Small Data. Lecture 11: Maximum likelihood (v2) Ramesh Johari
MS&E 226: Small Data Lecture 11: Maximum likelihood (v2) Ramesh Johari ramesh.johari@stanford.edu 1 / 18 The likelihood function 2 / 18 Estimating the parameter This lecture develops the methodology behind
More informationSupplementary Note on Bayesian analysis
Supplementary Note on Bayesian analysis Structured variability of muscle activations supports the minimal intervention principle of motor control Francisco J. Valero-Cuevas 1,2,3, Madhusudhan Venkadesan
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More informationAn introduction to Bayesian statistics and model calibration and a host of related topics
An introduction to Bayesian statistics and model calibration and a host of related topics Derek Bingham Statistics and Actuarial Science Simon Fraser University Cast of thousands have participated in the
More informationParameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn
Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation
More informationApproximate Bayesian Computation: a simulation based approach to inference
Approximate Bayesian Computation: a simulation based approach to inference Richard Wilkinson Simon Tavaré 2 Department of Probability and Statistics University of Sheffield 2 Department of Applied Mathematics
More informationComputer Vision Group Prof. Daniel Cremers. 14. Sampling Methods
Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationStatistical Analysis in Modelling: MCMC methods. Heikki Haario Lappeenranta University of Technology
Statistical Analysis in Modelling: methods Heikki Haario Lappeenranta University of Technology heikki.haario@lut.fi Contents Introduction Linear Models Regression Analysis Covariance, classical error bounds
More information16 : Markov Chain Monte Carlo (MCMC)
10-708: Probabilistic Graphical Models 10-708, Spring 2014 16 : Markov Chain Monte Carlo MCMC Lecturer: Matthew Gormley Scribes: Yining Wang, Renato Negrinho 1 Sampling from low-dimensional distributions
More informationSimulation - Lectures - Part III Markov chain Monte Carlo
Simulation - Lectures - Part III Markov chain Monte Carlo Julien Berestycki Part A Simulation and Statistical Programming Hilary Term 2018 Part A Simulation. HT 2018. J. Berestycki. 1 / 50 Outline Markov
More informationIntroduction to MCMC. DB Breakfast 09/30/2011 Guozhang Wang
Introduction to MCMC DB Breakfast 09/30/2011 Guozhang Wang Motivation: Statistical Inference Joint Distribution Sleeps Well Playground Sunny Bike Ride Pleasant dinner Productive day Posterior Estimation
More informationMetropolis-Hastings Algorithm
Strength of the Gibbs sampler Metropolis-Hastings Algorithm Easy algorithm to think about. Exploits the factorization properties of the joint probability distribution. No difficult choices to be made to
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods
Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationMultimodal Nested Sampling
Multimodal Nested Sampling Farhan Feroz Astrophysics Group, Cavendish Lab, Cambridge Inverse Problems & Cosmology Most obvious example: standard CMB data analysis pipeline But many others: object detection,
More informationBayesian Inference. Chapter 1. Introduction and basic concepts
Bayesian Inference Chapter 1. Introduction and basic concepts M. Concepción Ausín Department of Statistics Universidad Carlos III de Madrid Master in Business Administration and Quantitative Methods Master
More informationAdvanced Statistical Methods. Lecture 6
Advanced Statistical Methods Lecture 6 Convergence distribution of M.-H. MCMC We denote the PDF estimated by the MCMC as. It has the property Convergence distribution After some time, the distribution
More informationBayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007
Bayesian inference Fredrik Ronquist and Peter Beerli October 3, 2007 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference.
More informationStat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC
Stat 451 Lecture Notes 07 12 Markov Chain Monte Carlo Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapters 8 9 in Givens & Hoeting, Chapters 25 27 in Lange 2 Updated: April 4, 2016 1 / 42 Outline
More informationBayesian Gaussian Process Regression
Bayesian Gaussian Process Regression STAT8810, Fall 2017 M.T. Pratola October 7, 2017 Today Bayesian Gaussian Process Regression Bayesian GP Regression Recall we had observations from our expensive simulator,
More informationMarkov Chains Handout for Stat 110
Markov Chains Handout for Stat 0 Prof. Joe Blitzstein (Harvard Statistics Department) Introduction Markov chains were first introduced in 906 by Andrey Markov, with the goal of showing that the Law of
More informationApplication of Bayesian Inference to Milling Force Modeling
Jaydeep M. Karandikar Department of Mechanical Engineering and Engineering Science, University of North Carolina at Charlotte, Charlotte, NC 28223 Tony L. Schmitz Department of Mechanical Engineering and
More informationSTAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01
STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist
More informationSampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo. Sampling Methods. Oliver Schulte - CMPT 419/726. Bishop PRML Ch.
Sampling Methods Oliver Schulte - CMP 419/726 Bishop PRML Ch. 11 Recall Inference or General Graphs Junction tree algorithm is an exact inference method for arbitrary graphs A particular tree structure
More informationBayesian modelling. Hans-Peter Helfrich. University of Bonn. Theodor-Brinkmann-Graduate School
Bayesian modelling Hans-Peter Helfrich University of Bonn Theodor-Brinkmann-Graduate School H.-P. Helfrich (University of Bonn) Bayesian modelling Brinkmann School 1 / 22 Overview 1 Bayesian modelling
More informationBayesian Inference in Astronomy & Astrophysics A Short Course
Bayesian Inference in Astronomy & Astrophysics A Short Course Tom Loredo Dept. of Astronomy, Cornell University p.1/37 Five Lectures Overview of Bayesian Inference From Gaussians to Periodograms Learning
More informationInverse Problems in the Bayesian Framework
Inverse Problems in the Bayesian Framework Daniela Calvetti Case Western Reserve University Cleveland, Ohio Raleigh, NC, July 2016 Bayes Formula Stochastic model: Two random variables X R n, B R m, where
More informationBayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014
Bayesian Prediction of Code Output ASA Albuquerque Chapter Short Course October 2014 Abstract This presentation summarizes Bayesian prediction methodology for the Gaussian process (GP) surrogate representation
More informationSC7/SM6 Bayes Methods HT18 Lecturer: Geoff Nicholls Lecture 2: Monte Carlo Methods Notes and Problem sheets are available at http://www.stats.ox.ac.uk/~nicholls/bayesmethods/ and via the MSc weblearn pages.
More informationClass 26: review for final exam 18.05, Spring 2014
Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event
More informationMaking rating curves - the Bayesian approach
Making rating curves - the Bayesian approach Rating curves what is wanted? A best estimate of the relationship between stage and discharge at a given place in a river. The relationship should be on the
More informationDown by the Bayes, where the Watermelons Grow
Down by the Bayes, where the Watermelons Grow A Bayesian example using SAS SUAVe: Victoria SAS User Group Meeting November 21, 2017 Peter K. Ott, M.Sc., P.Stat. Strategic Analysis 1 Outline 1. Motivating
More informationThe Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision
The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationTEORIA BAYESIANA Ralph S. Silva
TEORIA BAYESIANA Ralph S. Silva Departamento de Métodos Estatísticos Instituto de Matemática Universidade Federal do Rio de Janeiro Sumário Numerical Integration Polynomial quadrature is intended to approximate
More informationAnswers and expectations
Answers and expectations For a function f(x) and distribution P(x), the expectation of f with respect to P is The expectation is the average of f, when x is drawn from the probability distribution P E
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationSampling from complex probability distributions
Sampling from complex probability distributions Louis J. M. Aslett (louis.aslett@durham.ac.uk) Department of Mathematical Sciences Durham University UTOPIAE Training School II 4 July 2017 1/37 Motivation
More informationDefault Priors and Effcient Posterior Computation in Bayesian
Default Priors and Effcient Posterior Computation in Bayesian Factor Analysis January 16, 2010 Presented by Eric Wang, Duke University Background and Motivation A Brief Review of Parameter Expansion Literature
More informationBayesian phylogenetics. the one true tree? Bayesian phylogenetics
Bayesian phylogenetics the one true tree? the methods we ve learned so far try to get a single tree that best describes the data however, they admit that they don t search everywhere, and that it is difficult
More informationOutline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution
Outline A short review on Bayesian analysis. Binomial, Multinomial, Normal, Beta, Dirichlet Posterior mean, MAP, credible interval, posterior distribution Gibbs sampling Revisit the Gaussian mixture model
More informationBAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA
BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci
More informationComputer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo
Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain
More informationThe Jackknife-Like Method for Assessing Uncertainty of Point Estimates for Bayesian Estimation in a Finite Gaussian Mixture Model
Thai Journal of Mathematics : 45 58 Special Issue: Annual Meeting in Mathematics 207 http://thaijmath.in.cmu.ac.th ISSN 686-0209 The Jackknife-Like Method for Assessing Uncertainty of Point Estimates for
More informationStatistical Methods in Particle Physics Lecture 1: Bayesian methods
Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan
More informationRisk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods
Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Konstantin Zuev Institute for Risk and Uncertainty University of Liverpool http://www.liv.ac.uk/risk-and-uncertainty/staff/k-zuev/
More informationForward Problems and their Inverse Solutions
Forward Problems and their Inverse Solutions Sarah Zedler 1,2 1 King Abdullah University of Science and Technology 2 University of Texas at Austin February, 2013 Outline 1 Forward Problem Example Weather
More informationBayesian statistics. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Bayesian statistics DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall15 Carlos Fernandez-Granda Frequentist vs Bayesian statistics In frequentist statistics
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationPhylogenetics: Bayesian Phylogenetic Analysis. COMP Spring 2015 Luay Nakhleh, Rice University
Phylogenetics: Bayesian Phylogenetic Analysis COMP 571 - Spring 2015 Luay Nakhleh, Rice University Bayes Rule P(X = x Y = y) = P(X = x, Y = y) P(Y = y) = P(X = x)p(y = y X = x) P x P(X = x 0 )P(Y = y X
More informationSession 3A: Markov chain Monte Carlo (MCMC)
Session 3A: Markov chain Monte Carlo (MCMC) John Geweke Bayesian Econometrics and its Applications August 15, 2012 ohn Geweke Bayesian Econometrics and its Session Applications 3A: Markov () chain Monte
More informationPhysics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester
Physics 403 Numerical Methods, Maximum Likelihood, and Least Squares Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Quadratic Approximation
More informationSpatially Adaptive Smoothing Splines
Spatially Adaptive Smoothing Splines Paul Speckman University of Missouri-Columbia speckman@statmissouriedu September 11, 23 Banff 9/7/3 Ordinary Simple Spline Smoothing Observe y i = f(t i ) + ε i, =
More informationCS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling
CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy
More informationBayesian Analysis. Bayesian Analysis: Bayesian methods concern one s belief about θ. [Current Belief (Posterior)] (Prior Belief) x (Data) Outline
Bayesian Analysis DuBois Bowman, Ph.D. Gordana Derado, M. S. Shuo Chen, M. S. Department of Biostatistics and Bioinformatics Center for Biomedical Imaging Statistics Emory University Outline I. Introduction
More informationLearning the hyper-parameters. Luca Martino
Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth
More information