Multifidelity Approaches to Approximate Bayesian Computation

Size: px
Start display at page:

Download "Multifidelity Approaches to Approximate Bayesian Computation"

Transcription

1 Multifidelity Approaches to Approximate Bayesian Computation Thomas P. Prescott Wolfson Centre for Mathematical Biology University of Oxford Banff International Research Station 11th 16th November 2018

2 Approximate Bayesian computation Multifidelity methods Multifidelity modelling example: Repressilator Multifidelity ABC Early accept/reject multifidelity ABC Performance analysis Repressilator revisited Implementation and future extensions 13/11/18 Early Accept/Reject Multifidelity ABC 2

3

4 Approximate Bayesian computation Multifidelity methods Multifidelity modelling example: Repressilator Multifidelity ABC Early accept/reject multifidelity ABC Performance analysis Repressilator revisited Implementation and future extensions 13/11/18 Early Accept/Reject Multifidelity ABC 4

5 Bayesian parameter inference Real-world system defined by uncertain parameters θ π( ) belonging to a prior distribution. Summary statistics, yobs, are gathered from observed data. A model p( θ) is a map from θ to a distribution. Distribution p( θ) on an output space containing y obs. Likelihood of observed data is p(yobs θ). Simulation of a model is a draw y p( θ) from the distribution. Cost of simulation is the time taken to simulate y. Model here refers to combination of mathematical model and numerical implementation to generate y. 13/11/18 Early Accept/Reject Multifidelity ABC 5

6 Bayesian parameter inference Posterior distribution for parameters is: p(θ y obs ) = p(y obs θ)π(θ) p(yobs θ)π(θ) dθ. Assume that the likelihood, p(y obs θ), is unavailable. 13/11/18 Early Accept/Reject Multifidelity ABC 6

7 Approximate Bayesian computation Attempt 1: Approximate likelihood, p(y obs θ), using Monte Carlo simulation: Simulate y p( θ) and set ˆp(y obs θ) = I(y = y obs ). Expectation E(ˆp(y obs ) θ) = p(y obs θ). 13/11/18 Early Accept/Reject Multifidelity ABC 7

8 Approximate Bayesian computation Attempt 1: Approximate likelihood, p(y obs θ), using Monte Carlo simulation: Simulate y p( θ) and set ˆp(y obs θ) = I(y = y obs ). Expectation E(ˆp(y obs ) θ) = p(y obs θ). Impractical in large output space. 13/11/18 Early Accept/Reject Multifidelity ABC 7

9 Approximate Bayesian computation Attempt 2: Consider neighbourhood Ω(ɛ) = {y d(y, y obs ) < ɛ} around y obs. Distance d(y, y obs ) in output space. Threshold distance ɛ. 13/11/18 Early Accept/Reject Multifidelity ABC 8

10 Approximate Bayesian computation Attempt 2: Consider neighbourhood Ω(ɛ) = {y d(y, y obs ) < ɛ} around y obs. Distance d(y, y obs ) in output space. Threshold distance ɛ. ABC approximation to the likelihood: for y p( θ), p(y obs θ) p(y Ω(ɛ) θ) =: p ABC (y obs θ) Requires lim ɛ 0 p(y Ω(ɛ) θ) = p(y obs θ). 13/11/18 Early Accept/Reject Multifidelity ABC 8

11 ABC: Monte Carlo rejection sampling For i = 1,..., N: Select θ i π( ) from prior and simulate y i p( θ i ). If y i Ω(ɛ) is in the ɛ-neighbourhood of y obs then accept θ i into sample, else reject. Given θ i, the accept/reject decision is a random weight w(θ i ) = I(y i Ω(ɛ)) with expectation E(w(θ i )) = p ABC (y obs θ i ) p(y obs θ i ). 13/11/18 Early Accept/Reject Multifidelity ABC 9

12 Monte Carlo ABC: Computational bottleneck Simulate y i p( θ i ) for i = 1,..., N for N large. Aim to reduce computational burden. Efficiently exploring parameter space can reduce N: Markov chain Monte Carlo (MCMC); Seqential Monte Carlo (SMC). 13/11/18 Early Accept/Reject Multifidelity ABC 10

13 Monte Carlo ABC: Computational bottleneck Simulate y i p( θ i ) for i = 1,..., N for N large. Aim to reduce computational burden. Efficiently exploring parameter space can reduce N: Markov chain Monte Carlo (MCMC); Seqential Monte Carlo (SMC). Goal: reduce simulation cost within rejection sampling framework. 13/11/18 Early Accept/Reject Multifidelity ABC 10

14 Approximate Bayesian computation Multifidelity methods Multifidelity modelling example: Repressilator Multifidelity ABC Early accept/reject multifidelity ABC Performance analysis Repressilator revisited Implementation and future extensions 13/11/18 Early Accept/Reject Multifidelity ABC 11

15 Alternative models Original model p( θ): distribution on output space containing y obs. Alternative summary statistics ỹ obs induce new output space. Consider new model p( θ)......with new cost, c, of simulating ỹ p( θ). We assume the new model is cheaper: E( c) E(c). 13/11/18 Early Accept/Reject Multifidelity ABC 12

16 Cheaper simulations Early stopping: simulate to t < T ; Coarser discretisations of space/time; Reduce model dimension; Langevin, mean field, or deterministic approximations; Surrogate models and regressions; Steady state analysis. Note: we re assuming that θ well-defines both p and p. 13/11/18 Early Accept/Reject Multifidelity ABC 13

17 Approximate Bayesian computation Using alternative model p: Consider neighbourhood Ω( ɛ) = {ỹ d(ỹ, ỹ obs ) < ɛ} around ỹ obs. Distance d(ỹ, ỹ obs ) in output space. Threshold distance ɛ. 13/11/18 Early Accept/Reject Multifidelity ABC 14

18 Approximate Bayesian computation Using alternative model p: Consider neighbourhood Ω( ɛ) = {ỹ d(ỹ, ỹ obs ) < ɛ} around ỹ obs. Distance d(ỹ, ỹ obs ) in output space. Threshold distance ɛ. ABC approximation to a different likelihood: for ỹ p( θ), p(ỹ obs θ) p ABC (ỹ obs θ) = p(ỹ Ω( ɛ) θ). Requires lim ɛ 0 p(ỹ Ω( ɛ) θ) = p(ỹ obs θ). 13/11/18 Early Accept/Reject Multifidelity ABC 14

19 ABC: Monte Carlo rejection sampling For i = 1,..., N: Select θ i π( ) from prior and simulate ỹ i p( θ i ). If ỹ i Ω( ɛ) is in the ɛ-neighbourhood of ỹ obs then accept θ i into sample, else reject. Given θ i, the accept/reject decision is a random weight w(θ i ) = I(ỹ i Ω( ɛ)) with expectation E( w(θ i )) = p ABC (ỹ obs θ i ) p(ỹ obs θ i ). 13/11/18 Early Accept/Reject Multifidelity ABC 15

20 Multifidelity terminology High-fidelity model p( θ): HF simulation y p( θ) with cost c; HF accept/reject weight w(θ) = I(y Ω(ɛ)). Low-fidelity model p( θ): LF simulation ỹ p( θ) with cost c; LF accept/reject weight w(θ) = I(ỹ Ω( ɛ)). 13/11/18 Early Accept/Reject Multifidelity ABC 16

21 Example: Repressilator Three genes G 1, G 2, and G 3, transcribed and translated into proteins P 1, P 2, and P 3. P 1 represses G 2 transcription; P 2 represses G 3 transcription; Elowitz & Leibler, Nature, P 3 represses G 1 transcription. Goal: identify n and K h in repression f (p) = K n h /(K n h + pn ). 13/11/18 Early Accept/Reject Multifidelity ABC 17

22 Multifidelity models 1000 mrna mrna mrna 3 Molecule Count Protein Protein Protein 3 Molecule Count Time Time Time Simulate network with: High fidelity Stochastic Simulation Algorithm (Gillespie). Low fidelity tau-leap discretisation. 13/11/18 Early Accept/Reject Multifidelity ABC 18

23 Low-fidelity rejection sampling is biased Matching estimator values False positive False negative Threshold , y obs ) (left) and n (right). Each Distance d(y, yobs ) varies with d(y point (N = 104 ) corresponds to parameter sampled from uniform prior. Observed data yobs = y obs is synthetic data using n = 2. 13/11/18 Early Accept/Reject Multifidelity ABC 19

24 Low-fidelity rejection sampling is biased High fidelity: Low fidelity: E(w(θ)) = p(y Ω(ɛ) θ) p(y obs θ). E( w(θ)) = p(ỹ Ω( ɛ) θ) p(ỹ obs θ) p(y obs θ). Can we combine low and high-fidelity models to reduce simulation costs, but without introducing bias? 13/11/18 Early Accept/Reject Multifidelity ABC 20

25 Approximate Bayesian computation Multifidelity methods Multifidelity modelling example: Repressilator Multifidelity ABC Early accept/reject multifidelity ABC Performance analysis Repressilator revisited Implementation and future extensions 13/11/18 Early Accept/Reject Multifidelity ABC 21

26 Early rejection First idea: Use low-fidelity simulation to decide when it s worth simulating the high fidelity model Matching estimator values False positive False negative Threshold /11/18 Early Accept/Reject Multifidelity ABC 22

27 Early rejection First idea: Use low-fidelity simulation to decide when it s worth simulating the high fidelity model Matching estimator values False positive False negative Threshold Simulate ỹ i p( θ i ). Define continuation probability α(ỹ i ) (0, 1]. w.p. 1 α: Reject without simulating y i Else, simulate y i : If y i Ω(ɛ) then accept (with appropriate weight); Else, reject. 13/11/18 Early Accept/Reject Multifidelity ABC 22

28 Early rejection Formulate as early rejection weight: w er (θ) = I(U < α(ỹ)) w(θ) α(ỹ) for unit uniform r.v. U U(0, 1). w er (θ) is an unbiased estimator of p(y Ω(ɛ) θ), and we avoid some high-fidelity simulations. 13/11/18 Early Accept/Reject Multifidelity ABC 23

29 Early rejection Matching estimator values False positive False negative Threshold w er (θ) = I(U < α(ỹ)) w(θ) α(ỹ) Choosing α(ỹ): Make α(ỹ) small when ỹ suggests y / Ω(ɛ) is likely. Converse requires larger α(ỹ) when y Ω(ɛ) more likely. See also: Lazy ABC (Prangle 2016), Delayed Acceptance ABC (Everitt & Rowinska 2017). 13/11/18 Early Accept/Reject Multifidelity ABC 24

30 Re-examine converse w er (θ) = I(U < α(ỹ)) w(θ). α(ỹ) Larger α(ỹ) when y Ω(ɛ) more likely. 13/11/18 Early Accept/Reject Multifidelity ABC 25

31 Re-examine converse w er (θ) = I(U < α(ỹ)) w(θ). α(ỹ) Larger α(ỹ) when y Ω(ɛ) more likely. More likely to simulate high-fidelity model if low-fidelity simulation is close to the data. Why not accept early without simulating? 13/11/18 Early Accept/Reject Multifidelity ABC 25

32 Re-examine converse w er (θ) = I(U < α(ỹ)) w(θ). α(ỹ) Larger α(ỹ) when y Ω(ɛ) more likely. More likely to simulate high-fidelity model if low-fidelity simulation is close to the data. Why not accept early without simulating? Early acceptance requires positive weight without high-fidelity simulation: not possible using w er. 13/11/18 Early Accept/Reject Multifidelity ABC 25

33 Early decision Second idea: Sometimes make an early accept/reject decision based on the low-fidelity simulation alone Matching estimator values False positive False negative Threshold /11/18 Early Accept/Reject Multifidelity ABC 26

34 Early decision Second idea: Sometimes make an early accept/reject decision based on the low-fidelity simulation alone Matching estimator values False positive False negative Threshold Simulate ỹ i p( θ i ). Make an early decision w. Define probability η (0, 1]. w.p. 1 η: Use early decision. Else, simulate y i : Correct early decision, based on w. Correction appropriately weighted. 13/11/18 Early Accept/Reject Multifidelity ABC 26

35 Early decision Formulate as early decision weight: w ed (θ) = w(θ) + for unit uniform r.v. U U(0, 1). I(U < η) η [w(θ) w(θ)] w ed (θ) is an unbiased estimator of p(y Ω(ɛ) θ), and we avoid some high-fidelity simulations. 13/11/18 Early Accept/Reject Multifidelity ABC 27

36 Combining existing approaches Early rejection use ỹ to determine when to generate y: w er = I(U < α(ỹ)) I(y Ω(ɛ)). α(ỹ) Early decision use ỹ to determine the early decision: w ed = I(ỹ Ω( ɛ)) + I(U < η) η Special cases of multifidelity ABC. ( ) I(y Ω(ɛ)) I(ỹ Ω( ɛ)). 13/11/18 Early Accept/Reject Multifidelity ABC 28

37 Early accept/reject multifidelity ABC Combine approaches into a multifidelity weight: w mf (θ) = I(ỹ Ω( ɛ)) + I(U < η(ỹ)) η(ỹ) ( I(y Ω(ɛ)) I(ỹ Ω( ɛ)) ) 13/11/18 Early Accept/Reject Multifidelity ABC 29

38 Early accept/reject multifidelity ABC Combine approaches into a multifidelity weight: w mf (θ) = I(ỹ Ω( ɛ)) + I(U < η(ỹ)) η(ỹ) ( I(y Ω(ɛ)) I(ỹ Ω( ɛ)) ) Natural form of η(ỹ) to give early accept/reject multifidelity ABC: for η 1, η 2 (0, 1]. η(ỹ) = η 1 I(ỹ Ω( ɛ)) + η 2 I(ỹ / Ω( ɛ)) E(w mf (θ)) = p(y Ω(ɛ) θ). 1 η 1 and 1 η 2 are early accept/reject probabilities. 13/11/18 Early Accept/Reject Multifidelity ABC 29

39 Early accept/reject multifidelity ABC There are 6 cases (at most 4 values) of w mf (θ): 1 ỹ Ω( ɛ) & U η 1 0 ỹ / Ω( ɛ) & U η 2 1 ỹ w mf (θ) = Ω( ɛ) & y Ω(ɛ) & U < η 1 0 ỹ / Ω( ɛ) & y / Ω(ɛ) & U < η 2 1 1/η 1 ỹ Ω( ɛ) & y / Ω(ɛ) & U < η /η 2 ỹ / Ω( ɛ) & y Ω(ɛ) & U < η 2 13/11/18 Early Accept/Reject Multifidelity ABC 30

40 Early accept/reject multifidelity ABC There are 6 cases (at most 4 values) of w mf (θ): 1 Predicted positive (unchecked) 0 Predicted negative (unchecked) 1 True positive (checked) w mf (θ) = 0 True negative (checked) 1 1/η 1 False positive (checked) 0 + 1/η 2 False negative (checked) 13/11/18 Early Accept/Reject Multifidelity ABC 30

41 Approximate Bayesian computation Multifidelity methods Multifidelity modelling example: Repressilator Multifidelity ABC Early accept/reject multifidelity ABC Performance analysis Repressilator revisited Implementation and future extensions 13/11/18 Early Accept/Reject Multifidelity ABC 31

42 Early accept/reject multifidelity ABC algorithm For i = 1,..., N: Generate θ i π( ). Simulate (low-fidelity) ỹ i p( θ i ). Calculate w = I(ỹ i Ω( ɛ)). Set w i = w and η = η 1 w + η 2 (1 w). If U < η: Simulate (high fidelity) y i p( θ). Calculate w = I(y i Ω(ɛ)). Update wi = w i + (w w)/η. Return {w i, θ i } N i=1. 13/11/18 Early Accept/Reject Multifidelity ABC 32

43 Weighted sample Input to algorithm is continuation probabilities η 1, η 2. Output of algorithm is weighted sample {w i, θ i } N i=1. Each weight w i incurs simulation cost T i. Use sample in ABC estimator for a function, F (θ), of uncertain parameters: i µ ABC (F ) = w if (θ i ) j w F (θ)p ABC (y obs θ)π(θ) dθ. j 13/11/18 Early Accept/Reject Multifidelity ABC 33

44 Quality of a sample Monte Carlo sample {w i, θ i } N i=1 : Effective Sample Size: ESS = ( i w i) 2 i w i 2 = N ( i w i/n) 2 i w i 2/N Simulation time: T tot = T i 13/11/18 Early Accept/Reject Multifidelity ABC 34

45 Quality of a sample Efficiency is ratio of ESS and simulation time: ESS T tot = ( i w i/n) ( 2 i w i 2/N) ( i T i/n) Approximation is in the limit as N. E(w i)2 E(w 2 i )E(T i) Goal: Tune inputs η 1, η 2 to maximise efficiency (in the limit). 13/11/18 Early Accept/Reject Multifidelity ABC 35

46 Maximising efficiency ESS E(w mf) 2 T tot E(wmf 2 )E(T ) over θ π( ) Numerator: E(w mf ) = p(y Ω(ɛ)) is independent of η 1, η 2. 13/11/18 Early Accept/Reject Multifidelity ABC 36

47 Maximising efficiency ESS E(w mf) 2 T tot E(wmf 2 )E(T ) over θ π( ) Numerator: E(w mf ) = p(y Ω(ɛ)) is independent of η 1, η 2. Maximising efficiency over η 1, η 2 (0, 1] equivalent to minimising product E(w 2 mf )E(T i). 13/11/18 Early Accept/Reject Multifidelity ABC 36

48 Tradeoff: Reduced computational burden... Simulation times: Low-fidelity simulation ỹ p( θ) costs c. High-fidelity simulation y p( θ) costs c. w mf costs T = c + ci(u < η(ỹ)). Expected time to construct w mf (θ) is therefore E(T ) = E( c) + η 1 E[cI(ỹ Ω( ɛ))] + η 2 E[cI(ỹ / Ω( ɛ))]. Depends on simulation cost c relative to c. 13/11/18 Early Accept/Reject Multifidelity ABC 37

49 Tradeoff:...for increased variance. Second moment increases as η i decreases: E(w 2 mf ) = P(y Ω(ɛ)) ( ) P(ỹ Ω( ɛ) y / Ω(ɛ)) η 1 ( ) P(ỹ / Ω( ɛ) y Ω(ɛ)) η 2 Depends on false positive and false negative rates. 13/11/18 Early Accept/Reject Multifidelity ABC 38

50 Optimal early accept/reject continuation probabilities Analytical result: efficiency maximised at ( (η1, η2) = 1 P(y / Ω(ɛ) ỹ Ω( ɛ)) λ E(c ỹ Ω(ɛ))/E( c), P(y Ω(ɛ) ỹ / Ω( ɛ)) ) E(c ỹ / Ω( ɛ))/e( c) If probability P(y / Ω(ɛ) ỹ Ω( ɛ)) of needing to correct a positive prediction is small, then check less often. If the cost E(c ỹ Ω(ɛ)) of checking a positive prediction is large relative to low-fidelity simulation cost E( c), then check less often. Similar for negative predictions. Less effective when λ = P(y Ω(ɛ)) P(w w) is small. 13/11/18 Early Accept/Reject Multifidelity ABC 39

51 Example: Construct realisations of early accept/reject samples Matching estimator values False positive False negative Threshold Calculate w mf (θ i ) for i = 1,..., I(U < η(ỹ i )) is the random decision. Evaluate ESS/T tot for 500 realisations. Optimal (η1, η 2 ) taken from baseline dataset: compare to other continuation probabilities. 13/11/18 Early Accept/Reject Multifidelity ABC 40

52 Example: Efficiency of different continuation probabilities 1.0 Early decision: η 1 = η 2. Early rejection: η 1 = 1. Rejection sampling: η 1 = η 2 = Level sets are 99%, 95%, 90%, 85%, 80%, 75%, 60% of maximum efficiency. 13/11/18 Early Accept/Reject Multifidelity ABC 41

53 Example: Distribution of observed efficiencies /11/18 Early Accept/Reject Multifidelity ABC 42

54 Approximate Bayesian computation Multifidelity methods Multifidelity modelling example: Repressilator Multifidelity ABC Early accept/reject multifidelity ABC Performance analysis Repressilator revisited Implementation and future extensions 13/11/18 Early Accept/Reject Multifidelity ABC 43

55 Implementation of early accept/reject multifidelity ABC Dealt with in recent work: No a priori knowledge of location of (η1, η 2 ): Simulation times and false positive/negative probabilities. Use burn-in approach first, then estimate. Methods to reduce false positive/negative probabilities. Coupling low and high fidelity simulations. Optimising efficiency in terms of estimating specific functions E ABC (F (θ)) i w i F (θ i )/ i w i Minimize E(w 2 mf (F E ABC (F )) 2 )E(T ) instead. 13/11/18 Early Accept/Reject Multifidelity ABC 44

56 Outlook: future work Applicability to MCMC/SMC methods. Deal with wmf < 0. Multiple low-fidelity models. Local hierarchies of accuracy/speed-up. Parameter/estimator-specific low-fidelity model. Alternative forms for η(ỹ) and w. Parameter-dependent η. Analytic/computational bounds on error: ỹ versus y; w versus w. 13/11/18 Early Accept/Reject Multifidelity ABC 45

57 Acknowledgements Project: Next generation approaches to connect models and quantitative data. PIs: Ruth Baker & Michael Stumpf. 13/11/18 Early Accept/Reject Multifidelity ABC 46

Approximate Bayesian Computation

Approximate Bayesian Computation Approximate Bayesian Computation Sarah Filippi Department of Statistics University of Oxford 09/02/2016 Parameter inference in a signalling pathway A RAS Receptor Growth factor Cell membrane Phosphorylation

More information

Approximate Bayesian Computation and Particle Filters

Approximate Bayesian Computation and Particle Filters Approximate Bayesian Computation and Particle Filters Dennis Prangle Reading University 5th February 2014 Introduction Talk is mostly a literature review A few comments on my own ongoing research See Jasra

More information

Monte Carlo in Bayesian Statistics

Monte Carlo in Bayesian Statistics Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview

More information

New Insights into History Matching via Sequential Monte Carlo

New Insights into History Matching via Sequential Monte Carlo New Insights into History Matching via Sequential Monte Carlo Associate Professor Chris Drovandi School of Mathematical Sciences ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS)

More information

Bayesian Computations for DSGE Models

Bayesian Computations for DSGE Models Bayesian Computations for DSGE Models Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 This Lecture is Based on Bayesian Estimation of DSGE Models Edward P. Herbst &

More information

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals

More information

Multi-level Approximate Bayesian Computation

Multi-level Approximate Bayesian Computation Multi-level Approximate Bayesian Computation Christopher Lester 20 November 2018 arxiv:1811.08866v1 [q-bio.qm] 21 Nov 2018 1 Introduction Well-designed mechanistic models can provide insights into biological

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods

Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Risk Estimation and Uncertainty Quantification by Markov Chain Monte Carlo Methods Konstantin Zuev Institute for Risk and Uncertainty University of Liverpool http://www.liv.ac.uk/risk-and-uncertainty/staff/k-zuev/

More information

Tutorial on ABC Algorithms

Tutorial on ABC Algorithms Tutorial on ABC Algorithms Dr Chris Drovandi Queensland University of Technology, Australia c.drovandi@qut.edu.au July 3, 2014 Notation Model parameter θ with prior π(θ) Likelihood is f(ý θ) with observed

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

Bayesian Phylogenetics:

Bayesian Phylogenetics: Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes

More information

Monte Carlo Inference Methods

Monte Carlo Inference Methods Monte Carlo Inference Methods Iain Murray University of Edinburgh http://iainmurray.net Monte Carlo and Insomnia Enrico Fermi (1901 1954) took great delight in astonishing his colleagues with his remarkably

More information

On Markov chain Monte Carlo methods for tall data

On Markov chain Monte Carlo methods for tall data On Markov chain Monte Carlo methods for tall data Remi Bardenet, Arnaud Doucet, Chris Holmes Paper review by: David Carlson October 29, 2016 Introduction Many data sets in machine learning and computational

More information

Fast Likelihood-Free Inference via Bayesian Optimization

Fast Likelihood-Free Inference via Bayesian Optimization Fast Likelihood-Free Inference via Bayesian Optimization Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

A note on multiple imputation for general purpose estimation

A note on multiple imputation for general purpose estimation A note on multiple imputation for general purpose estimation Shu Yang Jae Kwang Kim SSC meeting June 16, 2015 Shu Yang, Jae Kwang Kim Multiple Imputation June 16, 2015 1 / 32 Introduction Basic Setup Assume

More information

Efficient Likelihood-Free Inference

Efficient Likelihood-Free Inference Efficient Likelihood-Free Inference Michael Gutmann http://homepages.inf.ed.ac.uk/mgutmann Institute for Adaptive and Neural Computation School of Informatics, University of Edinburgh 8th November 2017

More information

Approximate Bayesian computation: methods and applications for complex systems

Approximate Bayesian computation: methods and applications for complex systems Approximate Bayesian computation: methods and applications for complex systems Mark A. Beaumont, School of Biological Sciences, The University of Bristol, Bristol, UK 11 November 2015 Structure of Talk

More information

David Giles Bayesian Econometrics

David Giles Bayesian Econometrics David Giles Bayesian Econometrics 1. General Background 2. Constructing Prior Distributions 3. Properties of Bayes Estimators and Tests 4. Bayesian Analysis of the Multiple Regression Model 5. Bayesian

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent

More information

2. Mathematical descriptions. (i) the master equation (ii) Langevin theory. 3. Single cell measurements

2. Mathematical descriptions. (i) the master equation (ii) Langevin theory. 3. Single cell measurements 1. Why stochastic?. Mathematical descriptions (i) the master equation (ii) Langevin theory 3. Single cell measurements 4. Consequences Any chemical reaction is stochastic. k P d φ dp dt = k d P deterministic

More information

Reminder of some Markov Chain properties:

Reminder of some Markov Chain properties: Reminder of some Markov Chain properties: 1. a transition from one state to another occurs probabilistically 2. only state that matters is where you currently are (i.e. given present, future is independent

More information

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November

More information

One Pseudo-Sample is Enough in Approximate Bayesian Computation MCMC

One Pseudo-Sample is Enough in Approximate Bayesian Computation MCMC Biometrika (?), 99, 1, pp. 1 10 Printed in Great Britain Submitted to Biometrika One Pseudo-Sample is Enough in Approximate Bayesian Computation MCMC BY LUKE BORNN, NATESH PILLAI Department of Statistics,

More information

Robust Monte Carlo Methods for Sequential Planning and Decision Making

Robust Monte Carlo Methods for Sequential Planning and Decision Making Robust Monte Carlo Methods for Sequential Planning and Decision Making Sue Zheng, Jason Pacheco, & John Fisher Sensing, Learning, & Inference Group Computer Science & Artificial Intelligence Laboratory

More information

Approximate Bayesian Computation: a simulation based approach to inference

Approximate Bayesian Computation: a simulation based approach to inference Approximate Bayesian Computation: a simulation based approach to inference Richard Wilkinson Simon Tavaré 2 Department of Probability and Statistics University of Sheffield 2 Department of Applied Mathematics

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

Approximate Bayesian Computation

Approximate Bayesian Computation Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki and Aalto University 1st December 2015 Content Two parts: 1. The basics of approximate

More information

An ABC interpretation of the multiple auxiliary variable method

An ABC interpretation of the multiple auxiliary variable method School of Mathematical and Physical Sciences Department of Mathematics and Statistics Preprint MPS-2016-07 27 April 2016 An ABC interpretation of the multiple auxiliary variable method by Dennis Prangle

More information

Efficient Leaping Methods for Stochastic Chemical Systems

Efficient Leaping Methods for Stochastic Chemical Systems Efficient Leaping Methods for Stochastic Chemical Systems Ioana Cipcigan Muruhan Rathinam November 18, 28 Abstract. Well stirred chemical reaction systems which involve small numbers of molecules for some

More information

Introduction to Bayesian inference

Introduction to Bayesian inference Introduction to Bayesian inference Thomas Alexander Brouwer University of Cambridge tab43@cam.ac.uk 17 November 2015 Probabilistic models Describe how data was generated using probability distributions

More information

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci

More information

Bayesian search for other Earths

Bayesian search for other Earths Bayesian search for other Earths Low-mass planets orbiting nearby M dwarfs Mikko Tuomi University of Hertfordshire, Centre for Astrophysics Research Email: mikko.tuomi@utu.fi Presentation, 19.4.2013 1

More information

An introduction to Approximate Bayesian Computation methods

An introduction to Approximate Bayesian Computation methods An introduction to Approximate Bayesian Computation methods M.E. Castellanos maria.castellanos@urjc.es (from several works with S. Cabras, E. Ruli and O. Ratmann) Valencia, January 28, 2015 Valencia Bayesian

More information

VCMC: Variational Consensus Monte Carlo

VCMC: Variational Consensus Monte Carlo VCMC: Variational Consensus Monte Carlo Maxim Rabinovich, Elaine Angelino, Michael I. Jordan Berkeley Vision and Learning Center September 22, 2015 probabilistic models! sky fog bridge water grass object

More information

Inexact approximations for doubly and triply intractable problems

Inexact approximations for doubly and triply intractable problems Inexact approximations for doubly and triply intractable problems March 27th, 2014 Markov random fields Interacting objects Markov random fields (MRFs) are used for modelling (often large numbers of) interacting

More information

Lecture 7 and 8: Markov Chain Monte Carlo

Lecture 7 and 8: Markov Chain Monte Carlo Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani

More information

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference 1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE

More information

Problem Set 5. 1 Waiting times for chemical reactions (8 points)

Problem Set 5. 1 Waiting times for chemical reactions (8 points) Problem Set 5 1 Waiting times for chemical reactions (8 points) In the previous assignment, we saw that for a chemical reaction occurring at rate r, the distribution of waiting times τ between reaction

More information

Introduction Probabilistic Programming ProPPA Inference Results Conclusions. Embedding Machine Learning in Stochastic Process Algebra.

Introduction Probabilistic Programming ProPPA Inference Results Conclusions. Embedding Machine Learning in Stochastic Process Algebra. Embedding Machine Learning in Stochastic Process Algebra Jane Hillston Joint work with Anastasis Georgoulas and Guido Sanguinetti, School of Informatics, University of Edinburgh 16th August 2017 quan col....

More information

Supplementary Note on Bayesian analysis

Supplementary Note on Bayesian analysis Supplementary Note on Bayesian analysis Structured variability of muscle activations supports the minimal intervention principle of motor control Francisco J. Valero-Cuevas 1,2,3, Madhusudhan Venkadesan

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

CS281A/Stat241A Lecture 22

CS281A/Stat241A Lecture 22 CS281A/Stat241A Lecture 22 p. 1/4 CS281A/Stat241A Lecture 22 Monte Carlo Methods Peter Bartlett CS281A/Stat241A Lecture 22 p. 2/4 Key ideas of this lecture Sampling in Bayesian methods: Predictive distribution

More information

Austerity in MCMC Land: Cutting the Metropolis-Hastings Budget

Austerity in MCMC Land: Cutting the Metropolis-Hastings Budget Austerity in MCMC Land: Cutting the Metropolis-Hastings Budget Anoop Korattikara, Yutian Chen and Max Welling,2 Department of Computer Science, University of California, Irvine 2 Informatics Institute,

More information

arxiv: v1 [stat.co] 23 Apr 2018

arxiv: v1 [stat.co] 23 Apr 2018 Bayesian Updating and Uncertainty Quantification using Sequential Tempered MCMC with the Rank-One Modified Metropolis Algorithm Thomas A. Catanach and James L. Beck arxiv:1804.08738v1 [stat.co] 23 Apr

More information

Stochastic Simulation

Stochastic Simulation Stochastic Simulation Idea: probabilities samples Get probabilities from samples: X count x 1 n 1. x k total. n k m X probability x 1. n 1 /m. x k n k /m If we could sample from a variable s (posterior)

More information

an introduction to bayesian inference

an introduction to bayesian inference with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena

More information

1 Bayesian Linear Regression (BLR)

1 Bayesian Linear Regression (BLR) Statistical Techniques in Robotics (STR, S15) Lecture#10 (Wednesday, February 11) Lecturer: Byron Boots Gaussian Properties, Bayesian Linear Regression 1 Bayesian Linear Regression (BLR) In linear regression,

More information

Learning the hyper-parameters. Luca Martino

Learning the hyper-parameters. Luca Martino Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth

More information

Bayesian Regression Linear and Logistic Regression

Bayesian Regression Linear and Logistic Regression When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

The Poisson transform for unnormalised statistical models. Nicolas Chopin (ENSAE) joint work with Simon Barthelmé (CNRS, Gipsa-LAB)

The Poisson transform for unnormalised statistical models. Nicolas Chopin (ENSAE) joint work with Simon Barthelmé (CNRS, Gipsa-LAB) The Poisson transform for unnormalised statistical models Nicolas Chopin (ENSAE) joint work with Simon Barthelmé (CNRS, Gipsa-LAB) Part I Unnormalised statistical models Unnormalised statistical models

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning

More information

Evidence estimation for Markov random fields: a triply intractable problem

Evidence estimation for Markov random fields: a triply intractable problem Evidence estimation for Markov random fields: a triply intractable problem January 7th, 2014 Markov random fields Interacting objects Markov random fields (MRFs) are used for modelling (often large numbers

More information

LECTURE 15 Markov chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte

More information

Econometrics I, Estimation

Econometrics I, Estimation Econometrics I, Estimation Department of Economics Stanford University September, 2008 Part I Parameter, Estimator, Estimate A parametric is a feature of the population. An estimator is a function of the

More information

LECTURE NOTE #3 PROF. ALAN YUILLE

LECTURE NOTE #3 PROF. ALAN YUILLE LECTURE NOTE #3 PROF. ALAN YUILLE 1. Three Topics (1) Precision and Recall Curves. Receiver Operating Characteristic Curves (ROC). What to do if we do not fix the loss function? (2) The Curse of Dimensionality.

More information

Part 1: Expectation Propagation

Part 1: Expectation Propagation Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.

More information

Statistical Data Analysis Stat 3: p-values, parameter estimation

Statistical Data Analysis Stat 3: p-values, parameter estimation Statistical Data Analysis Stat 3: p-values, parameter estimation London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway,

More information

Introduction to Bayesian Statistics

Introduction to Bayesian Statistics School of Computing & Communication, UTS January, 207 Random variables Pre-university: A number is just a fixed value. When we talk about probabilities: When X is a continuous random variable, it has a

More information

A Synthetic Oscillatory Network of Transcriptional Regulators

A Synthetic Oscillatory Network of Transcriptional Regulators A Synthetic Oscillatory Network of Transcriptional Regulators Michael Elowitz & Stanislas Leibler Nature, 2000 Presented by Khaled A. Rahman Background Genetic Networks Gene X Operator Operator Gene Y

More information

Gaussian kernel GARCH models

Gaussian kernel GARCH models Gaussian kernel GARCH models Xibin (Bill) Zhang and Maxwell L. King Department of Econometrics and Business Statistics Faculty of Business and Economics 7 June 2013 Motivation A regression model is often

More information

Tutorial on Approximate Bayesian Computation

Tutorial on Approximate Bayesian Computation Tutorial on Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki Aalto University Helsinki Institute for Information Technology 16 May 2016

More information

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01

STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 STAT 499/962 Topics in Statistics Bayesian Inference and Decision Theory Jan 2018, Handout 01 Nasser Sadeghkhani a.sadeghkhani@queensu.ca There are two main schools to statistical inference: 1-frequentist

More information

Bayesian parameter inference for stochastic biochemical network models using particle MCMC

Bayesian parameter inference for stochastic biochemical network models using particle MCMC Bayesian parameter inference for stochastic biochemical network models using particle MCMC Andrew Golightly Darren J. Wilkinson May 16, 2011 Abstract Computational systems biology is concerned with the

More information

Monte Carlo Methods for Computation and Optimization (048715)

Monte Carlo Methods for Computation and Optimization (048715) Technion Department of Electrical Engineering Monte Carlo Methods for Computation and Optimization (048715) Lecture Notes Prof. Nahum Shimkin Spring 2015 i PREFACE These lecture notes are intended for

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

Variational Inference via Stochastic Backpropagation

Variational Inference via Stochastic Backpropagation Variational Inference via Stochastic Backpropagation Kai Fan February 27, 2016 Preliminaries Stochastic Backpropagation Variational Auto-Encoding Related Work Summary Outline Preliminaries Stochastic Backpropagation

More information

Monte Carlo Studies. The response in a Monte Carlo study is a random variable.

Monte Carlo Studies. The response in a Monte Carlo study is a random variable. Monte Carlo Studies The response in a Monte Carlo study is a random variable. The response in a Monte Carlo study has a variance that comes from the variance of the stochastic elements in the data-generating

More information

Using Bayes Factors for Model Selection in High-Energy Astrophysics

Using Bayes Factors for Model Selection in High-Energy Astrophysics Using Bayes Factors for Model Selection in High-Energy Astrophysics Shandong Zhao Department of Statistic, UCI April, 203 Model Comparison in Astrophysics Nested models (line detection in spectral analysis):

More information

Bayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014

Bayesian Prediction of Code Output. ASA Albuquerque Chapter Short Course October 2014 Bayesian Prediction of Code Output ASA Albuquerque Chapter Short Course October 2014 Abstract This presentation summarizes Bayesian prediction methodology for the Gaussian process (GP) surrogate representation

More information

Bayesian Inference for DSGE Models. Lawrence J. Christiano

Bayesian Inference for DSGE Models. Lawrence J. Christiano Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 3 More Markov Chain Monte Carlo Methods The Metropolis algorithm isn t the only way to do MCMC. We ll

More information

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30 Problem Set MAS 6J/1.16J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain

More information

Computational methods for continuous time Markov chains with applications to biological processes

Computational methods for continuous time Markov chains with applications to biological processes Computational methods for continuous time Markov chains with applications to biological processes David F. Anderson anderson@math.wisc.edu Department of Mathematics University of Wisconsin - Madison Penn.

More information

Surveillance of Infectious Disease Data using Cumulative Sum Methods

Surveillance of Infectious Disease Data using Cumulative Sum Methods Surveillance of Infectious Disease Data using Cumulative Sum Methods 1 Michael Höhle 2 Leonhard Held 1 1 Institute of Social and Preventive Medicine University of Zurich 2 Department of Statistics University

More information

ST 740: Model Selection

ST 740: Model Selection ST 740: Model Selection Alyson Wilson Department of Statistics North Carolina State University November 25, 2013 A. Wilson (NCSU Statistics) Model Selection November 25, 2013 1 / 29 Formal Bayesian Model

More information

Fractional Imputation in Survey Sampling: A Comparative Review

Fractional Imputation in Survey Sampling: A Comparative Review Fractional Imputation in Survey Sampling: A Comparative Review Shu Yang Jae-Kwang Kim Iowa State University Joint Statistical Meetings, August 2015 Outline Introduction Fractional imputation Features Numerical

More information

Metropolis-Hastings Algorithm

Metropolis-Hastings Algorithm Strength of the Gibbs sampler Metropolis-Hastings Algorithm Easy algorithm to think about. Exploits the factorization properties of the joint probability distribution. No difficult choices to be made to

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

MCMC algorithms for fitting Bayesian models

MCMC algorithms for fitting Bayesian models MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models

More information

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde Gillespie s Algorithm and its Approximations Des Higham Department of Mathematics and Statistics University of Strathclyde djh@maths.strath.ac.uk The Three Lectures 1 Gillespie s algorithm and its relation

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

The Generalized Likelihood Uncertainty Estimation methodology

The Generalized Likelihood Uncertainty Estimation methodology CHAPTER 4 The Generalized Likelihood Uncertainty Estimation methodology Calibration and uncertainty estimation based upon a statistical framework is aimed at finding an optimal set of models, parameters

More information

Manifold Monte Carlo Methods

Manifold Monte Carlo Methods Manifold Monte Carlo Methods Mark Girolami Department of Statistical Science University College London Joint work with Ben Calderhead Research Section Ordinary Meeting The Royal Statistical Society October

More information

Characterizing Travel Time Reliability and Passenger Path Choice in a Metro Network

Characterizing Travel Time Reliability and Passenger Path Choice in a Metro Network Characterizing Travel Time Reliability and Passenger Path Choice in a Metro Network Lijun SUN Future Cities Laboratory, Singapore-ETH Centre lijun.sun@ivt.baug.ethz.ch National University of Singapore

More information

Fundamental Probability and Statistics

Fundamental Probability and Statistics Fundamental Probability and Statistics "There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are

More information

Bayesian Decision Theory

Bayesian Decision Theory Bayesian Decision Theory Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Bayesian Decision Theory Bayesian classification for normal distributions Error Probabilities

More information

Bayesian phylogenetics. the one true tree? Bayesian phylogenetics

Bayesian phylogenetics. the one true tree? Bayesian phylogenetics Bayesian phylogenetics the one true tree? the methods we ve learned so far try to get a single tree that best describes the data however, they admit that they don t search everywhere, and that it is difficult

More information

The Metropolis-Hastings Algorithm. June 8, 2012

The Metropolis-Hastings Algorithm. June 8, 2012 The Metropolis-Hastings Algorithm June 8, 22 The Plan. Understand what a simulated distribution is 2. Understand why the Metropolis-Hastings algorithm works 3. Learn how to apply the Metropolis-Hastings

More information

Likelihood-free inference and approximate Bayesian computation for stochastic modelling

Likelihood-free inference and approximate Bayesian computation for stochastic modelling Likelihood-free inference and approximate Bayesian computation for stochastic modelling Master Thesis April of 2013 September of 2013 Written by Oskar Nilsson Supervised by Umberto Picchini Centre for

More information

Engineering of Repressilator

Engineering of Repressilator The Utilization of Monte Carlo Techniques to Simulate the Repressilator: A Cyclic Oscillatory System Seetal Erramilli 1,2 and Joel R. Stiles 1,3,4 1 Bioengineering and Bioinformatics Summer Institute,

More information

Y i = η + ɛ i, i = 1,...,n.

Y i = η + ɛ i, i = 1,...,n. Nonparametric tests If data do not come from a normal population (and if the sample is not large), we cannot use a t-test. One useful approach to creating test statistics is through the use of rank statistics.

More information

Machine Learning: Chenhao Tan University of Colorado Boulder LECTURE 5

Machine Learning: Chenhao Tan University of Colorado Boulder LECTURE 5 Machine Learning: Chenhao Tan University of Colorado Boulder LECTURE 5 Slides adapted from Jordan Boyd-Graber, Tom Mitchell, Ziv Bar-Joseph Machine Learning: Chenhao Tan Boulder 1 of 27 Quiz question For

More information

Stat 516, Homework 1

Stat 516, Homework 1 Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball

More information

Penalized Loss functions for Bayesian Model Choice

Penalized Loss functions for Bayesian Model Choice Penalized Loss functions for Bayesian Model Choice Martyn International Agency for Research on Cancer Lyon, France 13 November 2009 The pure approach For a Bayesian purist, all uncertainty is represented

More information