Solar Spectral Analyses with Uncertainties in Atomic Physical Models
|
|
- Jemima Doyle
- 5 years ago
- Views:
Transcription
1 Solar Spectral Analyses with Uncertainties in Atomic Physical Models Xixi Yu Imperial College London 3 April 2018 Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
2 Acknowledgments Joint work with the International Space Science Institute (ISSI) team Improving the Analysis of Solar and Stellar Observations Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
3 The Solar Corona The solar corona is a complex and dynamic system Measuring physical properties in any solar region is important for understanding the processes that lead to these events Figure: The photospheric magnetic field measured with HMI, million degree emission observed with the AIA Fe IX 171,A channel, and high temperature loops observed with XRT Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
4 Aim We want to infer physical quantities of the solar atmosphere (density, temperature, path length, etc.), but we only observe intensity Inferences rely on models for the underlying atomic physics How to address uncertainty in the atomic physics models? Figure: Hinode spacecraft. Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
5 Physical Parameters n k 1 : number of free electrons per unit volume in plasma T k : electron temperature d k : path length through the solar atmosphere θ k = (log n k, log d k ) m: index of the emissivity curve Expected intensity of line with wavelength λ: ɛ (m) λ ɛ (m) λ (n k, T k )n 2 k d k (n k, T k ) is the plasma emissivity for the line with wavelength λ in pixel k 1 Subscript k is the pixel index Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
6 Data: Observed Intensity Data from the Extreme-Ultraviolet Imaging Spectrometer (EIS) on Hinode spacecraft. Figure: Example EIS spectrum of seven Fe XIII lines Spectral lines with wavelengths Λ = {λ 1,..., λ J } Observed intensities for K pixels and J wavelengths: ˆD = {D k = (I kλ1,..., I kλj ), k = 1,..., K} Standard deviation σ kλj are also known Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
7 Uncertainty: Emissivity Emissivity: how strongly energy is radiated at a given wavelength Simulated from a model accounting for uncertainty in the atomic data Suppose a collection of M emissivity curves are known M = {ɛ (m) λ (n k, T k ), λ Λ, m = 1,..., M} Figure: A simplified level diagram for the transitions relevant to the 7 lines considered here. Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
8 Preceding Work Fully Bayesian Model I: Use Bayesian Methods to incorporate information in the data for narrowing the uncertainty in the atomic physics calculation There are only M = 1000 equally likely emissivity curves as a priori Solution: Obtain a sample of M that accounts for the uncertainty in the atomic data, m m is treated as an unknown parameter Obtain sample from p(m, θ D) via two-step Monte Carlo (MC) samplers or Hamiltonian Monte Carlo (HMC) Conclusions: We are able to incorporate uncertainties in atomic physics calculations into analyses of solar spectroscopic data Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
9 Preceding Work: Fully Bayesian Model I Independent prior distributions Likeihood L(m, θ k D k ) p(m, θ k ) = p(m) p(log n k ) p(log d k ) (1) m DiscreteUniform({1,..., M}) (2) log 10 n k Uniform(min = 7, max = 12) (3) log 10 d k Cauchy(center = 9, scale = 5) (4) ( indep I kλ m, n k, d k Normal Joint posterior distribution ɛ (m) λ ) (n k, T k )n 2 k d k, σkλ 2, for λ Λ (5) p(m, θ k D k ) L(m, θ k D k ) p(m, θ k ), (6) Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
10 Preceding Work: Pragmatic VS Fully Bayesian Models Pragmatic Bayesian posterior distribution p(m, θ k D k ) = p(θ k D k, m) p(m). (7) Fully Bayesian posterior distribution p(m, θ k D k ) = p(θ k D k, m) p(m D k ). (8) Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
11 Preceding Work: Multimodal Posterior Distributions Bimodal posterior distributions occur Two modes correspond to two emissivity curves Inaccurate relative size of two modes in HMC Reason: Not enough emissivity curves Challenge: Sparse selection of emissivity curves Density H Algorithm Density H Algorithm log ne log ds Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
12 Preceding Work: Multimodal Posterior Distributions A computational issue: Inaccurate relative size of two modes A computational solution: Adding a few synthetic replicate emissivity curves with augmented set M aug M: M aug /M = {w 1 Emis w 2 Emis 368 }, where (w 1, w 2 ) = (0.75, 0.25), (0.50, 0.50), &(0.25, 0.75) Density H Algorithm Density H Algorithm log ne log ds Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
13 Aim Come up with a way to efficiently represent the high dimensional joint distribution of the uncertainty of the emissivity curves. Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
14 Comparison of Current and Preceding Models Model I (Done) Joint posterior distribution: p(m, θ k D k ) L(m, θ k D k ) p(m) p(θ k ) p(m) = 1 M A computational trick: adding a few synthetic replicate emissivity curves Model II (On going...) Joint posterior distribution: p(c(r k ), θ k D k ) L(C(r k ), θ k D k ) p(r k, θ k ) r k is the PCA transformation of emissivity curve, C p(r) is a high dimensional distribution An algorithm: summarizing the distribution with multivariate standard Normal distribution via principal component analysis (PCA) Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
15 Principal Component Analysis In log space J = 16 PCs capture 99% of total variation PCA generated emissivity curve replicate based on the first J PCs C rep = C + J r j β j v j, j=i where C: average of all 1000 emissivity curves r j : random variate generated from the standard Normal distribution β 2 j, v j: eigenvalue and eigenvector of component j in the PCA representation Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
16 Plot of Original Emissivity Curves C C C_ave var index var index Top panel: Each part corresponds one of the seven lines Light gray area: all 1000 emissivities Dark gray area: middle 68% of emissivities Solid black curve: C Bottom panel: Same as above, but using C C Colored dashed curves: six randomly selected curves Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
17 Plot of PCA Generated Emissivity Curves C C_ave C C_ave var index var index Light & dark gray areas are same as above Top panel: Dashed lines: all 1500 PCA generated emissivities Dotted lines: the middle 68% of PCA emissivities Bottom panel: Colored dashed curves: selection of PCA curves Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
18 Plot of Eigenvectors PC1 eigenvector PC2 eigenvector PC3 eigenvector PC4 eigenvector logn logn logn logn PC5 eigenvector PC6 eigenvector PC7 eigenvector PC8 eigenvector logn logn logn logn Figure: Plot of eigenvectors of 7 lines along log n grid for the first 16 PCs. Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
19 Principal Component Analysis PC9 eigenvector PC10 eigenvector PC11 eigenvector PC12 eigenvector logn logn logn logn PC13 eigenvector PC14 eigenvector PC15 eigenvector PC16 eigenvector logn logn logn logn Figure: Plot of eigenvectors of 7 lines along log n grid for the first 16 PCs. Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
20 Alg I: Hamiltonian Monte Carlo via Stan Use Stan (mc-stan.org) to sample (r (l), θ (l) ) p(r, θ D), for l = 1,..., L. where p(r, θ D) p(d r, θ) p(r) p(θ) p(r) Normal(0, I ) log 10 n Uniform(min = 7, max = 12) log 10 d Cauchy(center = 9, scale = 5) Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
21 Alg I: Compare Stan Results From Model I & Model II Density Density logn logds The results from Model II (white hist) can mitigate the multimodal from Model I (gray hist) Problem: Stan is time-consuming Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
22 Alg I: Scatterplot matrices from Stan Results Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
23 Alg I: Scatterplot matrices from Stan Results Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
24 Alg I: Scatterplot matrices from Stan Results Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
25 Alg II: Independence Sampler (IS) Metropolis Hastings (MH) within Fully Bayesian Gibbs Sampler Step 1: For j = 1,..., J, sample r [prop] j N (µ = 0, sd = 1). Step 2: Set r [prop] = (r [prop] 1,..., r [prop] J ) and C [prop] = C + J j=1 r [t+1] j β j v j. Step 3: Sample θ [prop] Q MVN (θ r [prop] ). Step 4: Compute ρ = p(c [prop], θ [prop] D) Q MVN (θ (t), r (t) ) p(c (t), θ (t) D) Q MVN (θ [prop], r [prop] ) (9) Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
26 Alg II: Independence Sampler Use a multivariate normal distribution Q MVN (θ, r) to fit a sample obtained from the pragmatic Bayesian posterior. Suppose we have the pragmatic Bayesian samples, {(r (1) pb, θ(1) (T ) pb ),..., (r pb, θ(t ) pb )} Set the multivariate normal distribution Q MVN (θ, r) to be, ( ) θ r r Normal(µ r = 0, Σ 22 = I ) (( ˆµθ Normal µ r ) ( )) ˆΣ11 ˆΣ12, ˆΣ 21 Σ 22 ˆµ θ, ˆΣ 11, ˆΣ 12 : sample mean and sample variance of θ obtained from the pragmatic Bayesian sample, sample covariance between θ and r The conditional distribution of θ given r is, Q MVN (θ r) Normal(ˆµ θ + ˆΣ 12 r, ˆΣ 11 ˆΣ 12 ˆΣ 21 ) achieved via multivariate normal linear regression. Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
27 Alg II: Several Possible Proposals of r Normal(0, I ) Normal(r (471), I ) or Normal(r (368), I ) Mixture distribution: p Normal(r (471), σ 2 I ) + (1 p) Normal(r (368), σ 2 I ) where p and σ are tuning parameters Normal(µ Stan, Σ Stan ) or Normal(µ Stan, Σ Stan 2) Student t 4 Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
28 Alg II: Proposals of r r Normal(0, I ) log n_e ACF log ds e Lag Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
29 Alg II: Proposals of r r Normal(r (471), I ) or Normal(r (368), I ) log n_e log ds ACF e Lag Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
30 Alg II: Proposals of r r p Normal(r (471), σ 2 I ) + (1 p) Normal(r (368), σ 2 I ) where p = 0.5 and σ = 2 are tuning parameters log n_e log ds ACF e Lag Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
31 Alg II: Proposals of r r Normal(µ Stan, Σ Stan ) or Normal(µ Stan, Σ Stan 2) log n_e log ds ACF e Lag Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
32 Alg II: Proposals of r r Student t 4 θ r multivariate t linear regression with df = 4 log n_e log ds ACF e Lag Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
33 Alg III: Random Walk Sampler (TBD) Use Random Walk Step 1: For j = 1,..., J, sample r [prop] j Step 2: Set r [prop] = (r [prop] 1,..., r [prop] C [prop] = C + J j=1 r [t+1] j β j v j. J N (µ = r [t] j, sd = σ r ). ) and Step 3: Sample θ [prop] Q MVN (θ r [prop] ). Step 4: Compute ρ = p(c [prop], θ [prop] D) Q MVN (θ (t) r (t) ) p = (C (t), θ (t) D) Q MVN (θ [prop] r [prop] ) (10) Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
34 Alg III Tuning parameter σ = 0.6 r is initialized from Normal(r (471), I ) log n_e log ds ACF e Lag Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
35 Alg IV: Mixture of Independence Sampler and Random Walk Sampler (TBD) Step 1: Sample u 1 Uniform(0, 1) Step 2: If u 1 < p m, go to Alg III, Random Walk Sampler else, go to Alg II, Independence Sampler Two tuning parameters p m and σ r Xixi Yu (Imperial College London) Statistical methods in Astrophysics 3 April / 35
Multistage Anslysis on Solar Spectral Analyses with Uncertainties in Atomic Physical Models
Multistage Anslysis on Solar Spectral Analyses with Uncertainties in Atomic Physical Models Xixi Yu Imperial College London xixi.yu16@imperial.ac.uk 23 October 2018 Xixi Yu (Imperial College London) Multistage
More informationMCMC Sampling for Bayesian Inference using L1-type Priors
MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling
More informationMetropolis Hastings. Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601. Module 9
Metropolis Hastings Rebecca C. Steorts Bayesian Methods and Modern Statistics: STA 360/601 Module 9 1 The Metropolis-Hastings algorithm is a general term for a family of Markov chain simulation methods
More informationComputational statistics
Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated
More informationAccounting for Calibration Uncertainty
: High Energy Astrophysics and the PCG Sampler 1 Vinay Kashyap 2 Taeyoung Park 3 Jin Xu 4 Imperial-California-Harvard AstroStatistics Collaboration 1 Statistics Section, Imperial College London 2 Smithsonian
More informationMetropolis-Hastings Algorithm
Strength of the Gibbs sampler Metropolis-Hastings Algorithm Easy algorithm to think about. Exploits the factorization properties of the joint probability distribution. No difficult choices to be made to
More informationLECTURE 15 Markov chain Monte Carlo
LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte
More informationEmbedding Supernova Cosmology into a Bayesian Hierarchical Model
1 / 41 Embedding Supernova Cosmology into a Bayesian Hierarchical Model Xiyun Jiao Statistic Section Department of Mathematics Imperial College London Joint work with David van Dyk, Roberto Trotta & Hikmatali
More informationCalibrating Environmental Engineering Models and Uncertainty Analysis
Models and Cornell University Oct 14, 2008 Project Team Christine Shoemaker, co-pi, Professor of Civil and works in applied optimization, co-pi Nikolai Blizniouk, PhD student in Operations Research now
More information(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis
Summarizing a posterior Given the data and prior the posterior is determined Summarizing the posterior gives parameter estimates, intervals, and hypothesis tests Most of these computations are integrals
More informationKernel adaptive Sequential Monte Carlo
Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline
More informationMarkov Chain Monte Carlo
Markov Chain Monte Carlo Recall: To compute the expectation E ( h(y ) ) we use the approximation E(h(Y )) 1 n n h(y ) t=1 with Y (1),..., Y (n) h(y). Thus our aim is to sample Y (1),..., Y (n) from f(y).
More informationHamiltonian Monte Carlo
Hamiltonian Monte Carlo within Stan Daniel Lee Columbia University, Statistics Department bearlee@alum.mit.edu BayesComp mc-stan.org Why MCMC? Have data. Have a rich statistical model. No analytic solution.
More informationBayesian Inference and Decision Theory
Bayesian Inference and Decision Theory Instructor: Kathryn Blackmond Laskey Room 2214 ENGR (703) 993-1644 Office Hours: Tuesday and Thursday 4:30-5:30 PM, or by appointment Spring 2018 Unit 6: Gibbs Sampling
More informationAccounting for Calibration Uncertainty in Spectral Analysis. David A. van Dyk
Bayesian Analysis of Accounting for in Spectral Analysis David A van Dyk Statistics Section, Imperial College London Joint work with Vinay Kashyap, Jin Xu, Alanna Connors, and Aneta Siegminowska ISIS May
More informationClosed-form Gibbs Sampling for Graphical Models with Algebraic constraints. Hadi Mohasel Afshar Scott Sanner Christfried Webers
Closed-form Gibbs Sampling for Graphical Models with Algebraic constraints Hadi Mohasel Afshar Scott Sanner Christfried Webers Inference in Hybrid Graphical Models / Probabilistic Programs Limitations
More informationLabor-Supply Shifts and Economic Fluctuations. Technical Appendix
Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January
More informationSTAT 518 Intro Student Presentation
STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible
More informationMIT /30 Gelman, Carpenter, Hoffman, Guo, Goodrich, Lee,... Stan for Bayesian data analysis
MIT 1985 1/30 Stan: a program for Bayesian data analysis with complex models Andrew Gelman, Bob Carpenter, and Matt Hoffman, Jiqiang Guo, Ben Goodrich, and Daniel Lee Department of Statistics, Columbia
More informationFully Bayesian Analysis of Calibration Uncertainty In High Energy Spectral Analysis
In High Energy Spectral Analysis Department of Statistics, UCI February 26, 2013 Model Building Principle Component Analysis Three Inferencial Models Simulation Quasar Analysis Doubly-intractable Distribution
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More informationAccounting for Calibration Uncertainty in Spectral Analysis. David A. van Dyk
Accounting for in Spectral Analysis David A van Dyk Statistics Section, Imperial College London Joint work with Vinay Kashyap, Jin Xu, Alanna Connors, and Aneta Siegminowska IACHEC March 2013 David A van
More informationComputer Vision Group Prof. Daniel Cremers. 14. Sampling Methods
Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationBayesian Inference and MCMC
Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the
More informationMarkov Chain Monte Carlo methods
Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As
More informationTutorial on Probabilistic Programming with PyMC3
185.A83 Machine Learning for Health Informatics 2017S, VU, 2.0 h, 3.0 ECTS Tutorial 02-04.04.2017 Tutorial on Probabilistic Programming with PyMC3 florian.endel@tuwien.ac.at http://hci-kdd.org/machine-learning-for-health-informatics-course
More informationMonte Carlo Inference Methods
Monte Carlo Inference Methods Iain Murray University of Edinburgh http://iainmurray.net Monte Carlo and Insomnia Enrico Fermi (1901 1954) took great delight in astonishing his colleagues with his remarkably
More informationBayesian Phylogenetics:
Bayesian Phylogenetics: an introduction Marc A. Suchard msuchard@ucla.edu UCLA Who is this man? How sure are you? The one true tree? Methods we ve learned so far try to find a single tree that best describes
More informationDAG models and Markov Chain Monte Carlo methods a short overview
DAG models and Markov Chain Monte Carlo methods a short overview Søren Højsgaard Institute of Genetics and Biotechnology University of Aarhus August 18, 2008 Printed: August 18, 2008 File: DAGMC-Lecture.tex
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo
Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative
More informationComputer intensive statistical methods
Lecture 13 MCMC, Hybrid chains October 13, 2015 Jonas Wallin jonwal@chalmers.se Chalmers, Gothenburg university MH algorithm, Chap:6.3 The metropolis hastings requires three objects, the distribution of
More informationHale Collage. Spectropolarimetric Diagnostic Techniques!!!!!!!! Rebecca Centeno
Hale Collage Spectropolarimetric Diagnostic Techniques Rebecca Centeno March 1-8, 2016 Today Simple solutions to the RTE Milne-Eddington approximation LTE solution The general inversion problem Spectral
More informationComputer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo
Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain
More informationPractical Bayesian Optimization of Machine Learning. Learning Algorithms
Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that
More informationBayesian Regression Linear and Logistic Regression
When we want more than point estimates Bayesian Regression Linear and Logistic Regression Nicole Beckage Ordinary Least Squares Regression and Lasso Regression return only point estimates But what if we
More informationA Level-Set Hit-And-Run Sampler for Quasi- Concave Distributions
University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 2014 A Level-Set Hit-And-Run Sampler for Quasi- Concave Distributions Shane T. Jensen University of Pennsylvania Dean
More informationNew Results of Fully Bayesian
UCI April 3, 2012 Calibration Samples Principle Component Analysis Model Building Three source parameter sampling schemes Simulation Quasar data sets Speed Up Pragmatic Bayesian Method Frequency Analysis
More informationarxiv: v1 [stat.co] 18 Feb 2012
A LEVEL-SET HIT-AND-RUN SAMPLER FOR QUASI-CONCAVE DISTRIBUTIONS Dean Foster and Shane T. Jensen arxiv:1202.4094v1 [stat.co] 18 Feb 2012 Department of Statistics The Wharton School University of Pennsylvania
More informationContents. Part I: Fundamentals of Bayesian Inference 1
Contents Preface xiii Part I: Fundamentals of Bayesian Inference 1 1 Probability and inference 3 1.1 The three steps of Bayesian data analysis 3 1.2 General notation for statistical inference 4 1.3 Bayesian
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods
Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationBayesian Linear Regression
Bayesian Linear Regression Sudipto Banerjee 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. September 15, 2010 1 Linear regression models: a Bayesian perspective
More informationBayesian Estimation with Sparse Grids
Bayesian Estimation with Sparse Grids Kenneth L. Judd and Thomas M. Mertens Institute on Computational Economics August 7, 27 / 48 Outline Introduction 2 Sparse grids Construction Integration with sparse
More informationBayesian Additive Regression Tree (BART) with application to controlled trail data analysis
Bayesian Additive Regression Tree (BART) with application to controlled trail data analysis Weilan Yang wyang@stat.wisc.edu May. 2015 1 / 20 Background CATE i = E(Y i (Z 1 ) Y i (Z 0 ) X i ) 2 / 20 Background
More informationarxiv: v1 [stat.co] 23 Apr 2018
Bayesian Updating and Uncertainty Quantification using Sequential Tempered MCMC with the Rank-One Modified Metropolis Algorithm Thomas A. Catanach and James L. Beck arxiv:1804.08738v1 [stat.co] 23 Apr
More informationMCMC algorithms for fitting Bayesian models
MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models
More informationProbabilistic Graphical Models Lecture 17: Markov chain Monte Carlo
Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,
More informationFactorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling
Monte Carlo Methods Appl, Vol 6, No 3 (2000), pp 205 210 c VSP 2000 Factorization of Seperable and Patterned Covariance Matrices for Gibbs Sampling Daniel B Rowe H & SS, 228-77 California Institute of
More informationMarkov Chain Monte Carlo (MCMC) and Model Evaluation. August 15, 2017
Markov Chain Monte Carlo (MCMC) and Model Evaluation August 15, 2017 Frequentist Linking Frequentist and Bayesian Statistics How can we estimate model parameters and what does it imply? Want to find the
More informationLarge Scale Bayesian Inference
Large Scale Bayesian I in Cosmology Jens Jasche Garching, 11 September 2012 Introduction Cosmography 3D density and velocity fields Power-spectra, bi-spectra Dark Energy, Dark Matter, Gravity Cosmological
More informationBayesian Dynamic Linear Modelling for. Complex Computer Models
Bayesian Dynamic Linear Modelling for Complex Computer Models Fei Liu, Liang Zhang, Mike West Abstract Computer models may have functional outputs. With no loss of generality, we assume that a single computer
More informationFrom theory to observations
Stellar Objects: From theory to observations 1 From theory to observations Given the stellar mass and chemical composition of a ZAMS, the stellar modeling can, in principle, give the prediction of the
More informationBAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA
BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci
More informationResults: MCMC Dancers, q=10, n=500
Motivation Sampling Methods for Bayesian Inference How to track many INTERACTING targets? A Tutorial Frank Dellaert Results: MCMC Dancers, q=10, n=500 1 Probabilistic Topological Maps Results Real-Time
More informationAIA DATA ANALYSIS OVERVIEW OF THE AIA INSTRUMENT
AIA DATA ANALYSIS OVERVIEW OF THE AIA INSTRUMENT SDO SUMMER SCHOOL ~ August 2010 ~ Yunnan, China Marc DeRosa (LMSAL) ~ derosa@lmsal.com WHAT IS SDO? The goal of Solar Dynamics Observatory (SDO) is to understand:
More informationST 740: Markov Chain Monte Carlo
ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:
More informationMultimodal Nested Sampling
Multimodal Nested Sampling Farhan Feroz Astrophysics Group, Cavendish Lab, Cambridge Inverse Problems & Cosmology Most obvious example: standard CMB data analysis pipeline But many others: object detection,
More informationSlice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method
Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method Madeleine B. Thompson Radford M. Neal Abstract The shrinking rank method is a variation of slice sampling that is efficient at
More informationModelling and forecasting of offshore wind power fluctuations with Markov-Switching models
Modelling and forecasting of offshore wind power fluctuations with Markov-Switching models 02433 - Hidden Markov Models Pierre-Julien Trombe, Martin Wæver Pedersen, Henrik Madsen Course week 10 MWP, compiled
More informationBayesian Model Comparison:
Bayesian Model Comparison: Modeling Petrobrás log-returns Hedibert Freitas Lopes February 2014 Log price: y t = log p t Time span: 12/29/2000-12/31/2013 (n = 3268 days) LOG PRICE 1 2 3 4 0 500 1000 1500
More informationMCMC Methods: Gibbs and Metropolis
MCMC Methods: Gibbs and Metropolis Patrick Breheny February 28 Patrick Breheny BST 701: Bayesian Modeling in Biostatistics 1/30 Introduction As we have seen, the ability to sample from the posterior distribution
More informationBayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference
1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin
More informationA new Hierarchical Bayes approach to ensemble-variational data assimilation
A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander
More informationBayesian Computation in Color-Magnitude Diagrams
Bayesian Computation in Color-Magnitude Diagrams SA, AA, PT, EE, MCMC and ASIS in CMDs Paul Baines Department of Statistics Harvard University October 19, 2009 Overview Motivation and Introduction Modelling
More informationSTA 294: Stochastic Processes & Bayesian Nonparametrics
MARKOV CHAINS AND CONVERGENCE CONCEPTS Markov chains are among the simplest stochastic processes, just one step beyond iid sequences of random variables. Traditionally they ve been used in modelling a
More informationPoint spread function reconstruction from the image of a sharp edge
DOE/NV/5946--49 Point spread function reconstruction from the image of a sharp edge John Bardsley, Kevin Joyce, Aaron Luttman The University of Montana National Security Technologies LLC Montana Uncertainty
More informationEstimating marginal likelihoods from the posterior draws through a geometric identity
Estimating marginal likelihoods from the posterior draws through a geometric identity Johannes Reichl Energy Institute at the Johannes Kepler University Linz E-mail for correspondence: reichl@energieinstitut-linz.at
More informationOverview. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Overview DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing
More informationMonetary and Exchange Rate Policy Under Remittance Fluctuations. Technical Appendix and Additional Results
Monetary and Exchange Rate Policy Under Remittance Fluctuations Technical Appendix and Additional Results Federico Mandelman February In this appendix, I provide technical details on the Bayesian estimation.
More informationBayesian Nonparametric Regression for Diabetes Deaths
Bayesian Nonparametric Regression for Diabetes Deaths Brian M. Hartman PhD Student, 2010 Texas A&M University College Station, TX, USA David B. Dahl Assistant Professor Texas A&M University College Station,
More informationScaling up Bayesian Inference
Scaling up Bayesian Inference David Dunson Departments of Statistical Science, Mathematics & ECE, Duke University May 1, 2017 Outline Motivation & background EP-MCMC amcmc Discussion Motivation & background
More informationAfternoon Meeting on Bayesian Computation 2018 University of Reading
Gabriele Abbati 1, Alessra Tosi 2, Seth Flaxman 3, Michael A Osborne 1 1 University of Oxford, 2 Mind Foundry Ltd, 3 Imperial College London Afternoon Meeting on Bayesian Computation 2018 University of
More informationPaul Karapanagiotidis ECO4060
Paul Karapanagiotidis ECO4060 The way forward 1) Motivate why Markov-Chain Monte Carlo (MCMC) is useful for econometric modeling 2) Introduce Markov-Chain Monte Carlo (MCMC) - Metropolis-Hastings (MH)
More informationCS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)
CS4495/6495 Introduction to Computer Vision 8B-L2 Principle Component Analysis (and its use in Computer Vision) Wavelength 2 Wavelength 2 Principal Components Principal components are all about the directions
More informationExploring Monte Carlo Methods
Exploring Monte Carlo Methods William L Dunn J. Kenneth Shultis AMSTERDAM BOSTON HEIDELBERG LONDON NEW YORK OXFORD PARIS SAN DIEGO SAN FRANCISCO SINGAPORE SYDNEY TOKYO ELSEVIER Academic Press Is an imprint
More informationHierarchical Bayesian Modeling
Hierarchical Bayesian Modeling Making scientific inferences about a population based on many individuals Angie Wolfgang NSF Postdoctoral Fellow, Penn State Astronomical Populations Once we discover an
More informationPhysics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester
Physics 403 Numerical Methods, Maximum Likelihood, and Least Squares Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Quadratic Approximation
More informationBayesian Methods for Machine Learning
Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),
More informationMonte Carlo Methods. Leon Gu CSD, CMU
Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte
More informationOn Bayesian Computation
On Bayesian Computation Michael I. Jordan with Elaine Angelino, Maxim Rabinovich, Martin Wainwright and Yun Yang Previous Work: Information Constraints on Inference Minimize the minimax risk under constraints
More information1 Bayesian Linear Regression (BLR)
Statistical Techniques in Robotics (STR, S15) Lecture#10 (Wednesday, February 11) Lecturer: Byron Boots Gaussian Properties, Bayesian Linear Regression 1 Bayesian Linear Regression (BLR) In linear regression,
More informationspbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models
spbayes: An R Package for Univariate and Multivariate Hierarchical Point-referenced Spatial Models Andrew O. Finley 1, Sudipto Banerjee 2, and Bradley P. Carlin 2 1 Michigan State University, Departments
More informationCS281A/Stat241A Lecture 22
CS281A/Stat241A Lecture 22 p. 1/4 CS281A/Stat241A Lecture 22 Monte Carlo Methods Peter Bartlett CS281A/Stat241A Lecture 22 p. 2/4 Key ideas of this lecture Sampling in Bayesian methods: Predictive distribution
More informationThe Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision
The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that
More informationFlexible Regression Modeling using Bayesian Nonparametric Mixtures
Flexible Regression Modeling using Bayesian Nonparametric Mixtures Athanasios Kottas Department of Applied Mathematics and Statistics University of California, Santa Cruz Department of Statistics Brigham
More informationTwo Statistical Problems in X-ray Astronomy
Two Statistical Problems in X-ray Astronomy Alexander W Blocker October 21, 2008 Outline 1 Introduction 2 Replacing stacking Problem Current method Model Further development 3 Time symmetry Problem Model
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate
More informationBayesian Gaussian Process Regression
Bayesian Gaussian Process Regression STAT8810, Fall 2017 M.T. Pratola October 7, 2017 Today Bayesian Gaussian Process Regression Bayesian GP Regression Recall we had observations from our expensive simulator,
More informationSequential Monte Carlo Samplers for Applications in High Dimensions
Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex
More informationReview: Probabilistic Matrix Factorization. Probabilistic Matrix Factorization (PMF)
Case Study 4: Collaborative Filtering Review: Probabilistic Matrix Factorization Machine Learning for Big Data CSE547/STAT548, University of Washington Emily Fox February 2 th, 214 Emily Fox 214 1 Probabilistic
More informationChapter 7: From theory to observations
Chapter 7: From theory to observations Given the stellar mass and chemical composition of a ZAMS, the stellar modeling can, in principle, predict the evolution of the stellar bolometric luminosity, effective
More informationPhysics 403. Segev BenZvi. Parameter Estimation, Correlations, and Error Bars. Department of Physics and Astronomy University of Rochester
Physics 403 Parameter Estimation, Correlations, and Error Bars Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Best Estimates and Reliability
More informationUsing Bayes Factors for Model Selection in High-Energy Astrophysics
Using Bayes Factors for Model Selection in High-Energy Astrophysics Shandong Zhao Department of Statistic, UCI April, 203 Model Comparison in Astrophysics Nested models (line detection in spectral analysis):
More informationApproximate Bayesian Computation
Approximate Bayesian Computation Michael Gutmann https://sites.google.com/site/michaelgutmann University of Helsinki and Aalto University 1st December 2015 Content Two parts: 1. The basics of approximate
More informationIntroduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation. EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016
Introduction to Bayesian Statistics and Markov Chain Monte Carlo Estimation EPSY 905: Multivariate Analysis Spring 2016 Lecture #10: April 6, 2016 EPSY 905: Intro to Bayesian and MCMC Today s Class An
More informationBagging During Markov Chain Monte Carlo for Smoother Predictions
Bagging During Markov Chain Monte Carlo for Smoother Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationKernel Sequential Monte Carlo
Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section
More informationKatsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus. Abstract
Bayesian analysis of a vector autoregressive model with multiple structural breaks Katsuhiro Sugita Faculty of Law and Letters, University of the Ryukyus Abstract This paper develops a Bayesian approach
More information