Monte Python. the way back from cosmology to Fundamental Physics. Miguel Zumalacárregui. Nordic Institute for Theoretical Physics and UC Berkeley

Similar documents
MontePython Exercises IFT School on Cosmology Tools

Solution to running JLA

The Cosmic Linear Anisotropy Solving System:

Getting Started with MontePython Solution to Exercises

Introduction to CosmoMC

Introduction to COSMOMC

CosmoSIS Webinar Part 1

epistasis Documentation

CLASS. the Cosmological Linear Anisotropy Solving System 1. Julien Lesgourgues, Deanna C. Hooper TTK, RWTH Aachen University

Introduction to hi class

class Workshop CANTATA Summer School

Concordance Cosmology and Particle Physics. Richard Easther (Yale University)

Cosmological Constraints on Newton s Gravitational Constant for Matter and Dark Matter

CosmoSIS a statistical framework for precision cosmology. Elise Jennings

Monte Carlo Markov Chains: A Brief Introduction and Implementation. Jennifer Helsby Astro 321

Algorithms for Uncertainty Quantification

Infer relationships among three species: Outgroup:

Sampling versus optimization in very high dimensional parameter spaces

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Model Fitting in Particle Physics

Relieving tensions related to the lensing of CMB temperature power spectra (astro-ph )

A brief Introduction to CAMB and CosmoMC

Cosmological Constraints on Dark Energy via Bulk Viscosity from Decaying Dark Matter

Effective Field Theory approach for Dark Energy/ Modified Gravity. Bin HU BNU

Lecture II: Background

IAST Documentation. Release Cory M. Simon

epic EPIC Documentation Release 1.3 arxiv: v4 [astro-ph.im] 20 Jul 2018 Rafael J. F. Marcondes Contents

ECE 510 Lecture 7 Goodness of Fit, Maximum Likelihood. Scott Johnson Glenn Shirley

Dark Energy at the Speed of Gravitational Waves

epic EPIC Documentation Release 1.4 arxiv: v5 [astro-ph.im] 18 Sep 2018 Rafael J. F. Marcondes Contents

ECE521 W17 Tutorial 1. Renjie Liao & Min Bai

L14. 1 Lecture 14: Crash Course in Probability. July 7, Overview and Objectives. 1.2 Part 1: Probability

Markov Networks.

Markov Chain Monte Carlo

Markov Chain Monte Carlo, Numerical Integration

arxiv: v2 [astro-ph.co] 2 Aug 2013

Practical Bioinformatics

Statistics and Prediction. Tom

Tutorial on Probabilistic Programming with PyMC3

MCV172, HW#3. Oren Freifeld May 6, 2017

(5) Multi-parameter models - Gibbs sampling. ST440/540: Applied Bayesian Analysis

Cosmological tests of ultra-light axions

Apex Python library. Release 1.0.1

Astro 406 Lecture 25 Oct. 25, 2013

Lecture 6: Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo

10708 Graphical Models: Homework 2

A Beginner s Guide to MCMC

MCMC 2: Lecture 2 Coding and output. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham

Lab 1: Empirical Energy Methods Due: 2/14/18


pyiast Documentation Release Cory M. Simon

Predici 11 Quick Overview

eboss Lyman-α Forest Cosmology

FYST17 Lecture 13 The cosmic connection. Thanks to R. Durrer, L. Covi, S. Sakar

MCMC notes by Mark Holder

MCMC algorithms for fitting Bayesian models

CS Homework 3. October 15, 2009

Monte Carlo Inference Methods

Simulated Data Sets and a Demonstration of Central Limit Theorem

A Comparison of Two MCMC Algorithms for Hierarchical Mixture Models

Measuring Neutrino Masses and Dark Energy

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Chapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)

CS 343: Artificial Intelligence

Introduction to Bayes

Cosmological observables and the nature of dark matter

Leïla Haegel University of Geneva

The Multivariate Normal Distribution 1

Large Scale Bayesian Inference

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo

Statistical Data Analysis Stat 3: p-values, parameter estimation

Optimization with Scipy (2)

Python. Tutorial. Jan Pöschko. March 22, Graz University of Technology

Bayes Nets: Sampling

CPSC 540: Machine Learning

Introduction to Markov Chain Monte Carlo

STA 4273H: Sta-s-cal Machine Learning

ITP, Universität Heidelberg Jul Weak Lensing of SNe. Marra, Quartin & Amendola ( ) Quartin, Marra & Amendola ( ) Miguel Quartin

Detection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset

The PyMC MCMC python package

Theory of Stochastic Processes 8. Markov chain Monte Carlo

The Galaxy Dark Matter Connection

Uniform and constant electromagnetic fields

Announcements. CS 188: Artificial Intelligence Fall Causality? Example: Traffic. Topology Limits Distributions. Example: Reverse Traffic

Lecture 5: Bayes pt. 1

Data Analysis I. Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK. 10 lectures, beginning October 2006

Cosmic Microwave Background Polarization. Gil Holder

Session 3A: Markov chain Monte Carlo (MCMC)

Baryon Acoustic Oscillations (BAO) in the Sloan Digital Sky Survey Data Release 7 Galaxy Sample

An Introduction to Reversible Jump MCMC for Bayesian Networks, with Application

LECTURE 15 Markov chain Monte Carlo

arxiv: v1 [astro-ph.co] 3 Apr 2019

Bayesian Inference in Astronomy & Astrophysics A Short Course

arxiv: v1 [astro-ph.co] 11 Jul 2016

git Tutorial Nicola Chiapolini Physik-Institut University of Zurich March 16, 2015

Cosmology and Large Scale Structure

Sheet 5 solutions. August 15, 2017

Bayes Networks. CS540 Bryan R Gibson University of Wisconsin-Madison. Slides adapted from those used by Prof. Jerry Zhu, CS540-1

Using training sets and SVD to separate global 21-cm signal from foreground and instrument systematics

Transcription:

the way back from cosmology to Fundamental Physics Nordic Institute for Theoretical Physics and UC Berkeley IFT School on Cosmology Tools March 2017

Related Tools Version control and Python interfacing PLANCK

Version control: GIT + github GIT: access all previous versions, restore, compare, branch... Github: website + interface - Network: users, branches intermediate versions

Version control: GIT + github GIT: access all previous versions, restore, compare, branch... Github: website + interface - Network: users, branches intermediate versions - Issues: troubleshooting, forum/improvements

Python interfacing with classy classy use CLASS as a Python module - Required for MCMC (tomorrow!) - Useful for plotting from classy import Class import numpy as np import matplotlib.pyplot as plt cosmo = Class () cosmo.set ({ output : tcl, pcl, lcl, lensing : yes }) cosmo.compute () l = np.array ( range (2,2501) ) factor = l*(l +1) /(2*np.pi ) lensed_cl = cosmo.lensed_cl (2500) #then just plot lensed_cl...

Python interfacing with classy classy use CLASS as a Python module - Required for MCMC (tomorrow!) - Useful for plotting IPython Interactive Python frontend - TAB auto-completion

Python interfacing with classy classy use CLASS as a Python module - Required for MCMC (tomorrow!) - Useful for plotting IPython Interactive Python frontend - TAB auto-completion Jupyter Notebook interface (Julia+Pyton+R)

from cosmology back to fundamental physics PLANCK

DISCLAIMER: Short time! 3h course overview and basic usage Learn further: MontePython slides by Sebastien Clesse ( 1h?) https://lesgourg.github.io/class-tour/16.06.02_lisbon_intro.pdf MontePython course ( 5h) https://lesgourg.github.io/class-tour-tokyo.html Links to extra resources in exercise sheet IMPORTANT DISCLAIMER: I m mainly a user with little experience developing! Help from experts: Thejs Brinckmann, Carlos Garcia, Deanna Hooper & Vivian Poulin

Fundamental physics and cosmology 10 5 10 4 P(k) [1/Mpc 3 ] 10 3 10 2 m ν =(m i,0,0) 10 1 10-4 10-3 10-2 10-1 10 0 0 10 rel. dev. [%] 20 m i [ev] 0.00 30 0.25 0.50 40 0.75 1.00 50 10-4 10-3 10-2 10-1 10 0 k [1/Mpc] Initial conditions, Dark Matter, Neutrinos, Dark Energy, Gravity...

Scan space of parameters (recall lecture by Will Handley)

Scan space of parameters (recall lecture by Will Handley)

Scan space of parameters (recall lecture by Will Handley)

Scan space of parameters (recall lecture by Will Handley)

Scan space of parameters (recall lecture by Will Handley)

Scan space of parameters (recall lecture by Will Handley)

Scan space of parameters (recall lecture by Will Handley)

Scan space of parameters (recall lecture by Will Handley)

MontePython An Markov Chain Monte Carlo engine for parameter extraction: Features Written in Python - Python is practically magic! - imports routines from numpy and scipy - useful outside academia, standard for Big Data

MontePython An Markov Chain Monte Carlo engine for parameter extraction: Features Written in Python - Python is practically magic! - imports routines from numpy and scipy - useful outside academia, standard for Big Data Uses CLASS through the classy wrapper

MontePython An Markov Chain Monte Carlo engine for parameter extraction: Features Written in Python - Python is practically magic! - imports routines from numpy and scipy - useful outside academia, standard for Big Data Uses CLASS through the classy wrapper Modular, easy to add - likelihoods for new experiments - features for sampling, plotting...

MontePython An Markov Chain Monte Carlo engine for parameter extraction: Features Written in Python - Python is practically magic! - imports routines from numpy and scipy - useful outside academia, standard for Big Data Uses CLASS through the classy wrapper Modular, easy to add - likelihoods for new experiments - features for sampling, plotting... Easy to use, intensively documented

MontePython An Markov Chain Monte Carlo engine for parameter extraction: Features Written in Python - Python is practically magic! - imports routines from numpy and scipy - useful outside academia, standard for Big Data Uses CLASS through the classy wrapper Modular, easy to add - likelihoods for new experiments - features for sampling, plotting... Easy to use, intensively documented Parallelization is optional - simpler to install - runs in old/separate computers, short queue

Modular Structure (from B. Audren s slides)

Defining a run (model.param) List of experiments data.experiments=[ Planck_highl, Planck_lowl, Planck_lensing ] Collected in montepython/likelihoods:

Defining a run (model.param) List of experiments data.experiments=[ Planck_highl, Planck_lowl, Planck_lensing ] Cosmological parameters # [mean, (bounds), SIGMA,scale, type ] data.parameters[ n_s ]=[0.96, None,None, 0.008, 1, cosmo ]

Defining a run (model.param) List of experiments data.experiments=[ Planck_highl, Planck_lowl, Planck_lensing ] Cosmological parameters # [mean, (bounds), SIGMA,scale, type ] data.parameters[ n_s ]=[0.96, None,None, 0.008, 1, cosmo ] Fixed values: - set SIGMA = 0 - data.cosmo_arguments[ N_ncdm ] = 1

Defining a run (model.param) List of experiments data.experiments=[ Planck_highl, Planck_lowl, Planck_lensing ] Cosmological parameters # [mean, (bounds), SIGMA,scale, type ] data.parameters[ n_s ]=[0.96, None,None, 0.008, 1, cosmo ] Fixed values: - set SIGMA = 0 - data.cosmo_arguments[ N_ncdm ] = 1 Derived and Nuisance parameters data.parameters[ sigma8 ] = [0, None, None, 0, 1, derived ] data.parameters[ A_cib_217 ] = [61,0,200,7,1, nuisance ]

Defining a run (model.param) List of experiments data.experiments=[ Planck_highl, Planck_lowl, Planck_lensing ] Cosmological parameters # [mean, (bounds), SIGMA,scale, type ] data.parameters[ n_s ]=[0.96, None,None, 0.008, 1, cosmo ] Fixed values: - set SIGMA = 0 - data.cosmo_arguments[ N_ncdm ] = 1 Derived and Nuisance parameters data.parameters[ sigma8 ] = [0, None, None, 0, 1, derived ] data.parameters[ A_cib_217 ] = [61,0,200,7,1, nuisance ] MCMC parameters: data.n=10, data.write_step=5...

Running (run) Single chain: python montepython/montepython.py run \ -p model.param \ -o output_directory (...) Parallel run (4 chains): mpirun -nc 4 python (...) Some options:

Running (run) Single chain: python montepython/montepython.py run \ -p model.param \ -o output_directory (...) Parallel run (4 chains): mpirun -nc 4 python (...) Some options: -N # points -C covariance matrix -r restart from last point of chain --update update sampling + covariance All options explained python montepython/montepython.py info --help

Analyzing results (info) Single model/experiment: python montepython/montepython.py info \ output_directory (...) Comparing several runs: python montepython/montepython.py info \ output_1 output_2 output_3 (...) Configuring the output/analysis

Analyzing results (info) Single model/experiment: python montepython/montepython.py info \ output_directory (...) Comparing several runs: python montepython/montepython.py info \ output_1 output_2 output_3 (...) Configuring the output/analysis --extra file with plot options --bins # bins for posterior --all plot every subplot separately --no-mean only marginalized in 1D All options explained python montepython/montepython.py info --help

A very minimal run Write lckb.param: data.experiments=[ bao_boss, bao_boss_aniso ] #Cosmo parameteress [mean, min, max, sigma, scale, type] data.parameters[ Omega_b ] = [0.045,0.01, None,0.01,1, cosmo ] data.parameters[ Omega_cdm ] = [0.3, 0, None, 0.1, 1, cosmo ] data.parameters[ Omega_k ] = [0.0, -0.5, 0.5, 0.1, 1, cosmo ] #Fixed parameters (sigma = 0) data.parameters[ H0 ] = [67.8, None, None, 0, 1, cosmo ] data.cosmo_arguments[ YHe ] = 0.24 #derived parameters data.parameters[ Omega_Lambda ] = [1,None,None,0,1, derived ] #mcmc parameters data.n=10 data.write_step=5 Run 7 chains with python montepython/montepython.py run -o chains/lckb_bao \ -p lckb_param --update 300 -N 100000

A very minimal run The 7 chains explore the parameter space Chains named yyyy-mm-dd N n.txt (date points id)

A very minimal run Analyze: python montepython/montepython.py info chains/lckb_bao lckb bao.tex table with MCMC results Param best-fit mean±σ 95% lower 95% upper Ω b 0.03595 0.03977 +0.0095 0.015 0.01662 0.06547 Ω cdm 0.2931 0.2872 +0.049 0.048 0.1892 0.3847 Ω k 0.1183 0.08087 +0.11 0.14 0.3182 0.1755 Ω Λ 0.7891 0.7538 +0.12 0.091 0.542 0.9564 ln L min = 5.57269, minimum χ 2 = 11.15 lckb_bao.covmat covariance matrix lckb_bao.bestfit best fit values arguments for another run

A very minimal run In lckb bao/plots: Ωb=0.0398 +0.00953 0.0147 Ωcdm=0.287 +0.0486 0.0478 0.01 0.0586 0.0974 Ωk= 0.0809 +0.11 0.14 0.126 0.29 0.454 ΩΛ=0.754 +0.12 0.091-0.343 0.0318 0.5 0.265 0.609 0.954

A very minimal run In lckb bao/plots: Ωb=0.0398 +0.00953 0.0147 Ωcdm=0.287 +0.0486 0.0478 0.454 Ωcdm 0.29 0.126 0.5 0.01 0.0586 0.0974 0.126 0.29 0.454 Ωk= 0.0809 +0.11 0.14 ΩΛ=0.754 +0.12 0.091 Ωk 0.0318-0.343 0.954 ΩΛ 0.609-0.343 0.0318 0.5 0.265 0.609 0.954 0.265 0.01 0.0586 0.0974 0.126 0.29 0.454-0.343 0.0318 0.5 0.265 0.609 0.954 Ωb Ωcdm Ωk ΩΛ

Comparing several runs Run chains lckb_sne with data.experiments=[ sne ] Analyze: python... info chains/lckb_sne chains/lckb_bao In lckb sne/plots: Ωb=0.0398 +0.00953 0.0147 Ωcdm=0.287 +0.0486 0.0478 0.01 0.245 0.479 Ωk= 0.0809 +0.11 0.14 0 0.229 0.458 ΩΛ=0.754 +0.12 0.091-0.5 0 0.5 0.265 0.674 1.08

Comparing several runs Run chains lckb_sne with data.experiments=[ sne ] Analyze: python... info chains/lckb_sne chains/lckb_bao In lckb sne/plots: lckb sne lckb bao Ωb=0.0398 +0.00953 0.0147 Ωcdm=0.287 +0.0486 0.0478 0.458 Ωcdm 0.229 0 0.5 0.01 0.245 0.479 Ωk= 0.0809 +0.11 0.14 0 0.229 0.458 ΩΛ=0.754 +0.12 0.091 Ωk 0-0.5 1.08 ΩΛ 0.674 0.265-0.5 0 0.5 0.265 0.674 1.08 0.01 0.245 0.479 Ωb 0 0.229 0.458-0.5 0 0.5 0.265 0.674 1.08 Ωcdm Ωk ΩΛ

Experiments and Likelihoods Simplest example ever: Prior on H 0 (1103.2976) The Hubble Space Telescope measured h obs = 0.738 ± 0.024 log(l) = 1 (h th h obs ) 2 2 σh 2 Likelihood (montepython/likelihoods/hst/ init ): from montepython.likelihood_class import Likelihood_prior class hst(likelihood_prior): def loglkl(self, cosmo, data): h = cosmo.h() loglkl = -0.5 * (h - self.h) ** 2 / (self.sigma ** 2) return loglkl Data (montepython/likelihoods/hst/hst.data): # Values for Hubble Space Telescope (following 1103.2976) hst.h = 0.738 hst.sigma = 0.024

Likelihood rules Likelihoods in directory montepython/likelihoods/l_name Needed files: init.py and l_name.data init.py defines a class, inheriting from Likelihood Contains function loglkl log(l) Introducing your own Likelihoods Follow the above rules Inspire yourself with the examples similar likelihood? you can inherit its methods! You can use additional python packages (See also B. Audren s lecture on likelihoods)

Conclusions Brings all the power of CLASS to Python Easy to run chains and analyze likelihoods Many available experiments Advantages from object oriented features in python - Add likelihoods - Add samplers or other features This just scratches the surface, many more options! (See also B. Audren s slides)

The hi class academy Coming soon! Set of interrelated projects: Theory & model building Implementation and phenomenology Compare with data Collaboration Publishable results Review of models Observational constaints www.hiclass-code.net Stay tuned for more info!