Data Analysis I. Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK. 10 lectures, beginning October 2006
|
|
- Bertina Shelton
- 5 years ago
- Views:
Transcription
1 Astronomical p( y x, I) p( x, I) p ( x y, I) = p( y, I) Data Analysis I Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK 10 lectures, beginning October 2006
2 4. Monte Carlo Methods In many data analysis problems it is useful to create mock datasets, in order to test models and explore possible systematic errors. e.g. Mock galaxy catalogues: From Cole et al. (1998)
3 4. Monte Carlo Methods In many data analysis problems it is useful to create mock datasets, in order to test models and explore possible systematic errors. e.g. Mock galaxy catalogues: We need methods for generating random variables i.e. samples of numbers which behave as if they are drawn from some particular pdf (e.g. uniform, Gaussian, Poisson etc). From Cole et al. (1998)
4 4. Monte Carlo Methods In many data analysis problems it is useful to create mock datasets, in order to test models and explore possible systematic errors. e.g. Mock galaxy catalogues: We need methods for generating random variables i.e. samples of numbers which behave as if they are drawn from some particular pdf (e.g. uniform, Gaussian, Poisson etc). We call these Monte Carlo methods From Cole et al. (1998)
5 4.1 Uniform random numbers Generating uniform random numbers, drawn from the pdf U[0,1], is fairly easy. Any scientific Calculator will have a RAN function p(x) Better examples of U[0,1] random number generators can be found in Numerical Recipes x
6 4.1 Uniform random numbers Generating uniform random numbers, drawn from the pdf U[0,1], is fairly easy. Any scientific Calculator will have a RAN function p(x) Better examples of U[0,1] random number generators can be found in Numerical Recipes. 1 In what sense are they better? Algorithms only generate pseudo-random 0 1 numbers: very long (deterministic) sequences of numbers which are approximately random (i.e. no discernible pattern). x The better the RNG, the better it approximates U[0,1]
7 We can test pseudo-random numbers for randomness in several ways: (a) Histogram of sampled values. We can use hypothesis tests to see if the sample is consistent with the pdf we are trying to model. e.g. Chi-squared test, applied to the to the numbers in each histogram bin. 0 1 x
8 We can test pseudo-random numbers for randomness in several ways: (a) Histogram of sampled values. We can use hypothesis tests to see if the sample is consistent with the pdf we are trying to model. e.g. Chi-squared test, applied to the to the numbers in each histogram bin. 2 χ = n bin i= 1 n obs i n σ i pred i 2 (4.1) Assume the bin number counts are subject 2 to Poisson fluctuations, so that pred σ = i n i Note: no. of degrees of freedom = n bin 1 since we know the total sample size. 0 1 x
9 (b) Correlations between neighbouring pseudo-random numbers Sequential patterns in the sampled values would show up as structure in the phase portraits scatterplots of the i th value against the (i+1) th value etc. x i+2 x i+1 x i+1 x i x i
10 (b) Correlations between neighbouring pseudo-random numbers Sequential patterns in the sampled values would show up as structure in the phase portraits scatterplots of the i th value against the (i+1) th value etc. We can compute the Auto-correlation function ( j) = [ ( xi x)( xi+ j x) ] 2 ( xi x) ( xi+ j x) ρ (4.2) 2 j is known as the Lag If the sequence is uniformly random, we expect { ρ(j) = 0 ρ(j) = 1 for j = 0 otherwise
11 4.2 Variable transformations Generating random numbers from other pdfs can be done by transforming random numbers drawn from simpler pdfs. The procedure is similar to changing variables in integration. Suppose, e.g. x ~ p( x) y = y(x) Let be monotonic Then p ( y) dy = p( x) dx (4.3) p( y) = p( x( y)) dy dx (4.4) Probability of number between y and y+dy Probability of number between x and x+dx Because probability must be positive
12 We can extend the expression given in eq. (4.4) to the case where y(x) is not monotonic, by calculating p ( y) dy = p( x i ) dx i i so that p( y) = i p( x dy i ( y)) dx i (4.5)
13 Example 1 p(x) Suppose we have x ~ U[0,1] 1 Then 1 for 0 < x < 1 ( x) = 0 otherwise p (4.6) 0 1 x Define y = a + ( b a) x (4.7) p(y) So dy dx ( b a) = (4.8) 1 b-a i.e. p( y) = b a for a < y < 0 otherwise 1 b (4.9) or y ~ U[ a, b] 0 a b y
14 Example 2 Numerical recipes provides a program to turn Normal pdf with mean zero and standard deviation unity x ~ U[0,1] into y ~ N[0,1] Suppose we want z ~ N[ µ, σ ] z = µ + σ y We define (4.10) so that dz dy = σ (4.11) Now 1 1 p( y) exp y 2π 2 = 2 (4.12) so ( z) = 1 1 exp 2πσ 2 z µ σ p (4.13) 2
15 Example 2 Numerical recipes provides a program to turn Normal pdf with mean zero and standard deviation unity x ~ U[0,1] into y ~ N[0,1] Suppose we want z ~ N[ µ, σ ] z = µ + σ y We define (4.10) so that dz dy = σ (4.11) Now 1 1 p( y) exp y 2π 2 = 2 (4.12) so ( z) = 1 1 exp 2πσ 2 z µ σ p (4.13) 2 Variable transformation formula also the basis for error propagation formulae we use in the lab. See example sheet 3 for more on this.
16 4.3 Probability integral transform One particular variable transformation merits special attention. Suppose we can compute the CDF of some desired random variable
17 1) Sample a random variable y ~ U[0,1] x y = P(x) x = P 1 ( y ) 2) Compute such that i.e.
18 1) Sample a random variable y ~ U[0,1] x y = P(x) 2) Compute such that i.e. x = P 1 ( y)
19 1) Sample a random variable y ~ U[0,1] x y = P(x) 2) Compute such that i.e. x = P 1 ( y) x ~ p( x) 3) Then i.e. is drawn from the pdf corresponding to the cdf x P(x)
20 Example
21 4.4 Rejection sampling Suppose we want to sample from some pdf and we know that p 1 ( x) p 2 ( x) y p1( x) < p2( x) x p 1 ( x) 1) Sample from 2) Sample x 1 p 2( x ) p 2 ( x )] y ~ U[0, 1 (Suppose we have an easy way to do this) x 1
22 4.4 Rejection sampling Suppose we want to sample from some pdf and we know that p 1 ( x) p 2 ( x) y p1( x) < p2( x) x p 1 ( x) 1) Sample from 2) Sample x 1 p 2( x ) p 2 ( x )] y ~ U[0, 1 (Suppose we have an easy way to do this) x 1 y < p 1 ( x) 3) If ACCEPT otherwise REJECT
23 4.4 Rejection sampling Suppose we want to sample from some pdf and we know that p 1 ( x) p 2 ( x) y p1( x) < p2( x) x p 1 ( x) 1) Sample from 2) Sample y < x 1 p 2( x ) p 2 ( x )] y ~ U[0, 1 p 1 ( x) 3) If ACCEPT otherwise REJECT (Suppose we have an easy way to do this) x 1 Set of accepted values are a sample from p 1 ( x) { } x i
24 4.4 Rejection sampling p 2 ( x) p 1 ( x) Method can be very slow if the shaded region is too large. Ideally we want to find a pdf p 2 ( x) that is: (a) easy to sample from (b) close to p 1 ( x)
25 4.5 Markov Chain Monte Carlo This is a very powerful, new (at least in astronomy!) method for sampling from pdfs. (These can be complicated and/or of high dimension). MCMC widely used e.g. in cosmology to determine maximum likelihood model to CMBR data. Angular power spectrum of CMBR temperature fluctuations ML cosmological model, depending on 7 different parameters.
26 Consider a 2-D example (e.g. bivariate normal distribution); Likelihood function depends on parameters a and b. Suppose we are trying to find the maximum of L(a,b) L(a,b) 1) Start off at some randomly chosen value ( a 1, b 1 ) 2) Compute L( a 1, b 1 ) and gradient L L, a b ( a b ) 1, 3) Move in direction of steepest +ve gradient i.e. increasing fastest 1 L( a 1, b 1 ) is a b 4) Repeat from step 2 until ( a n, b n ) converges on maximum of likelihood
27 Consider a 2-D example (e.g. bivariate normal distribution); Likelihood function depends on parameters a and b. Suppose we are trying to find the maximum of L(a,b) L(a,b) 1) Start off at some randomly chosen value ( a 1, b 1 ) 2) Compute L( a 1, b 1 ) and gradient L L, a b ( a b ) 1, 3) Move in direction of steepest +ve gradient i.e. increasing fastest 1 L( a 1, b 1 ) is a b 4) Repeat from step 2 until ( a n, b n ) converges on maximum of likelihood OK for finding maximum, but not for generating a sample from or for determining errors on the the ML parameter estimates. L(a,b)
28 b MCMC provides a simple Metropolis algorithm for generating random samples of points from L(a,b) Slice through L(a,b) a
29 MCMC provides a simple Metropolis algorithm for generating random samples of points from L(a,b) b 1. Sample random initial point P 1 = ( a 1, b 1 ) P 1 Slice through L(a,b) a
30 MCMC provides a simple Metropolis algorithm for generating random samples of points from L(a,b) b 1. Sample random initial point P 1 = ( a 1, b 1 ) 2. Centre a new pdf, Q, called the proposal density, on P 1 P 1 Slice through L(a,b) a
31 MCMC provides a simple Metropolis algorithm for generating random samples of points from L(a,b) b 1. Sample random initial point P 1 = ( a 1, b 1 ) 2. Centre a new pdf, Q, called the proposal density, on P 1 P 1 P 3. Sample tentative new point from Q P = ( a, b ) Slice through L(a,b) a
32 MCMC provides a simple Metropolis algorithm for generating random samples of points from L(a,b) b 1. Sample random initial point P 1 = ( a 1, b 1 ) 2. Centre a new pdf, Q, called the proposal density, on P 1 P 1 P 3. Sample tentative new point from Q P = ( a, b ) Slice through L(a,b) 4. Compute R = L( a', b' ) L ( a 1, b 1) a
33 5. If R > 1 this means is uphill from. P P 1 We accept P as the next point in our chain, i.e. P 2 = P 6. If R < 1 this means is downhill from. P P 1 In this case we may reject P as our next point. In fact, we accept with probability R. P How do we do this? (a) Generate a random number x ~ U[0,1] (b) If x < R then accept and set P P 2 = P (c) If x > R then reject and set P P 2 = P 1
34 5. If R > 1 this means is uphill from. P P 1 We accept P as the next point in our chain, i.e. P 2 = P 6. If R < 1 this means is downhill from. P P 1 In this case we may reject P as our next point. In fact, we accept with probability R. P How do we do this? (a) Generate a random number x ~ U[0,1] (b) If x < R then accept and set P P 2 = P (c) If x > R then reject and set P P 2 = P 1 Acceptance probability depends only on the previous point - Markov Chain
35 So the Metropolis Algorithm generally (but not always) moves uphill, towards the peak of the Likelihood Function. Remarkable facts Sequence of points { P 1, P 2, P 3, P 4, P 5, } represents a sample from the LF L(a,b) Sequence for each coordinate, e.g. samples the marginalised likelihood of a { a 1, a 2, a 3, a 4, a 5, } We can make a histogram of and use it to compute the mean and variance of a ( i.e. to attach an error bar to a ) { a 1, a 2, a 3, a 4, a 5,, a n }
36 Why is this so useful? Suppose our LF was a 1-D Gaussian. We could estimate the mean and variance quite well from a histogram of e.g samples. But what if our problem is, e.g. 7 dimensional? Normal sampling would need (1000) 7 samples! No. of samples MCMC provides a short-cut. To compute a new point in our Markov Chain we need to compute Sampled value the LF. But the computational cost does not grow so dramatically as we increase the number of dimensions of our problem. This lets us tackle problems that would be impossible by normal sampling.
37 Example: CMBR constraints from WMAP 3 year data Angular power spectrum of CMBR temperature fluctuations ML cosmological model, depending on 7 different parameters. Possible Honours Lab Project!
6. Advanced Numerical Methods. Monte Carlo Methods
6. Advanced Numerical Methods Part 1: Part : Monte Carlo Methods Fourier Methods Part 1: Monte Carlo Methods 1. Uniform random numbers Generating uniform random numbers, drawn from the pdf U[0,1], is fairly
More informationBrandon C. Kelly (Harvard Smithsonian Center for Astrophysics)
Brandon C. Kelly (Harvard Smithsonian Center for Astrophysics) Probability quantifies randomness and uncertainty How do I estimate the normalization and logarithmic slope of a X ray continuum, assuming
More informationModeling Uncertainty in the Earth Sciences Jef Caers Stanford University
Probability theory and statistical analysis: a review Modeling Uncertainty in the Earth Sciences Jef Caers Stanford University Concepts assumed known Histograms, mean, median, spread, quantiles Probability,
More information19 : Slice Sampling and HMC
10-708: Probabilistic Graphical Models 10-708, Spring 2018 19 : Slice Sampling and HMC Lecturer: Kayhan Batmanghelich Scribes: Boxiang Lyu 1 MCMC (Auxiliary Variables Methods) In inference, we are often
More informationParameter Estimation and Fitting to Data
Parameter Estimation and Fitting to Data Parameter estimation Maximum likelihood Least squares Goodness-of-fit Examples Elton S. Smith, Jefferson Lab 1 Parameter estimation Properties of estimators 3 An
More informationIntroduction to Statistics and Error Analysis II
Introduction to Statistics and Error Analysis II Physics116C, 4/14/06 D. Pellett References: Data Reduction and Error Analysis for the Physical Sciences by Bevington and Robinson Particle Data Group notes
More informationRecall the Basics of Hypothesis Testing
Recall the Basics of Hypothesis Testing The level of significance α, (size of test) is defined as the probability of X falling in w (rejecting H 0 ) when H 0 is true: P(X w H 0 ) = α. H 0 TRUE H 1 TRUE
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic - not
More informationLecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1
Lecture 5 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationPhysics 403. Segev BenZvi. Numerical Methods, Maximum Likelihood, and Least Squares. Department of Physics and Astronomy University of Rochester
Physics 403 Numerical Methods, Maximum Likelihood, and Least Squares Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Quadratic Approximation
More informationLecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis
Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationMonte Carlo Simulations
Monte Carlo Simulations What are Monte Carlo Simulations and why ones them? Pseudo Random Number generators Creating a realization of a general PDF The Bootstrap approach A real life example: LOFAR simulations
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 11 January 7, 2013 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline How to communicate the statistical uncertainty
More information16 : Markov Chain Monte Carlo (MCMC)
10-708: Probabilistic Graphical Models 10-708, Spring 2014 16 : Markov Chain Monte Carlo MCMC Lecturer: Matthew Gormley Scribes: Yining Wang, Renato Negrinho 1 Sampling from low-dimensional distributions
More informationStatistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg
Statistics for Data Analysis PSI Practical Course 2014 Niklaus Berger Physics Institute, University of Heidelberg Overview You are going to perform a data analysis: Compare measured distributions to theoretical
More informationLecture 7 and 8: Markov Chain Monte Carlo
Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani
More informationScientific Computing: Monte Carlo
Scientific Computing: Monte Carlo Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course MATH-GA.2043 or CSCI-GA.2112, Spring 2012 April 5th and 12th, 2012 A. Donev (Courant Institute)
More informationPattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods
Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -
More informationRandom Number Generation. CS1538: Introduction to simulations
Random Number Generation CS1538: Introduction to simulations Random Numbers Stochastic simulations require random data True random data cannot come from an algorithm We must obtain it from some process
More informationPractical Numerical Methods in Physics and Astronomy. Lecture 5 Optimisation and Search Techniques
Practical Numerical Methods in Physics and Astronomy Lecture 5 Optimisation and Search Techniques Pat Scott Department of Physics, McGill University January 30, 2013 Slides available from http://www.physics.mcgill.ca/
More informationStatistics and Data Analysis
Statistics and Data Analysis The Crash Course Physics 226, Fall 2013 "There are three kinds of lies: lies, damned lies, and statistics. Mark Twain, allegedly after Benjamin Disraeli Statistics and Data
More informationData modelling Parameter estimation
ASTR509-10 Data modelling Parameter estimation Pierre-Simon, Marquis de Laplace 1749-1827 Under Napoleon, Laplace was a member, then chancellor, of the Senate, and received the Legion of Honour in 1805.
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections Chapter 3 - continued 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions
More informationChapter 4 Multiple Random Variables
Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous
More informationRobert Collins CSE586, PSU Intro to Sampling Methods
Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection
More information17 : Markov Chain Monte Carlo
10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo
More informationMCMC 2: Lecture 3 SIR models - more topics. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham
MCMC 2: Lecture 3 SIR models - more topics Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham Contents 1. What can be estimated? 2. Reparameterisation 3. Marginalisation
More informationBayesian Inference in Astronomy & Astrophysics A Short Course
Bayesian Inference in Astronomy & Astrophysics A Short Course Tom Loredo Dept. of Astronomy, Cornell University p.1/37 Five Lectures Overview of Bayesian Inference From Gaussians to Periodograms Learning
More informationDoing Bayesian Integrals
ASTR509-13 Doing Bayesian Integrals The Reverend Thomas Bayes (c.1702 1761) Philosopher, theologian, mathematician Presbyterian (non-conformist) minister Tunbridge Wells, UK Elected FRS, perhaps due to
More informationMethods of Data Analysis Random numbers, Monte Carlo integration, and Stochastic Simulation Algorithm (SSA / Gillespie)
Methods of Data Analysis Random numbers, Monte Carlo integration, and Stochastic Simulation Algorithm (SSA / Gillespie) Week 1 1 Motivation Random numbers (RNs) are of course only pseudo-random when generated
More informationfunctions Poisson distribution Normal distribution Arbitrary functions
Physics 433: Computational Physics Lecture 6 Random number distributions Generation of random numbers of various distribuition functions Normal distribution Poisson distribution Arbitrary functions Random
More informationIntroduction to Markov Chain Monte Carlo & Gibbs Sampling
Introduction to Markov Chain Monte Carlo & Gibbs Sampling Prof. Nicholas Zabaras Sibley School of Mechanical and Aerospace Engineering 101 Frank H. T. Rhodes Hall Ithaca, NY 14853-3801 Email: zabaras@cornell.edu
More information2 Functions of random variables
2 Functions of random variables A basic statistical model for sample data is a collection of random variables X 1,..., X n. The data are summarised in terms of certain sample statistics, calculated as
More informationMathematical statistics
October 1 st, 2018 Lecture 11: Sufficient statistic Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation
More informationMarkov Chain Monte Carlo
Markov Chain Monte Carlo Recall: To compute the expectation E ( h(y ) ) we use the approximation E(h(Y )) 1 n n h(y ) t=1 with Y (1),..., Y (n) h(y). Thus our aim is to sample Y (1),..., Y (n) from f(y).
More informationLikelihood-free MCMC
Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte
More informationMarkov Chains and MCMC
Markov Chains and MCMC CompSci 590.02 Instructor: AshwinMachanavajjhala Lecture 4 : 590.02 Spring 13 1 Recap: Monte Carlo Method If U is a universe of items, and G is a subset satisfying some property,
More informationSTATISTICS OF OBSERVATIONS & SAMPLING THEORY. Parent Distributions
ASTR 511/O Connell Lec 6 1 STATISTICS OF OBSERVATIONS & SAMPLING THEORY References: Bevington Data Reduction & Error Analysis for the Physical Sciences LLM: Appendix B Warning: the introductory literature
More informationStrong Lens Modeling (II): Statistical Methods
Strong Lens Modeling (II): Statistical Methods Chuck Keeton Rutgers, the State University of New Jersey Probability theory multiple random variables, a and b joint distribution p(a, b) conditional distribution
More informationLecture : Probabilistic Machine Learning
Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning
More informationMultimodal Nested Sampling
Multimodal Nested Sampling Farhan Feroz Astrophysics Group, Cavendish Lab, Cambridge Inverse Problems & Cosmology Most obvious example: standard CMB data analysis pipeline But many others: object detection,
More informationStudy of Two Metrics for Anisotropy in the Highest Energy Cosmic Rays
Study of Two in the Highest Energy Cosmic Rays John D. Hague, B. R. Becker, L. Cazon, M. S. Gold, J. A. J. Matthews University of New Mexico Department of Physics and Astronomy jhague@unm.edu The Impact
More informationFourier and Stats / Astro Stats and Measurement : Stats Notes
Fourier and Stats / Astro Stats and Measurement : Stats Notes Andy Lawrence, University of Edinburgh Autumn 2013 1 Probabilities, distributions, and errors Laplace once said Probability theory is nothing
More informationChapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued
Chapter 3 sections 3.1 Random Variables and Discrete Distributions 3.2 Continuous Distributions 3.3 The Cumulative Distribution Function 3.4 Bivariate Distributions 3.5 Marginal Distributions 3.6 Conditional
More informationStatistical Methods in Particle Physics
Statistical Methods in Particle Physics Lecture 3 October 29, 2012 Silvia Masciocchi, GSI Darmstadt s.masciocchi@gsi.de Winter Semester 2012 / 13 Outline Reminder: Probability density function Cumulative
More informationMachine Learning. Probabilistic KNN.
Machine Learning. Mark Girolami girolami@dcs.gla.ac.uk Department of Computing Science University of Glasgow June 21, 2007 p. 1/3 KNN is a remarkably simple algorithm with proven error-rates June 21, 2007
More informationMonte Carlo Methods. Leon Gu CSD, CMU
Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte
More informationLarge Scale Bayesian Inference
Large Scale Bayesian I in Cosmology Jens Jasche Garching, 11 September 2012 Introduction Cosmography 3D density and velocity fields Power-spectra, bi-spectra Dark Energy, Dark Matter, Gravity Cosmological
More informationLecture 2. G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1
Lecture 2 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,
More informationMarkov Chain Monte Carlo
1 Motivation 1.1 Bayesian Learning Markov Chain Monte Carlo Yale Chang In Bayesian learning, given data X, we make assumptions on the generative process of X by introducing hidden variables Z: p(z): prior
More informationProbabilistic Graphical Models Lecture 17: Markov chain Monte Carlo
Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,
More informationDRAGON ADVANCED TRAINING COURSE IN ATMOSPHERE REMOTE SENSING. Inversion basics. Erkki Kyrölä Finnish Meteorological Institute
Inversion basics y = Kx + ε x ˆ = (K T K) 1 K T y Erkki Kyrölä Finnish Meteorological Institute Day 3 Lecture 1 Retrieval techniques - Erkki Kyrölä 1 Contents 1. Introduction: Measurements, models, inversion
More informationCSC 446 Notes: Lecture 13
CSC 446 Notes: Lecture 3 The Problem We have already studied how to calculate the probability of a variable or variables using the message passing method. However, there are some times when the structure
More informationSimulation. Where real stuff starts
1 Simulation Where real stuff starts ToC 1. What is a simulation? 2. Accuracy of output 3. Random Number Generators 4. How to sample 5. Monte Carlo 6. Bootstrap 2 1. What is a simulation? 3 What is a simulation?
More informationIntroduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the
More informationAST 418/518 Instrumentation and Statistics
AST 418/518 Instrumentation and Statistics Class Website: http://ircamera.as.arizona.edu/astr_518 Class Texts: Practical Statistics for Astronomers, J.V. Wall, and C.R. Jenkins Measuring the Universe,
More informationLecture 1: August 28
36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random
More informationMarkov chain Monte Carlo methods in atmospheric remote sensing
1 / 45 Markov chain Monte Carlo methods in atmospheric remote sensing Johanna Tamminen johanna.tamminen@fmi.fi ESA Summer School on Earth System Monitoring and Modeling July 3 Aug 11, 212, Frascati July,
More informationComplex Systems Methods 11. Power laws an indicator of complexity?
Complex Systems Methods 11. Power laws an indicator of complexity? Eckehard Olbrich e.olbrich@gmx.de http://personal-homepages.mis.mpg.de/olbrich/complex systems.html Potsdam WS 2007/08 Olbrich (Leipzig)
More informationCosmology & CMB. Set5: Data Analysis. Davide Maino
Cosmology & CMB Set5: Data Analysis Davide Maino Gaussian Statistics Statistical isotropy states only two-point correlation function is needed and it is related to power spectrum Θ(ˆn) = lm Θ lm Y lm (ˆn)
More informationMarkov Processes. Stochastic process. Markov process
Markov Processes Stochastic process movement through a series of well-defined states in a way that involves some element of randomness for our purposes, states are microstates in the governing ensemble
More informationStatistics notes. A clear statistical framework formulates the logic of what we are doing and why. It allows us to make precise statements.
Statistics notes Introductory comments These notes provide a summary or cheat sheet covering some basic statistical recipes and methods. These will be discussed in more detail in the lectures! What is
More informationPractical Statistics
Practical Statistics Lecture 1 (Nov. 9): - Correlation - Hypothesis Testing Lecture 2 (Nov. 16): - Error Estimation - Bayesian Analysis - Rejecting Outliers Lecture 3 (Nov. 18) - Monte Carlo Modeling -
More informationApril 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning
for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions
More informationProbability Density Functions
Probability Density Functions Probability Density Functions Definition Let X be a continuous rv. Then a probability distribution or probability density function (pdf) of X is a function f (x) such that
More informationMonte Carlo. Lecture 15 4/9/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky
Monte Carlo Lecture 15 4/9/18 1 Sampling with dynamics In Molecular Dynamics we simulate evolution of a system over time according to Newton s equations, conserving energy Averages (thermodynamic properties)
More informationMath Review Sheet, Fall 2008
1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the
More informationRobert Collins CSE586, PSU Intro to Sampling Methods
Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection
More informationStatistics and Prediction. Tom
Statistics and Prediction Tom Kitching tdk@roe.ac.uk @tom_kitching David Tom David Tom photon What do we want to measure How do we measure Statistics of measurement Cosmological Parameter Extraction Understanding
More informationParameter estimation and forecasting. Cristiano Porciani AIfA, Uni-Bonn
Parameter estimation and forecasting Cristiano Porciani AIfA, Uni-Bonn Questions? C. Porciani Estimation & forecasting 2 Temperature fluctuations Variance at multipole l (angle ~180o/l) C. Porciani Estimation
More informationToday: Fundamentals of Monte Carlo
Today: Fundamentals of Monte Carlo What is Monte Carlo? Named at Los Alamos in 1940 s after the casino. Any method which uses (pseudo)random numbers as an essential part of the algorithm. Stochastic -
More informationNumerical Methods I Monte Carlo Methods
Numerical Methods I Monte Carlo Methods Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 Course G63.2010.001 / G22.2420-001, Fall 2010 Dec. 9th, 2010 A. Donev (Courant Institute) Lecture
More informationBERTINORO 2 (JVW) Yet more probability Bayes' Theorem* Monte Carlo! *The Reverend Thomas Bayes
BERTINORO 2 (JVW) Yet more probability Bayes' Theorem* Monte Carlo! *The Reverend Thomas Bayes 1702-61 1 The Power-law (Scale-free) Distribution N(>L) = K L (γ+1) (integral form) dn = (γ+1) K L γ dl (differential
More informationLecture 4. Continuous Random Variables and Transformations of Random Variables
Math 408 - Mathematical Statistics Lecture 4. Continuous Random Variables and Transformations of Random Variables January 25, 2013 Konstantin Zuev (USC) Math 408, Lecture 4 January 25, 2013 1 / 13 Agenda
More informationPhysics 403 Probability Distributions II: More Properties of PDFs and PMFs
Physics 403 Probability Distributions II: More Properties of PDFs and PMFs Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Last Time: Common Probability Distributions
More informationECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering
ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:
More informationMonte Carlo Markov Chains: A Brief Introduction and Implementation. Jennifer Helsby Astro 321
Monte Carlo Markov Chains: A Brief Introduction and Implementation Jennifer Helsby Astro 321 What are MCMC: Markov Chain Monte Carlo Methods? Set of algorithms that generate posterior distributions by
More informationEconomics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,
Economics 520 Lecture Note 9: Hypothesis Testing via the Neyman-Pearson Lemma CB 8., 8.3.-8.3.3 Uniformly Most Powerful Tests and the Neyman-Pearson Lemma Let s return to the hypothesis testing problem
More informationIntroduction to statistics. Laurent Eyer Maria Suveges, Marie Heim-Vögtlin SNSF grant
Introduction to statistics Laurent Eyer Maria Suveges, Marie Heim-Vögtlin SNSF grant Recent history at the Observatory Request of something on statistics from PhD students, because of an impression of
More informationAdvanced Statistical Methods. Lecture 6
Advanced Statistical Methods Lecture 6 Convergence distribution of M.-H. MCMC We denote the PDF estimated by the MCMC as. It has the property Convergence distribution After some time, the distribution
More informationRobert Collins CSE586, PSU Intro to Sampling Methods
Robert Collins Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Robert Collins A Brief Overview of Sampling Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling
More informationLearning the hyper-parameters. Luca Martino
Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth
More informationStatistical techniques for data analysis in Cosmology
Statistical techniques for data analysis in Cosmology arxiv:0712.3028; arxiv:0911.3105 Numerical recipes (the bible ) Licia Verde ICREA & ICC UB-IEEC http://icc.ub.edu/~liciaverde outline Lecture 1: Introduction
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY PHYSICS DEPARTMENT
G. Clark 7oct96 1 MASSACHUSETTS INSTITUTE OF TECHNOLOGY PHYSICS DEPARTMENT 8.13/8.14 Junior Laboratory STATISTICS AND ERROR ESTIMATION The purpose of this note is to explain the application of statistics
More informationProbability Density Functions
Statistical Methods in Particle Physics / WS 13 Lecture II Probability Density Functions Niklaus Berger Physics Institute, University of Heidelberg Recap of Lecture I: Kolmogorov Axioms Ingredients: Set
More informationStatistics, Data Analysis, and Simulation SS 2015
Statistics, Data Analysis, and Simulation SS 2015 08.128.730 Statistik, Datenanalyse und Simulation Dr. Michael O. Distler Mainz, 27. April 2015 Dr. Michael O. Distler
More informationA Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait
A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling Christopher Jennison Department of Mathematical Sciences, University of Bath, UK http://people.bath.ac.uk/mascj Adriana Ibrahim Institute
More informationIntroduction to Bayes
Introduction to Bayes Alan Heavens September 3, 2018 ICIC Data Analysis Workshop Alan Heavens Introduction to Bayes September 3, 2018 1 / 35 Overview 1 Inverse Problems 2 The meaning of probability Probability
More informationModern Methods of Data Analysis - WS 07/08
Modern Methods of Data Analysis Lecture VII (26.11.07) Contents: Maximum Likelihood (II) Exercise: Quality of Estimators Assume hight of students is Gaussian distributed. You measure the size of N students.
More informationEco517 Fall 2013 C. Sims MCMC. October 8, 2013
Eco517 Fall 2013 C. Sims MCMC October 8, 2013 c 2013 by Christopher A. Sims. This document may be reproduced for educational and research purposes, so long as the copies contain this notice and are retained
More informationMonte Carlo and cold gases. Lode Pollet.
Monte Carlo and cold gases Lode Pollet lpollet@physics.harvard.edu 1 Outline Classical Monte Carlo The Monte Carlo trick Markov chains Metropolis algorithm Ising model critical slowing down Quantum Monte
More informationStat 451 Lecture Notes Markov Chain Monte Carlo. Ryan Martin UIC
Stat 451 Lecture Notes 07 12 Markov Chain Monte Carlo Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapters 8 9 in Givens & Hoeting, Chapters 25 27 in Lange 2 Updated: April 4, 2016 1 / 42 Outline
More informationParameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1
Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data
More informationMultiple Random Variables
Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x
More informationSAMPLING ALGORITHMS. In general. Inference in Bayesian models
SAMPLING ALGORITHMS SAMPLING ALGORITHMS In general A sampling algorithm is an algorithm that outputs samples x 1, x 2,... from a given distribution P or density p. Sampling algorithms can for example be
More informationStatistical Methods for Astronomy
Statistical Methods for Astronomy If your experiment needs statistics, you ought to have done a better experiment. -Ernest Rutherford Lecture 1 Lecture 2 Why do we need statistics? Definitions Statistical
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationShort tutorial on data assimilation
Mitglied der Helmholtz-Gemeinschaft Short tutorial on data assimilation 23 June 2015 Wolfgang Kurtz & Harrie-Jan Hendricks Franssen Institute of Bio- and Geosciences IBG-3 (Agrosphere), Forschungszentrum
More informationMCMC and Gibbs Sampling. Sargur Srihari
MCMC and Gibbs Sampling Sargur srihari@cedar.buffalo.edu 1 Topics 1. Markov Chain Monte Carlo 2. Markov Chains 3. Gibbs Sampling 4. Basic Metropolis Algorithm 5. Metropolis-Hastings Algorithm 6. Slice
More information