Markov chain Monte Carlo and stochastic origin ensembles methods Comparison of a simple application for a Compton imager detector

Size: px
Start display at page:

Download "Markov chain Monte Carlo and stochastic origin ensembles methods Comparison of a simple application for a Compton imager detector"

Transcription

1 Markov chain Monte Carlo and stochastic origin ensembles methods Comparison of a simple application for a Compton imager detector Pierre-Luc Drouin DRDC Ottawa Research Centre Defence Research and Development Canada Scientific Report DRDC-RDDC-2016-R124 September 2016

2 c Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence, 2016 c Sa Majesté la Reine (en droit du Canada), telle que réprésentée par le ministre de la Défense nationale, 2016

3 Abstract This document aims to summarise Markov Chain Monte Carlo (MCMC) methods, in particular, the Metropolis-Hastings algorithm and the Stochastic Origin Ensembles (SOE) method, in a concise and notation-consistent manner. These methods are commonly used to perform model parameter estimation for a population, based on a measured sample, through the sampling of the probability distribution for these parameters. A simple application of SOE is then demonstrated using simulation data from a Compton imager detector. Significance for defence and security In Radiation and Nuclear (RN) defence, many detection technologies rely on model parameter estimation to perform threat detection, identification and/or localisation. Such algorithms are often required when the direct observation of the parameters of concern is not possible. Some detection systems, such as Compton imagers and muon tomography systems, need to make extensive use of such algorithms compared to traditional technologies. This document summarises a set of algorithms which represent good candidates for the estimation of the parameters for these new technologies. Hardware and software development of such systems is currently ongoing at DRDC Ottawa Research Centre. DRDC-RDDC-2016-R124 i

4 Résumé Ce document vise à résumer les méthodes de Monte-Carlo par Chaînes de Markov, et en particulier, l algorithme de Metropolis-Hastings et la méthode des Ensembles d Origine Stochastique (EOS), d une manière concise et tout en utilisant une notation consistante. Ces méthodes sont fréquemment utilisées afin d estimer les paramètres d un modèle pour une population, basé sur la mesure d un échantillon, par le biais de l échantillonnage de la distribution probabilistique pour ces paramètres. Une application simple de la métode EOS est ensuite démontrée en utilisant des donnés simulées provenant d un imageur Compton. Importance pour la défense et la sécurité En défence Radiologique et Nucléaire (RN), plusieurs technologies de détection dépendent sur l estimation de paramètres d un modèle afin d accomplir la détection de menaces, leur identification et/ou leur localisation. De tels algorithmes sont souvent requis lorsque l observation directe des paramètres concernés n est pas possible. Certains systèmes de détection, tels que les imageurs Compton et les systèmes de tomographie muonique, requièrent un usage intensif de tels algorithmes comparativement aux technologies traditionnelles. Ce document résume un ensemble d algorithmes qui représentent de bons candidats afin d estimer les paramètres pour ces nouvelles technologies. Du développement d équipement ainsi que logiciel est en cours au RDDC Centre de recherches d Ottawa. ii DRDC-RDDC-2016-R124

5 Table of contents Abstract Significance fordefence andsecurity i i Résumé ii Importance pour la défense et la sécurité ii Tableofcontents iii Listoffigures Listoftables iv iv 1 Introduction Metropolis Hastings algorithm Sampling from the probability density function of an evaluated model Stochasticoriginensemblesmethod ApplicationofSOE foracomptonimager ComparisonoftheSOE algorithm LimitationsofthesimpleSOE algorithm Conclusion References DRDC-RDDC-2016-R124 iii

6 List of figures Figure 1: Figure 2: Images of a simulated 137 Cs point source located 10 off-axis as produced using simple back projection (top), smoothed MLEM (left) andsoe (right) Images of a simulated 137 Cs C-shaped source as produced using simple back projection, MLEM and SOE with 2 ( Coarse ) and 1 ( Fine ) priorpdf binning List of tables Table 1: Measured mean width for a 2D Gaussian fitted on the distributions resultingfrom thethreedifferentalgorithms iv DRDC-RDDC-2016-R124

7 1 Introduction In statistical data analysis, parameter estimation techniques are commonly used to determine model parameters for a population, based on a measured sample. Maximum-likelihood estimation is a well known and widely used approach, which aims at determining the model parameters that maximise the probability of the measured sample. Although this technique is well suited for a large number of applications, it can be difficult to apply for some situations in practice, for example when the number of unknown model parameters is very large and/or when the model can be defined more easily by introducing latent (hidden) variables. This document first presents a summary of Markov Chain Monte Carlo (MCMC) methods, in particular, the Metropolis Hastings algorithm, which can be used to perform parameter estimation in such situations. The Stochastic Origin Ensembles (SOE) method, which represents a particular application of the Metropolis-Hastings algorithm, is then described, and results from a simple application of this method are then compared to alternative algorithms. As shown in this document, advanced radiation imaging systems, such as Compton imagers, greatly benefit from using models which use a large number of latent variables. These imagers allow for a quick localisation of point or extended radioactive sources that can be partially shielded by the material of a cluttered environment. Such a task can represent a real challenge when attempted using traditional detectors. 2 Metropolis Hastings algorithm The Metropolis Hastings algorithm [1, 2] is a random walk Monte Carlo method, which is a subclass of the MCMC methods, and that samples from a Probability Density Function (PDF) f ( θ), through the usage of a state vector θ which is randomly moving in the parameter space of the PDF. A Markov process is uniquely defined by the expression of a transition PDF f ( θ θ ) which provides the probability density of transiting from state θ to state θ. A Markov chain, defined as a sequence of such state vectors, is thus memoryless, since the probability of transition to the following state depends only on the current state. A Markov process reaches a unique stationary distribution π( θ)= f ( θ) asymptotically when the two following conditions are met: the condition of detailed balance, π( θ) f ( θ θ )=π( θ ) f ( θ θ), (1) and the Markov process must be ergodic, meaning that the process must be aperiodic and must be able to return to any given state θ in a finite number of steps (irreducible process). DRDC-RDDC-2016-R124 1

8 The Metropolis-Hastings algorithm ensures that Condition (1) is met by enforcing it directly through the relationship between f ( θ) and f ( θ θ ): f ( θ) f ( θ θ )=f( θ ) f ( θ θ) (2) f ( θ θ ) f ( θ θ) = f ( θ ) f ( θ). (3) The transition PDF f ( θ θ ) is then expressed as the product between a proposal PDF g( θ θ ) and an acceptance PDF h( θ θ ): f ( θ (trans.) θ )=f( θ prop. θ, θ accept. θ ) = f ( θ prop. θ ) f ( θ accept. θ θ prop. θ ) =g( θ θ )h( θ θ ). (4) In order to fulfill the ergodicity condition, g( θ θ ) must be aperiodic and both g( θ θ ) and h( θ θ ) must allow f ( θ θ ) to be irreducible. From Equations (3) and (4), the condition on h( θ θ ) is h( θ θ ) h( θ θ) = g( θ θ) f ( θ ) g( θ θ ) f ( θ). (5) The Metropolis-Hastings algorithm uses the following expression for h( θ θ ) to satisfy the above constraint: { h( θ θ )=min 1, g( θ θ) f ( } θ ) g( θ θ ) f (. (6) θ) Equation (6) satisfies the irreducibility condition when g( θ θ ) is irreducible. The Metropolis-Hastings algorithm thus relies uniquely on g( θ θ ) to satisfy the ergodicity condition, while the expression for h( θ θ ) ensures that the condition of detailed balanced can be reached. Note also that due to the form of Equation (6), the Metropolis-Hastings expression for the transition PDF f ( θ θ ) is insensitive to the normalisation or scaling of f ( θ), meaning that the expression for f ( θ) needs only to be known up to a scaling factor that does not depend on θ, which can greatly simplify computation in practice. Using the above results and an initial state θ s with s = 0, the Metropolis-Hastings algorithm can be executed as follows: 1. Draw a random proposed state θ according to the chosen ergodic proposal PDF g( θ s θ ). 2 DRDC-RDDC-2016-R124

9 2. Compute g( θ θ s ) f ( θ ). If the resulting value is greater or equal to either 1 or otherwise, to a random number drawn uniformly in the interval ]0,1], the proposed state g( θ s θ ) f ( θ s ) is accepted, θ s+1 = θ, s is incremented by 1 and the algorithm continues to Step θ s+1 = θ s, s is incremented by 1 and the algorithm continues to Step 1. This procedure allows the distribution of states to converge to f ( θ) once the equilibrium is reached. However, the choice for the initial state θ 0 as well as the proposal PDF g( θ θ ) can greatly affect the required number of steps required to reach this regime. Also, unless g( θ θ ) has a negligible dependency on θ and the acceptance of θ is very likely, there can be a significant autocorrelation within the MCMC chain, such that consecutive states cannot be considered to be statistically independent. The usage of correlated states for estimator computation can lead to biased results. However, choosing a function g( θ θ ) having a weak dependency on θ often results in a very small probability of acceptance for θ. It can thus be advantageous to accept a higher dependency within g( θ θ ) to increase the probability of acceptance. A reduced chain of states with low autocorrelation can then be obtained by sampling the original chain at a fixed interval which is sufficiently large for the desired intent. A Metropolis-Hastings algorithm thus often involves the rejection of a number of burn-in steps required to reach the equilibrium, followed by a sampling of the remaining steps to insure a sufficiently low autocorrelation. The determination of the number of burn-in steps and the sampling rate is discussed in detail in [3]. In the case of multi-dimensional state vectors, a common method to easily increase the level of acceptance consists in a proposal function that randomly updates a single component of θ at each step. This implies a sampling rate of the chain which is greater than the dimension of the state vector to achieve statistical independence between the states of the reduced chain. 3 Sampling from the probability density function of an evaluated model MCMC methods can be used to sample from the PDF of a model which is evaluated using a data sample. Let the conditional probability density of a single measured event e, as computed by a model characterised by a set of parameters θ,bedefinedas f ( x e θ), where x e represents the coordinate of the event in the measurement parameter space. Assuming the statistical independence of the events within a measured sample, the PDF of a sample, its likelihood function, can be expressed as L(sample θ)= f (sample θ)= n events e=1 f ( x e θ), (7) DRDC-RDDC-2016-R124 3

10 where n events is the number of measured events. In the common case of a measurement where the number of measured events is Poisson distributed, an extended likelihood function is defined as L e (sample θ)=f e (sample θ)=p(n events ν) f (sample θ) = e ν ν n events n events! n events e=1 f ( x e θ), (8) where ν is a parameter, a subcomponent of θ, for the Poisson Probability Mass Function (PMF) P(n ν)= e ν ν n n!, which corresponds to the average number of measured events. Estimation methods, such as Maximum Likelihood (ML) and Maximum Likelihood Expectation Maximisation (MLEM), provide estimators ˆ θ that maximise the likelihood L(e) of the measured sample. ML methods can estimate the variance of the estimators through different methods such as contour lines in the likelihood parameter space. In contrast, MCMC methods can be used to sample from the f (e) ( θ sample) PDF directly, such that the resulting posterior distribution can be used to evaluate any metric. These posterior distributions contain more information than provided by ML methods and are thus suited to characterise estimator uncertainties in the case of non-gaussian statistics. f ( θ) is often expressed as a function of the likelihood function, through the usage of Bayes theorem: f (e) ( θ sample)=l (e) (sample f p ( θ) θ) f p (sample), (9) where f (e) ( θ sample) represents the MCMC PDF f ( θ). In the above expression, f p ( θ) is the prior PDF for the model parameters, while the PDF f p (sample) is the prior for the measured sample. Sampling simplifications with the Metropolis-Hastings algorithm As previously mentioned, for the Metropolis-Hastings algorithm, sampling of a PDF can be performed as long as the PDF is known up to a scaling factor which does not depend on the sampled parameters. This can simplify the task, notably when expressing the PDF as a function of the likelihood function. From Equation (9), it becomes no longer relevant to evaluate the f p (sample) expression and the PDF is now more simply interpreted as f (e) ( θ sample) L (e) (sample θ) f p ( θ). (10) 4 Stochastic origin ensembles method The StochasticOriginEnsemblesmethod [4] represents a particular applicationof themetropolis-hastings algorithm for a scenario where the parameter vector θ can be subdivided 4 DRDC-RDDC-2016-R124

11 in a set of φ subcomponents θ ( φ 1, φ 2,..., φ nevents), (11) each one respectively associated to a corresponding event in the measured sample. The φ components can represent hidden variables of the process and are considered to be independent from each other since they are associated to independently measured events. A priori, these components nonetheless follow an unknown distribution, and this distribution is approximated using the distribution of the φ components within θ. The f p ( θ) prior PDF is then computed by evaluating the probability density of each subcomponent in this distribution. Usually, the posterior sample of θ parameter values, as provided by the resulting Markov chain, is then used to generate a distribution in the φ parameter space. Due to the statistical independence of the φ subcomponents, f p ( θ) can be expressed as f p ( θ)= n events e=1 f ( φ e ), (12) where f ( φ) is the unknown distribution which is approximated using the distribution of the φ components within θ: f ( φ) f b ( φ θ). (13) In the above expression, f b ( φ θ) is a binned PDF in the φ parameter space, which is populated using the θ sample. The approximation of f ( φ) using f b ( φ θ) does not affect the convergence of the Markov chain to the stationary distribution; it is guarantied by the Metropolis-Hastings algorithm. We define a linearisedbin indexk for the binned φ parameter space and let n(k θ) represent the number of components within the θ sample that fall in bin k. If the function b( φ) provides the bin index k associated to the φ coordinate, then f b ( φ θ) is proportional to Using Equations (12) to (14), we then have f p ( θ) n events e=1 f b ( φ e θ) f b ( φ θ) n(b( φ) θ). (14) n events n(b( φ e ) n bins θ)= e=1 k=1 n(k θ) n(k θ), (15) where n bins is the total number of bins within f b ( φ θ). With the SOE algorithm, the ratio f ( θ ) f ( θ) from the Metropolis-Hastings algorithm is thus given by f ( θ ) f ( θ) L(sample θ n ) bins n(k θ ) n(k θ ) L(sample θ) k=1 n(k θ) n(k θ), (16) DRDC-RDDC-2016-R124 5

12 and similar to the case of an extended likelihood. If the implementation of the algorithm is such that a single component of θ is updated by the chosen proposal function g( θ θ ), then the above expression can be simplified even further. We define o and d as the origin and destination bin indices for the proposed transition, respectively. Thus, we have, when o d : such that n bins n(k θ ) n(k θ ) k=1 n(k θ) n(k θ) n(o θ )=n(o θ) 1 (17) n(d θ )=n(d θ)+1, (18) = [n(o θ) 1] n(o θ) 1 [n(d θ)+1] n(d θ)+1 n(o θ) n(o θ) n(d θ) n(d. (19) θ) This finally gives f ( θ ) f ( θ) = L(sample θ ) 1, if o = d L(sample [n(o θ) θ) 1] n(o θ) 1 [n(d θ)+1] n(d θ)+1, otherwise. (20) n(o θ) n(o θ) n(d θ) n(d θ) The above ratio can thus be used within the Metropolis-Hastings algorithm as described in the previous section to sample from f ( θ) when the model parameters are associated to the events in the measured sample. 5 Application of SOE for a Compton imager A simple application of the SOE method can be performed for a Compton imager [5], where one wants to determine the distribution of φ, defined as the angular origin of the detected radiation. For a simplified physical model where the PDF for the angular origin of a given detected event consists in a ring resulting from back projection of the Compton cone [6], the likelihood of the angular origin for the event is constant along that ring and otherwise null. This model assumes that the energy of the analysed photons is perfectly known. If one chooses a function g( θ θ ) which only proposes origins along these rings, the expression for h( θ θ ) can be easily evaluated, since the ratio of likelihood values in Equation (20) is equal to 1 and the proposal function is symmetrical. The resulting algorithm to generate the Markov chain for this simple Compton imager SOE is thus given by: 1. Pick an initial origin for θ, using random positions along the back projected rings of the measured events. 2. Compute the values for n(k θ) by filling an histogram, and record the bin index for each event. 6 DRDC-RDDC-2016-R124

13 3. Draw a random event index e uniformly out of the n events events in the measured sample. 4. Pick a random position along the back projected ring for the selected event and determine the proposed destination bin d. 5. Compute f ( θ ) as given by Equation (20) (where the ratio of likelihood values is equal f ( θ s ) to 1), using the destination bin d and the recorded current bin index for the event e as the o bin. If the resulting value is greater or equal to either 1 or otherwise, to a random number drawn uniformly in the interval ]0,1], the proposed state is accepted, n(o θ ) and n(d θ ) and the current bin index for event e are updated, θ s+1 = θ, s is incremented by 1 and the algorithm continues to Step θ s+1 = θ s, s is incremented by 1 and the algorithm continues to Step The resulting chain must be sampled at least every n events steps. 6 Comparison of the SOE algorithm In this section, results from the application of the SOE algorithm to Compton imager simulation data are presented and compared to alternate algorithms. The algorithm was tested using 1453 detected gamma rays from a 137 Cs point source located 10 off-axis. The prior SOE histogram was binned using bins from 90 to 90 in each direction. A total of steps were generated, including burn-in steps, and one step every 100 steps was then used to generated the posterior distribution. Figure 1 shows the resulting distribution, along with corresponding results which were obtained in [6] using simple back projection and smoothed MLEM. Corresponding angular resolution results for this simulation are shown in Table 1, where a 2D Gaussian distribution was fitted using a range of ±10 around the peak. When comparing MLEM and SOR results to back projection results, there is an obvious improvement for both the source image and the angular resolution values, as these two algorithms allow the elimination of the circular patterns produced by the back projection of the Compton cones. The mean Gaussian width is reduced by more than 50%, which is a very significant improvement considering that the three algorithms are not using assumptions regarding the source s spatial distribution. When comparing MLEM and SOE results, Figure 1 shows slight clustering of events outside the source location with the MLEM technique, which is not present with SOE. The computed angular resolution is slightly better with SOE, and the fit range for the 2D Gaussian excluded most of the visible clusters in MLEM s results. An infinitesimally thin C-shaped 137 Cs source was also imaged, using the same simulated dataset that was processed in [6]. The SOE algorithm was run twice to observe the effect of different prior PDF binning. The results are presented in Figure 2. Similarly to DRDC-RDDC-2016-R124 7

14 A Figure 1: Images of a simulated 137 Cs point source located 10 off-axis as produced using simple back projection (top), smoothed MLEM (left) and SOE (right). Table 1: Measured mean width for a 2D Gaussian fitted on the distributions resulting from the three different algorithms. Algorithm Resolution [ ] Simple back projection 7.1±0.1 MLEM 3.2±0.1 SOE 3.1±0.1 8 DRDC-RDDC-2016-R124

15 B!"#$ LL!"#$% A S S! "#$ A S S! "#$%& A A 2 Figure 2: Images of a simulated 137 Cs C-shaped source as produced using simple back projection, MLEM and SOE with 2 ( Coarse ) and 1 ( Fine ) prior PDF binning. DRDC-RDDC-2016-R124 9

16 the results obtained with the point source, the figure shows a drastic improvement of the reconstructed image when comparing SOE with the back projection method, which translates into a higher contrast. When qualitatively comparing SOE results with MLEM, we observe a dependence of the outcome on the binning that is chosen for the prior PDF, as one would expect since this effectively approximates the probability density to be uniform within a given bin. Correlations between image bins are thus reduced as the prior PDF binning becomes finer, but this is done at the expense of increased statistical noise, since the prior PDF is approximated using the measured sample itself. When comparing image blur, the MLEM result appears to fall between the two SOE results. The SOE result with finer binning could benefit from a postsmoothing filter in order to reduce the statistical noise effects. Similar structures have been observed with the MLEM method when the number of iterations becomes too large. 7 Limitations of the simple SOE algorithm The simple SOE method based on a back projection model which was described in Sections 5 and 6 has the advantage of being analytical and easy to compute. However, it has many of the flaws of the other methods based on this model: it assumes that the detector measures energies exactly, such that back projected events lie along an infinitesimal ring rather than a broad uncertainty band; the detector angular response is assumed to be uniform; energy deposition is assumed to occur in the centroid of the detector pixels; the analysed measured sample is assumed to be pure, without any contamination such as multiple scattering within the scatter plane, backscattering off the absorber plane or undetected energy escaping either plane; the source is assumed to be at an infinite distance from the detector; and the lateral position of the scatter pixel with respect to the origin of the back projection angular space is neglected. This list of approximations can contribute to various unwanted effects, such as image blur, distortion, artificial image structures, etc. A more realistic model which would avoid so many approximations is thus desirable. Except for the last two items, the above approximations will be addressed in the future. 10 DRDC-RDDC-2016-R124

17 8 Conclusion In this document, the Metropolis-Hastings algorithm was presented, following a summary of MCMC methods. The SOE method was then explained, as a special case of the Metropolis-Hastings algorithm, and a simple implementation for a Compton imager was demonstrated and compared to alternate algorithms. When compared to the back projection method, the SOE method showed a drastic improvement of the reconstructed image. The latter method produced results which were comparable to the results obtained with MLEM. Compared to MLEM, SOE showed less clustering of events outside the simulated point source location. DRDC-RDDC-2016-R124 11

18 This page intentionally left blank. 12 DRDC-RDDC-2016-R124

19 References [1] Metropolis, N., Rosenbluth, A. W., Rosenbluth, M. N., Teller, A. H., and Teller, E. (1953), Equation of State Calculations by Fast Computing Machines, The Journal of Chemical Physics, 21(6), [2] Hastings, W. K. (1970), Monte Carlo Sampling Methods Using Markov Chains and Their Applications, Biometrika, 57(1), [3] Raftery, A. E. and Lewis, S. M. (1995), The Number of Iterations, Convergence Diagnostics and Generic Metropolis Algorithms, In Practical Markov Chain Monte Carlo (W.R. Gilks, D.J. Spiegelhalter and S. Richardson, eds.), pp , Chapman and Hall. [4] Sitek, A. (2008), Representation of photon limited data in emission tomography using origin ensembles, Physics in medicine and biology, 53(12), [5] Andreyev, A. (2009), Stochastic image reconstruction method for Compton camera, In Nuclear Science Symposium Conference Record (NSS/MIC), 2009 IEEE, pp , IEEE. [6] Ueno, R. (2016), Development of the GEANT4 Simulation for the Compton Gamma-Ray Camera, (DRDC-RDDC-2016-C138) Defence Research and Development Canada Ottawa Research Centre. DRDC-RDDC-2016-R124 13

20 This page intentionally left blank. 14 DRDC-RDDC-2016-R124

21 DOCUMENT CONTROL DATA (Security markings for the title, abstract and indexing annotation must be entered when the document is Classified or Protected.) 1. ORIGINATOR (The name and address of the organization preparing the document. Organizations for whom the document was prepared, e.g. Centre sponsoring a contractor s report, or tasking agency, are entered in section 8.) 2a. SECURITY MARKING (Overall security marking of the document, including supplemental markings if applicable.) UNCLASSIFIED 3701 Carling Avenue, Ottawa ON K1A 0Z4, Canada 2b. CONTROLLED GOODS (NON-CONTROLLED GOODS) DMC A REVIEW: GCEC DECEMBER TITLE (The complete document title as indicated on the title page. Its classification should be indicated by the appropriate abbreviation (S, C or U) in parentheses after the title.) Markov chain Monte Carlo and stochastic origin ensembles methods 4. AUTHORS (Last name, followed by initials ranks, titles, etc. not to be used.) Drouin, P.-L. 5. DATE OF PUBLICATION (Month and year of publication of document.) September a. NO. OF PAGES (Total containing information. Include Annexes, Appendices, etc.) 22 6b. NO. OF REFS (Total cited in document.) 6 7. DESCRIPTIVE NOTES (The category of the document, e.g. technical report, technical note or memorandum. If appropriate, enter the type of report, e.g. interim, progress, summary, annual or final. Give the inclusive dates when a specific reporting period is covered.) Scientific Report 8. SPONSORING ACTIVITY (The name of the department project office or laboratory sponsoring the research and development include address.) DRDC Ottawa Research Centre 3701 Carling Avenue, Ottawa ON K1A 0Z4, Canada 9a. PROJECT OR GRANT NO. (If appropriate, the applicable research and development project or grant number under which the document was written. Please specify whether project or grant.) 9b. CONTRACT NO. (If appropriate, the applicable number under which the document was written.) 10a. ORIGINATOR S DOCUMENT NUMBER (The official document number by which the document is identified by the originating activity. This number must be unique to this document.) DRDC-RDDC-2016-R124 10b. OTHER DOCUMENT NO(s). (Any other numbers which may be assigned this document either by the originator or by the sponsor.) 11. DOCUMENT AVAILABILITY (Any limitations on further dissemination of the document, other than those imposed by security classification.) Unlimited 12. DOCUMENT ANNOUNCEMENT (Any limitation to the bibliographic announcement of this document. This will normally correspond to the Document Availability (11). However, where further distribution (beyond the audience specifiedin (11)) is possible, a wider announcement audience may be selected.) Unlimited

22 13. ABSTRACT (A brief and factual summary of the document. It may also appear elsewhere in the body of the document itself. It is highly desirable that the abstract of classified documents be unclassified. Each paragraph of the abstract shall begin with an indication of the security classification of the information in the paragraph (unless the document itself is unclassified) represented as (S), (C), or (U). It is not necessary to include here abstracts in both official languages unless the text is bilingual.) This document aims to summarise Markov Chain Monte Carlo (MCMC) methods, in particular, the Metropolis-Hastings algorithm and the Stochastic Origin Ensembles (SOE) method, in a concise and notation-consistent manner. These methods are commonly used to perform model parameter estimation for a population, based on a measured sample, through the sampling of the probability distribution for these parameters. A simple application of SOE is then demonstrated using simulation data from a Compton imager detector. 14. KEYWORDS, DESCRIPTORS or IDENTIFIERS (Technically meaningful terms or short phrases that characterize a document and could be helpful in cataloguing the document. They should be selected so that no security classification is required. Identifiers, such as equipment model designation, trade name, military project code name, geographic location may also be included. If possible keywords should be selected from a published thesaurus. e.g. Thesaurus of Engineering and Scientific Terms (TEST) and that thesaurus identified. If it is not possible to select indexing terms which are Unclassified, the classification of each should be indicated as with the title.) Markov chain Monte Carlo Metropolis-Hastings Stochastic Origin Ensembles Parameter estimation Maximum likelihood Maximum Likelihood Expectation Maximisation

Kinetic Energy Non-Lethal Weapons Testing Methodology

Kinetic Energy Non-Lethal Weapons Testing Methodology Kinetic Energy Non-Lethal Weapons Testing Methodology BTTR Impact Force Model Development B. Anctil Biokinetics and Associates Ltd. Prepared By: Biokinetics and Associates Ltd. 247 Don Reid Drive Ottawa,

More information

Near Earth Object Surveillance Satellite (NEOSSAT) Artificial Star

Near Earth Object Surveillance Satellite (NEOSSAT) Artificial Star Near Earth Object Surveillance Satellite (NEOSSAT) Artificial Star Capt. Kevin Bernard Dr. Lauchie Scott DRDC Ottawa Research Centre Defence Research and Development Canada Reference Document DRDC-RDDC-2016-D020

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

UNCERTAINTY ANALYSIS IN BURIED LANDMINE BLAST CHARACTERIZATION DRDC-RDDC-2016-N029

UNCERTAINTY ANALYSIS IN BURIED LANDMINE BLAST CHARACTERIZATION DRDC-RDDC-2016-N029 UNCERTAINTY ANALYSIS IN BURIED LANDMINE BLAST CHARACTERIZATION DRDC-RDDC-2016-N029 M. Ceh, T. Josey, W. Roberts Defence Research and Development Canada, Suffield Research Centre, PO Box 4000, Stn Main,

More information

A quick introduction to Markov chains and Markov chain Monte Carlo (revised version)

A quick introduction to Markov chains and Markov chain Monte Carlo (revised version) A quick introduction to Markov chains and Markov chain Monte Carlo (revised version) Rasmus Waagepetersen Institute of Mathematical Sciences Aalborg University 1 Introduction These notes are intended to

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

16 : Approximate Inference: Markov Chain Monte Carlo

16 : Approximate Inference: Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models 10-708, Spring 2017 16 : Approximate Inference: Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Yuan Yang, Chao-Ming Yen 1 Introduction As the target distribution

More information

On a multivariate implementation of the Gibbs sampler

On a multivariate implementation of the Gibbs sampler Note On a multivariate implementation of the Gibbs sampler LA García-Cortés, D Sorensen* National Institute of Animal Science, Research Center Foulum, PB 39, DK-8830 Tjele, Denmark (Received 2 August 1995;

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

Lecture 6: Markov Chain Monte Carlo

Lecture 6: Markov Chain Monte Carlo Lecture 6: Markov Chain Monte Carlo D. Jason Koskinen koskinen@nbi.ku.dk Photo by Howard Jackman University of Copenhagen Advanced Methods in Applied Statistics Feb - Apr 2016 Niels Bohr Institute 2 Outline

More information

Passive standoff detection of SF 6 plumes at 500 meters Measurement campaign to support the evaluation of Telops imaging spectrometer (FIRST)

Passive standoff detection of SF 6 plumes at 500 meters Measurement campaign to support the evaluation of Telops imaging spectrometer (FIRST) Passive standoff detection of SF 6 plumes at 5 meters Measurement campaign to support the evaluation of Telops imaging spectrometer (FIRST) H. Lavoie E. Puckrin J.-M. Thériault DRDC Valcartier Defence

More information

Bagging During Markov Chain Monte Carlo for Smoother Predictions

Bagging During Markov Chain Monte Carlo for Smoother Predictions Bagging During Markov Chain Monte Carlo for Smoother Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As

More information

Introduction to Bayesian methods in inverse problems

Introduction to Bayesian methods in inverse problems Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction

More information

A Statistical Input Pruning Method for Artificial Neural Networks Used in Environmental Modelling

A Statistical Input Pruning Method for Artificial Neural Networks Used in Environmental Modelling A Statistical Input Pruning Method for Artificial Neural Networks Used in Environmental Modelling G. B. Kingston, H. R. Maier and M. F. Lambert Centre for Applied Modelling in Water Engineering, School

More information

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.

More information

Lecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1

Lecture 5. G. Cowan Lectures on Statistical Data Analysis Lecture 5 page 1 Lecture 5 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,

More information

The Bias-Variance dilemma of the Monte Carlo. method. Technion - Israel Institute of Technology, Technion City, Haifa 32000, Israel

The Bias-Variance dilemma of the Monte Carlo. method. Technion - Israel Institute of Technology, Technion City, Haifa 32000, Israel The Bias-Variance dilemma of the Monte Carlo method Zlochin Mark 1 and Yoram Baram 1 Technion - Israel Institute of Technology, Technion City, Haifa 32000, Israel fzmark,baramg@cs.technion.ac.il Abstract.

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning

More information

Sample Spectroscopy System Hardware

Sample Spectroscopy System Hardware Semiconductor Detectors vs. Scintillator+PMT Detectors Semiconductors are emerging technology - Scint.PMT systems relatively unchanged in 50 years. NaI(Tl) excellent for single-photon, new scintillation

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

MCMC algorithms for fitting Bayesian models

MCMC algorithms for fitting Bayesian models MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models

More information

Maximum-Likelihood Deconvolution in the Spatial and Spatial-Energy Domain for Events With Any Number of Interactions

Maximum-Likelihood Deconvolution in the Spatial and Spatial-Energy Domain for Events With Any Number of Interactions IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 59, NO. 2, APRIL 2012 469 Maximum-Likelihood Deconvolution in the Spatial and Spatial-Energy Domain for Events With Any Number of Interactions Weiyi Wang, Member,

More information

Bayesian Nonparametric Regression for Diabetes Deaths

Bayesian Nonparametric Regression for Diabetes Deaths Bayesian Nonparametric Regression for Diabetes Deaths Brian M. Hartman PhD Student, 2010 Texas A&M University College Station, TX, USA David B. Dahl Assistant Professor Texas A&M University College Station,

More information

A MONTE CARLO SIMULATION OF COMPTON SUPPRESSION FOR NEUTRON ACTIVATION ANALYSIS. Joshua Frye Adviser Chris Grant 8/24/2012 ABSTRACT

A MONTE CARLO SIMULATION OF COMPTON SUPPRESSION FOR NEUTRON ACTIVATION ANALYSIS. Joshua Frye Adviser Chris Grant 8/24/2012 ABSTRACT A MONTE CARLO SIMULATION OF COMPTON SUPPRESSION FOR NEUTRON ACTIVATION ANALYSIS Joshua Frye Adviser Chris Grant 8/24/2012 ABSTRACT A Monte Carlo simulation has been developed using the Geant4 software

More information

A note on Reversible Jump Markov Chain Monte Carlo

A note on Reversible Jump Markov Chain Monte Carlo A note on Reversible Jump Markov Chain Monte Carlo Hedibert Freitas Lopes Graduate School of Business The University of Chicago 5807 South Woodlawn Avenue Chicago, Illinois 60637 February, 1st 2006 1 Introduction

More information

The Metropolis-Hastings Algorithm. June 8, 2012

The Metropolis-Hastings Algorithm. June 8, 2012 The Metropolis-Hastings Algorithm June 8, 22 The Plan. Understand what a simulated distribution is 2. Understand why the Metropolis-Hastings algorithm works 3. Learn how to apply the Metropolis-Hastings

More information

Session 3A: Markov chain Monte Carlo (MCMC)

Session 3A: Markov chain Monte Carlo (MCMC) Session 3A: Markov chain Monte Carlo (MCMC) John Geweke Bayesian Econometrics and its Applications August 15, 2012 ohn Geweke Bayesian Econometrics and its Session Applications 3A: Markov () chain Monte

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

Markov Chain Monte Carlo

Markov Chain Monte Carlo Chapter 5 Markov Chain Monte Carlo MCMC is a kind of improvement of the Monte Carlo method By sampling from a Markov chain whose stationary distribution is the desired sampling distributuion, it is possible

More information

Introduction to Computational Biology Lecture # 14: MCMC - Markov Chain Monte Carlo

Introduction to Computational Biology Lecture # 14: MCMC - Markov Chain Monte Carlo Introduction to Computational Biology Lecture # 14: MCMC - Markov Chain Monte Carlo Assaf Weiner Tuesday, March 13, 2007 1 Introduction Today we will return to the motif finding problem, in lecture 10

More information

Likelihood-free MCMC

Likelihood-free MCMC Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte

More information

Bayesian Inverse Problems

Bayesian Inverse Problems Bayesian Inverse Problems Jonas Latz Input/Output: www.latz.io Technical University of Munich Department of Mathematics, Chair for Numerical Analysis Email: jonas.latz@tum.de Garching, July 10 2018 Guest

More information

Sampling Methods (11/30/04)

Sampling Methods (11/30/04) CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with

More information

Langevin and hessian with fisher approximation stochastic sampling for parameter estimation of structured covariance

Langevin and hessian with fisher approximation stochastic sampling for parameter estimation of structured covariance Langevin and hessian with fisher approximation stochastic sampling for parameter estimation of structured covariance Cornelia Vacar, Jean-François Giovannelli, Yannick Berthoumieu To cite this version:

More information

S 3 j ESD-TR W OS VL, t-i 1 TRADE-OFFS BETWEEN PARTS OF THE OBJECTIVE FUNCTION OF A LINEAR PROGRAM

S 3 j ESD-TR W OS VL, t-i 1 TRADE-OFFS BETWEEN PARTS OF THE OBJECTIVE FUNCTION OF A LINEAR PROGRAM I >> I 00 OH I vo Q CO O I I I S 3 j ESD-TR-65-363 W-07454 OS VL, t-i 1 P H I CO CO I LU U4 I TRADE-OFFS BETWEEN PARTS OF THE OBECTIVE FUNCTION OF A LINEAR PROGRAM ESD RECORD COPY ESD ACCESSION LIST ESTI

More information

Further Development of the Geant4 Simulation and the Analysis Package for the Compton Gamma-Ray Camera

Further Development of the Geant4 Simulation and the Analysis Package for the Compton Gamma-Ray Camera CAN UNCLASSIFIED Further Development of the Geant4 Simulation and the Analysis Package for the Compton Gamma-Ray Camera Christian Van Ouellet Nicholi Shiell Ryuichi Ueno Calian Group Ltd Prepared by: Calian

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 3 More Markov Chain Monte Carlo Methods The Metropolis algorithm isn t the only way to do MCMC. We ll

More information

CSC 446 Notes: Lecture 13

CSC 446 Notes: Lecture 13 CSC 446 Notes: Lecture 3 The Problem We have already studied how to calculate the probability of a variable or variables using the message passing method. However, there are some times when the structure

More information

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

Who was Bayes? Bayesian Phylogenetics. What is Bayes Theorem?

Who was Bayes? Bayesian Phylogenetics. What is Bayes Theorem? Who was Bayes? Bayesian Phylogenetics Bret Larget Departments of Botany and of Statistics University of Wisconsin Madison October 6, 2011 The Reverand Thomas Bayes was born in London in 1702. He was the

More information

ST 740: Markov Chain Monte Carlo

ST 740: Markov Chain Monte Carlo ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:

More information

Bayesian Phylogenetics

Bayesian Phylogenetics Bayesian Phylogenetics Bret Larget Departments of Botany and of Statistics University of Wisconsin Madison October 6, 2011 Bayesian Phylogenetics 1 / 27 Who was Bayes? The Reverand Thomas Bayes was born

More information

Markov Chain Monte Carlo in Practice

Markov Chain Monte Carlo in Practice Markov Chain Monte Carlo in Practice Edited by W.R. Gilks Medical Research Council Biostatistics Unit Cambridge UK S. Richardson French National Institute for Health and Medical Research Vilejuif France

More information

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model UNIVERSITY OF TEXAS AT SAN ANTONIO Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model Liang Jing April 2010 1 1 ABSTRACT In this paper, common MCMC algorithms are introduced

More information

Downloaded from:

Downloaded from: Camacho, A; Kucharski, AJ; Funk, S; Breman, J; Piot, P; Edmunds, WJ (2014) Potential for large outbreaks of Ebola virus disease. Epidemics, 9. pp. 70-8. ISSN 1755-4365 DOI: https://doi.org/10.1016/j.epidem.2014.09.003

More information

Statistical Methods in Particle Physics Lecture 1: Bayesian methods

Statistical Methods in Particle Physics Lecture 1: Bayesian methods Statistical Methods in Particle Physics Lecture 1: Bayesian methods SUSSP65 St Andrews 16 29 August 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan

More information

Comparison of magnetic parameters of CFAV QUEST from FLUX3D modeling and airborne measurements

Comparison of magnetic parameters of CFAV QUEST from FLUX3D modeling and airborne measurements Copy No. Defence Research and Development Canada Recherche et développement pour la défense Canada DEFENCE & DÉFENSE Comparison of magnetic parameters of CFAV QUEST from FLUX3D modeling and airborne measurements

More information

Gate simulation of Compton Ar-Xe gamma-camera for radionuclide imaging in nuclear medicine

Gate simulation of Compton Ar-Xe gamma-camera for radionuclide imaging in nuclear medicine Journal of Physics: Conference Series PAPER OPEN ACCESS Gate simulation of Compton Ar-Xe gamma-camera for radionuclide imaging in nuclear medicine To cite this article: L Yu Dubov et al 2017 J. Phys.:

More information

Stat 516, Homework 1

Stat 516, Homework 1 Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball

More information

LECTURE 15 Markov chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte

More information

AIR FORCE RESEARCH LABORATORY Directed Energy Directorate 3550 Aberdeen Ave SE AIR FORCE MATERIEL COMMAND KIRTLAND AIR FORCE BASE, NM

AIR FORCE RESEARCH LABORATORY Directed Energy Directorate 3550 Aberdeen Ave SE AIR FORCE MATERIEL COMMAND KIRTLAND AIR FORCE BASE, NM AFRL-DE-PS-JA-2007-1004 AFRL-DE-PS-JA-2007-1004 Noise Reduction in support-constrained multi-frame blind-deconvolution restorations as a function of the number of data frames and the support constraint

More information

Assessing system reliability through binary decision diagrams using bayesian techniques.

Assessing system reliability through binary decision diagrams using bayesian techniques. Loughborough University Institutional Repository Assessing system reliability through binary decision diagrams using bayesian techniques. This item was submitted to Loughborough University's Institutional

More information

Markov chain Monte Carlo Lecture 9

Markov chain Monte Carlo Lecture 9 Markov chain Monte Carlo Lecture 9 David Sontag New York University Slides adapted from Eric Xing and Qirong Ho (CMU) Limitations of Monte Carlo Direct (unconditional) sampling Hard to get rare events

More information

Selection on selected records

Selection on selected records Selection on selected records B. GOFFINET I.N.R.A., Laboratoire de Biometrie, Centre de Recherches de Toulouse, chemin de Borde-Rouge, F 31320 Castanet- Tolosan Summary. The problem of selecting individuals

More information

Bayesian inference & Markov chain Monte Carlo. Note 1: Many slides for this lecture were kindly provided by Paul Lewis and Mark Holder

Bayesian inference & Markov chain Monte Carlo. Note 1: Many slides for this lecture were kindly provided by Paul Lewis and Mark Holder Bayesian inference & Markov chain Monte Carlo Note 1: Many slides for this lecture were kindly provided by Paul Lewis and Mark Holder Note 2: Paul Lewis has written nice software for demonstrating Markov

More information

Brief introduction to Markov Chain Monte Carlo

Brief introduction to Markov Chain Monte Carlo Brief introduction to Department of Probability and Mathematical Statistics seminar Stochastic modeling in economics and finance November 7, 2011 Brief introduction to Content 1 and motivation Classical

More information

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 9: Markov Chain Monte Carlo 9.1 Markov Chain A Markov Chain Monte

More information

Reducing the Run-time of MCMC Programs by Multithreading on SMP Architectures

Reducing the Run-time of MCMC Programs by Multithreading on SMP Architectures Reducing the Run-time of MCMC Programs by Multithreading on SMP Architectures Jonathan M. R. Byrd Stephen A. Jarvis Abhir H. Bhalerao Department of Computer Science University of Warwick MTAAP IPDPS 2008

More information

Probability and Information Theory. Sargur N. Srihari

Probability and Information Theory. Sargur N. Srihari Probability and Information Theory Sargur N. srihari@cedar.buffalo.edu 1 Topics in Probability and Information Theory Overview 1. Why Probability? 2. Random Variables 3. Probability Distributions 4. Marginal

More information

Detection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset

Detection ASTR ASTR509 Jasper Wall Fall term. William Sealey Gosset ASTR509-14 Detection William Sealey Gosset 1876-1937 Best known for his Student s t-test, devised for handling small samples for quality control in brewing. To many in the statistical world "Student" was

More information

Math 350: An exploration of HMMs through doodles.

Math 350: An exploration of HMMs through doodles. Math 350: An exploration of HMMs through doodles. Joshua Little (407673) 19 December 2012 1 Background 1.1 Hidden Markov models. Markov chains (MCs) work well for modelling discrete-time processes, or

More information

Markov Chain Monte Carlo The Metropolis-Hastings Algorithm

Markov Chain Monte Carlo The Metropolis-Hastings Algorithm Markov Chain Monte Carlo The Metropolis-Hastings Algorithm Anthony Trubiano April 11th, 2018 1 Introduction Markov Chain Monte Carlo (MCMC) methods are a class of algorithms for sampling from a probability

More information

HMM part 1. Dr Philip Jackson

HMM part 1. Dr Philip Jackson Centre for Vision Speech & Signal Processing University of Surrey, Guildford GU2 7XH. HMM part 1 Dr Philip Jackson Probability fundamentals Markov models State topology diagrams Hidden Markov models -

More information

MCMC notes by Mark Holder

MCMC notes by Mark Holder MCMC notes by Mark Holder Bayesian inference Ultimately, we want to make probability statements about true values of parameters, given our data. For example P(α 0 < α 1 X). According to Bayes theorem:

More information

Markov Chains and MCMC

Markov Chains and MCMC Markov Chains and MCMC CompSci 590.02 Instructor: AshwinMachanavajjhala Lecture 4 : 590.02 Spring 13 1 Recap: Monte Carlo Method If U is a universe of items, and G is a subset satisfying some property,

More information

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1 Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

MCMC Methods: Gibbs and Metropolis

MCMC Methods: Gibbs and Metropolis MCMC Methods: Gibbs and Metropolis Patrick Breheny February 28 Patrick Breheny BST 701: Bayesian Modeling in Biostatistics 1/30 Introduction As we have seen, the ability to sample from the posterior distribution

More information

Advanced Statistical Methods. Lecture 6

Advanced Statistical Methods. Lecture 6 Advanced Statistical Methods Lecture 6 Convergence distribution of M.-H. MCMC We denote the PDF estimated by the MCMC as. It has the property Convergence distribution After some time, the distribution

More information

Development of Stochastic Artificial Neural Networks for Hydrological Prediction

Development of Stochastic Artificial Neural Networks for Hydrological Prediction Development of Stochastic Artificial Neural Networks for Hydrological Prediction G. B. Kingston, M. F. Lambert and H. R. Maier Centre for Applied Modelling in Water Engineering, School of Civil and Environmental

More information

STAT 425: Introduction to Bayesian Analysis

STAT 425: Introduction to Bayesian Analysis STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 2) Fall 2017 1 / 19 Part 2: Markov chain Monte

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions

More information

Quantifying Uncertainty

Quantifying Uncertainty Sai Ravela M. I. T Last Updated: Spring 2013 1 Markov Chain Monte Carlo Monte Carlo sampling made for large scale problems via Markov Chains Monte Carlo Sampling Rejection Sampling Importance Sampling

More information

SAMSI Astrostatistics Tutorial. More Markov chain Monte Carlo & Demo of Mathematica software

SAMSI Astrostatistics Tutorial. More Markov chain Monte Carlo & Demo of Mathematica software SAMSI Astrostatistics Tutorial More Markov chain Monte Carlo & Demo of Mathematica software Phil Gregory University of British Columbia 26 Bayesian Logical Data Analysis for the Physical Sciences Contents:

More information

Simulation - Lectures - Part III Markov chain Monte Carlo

Simulation - Lectures - Part III Markov chain Monte Carlo Simulation - Lectures - Part III Markov chain Monte Carlo Julien Berestycki Part A Simulation and Statistical Programming Hilary Term 2018 Part A Simulation. HT 2018. J. Berestycki. 1 / 50 Outline Markov

More information

BLIND SEPARATION OF TEMPORALLY CORRELATED SOURCES USING A QUASI MAXIMUM LIKELIHOOD APPROACH

BLIND SEPARATION OF TEMPORALLY CORRELATED SOURCES USING A QUASI MAXIMUM LIKELIHOOD APPROACH BLID SEPARATIO OF TEMPORALLY CORRELATED SOURCES USIG A QUASI MAXIMUM LIKELIHOOD APPROACH Shahram HOSSEII, Christian JUTTE Laboratoire des Images et des Signaux (LIS, Avenue Félix Viallet Grenoble, France.

More information

Systematic uncertainties in statistical data analysis for particle physics. DESY Seminar Hamburg, 31 March, 2009

Systematic uncertainties in statistical data analysis for particle physics. DESY Seminar Hamburg, 31 March, 2009 Systematic uncertainties in statistical data analysis for particle physics DESY Seminar Hamburg, 31 March, 2009 Glen Cowan Physics Department Royal Holloway, University of London g.cowan@rhul.ac.uk www.pp.rhul.ac.uk/~cowan

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 10-708 Probabilistic Graphical Models Homework 3 (v1.1.0) Due Apr 14, 7:00 PM Rules: 1. Homework is due on the due date at 7:00 PM. The homework should be submitted via Gradescope. Solution to each problem

More information

Compton Camera. Compton Camera

Compton Camera. Compton Camera Diagnostic Imaging II Student Project Compton Camera Ting-Tung Chang Introduction The Compton camera operates by exploiting the Compton Effect. It uses the kinematics of Compton scattering to contract

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Statistical Estimation of the Parameters of a PDE

Statistical Estimation of the Parameters of a PDE PIMS-MITACS 2001 1 Statistical Estimation of the Parameters of a PDE Colin Fox, Geoff Nicholls (University of Auckland) Nomenclature for image recovery Statistical model for inverse problems Traditional

More information

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling Christopher Jennison Department of Mathematical Sciences, University of Bath, UK http://people.bath.ac.uk/mascj Adriana Ibrahim Institute

More information

Statistical Methods for Particle Physics Lecture 1: parameter estimation, statistical tests

Statistical Methods for Particle Physics Lecture 1: parameter estimation, statistical tests Statistical Methods for Particle Physics Lecture 1: parameter estimation, statistical tests http://benasque.org/2018tae/cgi-bin/talks/allprint.pl TAE 2018 Benasque, Spain 3-15 Sept 2018 Glen Cowan Physics

More information

Bayesian Methods in Multilevel Regression

Bayesian Methods in Multilevel Regression Bayesian Methods in Multilevel Regression Joop Hox MuLOG, 15 september 2000 mcmc What is Statistics?! Statistics is about uncertainty To err is human, to forgive divine, but to include errors in your design

More information

Monte Carlo in Bayesian Statistics

Monte Carlo in Bayesian Statistics Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview

More information

Detection of Artificial Satellites in Images Acquired in Track Rate Mode.

Detection of Artificial Satellites in Images Acquired in Track Rate Mode. Detection of Artificial Satellites in Images Acquired in Track Rate Mode. Martin P. Lévesque Defence R&D Canada- Valcartier, 2459 Boul. Pie-XI North, Québec, QC, G3J 1X5 Canada, martin.levesque@drdc-rddc.gc.ca

More information

An introduction to Markov Chain Monte Carlo techniques

An introduction to Markov Chain Monte Carlo techniques An introduction to Markov Chain Monte Carlo techniques G. J. A. Harker University of Colorado ASTR5550, 19th March 2012 Outline Introduction Bayesian inference: recap MCMC: when to use it and why A simple

More information

Optical Gain Measurements for a Portable Plastic-Scintillator-Based Muon Tomography System

Optical Gain Measurements for a Portable Plastic-Scintillator-Based Muon Tomography System Optical Gain Measurements for a Portable Plastic-Scintillator-Based Muon Tomography System Prepared By: Kenneth Moats Zernam Enterprises Inc. 101 Woodward Dr, Suite 110 Ottawa, ON K2C 0R4 PWGSC Contract

More information

1 Using standard errors when comparing estimated values

1 Using standard errors when comparing estimated values MLPR Assignment Part : General comments Below are comments on some recurring issues I came across when marking the second part of the assignment, which I thought it would help to explain in more detail

More information

General Construction of Irreversible Kernel in Markov Chain Monte Carlo

General Construction of Irreversible Kernel in Markov Chain Monte Carlo General Construction of Irreversible Kernel in Markov Chain Monte Carlo Metropolis heat bath Suwa Todo Department of Applied Physics, The University of Tokyo Department of Physics, Boston University (from

More information

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements

Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Supplement to A Hierarchical Approach for Fitting Curves to Response Time Measurements Jeffrey N. Rouder Francis Tuerlinckx Paul L. Speckman Jun Lu & Pablo Gomez May 4 008 1 The Weibull regression model

More information

Markov chain Monte Carlo methods in atmospheric remote sensing

Markov chain Monte Carlo methods in atmospheric remote sensing 1 / 45 Markov chain Monte Carlo methods in atmospheric remote sensing Johanna Tamminen johanna.tamminen@fmi.fi ESA Summer School on Earth System Monitoring and Modeling July 3 Aug 11, 212, Frascati July,

More information