Unfolding techniques for neutron spectrometry

Similar documents
Data analysis for neutron spectrometry with liquid scintillators: applications to fusion diagnostics

NSDUAZ unfolding package for neutron spectrometry and dosimetry with Bonner spheres

Unfolding of neutron spectra with an experimentally determined diamond detector response function

Neutron Spectrometry in Mixed Fields: Characterisation of the RA-1 Reactor Workplace

Characterization of the 3 MeV Neutron Field for the Monoenergetic Fast Neutron Fluence Standard at the National Metrology Institute of Japan

Recent Advances in Bayesian Inference Techniques

Characterization of an 241 AmBe neutron irradiation facility by different spectrometric techniques

Validation of the UFS Bonner Sphere Spectrometer and Monte Carlo Methods at the CERN-EU high energy Reference Field (CERF)

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Lecture 13 : Variational Inference: Mean Field Approximation

PART I INTRODUCTION The meaning of probability Basic definitions for frequentist statistics and Bayesian inference Bayesian inference Combinatorics

Bonner Sphere Spectrometer. Cruzate, J.A.; Carelli, J.L. and Gregori, B.N.

A comprehensive study of a wide energy range proportional counter neutron spectrometer

Update on Calibration Studies of the Canadian High-Energy Neutron Spectrometry System (CHENSS)

Recent Activities on Neutron Standardization at the Electrotechnical Laboratory

MEASUREMENT UNCERTAINTY AND SUMMARISING MONTE CARLO SAMPLES

Probabilistic Machine Learning

High resolution neutron spectrometry with liquid scintillation detectors for fusion applications

Bayesian Regression Linear and Logistic Regression

Approximate Inference Part 1 of 2

Approximate Inference Part 1 of 2

13: Variational inference II

Bayesian Inference and MCMC

CSC 2541: Bayesian Methods for Machine Learning

Statistical Methods in Particle Physics Lecture 1: Bayesian methods

Nonparametric Bayesian Methods (Gaussian Processes)

Spectrometry behind concrete shielding for neutrons produced by 400 MeV/u 12 C ions impinging on a thick graphite target

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

Machine Learning Techniques for Computer Vision

Energy Response Characteristics of Several Neutron Measuring Devices Determined By Using the Scattered Neutron Calibration Fields of KAERI

A new neutron monitor for pulsed fields at high-energy accelerators

SPATIAL AND SPATIO-TEMPORAL BAYESIAN MODELS WITH R - INLA BY MARTA BLANGIARDO, MICHELA CAMELETTI

Bayesian Machine Learning - Lecture 7

Ronald Christensen. University of New Mexico. Albuquerque, New Mexico. Wesley Johnson. University of California, Irvine. Irvine, California

eqr094: Hierarchical MCMC for Bayesian System Reliability

STAT 518 Intro Student Presentation

Curve Fitting Re-visited, Bishop1.2.5

Altitude Variation of cosmic-ray neutron energy spectrum and ambient dose equivalent at Mt.Fuji in Japan

Density Estimation. Seungjin Choi

PATTERN RECOGNITION AND MACHINE LEARNING

Probing the covariance matrix

Gaussian Process Approximations of Stochastic Differential Equations

PIER HLM Course July 30, 2011 Howard Seltman. Discussion Guide for Bayes and BUGS

Bayesian Inference. Chris Mathys Wellcome Trust Centre for Neuroimaging UCL. London SPM Course

Machine Learning for Data Science (CS4786) Lecture 24

European Project Metrology for Radioactive Waste Management

Lecture 6: Model Checking and Selection

Bayesian Quadrature: Model-based Approximate Integration. David Duvenaud University of Cambridge

Artificial Neural Network for Spectrum unfolding Bonner Sphere Data

CPSC 540: Machine Learning

Variational Methods in Bayesian Deconvolution

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

The HEPROW Program System

Recent developments in neutron metrology at the Institute for Radiological Protection and Nuclear Safety (IRSN)

NEUTRON SPECTROMETRY AND DOSIMETRY WITH ANNs

Lecture : Probabilistic Machine Learning

Neutron Spectrum Measurement in a Medical Cyclotron

Bayesian philosophy Bayesian computation Bayesian software. Bayesian Statistics. Petter Mostad. Chalmers. April 6, 2017

Part 1: Expectation Propagation

Variational Bayesian Inference Techniques

Organization. I MCMC discussion. I project talks. I Lecture.

A spectrometer for pulsed and continuous photon radiation

Bayesian System Identification based on Hierarchical Sparse Bayesian Learning and Gibbs Sampling with Application to Structural Damage Assessment

an introduction to bayesian inference

Principles of Bayesian Inference

ERINDA PAC 1/4 Testing the UAB Extended Range Bonner Sphere Spectrometer for high energy neutrons UU-TSL, Uppsala, Sweden UAB Barcelona

MODULE -4 BAYEIAN LEARNING

The non-backtracking operator

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Prediction of Data with help of the Gaussian Process Method

Infer relationships among three species: Outgroup:

Machine Learning Overview

Proteomics and Variable Selection

Contents. Part I: Fundamentals of Bayesian Inference 1

Encoding or decoding

BUGS Bayesian inference Using Gibbs Sampling

Markov Chain Monte Carlo

Statistical Rock Physics

A Variance Modeling Framework Based on Variational Autoencoders for Speech Enhancement

Quantitative Biology II Lecture 4: Variational Methods

Stochastic Spectral Approaches to Bayesian Inference

Permutation Test for Bayesian Variable Selection Method for Modelling Dose-Response Data Under Simple Order Restrictions

VIBES: A Variational Inference Engine for Bayesian Networks

Principles of Bayesian Inference

Deep learning / Ian Goodfellow, Yoshua Bengio and Aaron Courville. - Cambridge, MA ; London, Spis treści

Learning Energy-Based Models of High-Dimensional Data

Markov Networks.

Bayesian inference. Fredrik Ronquist and Peter Beerli. October 3, 2007

Towards a Bayesian model for Cyber Security

Variational Principal Components

Introduction to Machine Learning

Bagging During Markov Chain Monte Carlo for Smoother Predictions

STA 4273H: Statistical Machine Learning

The Bayesian Approach to Multi-equation Econometric Model Estimation

Principles of Bayesian Inference

Maximum Entropy and Species Distribution Modeling

Bayesian rules of probability as principles of logic [Cox] Notation: pr(x I) is the probability (or pdf) of x being true given information I

Bayesian Estimation of Expected Cell Counts by Using R

STA 4273H: Sta-s-cal Machine Learning

Expectation Propagation for Approximate Bayesian Inference

Transcription:

Uncertainty Assessment in Computational Dosimetry: A comparison of Approaches Unfolding techniques for neutron spectrometry Physikalisch-Technische Bundesanstalt Braunschweig, Germany Contents of the talk Bonner sphere spectrometry How strong is the data? How should we formulate the problem? Maximum entropy unfolding Bayesian parameter estimation Some concluding remarks

Bonner sphere spectrometer 6 5 bare bare-cd 3" to 18" R d (E n ) / cm 2 4 3 2 1 0 10-9 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 10 0 10 1 10 2 10 3 10 4 10 5 E n / MeV Extended range Bonner sphere spectrometer R d (E n ) / cm 2 10 9 8 7 6 5 4 3 2 Masurement performed by B Wiegel at GSI 2003 100 b_cd 3" 3.5" to 18" 3P5_7 4C5_7 4P5_7 4P6_8 80 60 40 20 1 0 0 10-9 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 10 0 10 1 10 2 10 3 10 4 E n / MeV Φ E (E) *E

Bonner sphere measurements Φ(E)/ Ln(E) (n.cm -2.p -1 ) 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 10 0 Neutron Energy (MeV) 6 bare bare-cd 5 3" to 18" = N NEMUS / N Si-monitor 5 4 3 2 1, PTB NEMUS eye guide R d (E n ) / cm 2 4 3 2 1 0 0 2 4 6 8 10 12 d / (2.54 cm) 0 10-9 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 10 0 10 1 10 2 10 3 10 4 10 5 E n / MeV Different mathematical spaces! Space of all possible spectra { Φ( E) } { ( E) } R k Space of measurements { N, σ } k k

Number of good data points (I) Look at a plot of measured data versus sphere diameter Number of good data points ~ number of points needed to define the smooth fit Number of good data points (II) Do a singular value decompostion (SVD) of the response matrix Look at the number of large eigenvalues (say, within 1% of the largest eigenvalue) relative eigenvalue 10 0 10-1 10-2 10-3 10-4 10-5 10-6 10-7 10-8 BS, all energies BS, E n > 0.1 MeV BUBBLE detectors 2 4 6 8 10 12 number M. Matzke, Radiat. Prot. Dosim. 107, 155-174, 2003

The golden rule Lost information can never be retrieved; it can only be replaced by hypotheses D. Taubin, in probabilities, data reduction and error analysis in the physical sciences Where can we get our hypotheses from? Specific features of the experiment Physical principles that apply to the particles being measured General principles from probability theory and information theory Another view of a measurement Φ(E)/ Ln(E) (n.cm -2.p -1 ) 5 0.8, PTB NEMUS 0.7 eye guide 4 0.6 N NEMUS / N Si-monitor 0.9 0.8 Thermal peak 0.7 Intermediate region High energy peak 0.6 Total 0.5 3 0.5 MAXED unfolding Detector 0.4 Data analysis 0.4 0.3 2 0.3 0.2 0.2 1 0.1 0.1 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 10 0 0 0 2 4 6 8 10 12 d / cm) Neutron Energy (MeV) 10-11 10-10 10-9 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 10 0 10 1 (2.54 E n / MeV measurement: spectrum detector measurement data analysis spectrum communication system (Shannon, 1948): message transmitter (noisy) signal receiver message (Φ E E n / N DA ) / cm -2

A recipe for unfolding measurements detector response SOLUTION prior information Why is this problem difficult? Some mathematical difficulties: non-uniqueness, stability of solutions, incorporation of prior information, computational problems, etc. Problems are mostly conceptual in nature, not mathematical: unfolding is a problem of inference it has to be formulated correctly! Many different approaches are possible: the choice of unfolding method must be determined by the particular question that one wants to answer

Many (too many!) unfolding methods Direct inversion Matrix inversion Fourier transform methods Derivative methods Regularization Linear regularization (Phillips-, Twomey-, Tikhonov-, Miller-,...) Iterative methods SAND-II (GRAVEL) Gold (BON) Doroshenko (SPUNIT) Least-squares Linear least-squares Maximum entropy Parameter estimation Expansion in basis functions Bayesian methods Stochastic methods Monte Carlo methods Genetic algorithms Neural networks Maximum entropy principle (MaxEnt) Start with your best estimate of the spectrum, DEF Φ (E) From all the spectra that fit the data, choose as solution spectrum Φ (E) the one that is closest to Φ DEF (E) Closest means: the spectrum Φ(E) that maximizes the relative entropy, Φ( E) DEF S = Φ( E)ln + Φ ( E) Φ( E) de DEF Φ ( E)

Relative entropy The relative entropy is a fundamental concept in information theory and probability theory It also known (to within a plus or minus sign) as Cross-entropy, Entropy, Entropy distance, Direct divergence, Expected weight of evidence, Kullback-Leibler distance, Kullback number, Discrimination information, Renyi s principle... Why use maximum entropy? According to E. T. Jaynes: Because it makes sense: of all the spectra that fit the data, you choose the one that is closest to your initial estimate According to Shore and Johnson: Because it is the only general method of inference that does not lead to inconsistencies

Example of MaxEnt unfolding Initial estimates MAXED unfoldings Goldhagen et al., Nuclear Instruments and Methods A 476, 42-51 (2002) Some properties of MaxEnt solutions The solution spectrum can be written in closed form: Φ = Φ DEF i i exp λk Rki k standard sensitivity analysis, uncertainty propagation Prior information is taken seriously! For small adjustments, Φ(E) is linear in the responses: Φ i = Φ DEF i exp Φ DEF λk Rki i 1 k k λk Rki

Bayesian parameter estimation Thermal region: Magnitude Intermediate region: Slope Magnitude Peak 1 Energy of the peak Magnitude Form / Width Peak 2 Energy of the peak Magnitude Form / Width Φ E E 0.7 Thermal Intermediate Region Peak 1 Peak 2 0.6 Total 0.5 0.4 0.3 0.2 0.1 10-9 10-8 10-7 10-6 10-5 10-4 10-3 10-2 10-1 10 0 10 1 10 2 10 3 10 4 10 5 E n / MeV What is a Bayesian approach? An application of the rules of probability theory The approach is conceptually clear It is usually easier to set up a problem and solve it Flexible approach All uncertainties are described by probability distributions All calculations are done in the space of parameters, not in the space of data Bayesian and orthodox methods calculate different things

We need only a few basic rules 1. The sum rule: 2. The product rule: 3. Marginalization: 4. Bayes theorem: Software for Bayesian parameter estimation WinBUGS: From MRC Biostatistics Unit in Cambridge and the Department of Epidemiology and Public Health of Imperial College at St Mary's, London. Software for the Bayesian analysis of complex statistical models using Markov chain Monte Carlo (MCMC) methods ( BUGS = Bayesian inference Using Gibbs Sampling) Nice features: Just download it from the internet Excellent documentation with many examples Programming language is fairly easy to learn

Example of Bayesian parameter estimation Posterior probability of the ambient dose equivalent M. Reginatto, Radiat. Prot. Dosim. 121, 64-69, 2006 Posterior probabilities of some parameters 1 magnitude 5.0 1.2 1.4 15.0 magnitude 1 5.0 0.3 0.4 0.5 0.6 6.0 magnitude 4.0 2.0 0.2 0.4 6.0 log(e peak /MeV) 4.0 2.0-0.6-0.4-0.2

Some advantages of a Bayesian approach The rules are clear and consistent (i.e., probability theory), and setting up problems is conceptually simple Analysis leads to posterior probabilities for the parameters Uncertainties (also for derived quantities) are estimated directly from the posterior probabilities Nuisance parameters are easy to handle Calculations are done in parameter space, not in data space Summary: analysis of Bonner sphere data We are dealing with a highly undetermined system: Typically ~10 spheres only but spectrum from thermal to a few tens or hundreds of MeV We need to introduce prior information to restrict to physically meaningful solutions: - Choose an initial estimate of spectrum (from Monte Carlo simulations, using physical principles, etc.) - Analyze the data using maximum entropy unfolding or - Introduce a parametrized spectrum that is reasonably good - Analyze the data using Bayesian parameter estimation

Thank you for your attention!