Intermediate Lab PHYS 3870

Similar documents
Intermediate Lab PHYS 3870

Intermediate Lab PHYS 3870

Normal Distributions Rejection of Data + RLC Circuits. Lecture 4 Physics 2CL Summer 2011

Error Analysis How Do We Deal With Uncertainty In Science.

Fourier and Stats / Astro Stats and Measurement : Stats Notes

A review of probability theory

Check List - Data Analysis

MAS223 Statistical Inference and Modelling Exercises

Gaussians Distributions, Simple Harmonic Motion & Uncertainty Analysis Review. Lecture # 5 Physics 2BL Summer 2015

A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

An Introduction to Error Analysis

Uncertainty and Bias UIUC, 403 Advanced Physics Laboratory, Fall 2014

Review of Statistics

Probability Distributions - Lecture 5

Stochastic Processes for Physicists

Course Project. Physics I with Lab

Probability, CLT, CLT counterexamples, Bayes. The PDF file of this lecture contains a full reference document on probability and random variables.

Statistics, Data Analysis, and Simulation SS 2015

Introduction to Statistics and Error Analysis II

Multiple Random Variables

Mathematics 324 Riemann Zeta Function August 5, 2005

CHINO VALLEY UNIFIED SCHOOL DISTRICT INSTRUCTIONAL GUIDE ALGEBRA II

Uncertainties & Error Analysis Tutorial

Statistics for Data Analysis. Niklaus Berger. PSI Practical Course Physics Institute, University of Heidelberg

Optimization. Charles J. Geyer School of Statistics University of Minnesota. Stat 8054 Lecture Notes

Take the measurement of a person's height as an example. Assuming that her height has been determined to be 5' 8", how accurate is our result?

2.3 Estimating PDFs and PDF Parameters

Physics 509: Error Propagation, and the Meaning of Error Bars. Scott Oser Lecture #10

Computer Intensive Methods in Mathematical Statistics

PHYSICS 2150 EXPERIMENTAL MODERN PHYSICS. Lecture 3 Rejection of Data; Weighted Averages

MA22S3 Summary Sheet: Ordinary Differential Equations

Randomized Algorithms

Ch 7 Summary - POLYNOMIAL FUNCTIONS

Physics 1050 Experiment 1. Introduction to Measurement and Uncertainty

Combining Interval and Probabilistic Uncertainty in Engineering Applications

Statistical Distribution Assumptions of General Linear Models

New Topic. PHYS 1021: Chap. 10, Pg 2. Page 1

Uncertainty, Error, and Precision in Quantitative Measurements an Introduction 4.4 cm Experimental error

Kinetic Theory 1 / Probabilities

Review of probability. Nuno Vasconcelos UCSD

Introduction to Data Analysis

Topics in Computer Mathematics

Treatment of Error in Experimental Measurements

Statistics. Lent Term 2015 Prof. Mark Thomson. 2: The Gaussian Limit

Parameter Estimation and Fitting to Data

Part I. Experimental Error

Statistics and Data Analysis

Graphical Analysis and Errors MBL

5. Systems in contact with a thermal bath

Rational Exponents. Polynomial function of degree n: with leading coefficient,, with maximum number of turning points is given by (n-1)

Measurement And Uncertainty

The Kalman Filter (part 1) Definition. Rudolf Emil Kalman. Why do we need a filter? Definition. HCI/ComS 575X: Computational Perception.

Algebra 2 Secondary Mathematics Instructional Guide

Lectures on Elementary Probability. William G. Faris

Probability - Lecture 4

PHYSICS 2150 LABORATORY

Before you begin read these instructions carefully.

Math Camp. September 16, 2017 Unit 2. MSSM Program Columbia University Dr. Satyajit Bose

Advanced Statistical Methods. Lecture 6

Fourier transforms. c n e inπx. f (x) = Write same thing in an equivalent form, using n = 1, f (x) = l π

Probability. Table of contents

Calculus. Weijiu Liu. Department of Mathematics University of Central Arkansas 201 Donaghey Avenue, Conway, AR 72035, USA

Math 341: Probability Eighteenth Lecture (11/12/09)

Algebra Vocabulary. abscissa

2D Image Processing (Extended) Kalman and particle filter

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

The random variable 1

Series Solutions. 8.1 Taylor Polynomials

Space Telescope Science Institute statistics mini-course. October Inference I: Estimation, Confidence Intervals, and Tests of Hypotheses

Lecture 11: Two Way Analysis of Variance

1 + lim. n n+1. f(x) = x + 1, x 1. and we check that f is increasing, instead. Using the quotient rule, we easily find that. 1 (x + 1) 1 x (x + 1) 2 =

Basic Probability Reference Sheet

Foundations of Probability and Statistics

Lecture 2 Binomial and Poisson Probability Distributions

With Question/Answer Animations. Chapter 7

2. As we shall see, we choose to write in terms of σ x because ( X ) 2 = σ 2 x.

P1 Calculus II. Partial Differentiation & Multiple Integration. Prof David Murray. dwm/courses/1pd

Bivariate distributions

STATISTICS OF OBSERVATIONS & SAMPLING THEORY. Parent Distributions

Linear Dynamical Systems

STAT 414: Introduction to Probability Theory

ACE 562 Fall Lecture 2: Probability, Random Variables and Distributions. by Professor Scott H. Irwin

First and Last Name: 2. Correct The Mistake Determine whether these equations are false, and if so write the correct answer.

function independent dependent domain range graph of the function The Vertical Line Test

04. Random Variables: Concepts

PHYS 6710: Nuclear and Particle Physics II

If we want to analyze experimental or simulated data we might encounter the following tasks:

PHYS 7411 Spring 2015 Computational Physics Homework 3

Statistical techniques for data analysis in Cosmology

Poisson distribution and χ 2 (Chap 11-12)

Modern Methods of Data Analysis - WS 07/08

The Treatment of Numerical Experimental Results

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Probabilistic Graphical Models

Hypothesis Testing. 1 Definitions of test statistics. CB: chapter 8; section 10.3

Data and Error Analysis

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Stocastic Processes - Part I

1 Random variables and distributions

Transcription:

Intermediate Lab PHYS 3870 Lecture 3 Distribution Functions References: Taylor Ch. 5 (and Chs. 10 and 11 for Reference) Taylor Ch. 6 and 7 Also refer to Glossary of Important Terms in Error Analysis Probability Cheat Sheet Introduction Section 0 Lecture 1 Slide 1 Lecture 3 Slide 1

Intermediate Lab PHYS 3870 Distribution Functions Introduction Section 0 Lecture 1 Slide 2 Lecture 3 Slide 2

Practical Methods to Calculate Mean and St. Deviation We need to develop a good way to tally, display, and think about a collection of repeated measurements of the same quantity. Here is where we are headed: Develop the notion of a probability distribution function, a distribution to describe the probable outcomes of a measurement Define what a distribution function is, and its properties Look at the properties of the most common distribution function, the Gaussian distribution for purely random events Introduce other probability distribution functions We will develop the mathematical basis for: Mean Standard deviation Standard deviation of the mean (SDOM) Moments and expectation values Error propagation formulas Introduction Section 0 Lecture 1 Slide 3 Addition of errors in quadrature (for independent and random measurements) Schwartz inequality (i.e., the uncertainty principle) (next lecture) Numerical values for confidence limits (t-test) Principle of maximal likelihood Central limit theorem Lecture 3 Slide 3

Two Practical Exercises in Probabilities Flip Penny a penny 50 50 times and and record the the results Roll a pair of dice 50 times and record the results Introduction Section 0 Lecture 1 Slide 4 Grab a partner and a set of instructions and complete the exercise. Lecture 3 Slide 4

Two Practical Exercises in Probabilities Flip a penny 50 times and record the results What is the asymmetry of the results? Introduction Section 0 Lecture 1 Slide 5 Lecture 3 Slide 5

Two Practical Exercises in Probabilities Flip a penny 50 times and record the results??% asymmetry 4% asymmetry What is the asymmetry of the results? Introduction Section 0 Lecture 1 Slide 6 Lecture 3 Slide 6

Two Practical Exercises in Probabilities Roll a pair of dice 50 times and record the results Introduction Section 0 Lecture 1 Slide 7 What is the mean value? The standard deviation? Lecture 3 Slide 7

Two Practical Exercises in Probabilities Roll a pair of dice 50 times and record the results What is the mean value? The standard deviation? What is the asymmetry (kurtosis)? Introduction Section 0 Lecture 1 Slide 8 What is the probability of rolling a 4? Lecture 3 Slide 8

Discrete Distribution Functions A data set to play with Written in terms of occurrence F The mean value In terms of fractional expectations Fractional expectations Introduction Section 0 Lecture 1 Slide 9 Normalization condition Mean value (This is just a weighted sum.) Lecture 3 Slide 9

Limit of Discrete Distribution Functions Binned data sets Normalizing data sets f k fractional occurrence k bin width Mean value: X = k F k x k = k (f k Δ k )x k Normalization: Introduction Section k0 Lecture 1 Slide 10 1 = (f k Δ k )x k k Δ k Expected value: Prob 4 = F 4 N = f 4 Lecture 3 Slide 10

Limits of Distribution Functions Consider the limiting distribution function as N ȸ and k 0 Larger data sets Mathcad Games: Introduction Section 0 Lecture 1 Slide 11 Lecture 3 Slide 11

Continuous Distribution Functions Meaning of Distribution Interval Thus and by extension Central (mean) Value = fraction of measurements that fall within a<x<b Introduction Section 0 Lecture 1 Slide 12 Normalization of Distribution Width of Distribution Lecture 3 Slide 12

Moments of Distribution Functions The first moment for a probability distribution function is x x = first moment = + x f(x) dx For a general distribution function, x x = first moment = + + x g(x) dx g(x) dx Generalizing, the n th moment is x n x n = nth moment = O th moment Introduction N Section 0 Lecture 2 nd 2 moment 1 Slide 13 x x x 2 + + x n g(x) dx g(x) dx 1 st moment x 3 rd moment kurtosis + = x n f(x) dx 0 (for a centered distribution) Lecture 3 Slide 13

Moments of Distribution Functions Generalizing, the n th moment is x n x n = nth moment = + + x n g(x) dx g(x) dx + = x n f(x) 0 O th moment N 2 nd moment x x 2 x 2 1 st moment x 3 rd moment kurtosis dx The n th moment about the mean is μ n x x n = nth moment about the mean = + x x n g(x) dx + g(x) dx + n = x x f(x) dx The standard deviation (or second moment about the mean) is Introduction Section 0 Lecture 1 Slide 14 σ x 2 μ 2 x x 2 = 2nd moment about the mean = + x x 2 g(x) dx + g(x) dx + = x x 2 f(x) dx Lecture 3 Slide 14

Example of Continuous Distribution Functions and Expectation Values Harmonic Oscillator: Example from Mechanics Expected Values The expectation value of a function Ɍ(x) is Ɍ x + + Ɍ x g x g x dx dx + = Ɍ(x) f(x) dx Introduction Section 0 Lecture 1 Slide 15 Lecture 3 Slide 15

Example of Continuous Distribution Functions and Expectation Values Boltzmann Distribution: Example from Kinetic Theory Expected Values The expectation value of a function Ɍ(x) is Ɍ x + + Ɍ x g x g x dx dx + = Ɍ(x) f(x) dx The Boltzmann distribution function for velocities of particles as a function of temperature, T is: P v; T = 4π M 2π k B T 3 2 v 2 exp 1 2 Mv2 1 2 k BT Then v = + v P(v) dv = 8 k BT π M 1 2 v 2 + = v 2 P(v) Introduction Section 0 Lecture 1 Slide 16 dv = 3 k BT M 1 2 implies KE = 1 2 M v2 = 3 2 k BT v peak= 2 k B T M 1 2 = 2 3 1 2 v 2 Lecture 3 Slide 16

Example of Continuous Distribution Functions and Expectation Values Fermi-Dirac Distribution: Example from Kinetic Theory For a system of identical fermions, the average number of fermions in a single-particle state, is given by the Fermi Dirac (F D) distribution, where k B is Boltzmann's constant, T is the absolute temperature, state, and μ is the total chemical potential. is the energy of the single-particle Since the F D distribution was derived using the Pauli exclusion principle, which allows at most one electron to occupy each possible state, a result is that When a quasi-continuum of energies has an associated density of states (i.e. the number of states per unit energy range per unit volume) the average number of fermions per unit energy range per unit volume is, where is called the Fermi function Introduction Section 0 Lecture 1 Slide 17 so that, Lecture 3 Slide 17

Example of Continuous Distribution Functions and Expectation Values Finite Square Well: Example from Quantum Mechanics Expectation Values The expectation value of a QM operator O(x) is O x For a finite square well of width L, Ψ n x = 2 L sin n π x L + + Ψ x O x Ψ x Ψ x Ψ x dx dx Ψ n x Ψ n x + + Ψ n x O x Ψ n x Ψ n x Ψ n x dx dx = 1 x = Ψ n x x Ψ n x + + Ψ n x x Ψ n x dx Ψ n x Ψ n x dx = L/2 p = Ψ n x ħ i x Ψ n x Introduction Section 0 Lecture 1 Slide 18 + Ψ n x ħ Ψ i x n x dx Ψ n x Ψ n x dx + = 0 INTRODUCTION TO Modern Physics PHYX 2710 E n = ΨFall n 2004 x iħ t Ψ n x + Ψ n x iħ Ψ t n x dx Ψ n x Ψ n x dx + = n2 π 2 ħ 2 2 m L 2 Lecture 3 Slide 18

Summary of Distribution Functions Available on web site Introduction Section 0 Lecture 1 Slide 19 Lecture 3 Slide 19

Intermediate Lab PHYS 3870 The Gaussian Distribution Function Introduction Section 0 Lecture 1 Slide 20 References: Taylor Ch. 5 Lecture 3 Slide 20

Gaussian Integrals Factorial Approximations n! 2πn 1 2 n n exp n + 1 12 n + O 1 n 2 log n! 1 2 log 2π + n + 1 2 log n n + 1 12 n + O 1 n 2 log n! n log n n (for all terms decreasing faster than linearly with n) Gaussian Integrals I m = 2 x m exp ( x 2 ) 0 dx ; m>-1 I m = 2 0 y n exp ( y) dy Γ(n + 1) ; x 2 y, 2 dx = y 1 2 dy, n 1 (m 1) 2 I 0 = Γ n = 1 2 = π ; m=0, n = 1 2 Introduction Section 0 Lecture 1 Slide 21 I 2 k = Γ k + 1 2 = k 1 2 k 3 2... 3 2 1 2 π ; even m m=2 k>0, n = k 1 2 I 2 k+1 = Γ k + 1 = k! ; odd m m=2 k+1>0, n = k 0 Lecture 3 Slide 21

Gaussian Distribution Function Distribution Function Independent Variable Center of Distribution (mean) Normalization Constant Width of Distribution (standard deviation) Gaussian Distribution Function Introduction Section 0 Lecture 1 Slide 22 Lecture 3 Slide 22

Effects of Increasing N on Gaussian Distribution Functions Consider the limiting distribution function as N and dx0 Introduction Section 0 Lecture 1 Slide 23 Lecture 3 Slide 23

Defining the Gaussian distribution function in Mathcad I suggest you investigate these with the Mathcad sheet on the web site Introduction Section 0 Lecture 1 Slide 24 Lecture 3 Slide 24

Using Mathcad to define other common distribution functions. Consider the limiting distribution function as N ȸ and k 0 Introduction Section 0 Lecture 1 Slide 25 Lecture 3 Slide 25

Gaussian Distribution Moments Consider the Gaussian distribution function Use the normalization condition to evaluate the normalization constant (see Taylor, p. 132) The mean, Ẋ, is the first moment of the Gaussian distribution function (see Taylor, p. 134) Introduction Section 0 Lecture 1 Slide 26 The standard deviation, σ x, is the standard deviation of the mean of the Gaussian distribution function (see Taylor, p. 143) Lecture 3 Slide 26

When is mean x not X best? Answer: When the distribution is not symmetric about X. Example: Cauchy Distribution Introduction Section 0 Lecture 1 Slide 27 Lecture 3 Slide 27

When is mean x not X best? Introduction Section 0 Lecture 1 Slide 28 Lecture 3 Slide 28

When is mean x not X best? Answer: When the distribution is has more than one peak. Introduction Section 0 Lecture 1 Slide 29 Lecture 3 Slide 29

Intermediate Lab PHYS 3870 The Gaussian Distribution Function and Its Relation to Errors Introduction Section 0 Lecture 1 Slide 30 Lecture 3 Slide 30

A Review of Probabilities in Combination 1 head AND 1 Four P(H,4) = P(H) * P(4) 1 Six OR 1 Four P(6,4) = P(6) + P(4) (true for a mutually exclusive single role) 1 head OR 1 Four P(H,4) = P(H) + P(4) - P(H and 4) (true for a non-mutually exclusive events) NOT 1 Six P(NOT 6) = 1 - P(6) Introduction Section 0 Lecture 1 Slide 31 Probability of a data set of N like measurements, (x 1,x 2, x N ) P (x 1,x 2, x N ) = P(x) 1 *P(x 2 )* P(x N ) Lecture 3 Slide 31

The Gaussian Distribution Function and Its Relation to Errors We will use the Gaussian distribution as applied to random variables to develop the mathematical basis for: Mean Standard deviation Standard deviation of the mean (SDOM) Moments and expectation values Error propagation formulas Addition of errors in quadrature (for independent and random measurements) Numerical values for confidence limits (t-test) Principle of maximal likelihood Central limit theorem Introduction Section 0 Lecture 1 Slide 32 Weighted distributions and Chi squared Schwartz inequality (i.e., the uncertainty principle) (next lecture) Lecture 3 Slide 32

Gaussian Distribution Moments Consider the Gaussian distribution function Use the normalization condition to evaluate the normalization constant (see Taylor, p. 132) The mean, Ẋ, is the first moment of the Gaussian distribution function (see Taylor, p. 134) Introduction Section 0 Lecture 1 Slide 33 The standard deviation, σ x, is the standard deviation of the mean of the Gaussian distribution function (see Taylor, p. 143) Lecture 3 Slide 33

Standard Deviation of Gaussian Distribution 1% or ~3σ Highly Significant 5% or ~2σ Significant 1σ Within errors See Sec. 10.6: Testing of Hypotheses 5 ppm or ~5σ Valid for HEP Area under curve (probability that σ<x<+σ) is 68% Ah, that s highly significant! Introduction Section 0 Lecture 1 Slide 34 More complete Table in App. A and B Lecture 3 Slide 34

Error Function of Gaussian Distribution Error Function: (probability that tσ<x<+tσ ). Complementary Error Function: (probability that x<-tσ AND x>+tσ ). Prob(x outside tσ) = 1 - Prob(x within tσ) Probable Error: (probability that 0.67σ<x<+0.67σ) is 50%. Error Function: Tabulated values-see App. A. Area under curve (probability that tσ<x<+tσ) is given in Table at right. Introduction Section 0 Lecture 1 Slide 35 More complete Table in App. A and B Lecture 3 Slide 35

Useful Points on Gaussian Distribution Full Width at Half Maximum FWHM (See Prob. 5.12) Points of Inflection Occur at ±σ (See Prob. 5.13) Introduction Section 0 Lecture 1 Slide 36 Lecture 3 Slide 36

Error Analysis and Gaussian Distribution Adding a Constant Multiplying by a Constant Introduction Section 0 Lecture 1 Slide 37 XX+A XBX and σb σ Lecture 3 Slide 37

Error Propagation: Addition Sum of Two Variables Consider the derived quantity Z=X + Y (with X=0 and Y=0) Error in Z: Multiple two probabilities Introduction Section 0 Lecture 1 Slide 38 X+YZ and σ 2 x + σ 2 y σ 2 z Prob x, y = Prob x Prob x exp 1 2 exp 1 2 exp 1 2 (x+y) 2 σ x 2 +σ y 2 z 2 (x+y) 2 σ x 2 +σ y 2 (addition in quadrature for random, independent variables!) x 2 σ 2 + y 2 x σ 2 y ICBST (Eq. 5.53) exp z 2 2 Integrates to 2π Lecture 3 Slide 38

General Formula for Error Propagation How do we determine the error of a derived quantity Z(X,Y, ) from errors in X,Y,? General formula for error propagation see [Taylor, Secs. 3.5 and 3.9] Uncertainty as a function of one variable [Taylor, Sec. 3.5] 1. Consider a graphical method of estimating error a) Consider an arbitaray function q(x) b) Plot q(x) vs. x. c) On the graph, label: (1) q best = q(x best ) (2) q hi = q(x best + δx) (3) q low = q(x best - δx) d) Making a linear approximation: q qhi qbest slope x qbest x Introduction Section 0 q Lecture q q slope x q 1 Slide 39 low best best x e) Therefore: q q x x Note the absolute value. Lecture 3 Slide 39

General Formula for Error Propagation General formula for uncertainty of a function of one variable q q x [Taylor, Eq. 3.23] x Can you now derive for specific rules of error propagation: 1. Addition and Subtraction [Taylor, p. 49] 2. Multiplication and Division [Taylor, p. 51] 3. Multiplication by a constant (exact number) [Taylor, p. 54] 4. Exponentiation (powers) [Taylor, p. 56] Introduction Section 0 Lecture 1 Slide 40 Lecture 3 Slide 40

Introduction Section 0 Lecture 1 Slide 41 Lecture 3 Slide 41 General Formula for Multiple Variables Uncertainty of a function of multiple variables [Taylor, Sec. 3.11] 1. It can easily (no, really) be shown that (see Taylor Sec. 3.11) for a function of several variables...,...),, ( z z q y y q x x q z y x q [Taylor, Eq. 3.47] 2. More correctly, it can be shown that (see Taylor Sec. 3.11) for a function of several variables...,...),, ( z z q y y q x x q z y x q [Taylor, Eq. 3.47] where the equals sign represents an upper bound, as discussed above. 3. For a function of several independent and random variables...,...),, ( 2 2 2 z z q y y q x x q z y x q [Taylor, Eq. 3.48] Again, the proof is left for Ch. 5.

Error Propagation: General Case How do we determine the error of a derived quantity Z(X,Y, ) from errors in X,Y,? Consider the arbitrary derived quantity q(x,y) of two independent random variables x and y. Expand q(x,y) in a Taylor series about the expected values of x and y (i.e., at points near X and Y). Fixed, shifts peak of distribution q q q x, y = q X, Y + x X + (y Y) x X y Y Fixed Distribution centered at X with width σ X Product of Two Variables Introduction Section 0 Lecture 1 Slide 42 δq x, y = σ q = q X, Y + q x 0 X σ x 2 + q y Y σ y 2 Lecture 3 Slide 42

Each measurement has similar σ Xi= σ Ẋ SDOM of Gaussian Distribution Standard Deviation of the Mean and similar partial derivatives Thus The SDOM decreases as the square root of the number of measurements. Introduction Section 0 Lecture 1 Slide 43 That is, the relative width, σ/ẋ, of the distribution gets narrower as more measurements are made. Lecture 3 Slide 43

Two Key Theorems from Probability Central Limit Theorem For random, independent measurements (each with a well-define expectation value and well-defined variance), the arithmetic mean (average) will be approximately normally distributed. Principle of Maximum Likelihood Given the N observed measurements, x 1, x 2, x N, the best estimates for Ẋ and σ are those values for which the observed x 1, x 2, x N, are most likely. Introduction Section 0 Lecture 1 Slide 44 Lecture 3 Slide 44

Mean of Gaussian Distribution as Best Estimate Principle of Maximum Likelihood To find the most likely value of the mean (the best estimate of ẋ), find X that yields the highest probability for the data set. Consider a data set {x 1, x 2, x 3 x N } Each randomly distributed with The combined probability for the full data set is the product 1 σ e (x 1 X) 2 2σ 1 σ e (x 2 X) 2 2σ 1 σ e (x N X) 2 2σ = 1 σ N e (x i X) 2 2σ Introduction Section 0 Lecture 1 Slide 45 Best Estimate of X is from maximum probability or minimum summation Minimize Sum Solve for derivative wrst X set to 0 Best estimate of X Lecture 3 Slide 45

Uncertainty of Best Estimates of Gaussian Distribution Principle of Maximum Likelihood To find the most likely value of the standard deviation (the best estimate of the width of the x distribution), find σ x that yields the highest probability for the data set. Consider a data set {x 1, x 2, x 3 x N } The combined probability for the full data set is the product 1 σ e (x 1 X) 2 2σ 1 σ e (x 2 X) 2 2σ 1 σ e (x N X) 2 2σ = 1 σ N e (x i X) 2 2σ Best Estimate of X is from maximum probability or minimum summation Minimize Sum Introduction Section 0 Lecture 1 Slide 46 Solve for derivative wrst X set to 0 Best estimate of X Best Estimate of σ is from maximum probability or minimum summation Minimize Sum Solve for derivative wrst σ set to 0 See Prob. 5.26 Best estimate of σ Lecture 3 Slide 46

Intermediate Lab PHYS 3870 Combining Data Sets Weighted Averages Introduction Section 0 Lecture 1 Slide 47 References: Taylor Ch. 7 Lecture 3 Slide 47

Weighted Averages Question: How can we properly combine two or more separate independent measurements of the same randomly distributed quantity to determine a best combined value with uncertainty? Introduction Section 0 Lecture 1 Slide 48 Lecture 3 Slide 48

Weighted Averages Consider two measurements of the same quantity, described by a random Gaussian distribution <x 1 > x1 and <x 2 > x2 Assume negligible systematic errors. The probability of measuring two such measurements is Prob x x 1, x 2 = Prob x x 1 Prob x x 2 = 1 e χ 2 /2 where χ 2 x 1 X σ 1 σ 2 σ 1 2 + x 2 X σ 2 2 To find the best value for χ, find the maximum Prob or minimum χ 2 Note: χ 2, or Chi squared, is the sum of the squares of the deviations from the mean, divided by the corresponding uncertainty. Introduction Section 0 Lecture 1 Slide 49 Such methods are called Methods of Least Squares. They follow directly from the Principle of Maximum Likelihood. Lecture 3 Slide 49

Weighted Averages The probability of measuring two such measurements is Prob x x 1, x 2 = Prob x x 1 Prob x x 2 = 1 e χ 2 /2 where χ 2 x 1 X σ 1 σ 2 σ 1 2 + x 2 X σ 2 2 To find the best value for χ, find the maximum Prob or minimum χ 2 Best Estimate of χ is from maximum probibility or minimum summation Minimize Sum Solve for derivative wrst χ set to 0 Solve for best estimate of χ This leads to x W_avg = w 1x 1 +w 2 x 2 w 1 +w 2 = w i x i w i where w i = 1 σ i 2 Introduction Section 0 Lecture 1 Slide 50 Note: If w 1 =w 2, we recover the standard result X wavg = (1/2) (x 1 +x 2 ) Finally, the width of a weighted average distribution is σ wieg hted avg = 1 i w i Lecture 3 Slide 50

Weighted Averages on Steroids A very powerful method for combining data from different sources with different methods and uncertainties (or, indeed, data of related measured and calculated quantities) is Kalman filtering. The Kalman filter, also known as linear quadratic estimation (LQE), is an algorithm that uses a series of measurements observed over time, containing noise (random variations) and other inaccuracies, and produces estimates of unknown variables that tend to be more precise than those based on a single measurement alone. The Kalman filter keeps track of the estimated state of the system and the variance or uncertainty of the estimate. The Introduction estimate is Section updated 0 Lecture using a 1 state Slide transition 51 model and measurements. x k k-1 denotes the estimate of the system's state at time step k before the k th measurement y k has been taken into account; P k k-1 is the corresponding uncertainty. --Wikipedia, 2013. Ludger Scherliess, of Fall USU 2004 Physics, is a world expert at using Kalman filtering for the assimilation of satellite and ground-based data and the USU GAMES model to predict space weather. Lecture 3 Slide 51

Intermediate Lab PHYS 3870 Rejecting Data Chauvenet s Criterion Introduction Section 0 Lecture 1 Slide 52 References: Taylor Ch. 6 Lecture 3 Slide 52

Rejecting Data What is a good criteria for rejecting data? Question: When is it reasonable to discard a seemingly unreasonable data point from a set of randomly distributed measurements? Never Whenever it makes things look better Chauvenet s criterion provides a (quantitative) compromise Introduction Section 0 Lecture 1 Slide 53 Lecture 3 Slide 53

Question: When is it reasonable to discard a seemingly unreasonable data point from a set of randomly distributed measurements? Never Rejecting Data Zallen s Criterion Introduction Section 0 Lecture 1 Slide 54 Lecture 3 Slide 54

Rejecting Data Disney s Criterion Question: When is it reasonable to discard a seemingly unreasonable data point from a set of randomly distributed measurements? Whenever it makes things look better Disney s First Law Wishing will make it so. Introduction Section 0 Lecture 1 Slide 55 Disney s Second Law Dreams are more colorful than reality. Lecture 3 Slide 55

Data may be rejected if the expected number of measurements at least as deviant as the suspect measurement is less than 50%. Consider a set of N measurements of a single quantity { x 1,x 2, x N } Rejecting Data Chauvenet s Criterion Calculate <x> and σ x and then determine the fractional deviations from the mean of all the points: For the suspect point(s), x suspect, find the probability of such a point occurring in N measurements Introduction Section 0 Lecture 1 Slide 56 n = (expected number as deviant as x suspect ) = N Prob(outside x suspect σ x ) Lecture 3 Slide 56

Error Function of Gaussian Distribution Error Function: (probability that tσ<x<+tσ ). Error Function: Tabulated values-see App. A. Area under curve (probability that tσ<x<+tσ) is given in Table at right. Introduction Section 0 Lecture 1 Slide 57 Probable Error: (probability that 0.67σ<x<+0.67σ) is 50%. Lecture 3 Slide 57

Chauvenet s Criterion Introduction Section 0 Lecture 1 Slide 58 Lecture 3 Slide 58

Chauvenet s Criterion Example 1 Introduction Section 0 Lecture 1 Slide 59 Lecture 3 Slide 59

Chauvenet s Details (1) Introduction Section 0 Lecture 1 Slide 60 Lecture 3 Slide 60

Chauvenet s Criterion Details (2) Introduction Section 0 Lecture 1 Slide 61 Lecture 3 Slide 61

Chauvenet s Criterion Details (3) Introduction Section 0 Lecture 1 Slide 62 Lecture 3 Slide 62

Chauvenet s Criterion Example 2 Introduction Section 0 Lecture 1 Slide 63 Lecture 3 Slide 63

Intermediate Lab PHYS 3870 Summary of Probability Theory Introduction Section 0 Lecture 1 Slide 64 Lecture 3 Slide 64

Summary of Probability Theory-I Introduction Section 0 Lecture 1 Slide 65 Lecture 3 Slide 65

Summary of Probability Theory-II Introduction Section 0 Lecture 1 Slide 66 Lecture 3 Slide 66

Summary of Probability Theory-III Introduction Section 0 Lecture 1 Slide 67 Lecture 3 Slide 67

Summary of Probability Theory-IV Introduction Section 0 Lecture 1 Slide 68 Lecture 3 Slide 68