IE 361 Module 18. Reading: Section 2.5 Statistical Methods for Quality Assurance. ISU and Analytics Iowa LLC

Similar documents
IE 361 Module 4. Modeling Measurement. Reading: Section 2.1 Statistical Methods for Quality Assurance. ISU and Analytics Iowa LLC

Simple Linear Regression

AMS 7 Correlation and Regression Lecture 8

Scatter plot of data from the study. Linear Regression

Inference for Regression

Scatter plot of data from the study. Linear Regression

Regression Analysis IV... More MLR and Model Building

Elementary Statistical Methods and Measurement Error

Simple Linear Regression. (Chs 12.1, 12.2, 12.4, 12.5)

Important note: Transcripts are not substitutes for textbook assignments. 1

IE 361 Module 32. Patterns on Control Charts Part 2 and Special Checks/Extra Alarm Rules

Inferences for Regression

STATISTICS AND MEASUREMENT

Simple linear regression

Institute of Actuaries of India

IE 361 Module 24. Introduction to Shewhart Control Charting Part 1 (Statistical Process Control, or More Helpfully: Statistical Process Monitoring)

Ch 2: Simple Linear Regression

Simple Linear Regression for the Climate Data

Statistical View of Least Squares

Linear Regression and Correlation. February 11, 2009

Chapter 27 Summary Inferences for Regression

7.0 Lesson Plan. Regression. Residuals

Math 423/533: The Main Theoretical Topics

Simple Linear Regression

Linear regression. We have that the estimated mean in linear regression is. ˆµ Y X=x = ˆβ 0 + ˆβ 1 x. The standard error of ˆµ Y X=x is.

Statistics 203: Introduction to Regression and Analysis of Variance Penalized models

Measurement and Uncertainty

Simple Linear Regression. Material from Devore s book (Ed 8), and Cengagebrain.com

Business Statistics. Lecture 9: Simple Regression

Business Statistics. Lecture 10: Course Review

EFFECT OF THE UNCERTAINTY OF THE STABILITY DATA ON THE SHELF LIFE ESTIMATION OF PHARMACEUTICAL PRODUCTS

Measurement And Uncertainty

Simple Linear Regression Analysis

STAT Chapter 11: Regression

STAT5044: Regression and Anova. Inyoung Kim

Measuring the fit of the model - SSR

IE 361 Module 17. Process Capability Analysis: Part 1

IE 361 Module 39. "Statistical" (Probabilistic) Tolerancing Part 1 (Ideas) Reading: Section 4.4 Statistical Methods for Quality Assurance

IE 361 Module 21. Design and Analysis of Experiments: Part 2

Business Statistics. Lecture 10: Correlation and Linear Regression

IE 361 Module 25. Introduction to Shewhart Control Charting Part 2 (Statistical Process Control, or More Helpfully: Statistical Process Monitoring)

Topic 12 Overview of Estimation

Finding Relationships Among Variables

An Elementary Introduction to Some Issues in Metrology and Some Implications of (and for) Probability and Statistics

ME EN 363 Elementary Instrumentation

Statistical View of Least Squares

1 Measurement Uncertainties

Variance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression.

Bivariate Data: Graphical Display The scatterplot is the basic tool for graphically displaying bivariate quantitative data.

Correlation and the Analysis of Variance Approach to Simple Linear Regression

Approximations - the method of least squares (1)

Multiple Regression. Inference for Multiple Regression and A Case Study. IPS Chapters 11.1 and W.H. Freeman and Company

Lecture 18: Simple Linear Regression

Linear regression and correlation

Summarizing Data: Paired Quantitative Data

Introduction to Uncertainty. Asma Khalid and Muhammad Sabieh Anwar

Chapter 10: Inferences based on two samples

The scatterplot is the basic tool for graphically displaying bivariate quantitative data.

Assumptions, Diagnostics, and Inferences for the Simple Linear Regression Model with Normal Residuals

Psych 10 / Stats 60, Practice Problem Set 10 (Week 10 Material), Solutions

ECO220Y Simple Regression: Testing the Slope

STA121: Applied Regression Analysis

AMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression

Know Your Uncertainty

Guide to the Expression of Uncertainty in Measurement (GUM)- An Overview

Regression Analysis: Basic Concepts

Warm-up Using the given data Create a scatterplot Find the regression line

THE APPLICATION OF SIMPLE STATISTICS IN GRAINS RESEARCH

Inference for Regression Inference about the Regression Model and Using the Regression Line, with Details. Section 10.1, 2, 3

Linear models and their mathematical foundations: Simple linear regression

Chapter 6: Exploring Data: Relationships Lesson Plan

Statistics for Engineers Lecture 9 Linear Regression

Inference for Regression Simple Linear Regression

Unit 6 - Simple linear regression

INFERENCE FOR REGRESSION

Chapter 7. Scatterplots, Association, and Correlation

57:020 Mechanics of Fluids and Transfer Processes OVERVIEW OF UNCERTAINTY ANALYSIS

Ridge regression. Patrick Breheny. February 8. Penalized regression Ridge regression Bayesian interpretation

22 Approximations - the method of least squares (1)

Data Integration for Big Data Analysis for finite population inference

Stat 101 L: Laboratory 5

Lectures on Simple Linear Regression Stat 431, Summer 2012

Psychology 282 Lecture #3 Outline

Inference for Regression Inference about the Regression Model and Using the Regression Line

appstats27.notebook April 06, 2017

Predicted Y Scores. The symbol stands for a predicted Y score

Chapter 10. Regression. Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania

Harvard University. Rigorous Research in Engineering Education

TABLES AND FORMULAS FOR MOORE Basic Practice of Statistics

High-dimensional regression

Experimental design. Matti Hotokka Department of Physical Chemistry Åbo Akademi University

Linear Regression & Correlation

9. Linear Regression and Correlation

SLR output RLS. Refer to slr (code) on the Lecture Page of the class website.

Any of 27 linear and nonlinear models may be fit. The output parallels that of the Simple Regression procedure.

Error Analysis How Do We Deal With Uncertainty In Science.

Chapter 14 Simple Linear Regression (A)

IE 316 Exam 1 Fall 2012

The Simple Linear Regression Model

statistical sense, from the distributions of the xs. The model may now be generalized to the case of k regressors:

Transcription:

IE 361 Module 18 Calibration Studies and Inference Based on Simple Linear Regression Reading: Section 2.5 Statistical Methods for Quality Assurance ISU and Analytics Iowa LLC (ISU and Analytics Iowa LLC) IE 361 Module 18 1 / 13

Calibration Studies Calibration is an essential activity in the qualification and maintenance of measurement devices or systems. The basic idea is that one uses a measurement device to produce measurements on "standard" specimens with (relatively well-) "known" values of a measurand, and sees how the measurements compare to the measurand. In the event that there are systematic discrepancies between what is known to be true and what the device reads, the plan is then to invent some conversion scheme to (in future use of the device) adjust what is read to something that is hopefully closer to the (future) truth. A slight extension of "regression" analysis (curve fitting) from an elementary statistics course is the relevant statistical methodology in making this conversion. (ISU and Analytics Iowa LLC) IE 361 Module 18 2 / 13

Calibration studies produce "true"/gold-standard-measurement values x and "local" measurements y and seek a "conversion" method from y to x. (Strictly speaking, y need not even be in the same units as x.) Regression analysis can provide both "point conversions" and measures of uncertainty (the latter through inversion of "prediction limits"). The simplest version of this is the case where y β 0 + β 1 x This is "linear calibration." The standard statistical model for such a circumstance is y = β 0 + β 1 x + ɛ for a normal error ɛ with mean 0 and standard deviation σ. (σ describes how much y s vary for a fixed x, and in the present context amounts to a "repeatability" standard deviation.) This can be pictured as follows. (ISU and Analytics Iowa LLC) IE 361 Module 18 3 / 13

Figure: Normal Simple Linear Regression Model (ISU and Analytics Iowa LLC) IE 361 Module 18 4 / 13

For n data pairs (x i, y i ), simple linear regression methodology allows one to make confidence intervals and tests associated with the model, and what is more important for our present purposes, prediction limits for a new y associated with a new x. These are of the form (b 0 + b 1 x) ±ts LF 1 + 1 n + (x x)2 (x x) 2 where the least squares line is ŷ = b 0 + b 1 x and s LF is an estimate of σ derived from the fit of the line to the data. Any good statistical package will compute and plot these limits along with a least squares line through the data set. (ISU and Analytics Iowa LLC) IE 361 Module 18 5 / 13

Example 18-1 (Mandel NBS/NIST) "Gold-standard" and "local" measurements on n = 14 specimens (units not given) are as below. Figure: JMP Data Table for Mandel s Calibration Example (ISU and Analytics Iowa LLC) IE 361 Module 18 6 / 13

Example 18-1 (Mandel NBS/NIST) A JMP report for simple linear regression including prediction limits for an additional value of y (that, of course, change with x) plotted is below. Figure: JMP Report for Simple Linear Regression With Mandel s Data (ISU and Analytics Iowa LLC) IE 361 Module 18 7 / 13

What is of most interest here is (of course) what regression technology indicates about measurement and calibration. In particular: From a simple linear regression output, s LF = MSE = "root mean square error" is a kind of estimated repeatability standard deviation. One may make confidence intervals for σ repeatability based on this "sample standard deviation" using ν = n 2 degrees of freedom and limits n 2 n 2 s LF χ 2 and s LF upper χ 2 lower (ISU and Analytics Iowa LLC) IE 361 Module 18 8 / 13

The least squares equation ŷ = b 0 + b 1 x can be solved for x, giving ˆx = y b 0 b 1 as a way of estimating a "gold-standard" value of a measurand x from a measured local value y. It turns out that one can take the prediction limits for y and "turn them around" to get confidence limits for the x corresponding to a measured local y. This provides a defensible way to set "error bounds" on what y indicates about x. (ISU and Analytics Iowa LLC) IE 361 Module 18 9 / 13

Example 18-1 (Mandel NBS/NIST) Since from the JMP report y = 42.216299 + 0.881819x with s LF = 25.32578 we might expect a local (y) repeatability standard deviation of around 25 (in the y units). In fact, 95% confidence limits for σ can be made (using n 2 = 12 degrees of freedom) limits 12 12 25.3 and 25.3 23.337 4.4004 i.e. 18.1 and 41.8 Making use of the slope and intercept of the least squares line, a "conversion formula" for going from y to x is ˆx = y 42.216299 0.881819 (ISU and Analytics Iowa LLC) IE 361 Module 18 10 / 13

Example 18-1 (Mandel NBS/NIST) The following figure shows how one can set 95% confidence limits on x if y = 1500 is observed, using a plot of 95% prediction limits for y given x. Figure: 95% Confidence Limits for x When y = 1500 is Measured, Derived From Traces of 95% Prediction Limits for y at Given x Values (ISU and Analytics Iowa LLC) IE 361 Module 18 11 / 13

As one final consideration for Example 18-1, it is worthwhile to note what a standard simple linear regression analysis has to say about the "linearity" of the local measurement device assuming that the (unavailable) units of x and y are meant to be the same. While a scatterplot of the n = 14 data pairs (x i, y i ) in the example is reasonably straight-line, that is not really the issue to be discussed, but rather something stronger. As we mentioned in Module 4, the term "linearity" as typically employed in metrology contexts concerns the matter of constant bias. In regression terms, this requires a straight-line relationship between measurand and average measurement with slope of 1. The methods of elementary regression analysis say that confidence limits for the slope β 1 of the simple linear regression model are b 1 ± t s LF (x i x) 2 (ISU and Analytics Iowa LLC) IE 361 Module 18 12 / 13

Example 18-1 (Mandel NBS/NIST) The report on panel 7 shows that b 1 =.882 and s LF (x i x) 2 = SE b1 =.012 so that (using the upper 2.5% point of the t 12 distribution, 2.179) 95% confidence limits for β 1 are or.882 ± 2.179 (.012).882 ±.026 A 95% confidence interval for β 1 clearly does not include 1.0. So if the intent was that the local measurement device be "linear," at least before calibration it was not. Bias was not constant. (ISU and Analytics Iowa LLC) IE 361 Module 18 13 / 13