Analyzing Two-Dimensional Data. Analyzing Two-Dimensional Data

Similar documents
ESS Line Fitting

Simple Linear Regression

Lecture Notes Types of economic variables

Multiple Choice Test. Chapter Adequacy of Models for Regression

Objectives of Multiple Regression

12.2 Estimating Model parameters Assumptions: ox and y are related according to the simple linear regression model

Quantitative analysis requires : sound knowledge of chemistry : possibility of interferences WHY do we need to use STATISTICS in Anal. Chem.?

Lecture Notes Forecasting the process of estimating or predicting unknown situations

Mean is only appropriate for interval or ratio scales, not ordinal or nominal.

Maximum Likelihood Estimation

Correlation and Regression Analysis

Simple Linear Regression and Correlation.

Econometric Methods. Review of Estimation

Statistics. Correlational. Dr. Ayman Eldeib. Simple Linear Regression and Correlation. SBE 304: Linear Regression & Correlation 1/3/2018

ENGI 3423 Simple Linear Regression Page 12-01

Statistics MINITAB - Lab 5

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Simple Linear Regression

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Correlation and Simple Linear Regression

Example: Multiple linear regression. Least squares regression. Repetition: Simple linear regression. Tron Anders Moger

Can we take the Mysticism Out of the Pearson Coefficient of Linear Correlation?

The equation is sometimes presented in form Y = a + b x. This is reasonable, but it s not the notation we use.

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

Linear Regression with One Regressor

Chapter Two. An Introduction to Regression ( )

residual. (Note that usually in descriptions of regression analysis, upper-case

Lecture 2: Linear Least Squares Regression

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model

Chapter Business Statistics: A First Course Fifth Edition. Learning Objectives. Correlation vs. Regression. In this chapter, you learn:

Functions of Random Variables

Lecture 8: Linear Regression

Lecture 3. Sampling, sampling distributions, and parameter estimation

STA302/1001-Fall 2008 Midterm Test October 21, 2008

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

Lecture 2: The Simple Regression Model

: At least two means differ SST

Line Fitting and Regression

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

Chapter 13 Student Lecture Notes 13-1

Linear Regression. Hsiao-Lung Chan Dept Electrical Engineering Chang Gung University, Taiwan

2SLS Estimates ECON In this case, begin with the assumption that E[ i

Multiple Regression. More than 2 variables! Grade on Final. Multiple Regression 11/21/2012. Exam 2 Grades. Exam 2 Re-grades

Lecture Notes 2. The ability to manipulate matrices is critical in economics.

UNIT 7 RANK CORRELATION

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

ENGI 4421 Propagation of Error Page 8-01

Transforming Numerical Methods Education for the STEM Undergraduate Torque (N-m)

Linear Regression. Hsiao-Lung Chan Dept Electrical Engineering Chang Gung University, Taiwan

ε. Therefore, the estimate

Lecture 1: Introduction to Regression

Chapter Statistics Background of Regression Analysis

Third handout: On the Gini Index

CLASS NOTES. for. PBAF 528: Quantitative Methods II SPRING Instructor: Jean Swanson. Daniel J. Evans School of Public Affairs

Probability and. Lecture 13: and Correlation

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Supervised learning: Linear regression Logistic regression

Model Fitting, RANSAC. Jana Kosecka

b. There appears to be a positive relationship between X and Y; that is, as X increases, so does Y.

The Randomized Block Design

Chapter 8: Statistical Analysis of Simulated Data

Lecture 3. Least Squares Fitting. Optimization Trinity 2014 P.H.S.Torr. Classic least squares. Total least squares.

Lecture 1: Introduction to Regression

Statistics: Unlocking the Power of Data Lock 5

Class 13,14 June 17, 19, 2015


Simple Linear Regression - Scalar Form

Homework Solution (#5)

Chapter 3 Response Surface Approximations

Previous lecture. Lecture 8. Learning outcomes of this lecture. Today. Statistical test and Scales of measurement. Correlation

Chapter 2 Supplemental Text Material

Handout #8. X\Y f(x) 0 1/16 1/ / /16 3/ / /16 3/16 0 3/ /16 1/16 1/8 g(y) 1/16 1/4 3/8 1/4 1/16 1

Generative classification models

Linear Calibration Is It so Simple?

1. The weight of six Golden Retrievers is 66, 61, 70, 67, 92 and 66 pounds. The weight of six Labrador Retrievers is 54, 60, 72, 78, 84 and 67.

Multiple Linear Regression Analysis

REVIEW OF SIMPLE LINEAR REGRESSION SIMPLE LINEAR REGRESSION

15-381: Artificial Intelligence. Regression and neural networks (NN)

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

CH E 374 Computational Methods in Engineering Fall 2007

i 2 σ ) i = 1,2,...,n , and = 3.01 = 4.01

Chapter -2 Simple Random Sampling

Module 7. Lecture 7: Statistical parameter estimation

Chapter -2 Simple Random Sampling

Bayes (Naïve or not) Classifiers: Generative Approach

STATISTICS 13. Lecture 5 Apr 7, 2010

CS5620 Intro to Computer Graphics

Big Data Analytics. Data Fitting and Sampling. Acknowledgement: Notes by Profs. R. Szeliski, S. Seitz, S. Lazebnik, K. Chaturvedi, and S.

Summary of the lecture in Biostatistics

Regression and the LMS Algorithm

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

Uncertainty, Data, and Judgment

r y Simple Linear Regression How To Study Relation Between Two Quantitative Variables? Scatter Plot Pearson s Sample Correlation Correlation

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

6.867 Machine Learning

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes

Lecture 5: Interpolation. Polynomial interpolation Rational approximation

L5 Polynomial / Spline Curves

Transcription:

/7/06 Aalzg Two-Dmesoal Data The most commo aaltcal measuremets volve the determato of a ukow cocetrato based o the respose of a aaltcal procedure (usuall strumetal). Such a measuremet requres calbrato, or the preparato of a calbrato curve. Determato of the respose of the method to solutos of kow cocetrato (stadards). Oce the respose for the stadards s kow, the cocetrato of a ukow ca be determed IF the cocetrato/respose relatoshp s well defed. Ideall prefer a lear relatoshp does t have to be lear as log as ou kow what t s, ca ofte force olear relatoshps to appear lear b approprate expermet desg Aalzg Two-Dmesoal Data Respose Cocetrato Importat questos to ask:. How do we defe the best le?. How do errors our data affect ths le? 3. How cofdet ca we be of the ukow cocetrato that we calculate from our calbrato curve?

/7/06 Aalzg Two-Dmesoal Data Example: Prote determato usg spectrophotometr. IMPORTANT: Absorbace Prote mass Prote (g) 0.00 9.36 8.7 8.08 37.44 Absorbace 0.466 0.756 0.843.6.80 Our objectve s to draw the best ft le through the data, but how? Mmze devato (spread) of the data aroud the le Absorbace.40.30.0.0.00 0.90 0.80 0.70 0.60 0.50 0.40 0 0 0 30 40 Prote (mg) Mathematcall, ths s a least squares aalss Work to mmze the square of the devato (to remove effects of sg) from our calculated le. Qualtatvel ths s eas, quattatvel; thgs are a lttle more challegg. 3 Tpcall workg to defe a straght le, = mx +b Assume that values for x have lttle error, but more error s assocated wth values for. Sce our data have some scatter, each datum ma devate from the le the -drecto. Ths s also called a resdual (d ) d = le = (mx + b) We reall wat to mmze the square of the devatos (actuall the SUM of the squares): (d ) = ( -mx -b) (d ) = -mx - b + mx b + m x + b (d ) = - mx - b + mx b + m x + b How do we do ths? 4

/7/06 (d ) = - mx - b + mx b + m x + b Two parameters, so two partal dervatves to set equal to zero: = - x + x b + mx =-x + bx + mx = 0 =- + mx + b = - + mx + b = 0 Ths produces two equatos ad two ukows (m, ad b) we should be able to solve ths sstem! 5 (d ) = -mx - b + mx b + m x + b Wth a lttle had-wavg (ad the magc of calculus ad lear algebra), we are able to mmze the equato above ad solve for m ad b, whe we do, we get: (x ) m x D x b x (x ) D Each operato volves takg the determat of a matrx x D x x a c b d a d b c There s ol oe soluto to the sstem of equatos So ol oe least squares le! 6 3

/7/06 Lets appl ths to our example data: x D x m x 4380.48 Now lets calculate some pots based o our le: Prote (g) Absorbace x x x 0.00 0.466 0.000 0.000 9.36 0.756 7.076 87.609 8.7 0.843 5.780 350.438 8.08.6 34.46 788.486 37.44.80 47.90 40.75 93.60 4.57 05.06 68.88 (x) x x D 0.045 (x ) b D 0. x 4946 Prote (g) Absorbace x calc d d 0.00 0.466 0.495-0.09 0.00088 9.36 0.756 0.704 0.05 0.00663 8.7 0.843 0.94-0.07 0.005069 8.08.6.4 0.0 0.00404 37.44.80.334-0.054 0.00894 93.60 4.57 4.57 0.000 0.0 Absorbace.40.30.0.0.00 0.90 0.80 0.70 0.60 0.50 0.40 0 0 0 30 40 Prote (mg) 7 How relable are m, b, ad values we determe based o our calbrato curve? The majort of our cofdece depeds o the scatter of values about the le, or the stadard devato, s (also called s r, st. dev. about regresso). (d d) d s Lke usual, the umber of degrees of freedom s the deomator. Wh - degrees of freedom? Other std. devs. deped o s s m s D s b s x D x x s x x x x s sx m k D D D m k m where k s the umber of replcate measuremets of the ukow ad s the umber of calbrato pots. 8 4

/7/06 Cofdece Lmts for m, b, x calc Cofdece Lmts for m, b, x calc m ± ts m b ± ts b x calc ±ts x t s for - degrees of freedom 9 R ad Such Plottg Excel (or o m calculator) gves me R (or R) values. What the #$%@ do these mea? R (or r ): Coeffcet of Determato s the fracto of the scatter the data that ca be descrbed b the lear relatoshp. R compares the varato of the data from the least-squares le to that due to radom scatter: R le A R close to does t guaratee good precso m ad b.5.5 =(0.56±0.07)x + (0.7±0.) R = 0.954.5.5 =(0.39±0.05)x + (0.0±0.) R = 0.957 0.5 0.5 0 0 4 6-0.5 x 0 0 4 6-0.5 x 0 5