Business Statistics. Chapter 14 Introduction to Linear Regression and Correlation Analysis QMIS 220. Dr. Mohammad Zainal

Similar documents
Chapter 13 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics

Correlation Analysis

Basic Business Statistics 6 th Edition

Chapter Learning Objectives. Regression Analysis. Correlation. Simple Linear Regression. Chapter 12. Simple Linear Regression

Statistics for Managers using Microsoft Excel 6 th Edition

Chapter 14 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics. Chapter 14 Multiple Regression

Business Statistics. Lecture 10: Correlation and Linear Regression

Chapter 4. Regression Models. Learning Objectives

Ch 13 & 14 - Regression Analysis

What is a Hypothesis?

The simple linear regression model discussed in Chapter 13 was written as

Regression Models. Chapter 4. Introduction. Introduction. Introduction

Chapter 16. Simple Linear Regression and dcorrelation

Basic Business Statistics, 10/e

Chapter 7 Student Lecture Notes 7-1

Chapter 16. Simple Linear Regression and Correlation

Chapter 3 Multiple Regression Complete Example

The Multiple Regression Model

Ch 2: Simple Linear Regression

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.

Chapter 14 Student Lecture Notes 14-1

Regression Models. Chapter 4

Chapte The McGraw-Hill Companies, Inc. All rights reserved.

(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box.

Chapter 14 Simple Linear Regression (A)

Inferences for Regression

Keller: Stats for Mgmt & Econ, 7th Ed July 17, 2006

Topic 10 - Linear Regression

Chapter 4: Regression Models

Chapter 12 - Lecture 2 Inferences about regression coefficient

Mathematics for Economics MA course

ECON 450 Development Economics

Inference for the Regression Coefficient

Inference for Regression Simple Linear Regression

Section 3: Simple Linear Regression

Econ 3790: Business and Economics Statistics. Instructor: Yogesh Uppal

Ch 3: Multiple Linear Regression

Econ 3790: Statistics Business and Economics. Instructor: Yogesh Uppal

Measuring the fit of the model - SSR

A discussion on multiple regression models

Statistics for Business and Economics

Six Sigma Black Belt Study Guides

Inference for Regression Inference about the Regression Model and Using the Regression Line

Interactions. Interactions. Lectures 1 & 2. Linear Relationships. y = a + bx. Slope. Intercept

Regression Analysis II

Simple Linear Regression

Finding Relationships Among Variables

Inference for Regression

Multiple Regression. Inference for Multiple Regression and A Case Study. IPS Chapters 11.1 and W.H. Freeman and Company

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.

Chapter 4 Describing the Relation between Two Variables

regression analysis is a type of inferential statistics which tells us whether relationships between two or more variables exist

: The model hypothesizes a relationship between the variables. The simplest probabilistic model: or.

STA121: Applied Regression Analysis

Regression Analysis. BUS 735: Business Decision Making and Research

SIMPLE REGRESSION ANALYSIS. Business Statistics

Chapter 10. Regression. Understandable Statistics Ninth Edition By Brase and Brase Prepared by Yixun Shi Bloomsburg University of Pennsylvania

Chapter 8 Student Lecture Notes 8-1. Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance

Chapter 13. Multiple Regression and Model Building

df=degrees of freedom = n - 1

Can you tell the relationship between students SAT scores and their college grades?

Correlation and the Analysis of Variance Approach to Simple Linear Regression

Correlation. A statistics method to measure the relationship between two variables. Three characteristics

AMS 315/576 Lecture Notes. Chapter 11. Simple Linear Regression

BNAD 276 Lecture 10 Simple Linear Regression Model

Bayesian Analysis LEARNING OBJECTIVES. Calculating Revised Probabilities. Calculating Revised Probabilities. Calculating Revised Probabilities

Simple Linear Regression. Material from Devore s book (Ed 8), and Cengagebrain.com

LECTURE 6. Introduction to Econometrics. Hypothesis testing & Goodness of fit

Simple Linear Regression

Lectures on Simple Linear Regression Stat 431, Summer 2012

Chapter 12 - Part I: Correlation Analysis

LI EAR REGRESSIO A D CORRELATIO

Correlation and Regression Analysis. Linear Regression and Correlation. Correlation and Linear Regression. Three Questions.

Linear Regression. Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x).

9. Linear Regression and Correlation

Regression Models REVISED TEACHING SUGGESTIONS ALTERNATIVE EXAMPLES

Statistics and Quantitative Analysis U4320

Department of Economics. Business Statistics. Chapter 12 Chi-square test of independence & Analysis of Variance ECON 509. Dr.

Lecture 15 Multiple regression I Chapter 6 Set 2 Least Square Estimation The quadratic form to be minimized is

Simple Linear Regression Using Ordinary Least Squares

Trendlines Simple Linear Regression Multiple Linear Regression Systematic Model Building Practical Issues

Simple Linear Regression

Sociology 6Z03 Review II

TMA4255 Applied Statistics V2016 (5)

Formal Statement of Simple Linear Regression Model

Linear Regression and Correlation

Lecture 10 Multiple Linear Regression

Business Statistics:

STAT 350 Final (new Material) Review Problems Key Spring 2016

Statistical Techniques II EXST7015 Simple Linear Regression

Ch14. Multiple Regression Analysis

CHAPTER EIGHT Linear Regression

REVIEW 8/2/2017 陈芳华东师大英语系

CHAPTER 5 LINEAR REGRESSION AND CORRELATION

Chapter 12 : Linear Correlation and Linear Regression

Lecture 2 Simple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: Chapter 1

STAT Chapter 11: Regression

Review of Statistics 101

STATISTICAL DATA ANALYSIS IN EXCEL

STAT5044: Regression and Anova. Inyoung Kim

Transcription:

Department of Quantitative Methods & Information Systems Business Statistics Chapter 14 Introduction to Linear Regression and Correlation Analysis QMIS 220 Dr. Mohammad Zainal

Chapter Goals After completing this chapter, you should be able to: Calculate and interpret the simple correlation between two variables Determine whether the correlation is significant Calculate and interpret the simple linear regression equation for a set of data Understand the assumptions behind regression analysis Determine whether a regression model is significant QMIS 220, by Dr. M. Zainal Chap 13-2

Chapter Goals After completing this chapter, you should be able to: Calculate and interpret confidence intervals for the regression coefficients Recognize regression analysis applications for purposes of prediction and description Recognize some potential problems if regression analysis is used incorrectly Recognize nonlinear relationships between two variables (continued) QMIS 220, by Dr. M. Zainal Chap 13-3

Scatter Plots and Correlation A scatter plot (or scatter diagram) is used to show the relationship between two variables Correlation analysis is used to measure strength of the association (linear relationship) between two variables Only concerned with strength of the relationship No causal effect is implied QMIS 220, by Dr. M. Zainal Chap 13-4

Scatter Plot Examples Linear relationships Curvilinear relationships y y x x y y x QMIS 220, by Dr. M. Zainal Chap 13-5 x

Scatter Plot Examples Strong relationships (continued) Weak relationships y y x x y y x QMIS 220, by Dr. M. Zainal Chap 13-6 x

Scatter Plot Examples No relationship (continued) y x y QMIS 220, by Dr. M. Zainal Chap 13-7 x

Correlation Coefficient Correlation measures the strength of the linear association between two variables The sample correlation coefficient r is a measure of the strength of the linear relationship between two variables, based on sample observations (continued) QMIS 220, by Dr. M. Zainal Chap 13-8

Features of r Unit free Range between -1 and 1 The closer to -1, the stronger the negative linear relationship The closer to 1, the stronger the positive linear relationship The closer to 0, the weaker the linear relationship QMIS 220, by Dr. M. Zainal Chap 13-9

Examples of Approximate r Values y y y x r = -1 r = -.6 r = 0 y y x x r = +.3 r = +1 x QMIS 220, by Dr. M. Zainal Chap 13-10 x

Calculating the Correlation Coefficient Sample correlation coefficient: r [ (x (x x)(y y) x) 2 ][ (y y) 2 ] or the algebraic equivalent: ) ( n xy x y r 2 2 2 [n( x ) ( x) ][n( y where: r = Sample correlation coefficient n = Sample size x = Value of the independent variable y = Value of the dependent variable QMIS 220, by Dr. M. Zainal Chap 13-11 y) 2 ]

Calculation Example Tree Height Trunk Diameter y x xy y 2 x 2 35 8 280 1225 64 49 9 441 2401 81 27 7 189 729 49 33 6 198 1089 36 60 13 780 3600 169 21 7 147 441 49 45 11 495 2025 121 51 12 612 2601 144 =321 =73 =3142 =14111 =713 QMIS 220, by Dr. M. Zainal Chap 13-12

Calculation Example (continued) Tree Height, y 70 r n xy x y 2 2 2 [n( x ) ( x) ][n( y ) ( y) 2 ] 60 50 40 8(3142) (73)(321) [8(713) (73) 2 ][8(14111) (321) 2 ] 30 20 0.886 10 0 0 2 4 6 8 10 12 14 Trunk Diameter, x QMIS 220, by Dr. M. Zainal Chap 13-13

Excel Output Excel Correlation Output Tools / data analysis / correlation Tree Height Trunk Diameter Tree Height 1 Trunk Diameter 0.886231 1 QMIS 220, by Dr. M. Zainal Chap 13-14

Significance Test for Correlation Hypotheses H 0 : ρ = 0 H A : ρ 0 (no correlation) (correlation exists) Test statistic t r 2 1 r n 2 The Greek letter ρ (rho) represents the population correlation coefficient (with n 2 degrees of freedom) QMIS 220, by Dr. M. Zainal Chap 13-15

Example: Produce Stores Is there evidence of a linear relationship between tree height and trunk diameter at the.05 level of significance? H 0 : ρ = 0 H 1 : ρ 0 (No correlation) (correlation exists) =.05, df = 8-2 = 6 t r 2 1 r n 2.886 1.886 8 2 2 4.68 QMIS 220, by Dr. M. Zainal Chap 13-16

Example: Test Solution r t 2 1 r n 2 d.f. = 8-2 = 6 /2=.025.886 1.886 8 2 2 4.68 /2=.025 Decision: Reject H 0 Conclusion: There is sufficient evidence of a linear relationship at the 5% level of significance Reject H 0 -t α/2 Do not reject H 0 0 Reject H t 0 α/2-2.4469 2.4469 QMIS 220, by Dr. M. Zainal Chap 13-17

Introduction to Regression Analysis Regression analysis is used to: Predict the value of a dependent variable based on the value of at least one independent variable Explain the impact of changes in an independent variable on the dependent variable Dependent variable: the variable we wish to explain Independent variable: the variable used to explain the dependent variable QMIS 220, by Dr. M. Zainal Chap 13-18

Simple Linear Regression Model Only one independent variable, x Relationship between x and y is described by a linear function Changes in y are assumed to be caused by changes in x QMIS 220, by Dr. M. Zainal Chap 13-19

Types of Regression Models Positive Linear Relationship Relationship NOT Linear Negative Linear Relationship No Relationship QMIS 220, by Dr. M. Zainal Chap 13-20

Population Linear Regression The population regression model: Dependent Variable Population y intercept Population Slope Coefficient y 1 Independent Variable β β x 0 ε Random Error term, or residual Linear component Random Error component QMIS 220, by Dr. M. Zainal Chap 13-21

Linear Regression Assumptions Error values (ε) are statistically independent Error values are normally distributed for any given value of x The probability distribution of the errors is normal The distributions of possible ε values have equal variances for all values of x The underlying relationship between the x variable and the y variable is linear QMIS 220, by Dr. M. Zainal Chap 13-22

Population Linear Regression (continued) y Observed Value of y for x i y 1 β β x 0 ε Predicted Value of y for x i ε i Random Error for this x value Slope = β 1 Intercept = β 0 x i x QMIS 220, by Dr. M. Zainal Chap 13-23

Estimated Regression Model The sample regression line provides an estimate of the population regression line Estimated (or predicted) y value Estimate of the regression intercept Estimate of the regression slope ŷ b b x i 0 1 Independent variable The individual random error terms e i have a mean of zero QMIS 220, by Dr. M. Zainal Chap 13-24

Least Squares Criterion b 0 and b 1 are obtained by finding the values of b 0 and b 1 that minimize the sum of the squared residuals e 2 (y ŷ) 2 (y (b 0 b 1 x)) 2 QMIS 220, by Dr. M. Zainal Chap 13-25

The Least Squares Equation The formulas for b 1 and b 0 are: b 1 and (x 2 (x b0 y b1x x)(y x) y) algebraic equivalent for b 1 : x y xy b n 1 ( 2 x) 2 x n QMIS 220, by Dr. M. Zainal Chap 13-26

Interpretation of the Slope and the Intercept b 0 is the estimated average value of y when the value of x is zero b 1 is the estimated change in the average value of y as a result of a one-unit change in x QMIS 220, by Dr. M. Zainal Chap 13-27

Finding the Least Squares Equation The coefficients b 0 and b 1 will usually be found using computer software, such as Excel or Minitab Other regression measures will also be computed as part of computer-based regression analysis QMIS 220, by Dr. M. Zainal Chap 13-28

Simple Linear Regression Example A real estate agent wishes to examine the relationship between the selling price of a home and its size (measured in square feet) A random sample of 10 houses is selected Dependent variable (y) = house price in $1000s Independent variable (x) = square feet QMIS 220, by Dr. M. Zainal Chap 13-29

Sample Data for House Price Model House Price in $1000s (y) Square Feet (x) 245 1400 312 1600 279 1700 308 1875 199 1100 219 1550 405 2350 324 2450 319 1425 255 1700 QMIS 220, by Dr. M. Zainal Chap 13-30

Regression Using Excel Data / Data Analysis / Regression QMIS 220, by Dr. M. Zainal Chap 13-31

Excel Output Regression Statistics Multiple R 0.76211 R Square 0.58082 Adjusted R Square 0.52842 Standard Error 41.33032 Observations 10 The regression equation is: ANOVA df SS MS F Significance F Regression 1 18934.9348 18934.9348 11.0848 0.01039 Residual 8 13665.5652 1708.1957 Total 9 32600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.24833 58.03348 1.69296 0.12892-35.57720 232.07386 Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580 QMIS 220, by Dr. M. Zainal Chap 13-32

House Price ($1000s) Graphical Presentation House price model: scatter plot and regression line 450 400 350 300 250 200 150 100 50 0 0 500 1000 1500 2000 2500 3000 Square Feet houseprice 98.24833 0.10977(squarefeet) QMIS 220, by Dr. M. Zainal Chap 13-33

Interpretation of the Intercept, b 0 houseprice 98.24833 0.10977(squarefeet) b 0 is the estimated average value of Y when the value of X is zero (if x = 0 is in the range of observed x values) Here, no houses had 0 square feet, so b 0 = 98.24833 just indicates that, for houses within the range of sizes observed, $98,248.33 is the portion of the house price not explained by square feet QMIS 220, by Dr. M. Zainal Chap 13-34

Interpretation of the Slope Coefficient, b 1 houseprice 98.24833 0.10977(squarefeet) b 1 measures the estimated change in the average value of Y as a result of a oneunit change in X Here, b 1 =.10977 tells us that the average value of a house increases by.10977($1000) = $109.77, on average, for each additional one square foot of size QMIS 220, by Dr. M. Zainal Chap 13-35

Least Squares Regression Properties The sum of the residuals from the least squares regression line is 0 ( (y y) ˆ 0 ) The sum of the squared residuals is a minimum 2 (minimized ) (y y) ˆ The simple regression line always passes through the mean of the y variable and the mean of the x variable The least squares coefficients are unbiased estimates of β 0 and β 1 QMIS 220, by Dr. M. Zainal Chap 13-36

Explained and Unexplained Variation Total variation is made up of two parts: SST SSE SSR Total sum of Squares Sum of Squares Error Sum of Squares Regression SST where: y (y 2 y) SSE 2 (y ŷ) SSR = Average value of the dependent variable y = Observed values of the dependent variable ŷ = Estimated value of y for the given x value 2 (ŷ y) Chap 13-37

Explained and Unexplained Variation SST = total sum of squares Measures the variation of the y i values around their mean y SSE = error sum of squares Variation attributable to factors other than the relationship between x and y SSR = regression sum of squares Explained variation attributable to the relationship between x and y (continued) QMIS 220, by Dr. M. Zainal Chap 13-38

Explained and Unexplained Variation y (continued) y i y _ y _ SST = (y i - y) 2 SSE = (y i - y i ) 2 _ SSR = (y i - y) 2 y _ y X i QMIS 220, by Dr. M. Zainal Chap 13-39 x

Coefficient of Determination, R 2 The coefficient of determination is the portion of the total variation in the dependent variable that is explained by variation in the independent variable The coefficient of determination is also called R-squared and is denoted as R 2 R 2 SSR where 0 R 2 1 SST QMIS 220, by Dr. M. Zainal Chap 13-40

Coefficient of Determination, R 2 Coefficient of determination (continued) R 2 SSR SST sumof squaresexplainedby regression total sumof squares Note: In the single independent variable case, the coefficient of determination is 2 R r 2 where: R 2 = Coefficient of determination r = Simple correlation coefficient QMIS 220, by Dr. M. Zainal Chap 13-41

Examples of Approximate R 2 Values y R 2 = 1 y R 2 = 1 x Perfect linear relationship between x and y: 100% of the variation in y is explained by variation in x R 2 = +1 x QMIS 220, by Dr. M. Zainal Chap 13-42

Examples of Approximate R 2 Values (continued) y y x 0 < R 2 < 1 Weaker linear relationship between x and y: Some but not all of the variation in y is explained by variation in x x QMIS 220, by Dr. M. Zainal Chap 13-43

Examples of Approximate R 2 Values (continued) y R 2 = 0 No linear relationship between x and y: R 2 = 0 x The value of Y does not depend on x. (None of the variation in y is explained by variation in x) QMIS 220, by Dr. M. Zainal Chap 13-44

Excel Output Regression Statistics Multiple R 0.76211 R Square 0.58082 Adjusted R Square 0.52842 Standard Error 41.33032 Observations 10 ANOVA df SS MS F Significance F Regression 1 18934.9348 18934.9348 11.0848 0.01039 Residual 8 13665.5652 1708.1957 Total 9 32600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.24833 58.03348 1.69296 0.12892-35.57720 232.07386 Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580 QMIS 220, by Dr. M. Zainal Chap 13-45

Test for Significance of Coefficient of Determination Hypotheses H 0 : ρ 2 = 0 H A : ρ 2 0 H 0 : The independent variable does not explain a significant portion of the variation in the dependent variable H A : The independent variable does explain a significant portion of the variation in the dependent variable Test statistic F SSR/1 SSE/(n 2) (with D 1 = 1 and D 2 = n 2 degrees of freedom) QMIS 220, by Dr. M. Zainal Chap 13-46

Excel Output Regression Statistics Multiple R 0.76211 R Square 0.58082 Adjusted R Square 0.52842 Standard Error 41.33032 Observations 10 ANOVA df SS MS F Significance F Regression 1 18934.9348 18934.9348 11.0848 0.01039 Residual 8 13665.5652 1708.1957 Total 9 32600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.24833 58.03348 1.69296 0.12892-35.57720 232.07386 Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580 QMIS 220, by Dr. M. Zainal Chap 13-47

Standard Error of Estimate The standard deviation of the variation of observations around the simple regression line is estimated by s ε SSE n 2 Where SSE = Sum of squares error n = Sample size QMIS 220, by Dr. M. Zainal Chap 13-48

The Standard Deviation of the Regression Slope The standard error of the regression slope coefficient (b 1 ) is estimated by s where: s b1 s ε b 1 s (x ε x) 2 x 2 s ε ( = Estimate of the standard error of the least squares slope SSE n 2 = Sample standard error of the estimate QMIS 220, by Dr. M. Zainal Chap 13-49 n x) 2

Excel Output Regression Statistics Multiple R 0.76211 R Square 0.58082 Adjusted R Square 0.52842 Standard Error 41.33032 Observations 10 ANOVA df SS MS F Significance F Regression 1 18934.9348 18934.9348 11.0848 0.01039 Residual 8 13665.5652 1708.1957 Total 9 32600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.24833 58.03348 1.69296 0.12892-35.57720 232.07386 Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580 QMIS 220, by Dr. M. Zainal Chap 13-50

Comparing Standard Errors y Variation of observed y values from the regression line y Variation in the slope of regression lines from different possible samples smalls x smalls b1 x y y larges x large s b1 QMIS 220, by Dr. M. Zainal Chap 13-51 x

Inference about the Slope: t Test t test for a population slope Is there a linear relationship between x and y? Null and alternative hypotheses H 0 : β 1 = 0 H A : β 1 0 Test statistic t b1 β s b 1 1 (no linear relationship) (linear relationship does exist) where: b 1 = Sample regression slope coefficient β 1 = Hypothesized slope s b1 = Estimator of the standard error of the slope d.f. n 2 QMIS 220, by Dr. M. Zainal Chap 13-52

Inference about the Slope: t Test (continued) House Price in $1000s (y) Square Feet (x) 245 1400 312 1600 279 1700 308 1875 199 1100 219 1550 405 2350 324 2450 319 1425 255 1700 Estimated Regression Equation: houseprice 98.25 0.1098(sq.ft.) The slope of this model is 0.1098 Does square footage of the house affect its sales price? QMIS 220, by Dr. M. Zainal Chap 13-53

Inferences about the Slope: t Test Example H 0 : β 1 = 0 H A : β 1 0 d.f. = 10-2 = 8 From Excel output: Coefficients Standard Error t Stat P-value Intercept 98.24833 58.03348 1.69296 0.12892 Square Feet 0.10977 0.03297 3.32938 0.01039 Decision: /2=.025 /2=.025 Conclusion: Reject H 0 Do not reject H Reject H -t 0 0 α/2 t α/2 0-2.3060 2.3060 QMIS 220, by Dr. M. Zainal Chap 13-54

Regression Analysis for Description Confidence Interval Estimate of the Slope: b t s 1 /2 b 1 d.f. = n - 2 Excel Printout for House Prices: Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.24833 58.03348 1.69296 0.12892-35.57720 232.07386 Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580 QMIS 220, by Dr. M. Zainal Chap 13-55

Regression Analysis for Description Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.24833 58.03348 1.69296 0.12892-35.57720 232.07386 Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580 Since the units of the house price variable is $1000s, we are 95% confident that the average impact on sales price is between $33.70 and $185.80 per square foot of house size This 95% confidence interval does not include 0. Conclusion: There is a significant relationship between house price and square feet at the.05 level of significance QMIS 220, by Dr. M. Zainal Chap 13-56

Confidence Interval for the Average y, Given x Confidence interval estimate for the mean of y given a particular x p Size of interval varies according to distance away from mean, x ŷ 2 1 (xp x) t /2sε 2 n (x x) QMIS 220, by Dr. M. Zainal Chap 13-57

Confidence Interval for an Individual y, Given x Confidence interval estimate for an Individual value of y given a particular x p ŷ 1 (x x) t s 1 p /2 ε n (x x) 2 2 QMIS 220, by Dr. M. Zainal Chap 13-58

Example: House Prices House Price in $1000s (y) Square Feet (x) 245 1400 312 1600 279 1700 308 1875 199 1100 219 1550 405 2350 324 2450 319 1425 255 1700 Estimated Regression Equation: houseprice 98.25 0.1098(sq.ft.) Predict the price for a house with 2000 square feet QMIS 220, by Dr. M. Zainal Chap 13-59

Example: House Prices Predict the price for a house with 2000 square feet: (continued) houseprice 98.25 0.1098 (sq.ft.) 98.25 0.1098(2000) 317.85 The predicted price for a house with 2000 square feet is 317.85($1,000s) = $317,850 QMIS 220, by Dr. M. Zainal Chap 13-60

Residual Analysis Purposes Examine for linearity assumption Examine for constant variance for all levels of x Evaluate normal distribution assumption Graphical Analysis of Residuals Can plot residuals vs. x Can create histogram of residuals to check for normality QMIS 220, by Dr. M. Zainal Chap 13-61

residuals residuals Residual Analysis for Linearity y y x x x x Not Linear Linear QMIS 220, by Dr. M. Zainal Chap 13-62

residuals residuals Residual Analysis for Constant Variance y y x x x x Non-constant variance Constant variance QMIS 220, by Dr. M. Zainal Chap 13-63

Residuals Excel Output RESIDUAL OUTPUT Predicted House Price Residuals 1 251.92316-6.923162 2 273.87671 38.12329 3 284.85348-5.853484 80 60 40 House Price Model Residual Plot 4 304.06284 3.937162 20 5 218.99284-19.99284 6 268.38832-49.38832 0-20 0 1000 2000 3000 7 356.20251 48.79749 8 367.17929-43.17929 9 254.6674 64.33264 10 284.85348-29.85348-40 -60 Square Feet QMIS 220, by Dr. M. Zainal Chap 13-64

Chapter Summary Introduced correlation analysis Discussed correlation to measure the strength of a linear association Introduced simple linear regression analysis Calculated the coefficients for the simple linear regression equation Described measures of variation (R 2 and s ε ) Addressed assumptions of regression and correlation QMIS 220, by Dr. M. Zainal Chap 13-65

Chapter Summary (continued) Described inference about the slope Addressed estimation of mean values and prediction of individual values Discussed residual analysis QMIS 220, by Dr. M. Zainal Chap 13-66

Problems A random sample of eight drivers insured with a company and having similar auto insurance policies was selected. The following table lists their driving experiences (in years) and monthly auto insurance premiums. Driving Experience (years) Monthly Auto Insurance Premium 5 64 2 87 12 50 9 71 15 44 6 56 25 42 16 60 QMIS 220, by Dr. M. Zainal Chap 13-67

Problems a. Does the insurance premium depend on the driving experience or does the driving experience depend on the insurance premium? Do you expect a positive or a negative relationship between these two variables? QMIS 220, by Dr. M. Zainal Chap 13-68

Problems b. Compute SS xx, SS yy, and SS xy. QMIS 220, by Dr. M. Zainal Chap 13-69

Problems c. Find the least squares regression line by choosing appropriate dependent and independent variables based on your answer in part a. QMIS 220, by Dr. M. Zainal Chap 13-70

Problems d. Interpret the meaning of the values of a and b calculated in part c QMIS 220, by Dr. M. Zainal Chap 13-71

Problems e. Plot the scatter diagram and the regression line QMIS 220, by Dr. M. Zainal Chap 13-72

Problems f. Calculate r and r 2 and explain what they mean QMIS 220, by Dr. M. Zainal Chap 13-73

Problems g. Predict the monthly auto insurance premium for a driver with 10 years of driving experience QMIS 220, by Dr. M. Zainal Chap 13-74

Problems h. Compute the standard deviation of errors QMIS 220, by Dr. M. Zainal Chap 13-75

Problems i. Construct a 90% confidence interval for B. QMIS 220, by Dr. M. Zainal Chap 13-76

Problems j. Test at the 5% significance level whether B is negative. QMIS 220, by Dr. M. Zainal Chap 13-77

Problems k. Using α =.05, test whether ρ is different from zero. QMIS 220, by Dr. M. Zainal Chap 13-78

Copyright The materials of this presentation were mostly taken from the PowerPoint files accompanied Business Statistics: A Decision-Making Approach, 7e 2008 Prentice-Hall, Inc. QMIS 220, by Dr. M. Zainal Chap 10-79