Chapter 14 Student Lecture Notes 14-1

Similar documents
Chapter 7 Student Lecture Notes 7-1

Chapter 3 Multiple Regression Complete Example

Basic Business Statistics, 10/e

Chapter 14 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics. Chapter 14 Multiple Regression

The Multiple Regression Model

Chapter 4. Regression Models. Learning Objectives

Correlation Analysis

Chapter 4: Regression Models

Basic Business Statistics 6 th Edition

Regression Models. Chapter 4. Introduction. Introduction. Introduction

Statistics for Managers using Microsoft Excel 6 th Edition

Business Statistics. Chapter 14 Introduction to Linear Regression and Correlation Analysis QMIS 220. Dr. Mohammad Zainal

Chapter 13. Multiple Regression and Model Building

Multiple Regression. Peerapat Wongchaiwat, Ph.D.

Analisi Statistica per le Imprese

Chapter Learning Objectives. Regression Analysis. Correlation. Simple Linear Regression. Chapter 12. Simple Linear Regression

The simple linear regression model discussed in Chapter 13 was written as

Inferences for Regression

Chapter 13 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics

Mathematics for Economics MA course

STA121: Applied Regression Analysis

Bayesian Analysis LEARNING OBJECTIVES. Calculating Revised Probabilities. Calculating Revised Probabilities. Calculating Revised Probabilities

Chapter 16. Simple Linear Regression and dcorrelation

Ch14. Multiple Regression Analysis

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Inference for the Regression Coefficient

Chapter 16. Simple Linear Regression and Correlation

Estimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.

Chapter 14 Simple Linear Regression (A)

Lecture 10 Multiple Linear Regression

Regression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.

MULTIPLE REGRESSION ANALYSIS AND OTHER ISSUES. Business Statistics

A discussion on multiple regression models

SIMPLE REGRESSION ANALYSIS. Business Statistics

Inference for Regression

Multiple Regression Methods

Regression Models. Chapter 4

Final Review. Yang Feng. Yang Feng (Columbia University) Final Review 1 / 58

Keller: Stats for Mgmt & Econ, 7th Ed July 17, 2006

Single and multiple linear regression analysis

Unit 11: Multiple Linear Regression

Inference for Regression Simple Linear Regression

Econ 3790: Statistics Business and Economics. Instructor: Yogesh Uppal

Multiple Regression. Inference for Multiple Regression and A Case Study. IPS Chapters 11.1 and W.H. Freeman and Company

Lecture 15 Multiple regression I Chapter 6 Set 2 Least Square Estimation The quadratic form to be minimized is

Regression Analysis. BUS 735: Business Decision Making and Research

x3,..., Multiple Regression β q α, β 1, β 2, β 3,..., β q in the model can all be estimated by least square estimators

LI EAR REGRESSIO A D CORRELATIO

Ch 2: Simple Linear Regression

Hypothesis testing Goodness of fit Multicollinearity Prediction. Applied Statistics. Lecturer: Serena Arima

Econ 3790: Business and Economics Statistics. Instructor: Yogesh Uppal

PubH 7405: REGRESSION ANALYSIS. MLR: INFERENCES, Part I

Ch 13 & 14 - Regression Analysis

Statistics and Quantitative Analysis U4320

Regression Analysis IV... More MLR and Model Building

Lecture 6: Linear Regression (continued)

Lectures on Simple Linear Regression Stat 431, Summer 2012

Introduction to Regression

Regression Models REVISED TEACHING SUGGESTIONS ALTERNATIVE EXAMPLES

Final Exam - Solutions

Inference for Regression Inference about the Regression Model and Using the Regression Line

Chapter 15 Multiple Regression

BNAD 276 Lecture 10 Simple Linear Regression Model

Data Analysis 1 LINEAR REGRESSION. Chapter 03

Trendlines Simple Linear Regression Multiple Linear Regression Systematic Model Building Practical Issues

STATISTICAL DATA ANALYSIS IN EXCEL

Regression Analysis II

Interactions. Interactions. Lectures 1 & 2. Linear Relationships. y = a + bx. Slope. Intercept

Chapte The McGraw-Hill Companies, Inc. All rights reserved.

DEMAND ESTIMATION (PART III)

Multiple regression: Model building. Topics. Correlation Matrix. CQMS 202 Business Statistics II Prepared by Moez Hababou

Analysing data: regression and correlation S6 and S7

Business Statistics. Lecture 10: Correlation and Linear Regression

Linear regression. We have that the estimated mean in linear regression is. ˆµ Y X=x = ˆβ 0 + ˆβ 1 x. The standard error of ˆµ Y X=x is.

Regression Model Building

Confidence Interval for the mean response

Simple Linear Regression

What is a Hypothesis?

Multiple Regression: Chapter 13. July 24, 2015

Linear Modelling in Stata Session 6: Further Topics in Linear Modelling

Concordia University (5+5)Q 1.

Sociology 6Z03 Review II

STAT 350 Final (new Material) Review Problems Key Spring 2016

Regression Models for Quantitative and Qualitative Predictors: An Overview

STAT 212 Business Statistics II 1

Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z).

Simple Linear Regression: One Qualitative IV

ECO220Y Simple Regression: Testing the Slope

Section 4: Multiple Linear Regression

Chapter 14. Multiple Regression Models. Multiple Regression Models. Multiple Regression Models

Multiple Linear Regression. Chapter 12

Inference with Simple Regression

(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box.

Multiple Linear Regression

Correlation and Regression

Regression analysis is a tool for building mathematical and statistical models that characterize relationships between variables Finds a linear

Econometrics. 5) Dummy variables

Correlation & Simple Regression

Inference for Regression Inference about the Regression Model and Using the Regression Line, with Details. Section 10.1, 2, 3

STAT 212: BUSINESS STATISTICS II Third Exam Tuesday Dec 12, 6:00 PM

Transcription:

Chapter 14 Student Lecture Notes 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter 14 Multiple Regression Analysis and Model Building Chap 14-1 Chapter Goals After completing this chapter, you should be able to: understand model building using multiple regression analysis apply multiple regression analysis to business decision-making situations analyze and interpret the computer output for a multiple regression model test the significance of the independent variables in a multiple regression model Chap 14- Chapter Goals After completing this chapter, you should be able to: use variable transformations to model nonlinear relationships recognize potential problems in multiple regression analysis and take the steps to correct the problems. incorporate qualitative variables into the regression model by using dummy variables. Chap 14-3

Chapter 14 Student Lecture Notes 14- The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (y) & or more independent variables (x i ) Population model: Y-intercept Population slopes Random Error y = β + β1x1 + βx + K+ βkxk + ε Estimated multiple regression model: Estimated (or predicted) value of y Estimated intercept Estimated slope coefficients ŷ = b K+ + b1x1 + bx + bkxk Chap 14-4 Multiple Regression Model Two variable model y ŷ = b + + b1x1 bx Slope for variable x 1 Slope for variable x x x 1 Chap 14-5 Multiple Regression Model Two variable model y i y Sample observation ŷ = b + + b1x1 bx y i < e = (y y) < x i x x 1i The best fit equation, y, x 1 Chap 14-6 < is found by minimizing the sum of squared errors, Σe

Chapter 14 Student Lecture Notes 14-3 Multiple Regression Assumptions Errors (residuals) from the regression model: e = (y y) < The errors are normally distributed The mean of the errors is zero Errors have a constant variance The model errors are independent Chap 14-7 Model Specification Decide what you want to do and select the dependent variable Determine the potential independent variables for your model Gather sample data (observations) for all variables Chap 14-8 The Correlation Matrix Correlation between the dependent variable and selected independent variables can be found using Excel: Tools / Data Analysis / Correlation Can check for statistical significance of correlation with a t test Chap 14-9

Chapter 14 Student Lecture Notes 14-4 Example A distributor of frozen desert pies wants to evaluate factors thought to influence demand Dependent variable: Pie sales (units per week) Independent variables: Price (in $) Advertising ($1 s) Data is collected for 15 weeks Chap 14-1 Pie Sales Model Week 1 Pie Sales 35 Price ($) 5.5 Advertising ($1s) 3.3 Multiple regression model: 3 4 5 46 35 43 35 7.5 8. 8. 6.8 3.3 3. 4.5 3. Sales = b + b 1 (Price) + b (Advertising) 6 38 7.5 4. 7 8 9 1 11 1 13 43 47 45 49 34 3 44 4.5 6.4 7. 5. 7. 7.9 5.9 3. 3.7 3.5 4. 3.5 3. 4. Correlation matrix: Price Pie Sales Advertising Pie Sales 1 -.4437.5563 Price 1.344 Advertisin g 1 14 45 5. 3.5 15 3 7..7 Chap 14-11 Slope (b i ) Interpretation of Estimated Coefficients Estimates that the average value of y changes by b i units for each 1 unit increase in X i holding all other variables constant Example: if b 1 = -, then sales (y) is expected to decrease by an estimated pies per week for each $1 increase in selling price (x 1 ), net of the effects of changes due to advertising (x ) y-intercept (b ) The estimated average value of y when all x i = (assuming all x i = is within the range of observed values) Chap 14-1

Chapter 14 Student Lecture Notes 14-5 Pie Sales Correlation Matrix Pie Sales Price Advertising Pie Sales 1 -.4437.5563 Price 1.344 Advertisin g 1 Price vs. Sales : r = -.4437 There is a negative association between price and sales Advertising vs. Sales : r =.5563 There is a positive association between advertising and sales Chap 14-13 Scatter Diagrams Sales 6 Sales vs. Price 5 4 3 1 Sales 6 4 6 8 1 5 Price 4 Sales vs. Advertising 3 1 1 3 4 5 Advertising Chap 14-14 Estimating a Multiple Linear Regression Equation Computer software is generally used to generate the coefficients and measures of goodness of fit for multiple regression Excel: Tools / Data Analysis... / Regression PHStat: PHStat / Regression / Multiple Regression Chap 14-15

Chapter 14 Student Lecture Notes 14-6 Multiple Regression Output Multiple R R Square Adusted R Square Regression Statistics Standard Error Observations.713.5148.4417 47.46341 15 Sales = 36.56-4.975(Price) + 74.131(Advertising) ANOVA df SS MS F Significance F Regression 946.7 1473.13 6.53861.11 Residual 1 733.36 5.776 Total 14 56493.333 Intercept Coefficient s 36.5619 Standard Error 114.5389 t Stat.6885 P-value.1993 Lower 95% 57.58835 Upper 95% 555.4644 Price -4.9759 1.8313 -.3565.3979-48.5766-1.3739 Advertising 74.1396 5.9673.85478.1449 17.5533 13.7888 Chap 14-16 The Multiple Regression Equation Sales = 36.56-4.975(Price) + 74.131(Advertising) where Sales is in number of pies per week Price is in $ Advertising is in $1 s. b 1 = -4.975: sales will decrease, on average, by 4.975 pies per week for each $1 increase in selling price, net of the effects of changes due to advertising b = 74.131: sales will increase, on average, by 74.131 pies per week for each $1 increase in advertising, net of the effects of changes due to price Chap 14-17 Using The Model to Make Predictions Predict sales for a week in which the selling price is $5.5 and advertising is $35: Sales = 36.56-4.975(Price) + 74.131(Advertising) = 36.56-4.975 (5.5) + 74.131(3.5) = 48.6 Predicted sales is 48.6 pies Note that Advertising is in $1 s, so $35 means that x = 3.5 Chap 14-18

Chapter 14 Student Lecture Notes 14-7 Predictions in PHStat PHStat regression multiple regression Check the confidence and prediction interval estimates box Chap 14-19 Predictions in PHStat Input values Predicted y value < Confidence interval for the mean y value, given these x s < Prediction interval for an individual y value, given these x s Chap 14- < Multiple Coefficient of Determination Reports the proportion of total variation in y explained by all x variables taken together SSR Sum of squares regression R = = SST Total sum of squares Chap 14-1

Chapter 14 Student Lecture Notes 14-8 Regression Statistics Multiple R.713 R Square Adusted R.5148 Square.4417 Standard Error Observations Multiple Coefficient of Determination 47.46341 15 SSR 946. R = = =.5148 SST 56493.3 5.1% of the variation in pie sales is explained by the variation in price and advertising ANOVA df SS MS F Significance F Regression 946.7 1473.13 6.53861.11 Residual 1 733.36 5.776 Total 14 56493.333 Intercept Coefficient s 36.5619 Standard Error 114.5389 t Stat.6885 P-value.1993 Lower 95% 57.58835 Upper 95% 555.4644 Price -4.9759 1.8313 -.3565.3979-48.5766-1.3739 Advertising 74.1396 5.9673.85478.1449 17.5533 13.7888 Chap 14- Adusted R R never decreases when a new x variable is added to the model This can be a disadvantage when comparing models What is the net effect of adding a new variable? We lose a degree of freedom when a new x variable is added Did the new x variable add enough explanatory power to offset the loss of one degree of freedom? Chap 14-3 Adusted R Shows the proportion of variation in y explained by all x variables adusted for the number of x variables used R A = 1 (1 R n 1 ) n k 1 (where n = sample size, k = number of independent variables) Penalize excessive use of unimportant independent variables Smaller than R Useful in comparing among models Chap 14-4

Chapter 14 Student Lecture Notes 14-9 Multiple R R Square Adusted R Square Regression Statistics Standard Error Observations Multiple Coefficient of Determination.713.5148.4417 47.46341 15 R A =.4417 44.% of the variation in pie sales is explained by the variation in price and advertising, taking into account the sample size and number of independent variables ANOVA df SS MS F Significance F Regression 946.7 1473.13 6.53861.11 Residual 1 733.36 5.776 Total 14 56493.333 Intercept Coefficient s 36.5619 Standard Error 114.5389 t Stat.6885 P-value.1993 Lower 95% 57.58835 Upper 95% 555.4644 Price -4.9759 1.8313 -.3565.3979-48.5766-1.3739 Advertising 74.1396 5.9673.85478.1449 17.5533 13.7888 Chap 14-5 Is the Model Significant? F-Test for Overall Significance of the Model Shows if there is a linear relationship between all of the x variables considered together and y Use F test statistic Hypotheses: H : β 1 = β = = β k = (no linear relationship) H A : at least one β i (at least one independent variable affects y) Chap 14-6 F-Test for Overall Significance Test statistic: SSR F = k = SSE n k 1 MSR MSE where F has (numerator) D 1 = k and (denominator) D = (n k - 1) degrees of freedom Chap 14-7

Chapter 14 Student Lecture Notes 14-1 Multiple R R Square Adusted R Square Standard Error Observations F-Test for Overall Significance Regression Statistics.713.5148.4417 47.46341 15 MSR 1473. F = = = 6.5386 MSE 5.8 With and 1 degrees of freedom P-value for the F-Test ANOVA df SS MS F Significance F Regression 946.7 1473.13 6.53861.11 Residual 1 733.36 5.776 Total 14 56493.333 Intercept Coefficient s 36.5619 Standard Error 114.5389 t Stat.6885 P-value.1993 Lower 95% 57.58835 Upper 95% 555.4644 Price -4.9759 1.8313 -.3565.3979-48.5766-1.3739 Advertising 74.1396 5.9673.85478.1449 17.5533 13.7888 Chap 14-8 H : β 1 = β = H A : β 1 and β not both zero α =.5 df 1 = df = 1 Do not reect H F-Test for Overall Significance Critical Value: F α = 3.885 α =.5 Reect H F.5 = 3.885 F Test Statistic: MSR F = = 6.5386 MSE Decision: Reect H at α =.5 Conclusion: The regression model does explain a significant portion of the variation in pie sales (There is evidence that at least one independent variable affects y) Chap 14-9 Are Individual Variables Significant? Use t-tests of individual variable slopes Shows if there is a linear relationship between the variable x i and y Hypotheses: H : β i = (no linear relationship) H A : β i (linear relationship does exist between x i and y) Chap 14-3

Chapter 14 Student Lecture Notes 14-11 Are Individual Variables Significant? H : β i = (no linear relationship) H A : β i (linear relationship does exist between x i and y) Test Statistic: t = bi s b i (df = n k 1) Chap 14-31 Multiple R R Square Adusted R Square Regression Statistics Standard Error Observations Are Individual Variables Significant?.713.5148.4417 47.46341 15 t-value for Price is t = -.36, with p-value.398 t-value for Advertising is t =.855, with p-value.145 ANOVA df SS MS F Significance F Regression 946.7 1473.13 6.53861.11 Residual 1 733.36 5.776 Total 14 56493.333 Intercept Coefficient s 36.5619 Standard Error 114.5389 t Stat.6885 P-value.1993 Lower 95% 57.58835 Upper 95% 555.4644 Price -4.9759 1.8313 -.3565.3979-48.5766-1.3739 Advertising 74.1396 5.9673.85478.1449 17.5533 13.7888 Chap 14-3 Inferences about the Slope: ttest Example H : β i = H A : β i d.f. = 15--1 = 1 α =.5 t α/ =.1788 α/=.5 From Excel output: Price Advertising Reect H Do not reect H Reect H -t α/ t α/ -.1788.1788 Coefficients Standard Error t Stat P-value -4.9759 1.8313 -.3565.3979 74.1396 5.9673.85478.1449 The test statistic for each variable falls in the reection region (p-values <.5) Decision: Reect H for each variable α/=.5 Conclusion: There is evidence that both Price and Advertising affect pie sales at α =.5 Chap 14-33

Chapter 14 Student Lecture Notes 14-1 Confidence Interval Estimate for the Slope Confidence interval for the population slope β 1 (the effect of changes in price on pie sales): b ± t i α / s b i where t has (n k 1) d.f. Coefficient s Standard Error Lower 95% Upper 95% Intercept 36.5619 114.5389 57.58835 555.4644 Price -4.9759 1.8313-48.5766-1.3739 Advertising 74.1396 5.9673 17.5533 13.7888 Example: Weekly sales are estimated to be reduced by between 1.37 to 48.58 pies for each increase of $1 in the selling price Chap 14-34 Standard Deviation of the Regression Model The estimate of the standard deviation of the regression model is: SSE s ε = = n k 1 MSE Is this value large or small? Must compare to the mean size of y for comparison Chap 14-35 Multiple R R Square Adusted R Square Regression Statistics Standard Error Observations Standard Deviation of the Regression Model.713.5148.4417 47.46341 15 The standard deviation of the regression model is 47.46 ANOVA df SS MS F Significance F Regression 946.7 1473.13 6.53861.11 Residual 1 733.36 5.776 Total 14 56493.333 Intercept Coefficient s 36.5619 Standard Error 114.5389 t Stat.6885 P-value.1993 Lower 95% 57.58835 Upper 95% 555.4644 Price -4.9759 1.8313 -.3565.3979-48.5766-1.3739 Advertising 74.1396 5.9673.85478.1449 17.5533 13.7888 Chap 14-36

Chapter 14 Student Lecture Notes 14-13 Standard Deviation of the Regression Model The standard deviation of the regression model is 47.46 A rough prediction range for pie sales in a given week is ± (47.46) = 94. Pie sales in the sample were in the 3 to 5 per week range, so this range is probably too large to be acceptable. The analyst may want to look for additional variables that can explain more of the variation in weekly sales Chap 14-37 Multicollinearity Multicollinearity: High correlation exists between two independent variables This means the two variables contribute redundant information to the multiple regression model Chap 14-38 Multicollinearity Including two highly correlated independent variables can adversely affect the regression results No new information provided Can lead to unstable coefficients (large standard error and low t-values) Coefficient signs may not match prior expectations Chap 14-39

Chapter 14 Student Lecture Notes 14-14 Some Indications of Severe Multicollinearity Incorrect signs on the coefficients Large change in the value of a previous coefficient when a new variable is added to the model A previously significant variable becomes insignificant when a new independent variable is added The estimate of the standard deviation of the model increases when a variable is added to the model Chap 14-4 Detect Collinearity (Variance Inflationary Factor) VIF is used to measure collinearity: VIF = 1 1 R R is the coefficient of determination when the th independent variable is regressed against the remaining k 1 independent variables If VIF > 5, x is highly correlated with the other explanatory variables Chap 14-41 Detect Collinearity in PHStat PHStat / regression / multiple regression Check the variance inflationary factor (VIF) box Regression Analysis Price and all other X Multiple R R Square Adusted R Square Standard Error Observations VIF Regression Statistics.3437581.96446 -.7595366 1.15735 15 1.9735 Output for the pie sales example: Since there are only two explanatory variables, only one VIF is reported VIF is < 5 There is no evidence of collinearity between Price and Advertising Chap 14-4

Chapter 14 Student Lecture Notes 14-15 Qualitative (Dummy) Variables Categorical explanatory variable (dummy variable) with two or more levels: yes or no, on or off, male or female coded as or 1 Regression intercepts are different if the variable is significant Assumes equal slopes for other variables The number of dummy variables needed is (number of levels - 1) Chap 14-43 Dummy-Variable Model Example (with Levels) Let: y = pie sales x 1 = price ŷ = b + b x1 + b x 1 x = holiday (X = 1 if a holiday occurred during the week) (X = if there was no holiday that week) Chap 14-44 ŷ Dummy-Variable Model Example (with Levels) = + 1 + = + 1 1 1 Holiday = b + b x1 + b () = b + b x 1 1 1 No Holiday y (sales) Different intercept Same slope b + b If H : β = is reected, then b Holiday has a significant effect on pie sales ŷ b b x b (1) Holiday No Holiday (b b ) + b x x 1 (Price) Chap 14-45

Chapter 14 Student Lecture Notes 14-16 Interpretation of the Dummy Variable Coefficient (with Levels) Example: Sales = 3-3(Price) + 15(Holiday) Sales: number of pies sold per week Price: pie price in $ Holiday: 1 If a holiday occurred during the week If no holiday occurred b = 15: on average, sales were 15 pies greater in weeks with a holiday than in weeks without a holiday, given the same price Chap 14-46 Dummy-Variable Models (more than Levels) The number of dummy variables is one less than the number of levels Example: y = house price ; x 1 = square feet The style of the house is also thought to matter: Style = ranch, split level, condo Three levels, so two dummy variables are needed Chap 14-47 Dummy-Variable Models (more than Levels) Let the default category be condo 1 if ranch x = x3 if not 1 = if split level if not ŷ = b + b x1 + b x + b x 1 3 3 b shows the impact on price if the house is a ranch style, compared to a condo b 3 shows the impact on price if the house is a split level style, compared to a condo Chap 14-48

Chapter 14 Student Lecture Notes 14-17 Interpreting the Dummy Variable Coefficients (with 3 Levels) Suppose the estimated equation is ŷ =.43 +.45x1 + 3.53x + 18.84x For a condo: x = x 3 = ŷ =.43 +.45x 1 For a ranch: x 3 = ŷ =.43 +.45x + 1 3.53 For a split level: x = ŷ =.43 +.45x + 1 18.84 With the same square feet, a split-level will have an estimated average price of 18.84 thousand dollars more than a condo With the same square feet, a ranch will have an estimated average price of 3.53 thousand dollars more than a condo. Chap 14-49 3 Nonlinear Relationships The relationship between the dependent variable and an independent variable may not be linear Useful when scatter diagram indicates nonlinear relationship Example: Quadratic model y = β + β1x + βx + ε The second independent variable is the square of the first variable Chap 14-5 Polynomial Regression Model General form: p y = β + β1x + βx + K + βpx + ε where: β = Population regression constant β i = Population regression coefficient for variable x : = 1,, k p = Order of the polynomial ε i = Model error If p = the model is a quadratic model: y = β + β1x + βx + ε Chap 14-51

Chapter 14 Student Lecture Notes 14-18 Linear vs. Nonlinear Fit y y x x residuals Linear fit does not give random residuals x Nonlinear fit gives random residuals Chap 14-5 residuals x Quadratic Regression Model y Quadratic models may be considered when scatter diagram takes on the following shapes: y y = β + β1x + βx + ε y y x 1 x 1 β 1 < β 1 > β 1 < β 1 > β > β > β < β < β 1 = the coefficient of the linear term β = the coefficient of the squared term x 1 x 1 Chap 14-53 Testing for Significance: Quadratic Model Test for Overall Relationship MSR F test statistic = MSE Testing the Quadratic Effect Compare quadratic model y = β + β x + β x ε with the linear model y = β + β x ε Hypotheses 1 + 1 + H (No nd : β = order polynomial term) H ( nd A : β order polynomial term is needed) Chap 14-54

Chapter 14 Student Lecture Notes 14-19 Higher Order Models y x If p = 3 the model is a cubic form: 3 y = β + β1x + βx + β3x + ε Chap 14-55 Interaction Effects Hypothesizes interaction between pairs of x variables Response to one x variable varies at different levels of another x variable Contains two-way cross product terms y = β + + β1x1 + βx1 + β3x3 + β4x1x β5x1x Basic Terms Interactive Terms Chap 14-56 Given: Effect of Interaction y = β + β1x1 + βx + β3x1x + Without interaction term, effect of x 1 on y is measured by β 1 With interaction term, effect of x 1 on y is measured by β 1 + β 3 x Effect changes as x increases ε Chap 14-57

Chapter 14 Student Lecture Notes 14- Interaction Example 1 8 4 y y = 1 + x 1 + 3x + 4x 1 x.5 1 1.5 where x = or 1 (dummy variable) x = 1 y = 1 + x 1 + 3(1) + 4x 1 (1) = 4 + 6x 1 x = y = 1 + x 1 + 3() + 4x 1 () = 1 + x 1 Effect (slope) of x 1 on y does depend on x value Chap 14-58 x 1 Interaction Regression Model Worksheet Case, i y i x 1i x i x 1i x i 1 1 1 3 3 4 8 5 4 3 1 3 6 4 3 5 6 3 : : : : : multiply x 1 by x to get x 1 x, then run regression with y, x 1, x, x 1 x Chap 14-59 Hypothesize interaction between pairs of independent variables y = β + β1x1 + βx + β3x1x Hypotheses: Evaluating Presence of Interaction + H : β 3 = (no interaction between x 1 and x ) H A : β 3 (x 1 interacts with x ) ε Chap 14-6

Chapter 14 Student Lecture Notes 14-1 Model Building Goal is to develop a model with the best set of independent variables Easier to interpret if unimportant variables are removed Lower probability of collinearity Stepwise regression procedure Provide evaluation of alternative models as variables are added Best-subset approach Try all combinations and select the best using the highest adusted R and lowest s ε Chap 14-61 Stepwise Regression Idea: develop the least squares regression equation in steps, either through forward selection, backward elimination, or through standard stepwise regression The coefficient of partial determination is the measure of the marginal contribution of each independent variable, given that other independent variables are in the model Chap 14-6 Best Subsets Regression Idea: estimate all possible regression equations using all possible combinations of independent variables Choose the best fit by looking for the highest adusted R and lowest standard error s ε Stepwise regression and best subsets regression can be performed using PHStat, Minitab, or other statistical software packages Chap 14-63

Chapter 14 Student Lecture Notes 14- Aptness of the Model Diagnostic checks on the model include verifying the assumptions of multiple regression: Each x i is linearly related to y Errors have constant variance Errors are independent Error are normally distributed Errors (or Residuals) are given by e i = (y ŷ) Chap 14-64 Residual Analysis residuals x residuals x Non-constant variance Constant variance residuals x Not Independent Independent Chap 14-65 residuals x The Normality Assumption Errors are assumed to be normally distributed Standardized residuals can be calculated by computer Examine a histogram or a normal probability plot of the standardized residuals to check for normality Chap 14-66

Chapter 14 Student Lecture Notes 14-3 Chapter Summary Developed the multiple regression model Tested the significance of the multiple regression model Developed adusted R Tested individual regression coefficients Used dummy variables Examined interaction in a multiple regression model Chap 14-67 Chapter Summary Described nonlinear regression models Described multicollinearity Discussed model building Stepwise regression Best subsets regression Examined residual plots to check model assumptions Chap 14-68