Simple Regression
Correlation Analysis Correlation analysis is used to measure strength of the association (linear relationship) between two variables Correlation is only concerned with strength of the relationship No causal effect is implied with correlation Correlation was first presented in Chapter 3
Correlation Analysis The population correlation coefficient is denoted ρ (the Greek letter rho) The sample correlation coefficient is where r = s s x xy s y s xy (x = i x)(y n 1 i y)
Hypothesis Test for Correlation To test the null hypothesis of no linear association, H 0 :ρ = the test statistic follows the Student s t distribution with (n ) degrees of freedom: 0 t = r (n ) (1 r )
Decision Rules Hypothesis Test for Correlation Lower-tail test: H 0 : ρ 0 H 1 : ρ < 0 Upper-tail test: H 0 : ρ 0 H 1 : ρ > 0 Two-tail test: H 0 : ρ = 0 H 1 : ρ 0 α α α/ α/ -t α t α -t α/ t α/ Reject H 0 if t < -t n-, α Reject H 0 if t > t n-, α Reject H 0 if t < -t n-, α/ or t > t n-, α/ r (n ) Where t = has n - d.f. (1 r )
Introduction to Regression Analysis Regression analysis is used to: Predict the value of a dependent variable based on the value of at least one independent variable Explain the impact of changes in an independent variable on the dependent variable Dependent variable: the variable we wish to explain (also called the endogenous variable) Independent variable: the variable used to explain the dependent variable (also called the exogenous variable)
Linear Regression Model The relationship between X and Y is described by a linear function Changes in Y are assumed to be caused by changes in X Linear regression population equation model Y = β + β x + i 0 1 i ε i Where β 0 and β 1 are the population model coefficients and ε is a random error term.
Simple Linear Regression Model The population regression model: Dependent Variable Population Y intercept Y = β + β X + i 0 Population Slope Coefficient 1 Independent Variable i ε i Random Error term Linear component Random Error component
Simple Linear Regression Model (continued) Y Observed Value of Y for X i Y = β + β X + i 0 1 i ε i Predicted Value of Y for X i ε i Random Error for this X i value Slope = β 1 Intercept = β 0 X i X
Simple Linear Regression Equation The simple linear regression equation provides an estimate of the population regression line Estimated (or predicted) y value for observation i y ˆ = b + i Estimate of the regression intercept 0 b Estimate of the regression slope 1 x i Value of x for observation i The individual random error terms e i have a mean of zero e ˆ i = ( yi - yi) = yi -(b0 + b1xi)
Least Squares Estimators b 0 and b 1 are obtained by finding the values of b 0 and b 1 that minimize the sum of the squared differences between y and ŷ : min SSE = min e i = min (y i yˆ i ) = min [y i (b 0 + b 1 x i )] Differential calculus is used to obtain the coefficient estimators b 0 and b 1 that minimize SSE
Least Squares Estimators The slope coefficient estimator is (continued) n (x x)(y y) i i i= 1 b 1 = = n x (xi x) i= 1 r xy s s Y X And the constant or y-intercept is b0 = y b1x The regression line always goes through the mean x, y
Finding the Least Squares Equation The coefficients b 0 and b 1, and other regression results in this chapter, will be found using a computer Hand calculations are tedious Statistical routines are built into Excel Other statistical analysis software can be used
Linear Regression Model Assumptions The true relationship form is linear (Y is a linear function of X, plus random error) The error terms, ε i are independent of the x values The error terms are random variables with mean 0 and constant variance, σ (the constant variance property is called homoscedasticity) E[ε ] = 0 i and E[ε i ] = σ for (i = 1, K,n) The random error terms, ε i, are not correlated with one another, so that E[ε i ε j ] = 0 for all i j
Interpretation of the Slope and the Intercept b 0 is the estimated average value of y when the value of x is zero (if x = 0 is in the range of observed x values) b 1 is the estimated change in the average value of y as a result of a one-unit change in x
Simple Linear Regression Example A real estate agent wishes to examine the relationship between the selling price of a home and its size (measured in square feet) A random sample of 10 houses is selected Dependent variable (Y) = house price in $1000s Independent variable (X) = square feet
Sample Data for House Price Model House Price in $1000s (Y) 45 31 79 308 199 19 405 34 319 55 Square Feet (X) 1400 1600 1700 1875 1100 1550 350 450 145 1700
Graphical Presentation House Price ($1000s) House price model: scatter plot 450 400 350 300 50 00 150 100 50 0 0 500 1000 1500 000 500 3000 Square Feet
Regression Using Excel Tools / Data Analysis / Regression
Excel Output Regression Statistics Multiple R 0.7611 R Square 0.5808 Adjusted R Square 0.584 Standard Error 41.3303 Observations 10 The regression equation is: house price = 98.4833 + 0.10977 (square feet) ANOVA df SS MS F Significance F Regression 1 18934.9348 18934.9348 11.0848 0.01039 Residual 8 13665.565 1708.1957 Total 9 3600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.4833 58.03348 1.6996 0.189-35.5770 3.07386 Square Feet 0.10977 0.0397 3.3938 0.01039 0.03374 0.18580
Graphical Presentation House price model: scatter plot and regression line Intercept = 98.48 House Price ($1000s) 450 400 350 300 50 00 150 100 50 0 0 500 1000 1500 000 500 3000 Square Feet Slope = 0.10977 house price = 98.4833 + 0.10977 (square feet)
Interpretation of the Intercept, b 0 house price = 98.4833 + 0.10977 (square feet) b 0 is the estimated average value of Y when the value of X is zero (if X = 0 is in the range of observed X values) Here, no houses had 0 square feet, so b 0 = 98.4833 just indicates that, for houses within the range of sizes observed, $98,48.33 is the portion of the house price not explained by square feet
Interpretation of the Slope Coefficient, b 1 house price = 98.4833 + 0.10977 (square feet) b 1 measures the estimated change in the average value of Y as a result of a oneunit change in X Here, b 1 =.10977 tells us that the average value of a house increases by.10977($1000) = $109.77, on average, for each additional one square foot of size
Measures of Variation Total variation is made up of two parts: SST = SSR + SSE Total Sum of Squares Regression Sum of Squares Error Sum of Squares = SST (y y = SSR (yˆ y SSE = (y i ) i ) ˆ i yi) where: y = Average value of the dependent variable y i = Observed values of the dependent variable ŷ i = Predicted value of y for the given x i value
Measures of Variation (continued) SST = total sum of squares Measures the variation of the y i values around their mean, y SSR = regression sum of squares Explained variation attributable to the linear relationship between x and y SSE = error sum of squares Variation attributable to factors other than the linear relationship between x and y
y i Y _ y y SST = (y i - y) Measures of Variation _ SSE = (y i - y i ) _ SSR = (y i - y) (continued) y _ y x i X
Coefficient of Determination, R The coefficient of determination is the portion of the total variation in the dependent variable that is explained by variation in the independent variable The coefficient of determination is also called R-squared and is denoted as R SSR R = = SST regression sum of squares total sum of squares note: 0 R 1
Examples of Approximate r Values Y r = 1 Y r = 1 X Perfect linear relationship between X and Y: 100% of the variation in Y is explained by variation in X r = 1 X
Examples of Approximate r Values Y Y X 0 < r < 1 Weaker linear relationships between X and Y: Some but not all of the variation in Y is explained by variation in X X
Examples of Approximate r Values Y r = 0 No linear relationship between X and Y: r = 0 X The value of Y does not depend on X. (None of the variation in Y is explained by variation in X)
Excel Output Regression Statistics Multiple R 0.7611 R Square 0.5808 Adjusted R Square 0.584 Standard Error 41.3303 Observations 10 SSR 18934.9348 R = = = SST 3600.5000 0.5808 58.08% of the variation in house prices is explained by variation in square feet ANOVA df SS MS F Significance F Regression 1 18934.9348 18934.9348 11.0848 0.01039 Residual 8 13665.565 1708.1957 Total 9 3600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.4833 58.03348 1.6996 0.189-35.5770 3.07386 Square Feet 0.10977 0.0397 3.3938 0.01039 0.03374 0.18580
Correlation and R The coefficient of determination, R, for a simple regression is equal to the simple correlation squared R = r xy
Estimation of Model Error Variance An estimator for the variance of the population model error is σˆ = s e = n i= 1 e n SSE n Division by n instead of n 1 is because the simple regression model uses two estimated parameters, b 0 and b 1, instead of one i = s e = s e is called the standard error of the estimate
Excel Output Regression Statistics Multiple R 0.7611 R Square 0.5808 Adjusted R Square 0.584 Standard Error 41.3303 Observations 10 s e = 41.3303 ANOVA df SS MS F Significance F Regression 1 18934.9348 18934.9348 11.0848 0.01039 Residual 8 13665.565 1708.1957 Total 9 3600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.4833 58.03348 1.6996 0.189-35.5770 3.07386 Square Feet 0.10977 0.0397 3.3938 0.01039 0.03374 0.18580
Comparing Standard Errors s e is a measure of the variation of observed y values from the regression line Y Y small s e X large se X The magnitude of s e should always be judged relative to the size of the y values in the sample data i.e., s e = $41.33K is moderately small relative to house prices in the $00 - $300K range
Inferences About the Regression Model The variance of the regression slope coefficient (b 1 ) is estimated by s b1 se = (xi x) = (n se 1)s x where: s b1 = Estimate of the standard error of the least squares slope s e = SSE n = Standard error of the estimate
Excel Output Regression Statistics Multiple R 0.7611 R Square 0.5808 Adjusted R Square 0.584 Standard Error 41.3303 Observations 10 s = b 1 0.0397 ANOVA df SS MS F Significance F Regression 1 18934.9348 18934.9348 11.0848 0.01039 Residual 8 13665.565 1708.1957 Total 9 3600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.4833 58.03348 1.6996 0.189-35.5770 3.07386 Square Feet 0.10977 0.0397 3.3938 0.01039 0.03374 0.18580
Comparing Standard Errors of the Slope S b1 is a measure of the variation in the slope of regression lines from different possible samples Y Y small S b1 X large S b1 X
Inference about the Slope: t Test t test for a population slope Is there a linear relationship between X and Y? Null and alternative hypotheses H 0 : β 1 = 0 H 1 : β 1 0 Test statistic t = d.f. (no linear relationship) (linear relationship does exist) b β s 1 = n b 1 1 where: b 1 = regression slope coefficient β 1 = hypothesized slope s b1 = standard error of the slope
Inference about the Slope: t Test (continued) House Price in $1000s (y) 45 31 79 308 199 19 405 34 319 55 Square Feet (x) 1400 1600 1700 1875 1100 1550 350 450 145 1700 Estimated Regression Equation: house price = 98.5 + 0.1098 (sq.ft.) The slope of this model is 0.1098 Does square footage of the house affect its sales price?
Inferences about the Slope: t Test Example H 0 : β 1 = 0 H 1 : β 1 0 From Excel output: Coefficients Intercept 98.4833 b 1 Standard Error 58.03348 s b1 t Stat 1.6996 P-value 0.189 Square Feet 0.10977 0.0397 3.3938 0.01039 t = b β s b 1 0.10977 0 t 0.0397 1 1 = = 3.3938
H 0 : β 1 = 0 H 1 : β 1 0 d.f. = 10- = 8 t 8,.05 =.3060 α/=.05 Inferences about the Slope: t Test Example Test Statistic: t = 3.39 From Excel output: Intercept Square Feet α/=.05 Reject H 0 Do not reject H 0 Reject H 0 -t n-,α/ 0 t n-,α/ -.3060.3060 3.39 Coefficients 98.4833 0.10977 b 1 Standard Error 58.03348 0.0397 sb 1 (continued) t Stat 1.6996 3.3938 P-value 0.189 0.01039 Decision: Reject H 0 Conclusion: There is sufficient evidence that square footage affects house price t
H 0 : β 1 = 0 H 1 : β 1 0 Inferences about the Slope: t Test Example P-value = 0.01039 From Excel output: (continued) P-value Coefficients Standard Error t Stat P-value Intercept 98.4833 58.03348 1.6996 0.189 Square Feet 0.10977 0.0397 3.3938 0.01039 This is a two-tail test, so the p-value is P(t > 3.39)+P(t < -3.39) = 0.01039 (for 8 d.f.) Decision: P-value < α so Reject H 0 Conclusion: There is sufficient evidence that square footage affects house price
Confidence Interval Estimate for the Slope Confidence Interval Estimate of the Slope: b 1 tn,α/sb < β1 < b1 + tn,α/s Excel Printout for House Prices: 1 b 1 d.f. = n - Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.4833 58.03348 1.6996 0.189-35.5770 3.07386 Square Feet 0.10977 0.0397 3.3938 0.01039 0.03374 0.18580 At 95% level of confidence, the confidence interval for the slope is (0.0337, 0.1858)
Confidence Interval Estimate for the Slope (continued) Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.4833 58.03348 1.6996 0.189-35.5770 3.07386 Square Feet 0.10977 0.0397 3.3938 0.01039 0.03374 0.18580 Since the units of the house price variable is $1000s, we are 95% confident that the average impact on sales price is between $33.70 and $185.80 per square foot of house size This 95% confidence interval does not include 0. Conclusion: There is a significant relationship between house price and square feet at the.05 level of significance
F-Test for Significance F Test statistic: F = MSR MSE where MSR = SSR k MSE = SSE n k 1 where F follows an F distribution with k numerator and (n k - 1) denominator degrees of freedom (k = the number of independent variables in the regression model)
Excel Output Regression Statistics Multiple R 0.7611 R Square 0.5808 Adjusted R Square 0.584 Standard Error 41.3303 Observations 10 ANOVA df Regression 1 MSR 18934.9348 F = = = 11.0848 MSE 1708.1957 With 1 and 8 degrees of freedom SS 18934.9348 MS 18934.9348 F 11.0848 Significance F 0.01039 P-value for the F-Test Residual 8 13665.565 1708.1957 Total 9 3600.5000 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept 98.4833 58.03348 1.6996 0.189-35.5770 3.07386 Square Feet 0.10977 0.0397 3.3938 0.01039 0.03374 0.18580
0 H 0 : β 1 = 0 H 1 : β 1 0 α =.05 df 1 = 1 df = 8 Do not reject H 0 Critical Value: F α = 5.3 F-Test for Significance α =.05 Reject H 0 F.05 = 5.3 F Test Statistic: MSR F = = 11.08 MSE Decision: Reject H 0 at α = 0.05 Conclusion: (continued) There is sufficient evidence that house size affects selling price
Prediction The regression equation can be used to predict a value for y, given a particular x For a specified value, x n+1, the predicted value is ˆ y = b + b x n+ 1 0 1 n+ 1
Predictions Using Regression Analysis Predict the price for a house with 000 square feet: house price = 98.5 + 0.1098 (sq.ft.) = 98.5 + 0.1098(000) = 317.85 The predicted price for a house with 000 square feet is 317.85($1,000s) = $317,850
Relevant Data Range When using a regression model for prediction, only predict within the relevant range of data Relevant data range 450 400 House Price ($1000s) 350 300 50 00 150 100 50 0 0 500 1000 1500 000 500 3000 Risky to try to extrapolate far beyond the range of observed X s Square Feet
Estimating Mean Values and Predicting Individual Values Confidence Interval for the expected value of y, given x i Goal: Form intervals around y to express uncertainty about the value of y for a given x i Y y y = b 0 +b 1 x i Prediction Interval for an single observed y, given x i x i X
Confidence Interval for the Average Y, Given X Confidence interval estimate for the expected value of y given a particular x i Confidence interval for E(Y n+ 1 X n+ 1 ): yˆ n+ 1 ± t n,α/ s e 1 n + (x n + 1 (xi x) x) Notice that the formula involves the term (x x n + 1 ) so the size of interval varies according to the distance x n+1 is from the mean, x
Prediction Interval for an Individual Y, Given X Confidence interval estimate for an actual observed value of y given a particular x i Confidence interval for yˆ n+ 1 : yˆ n+ 1 ± t n,α/ s e 1+ 1 n + (x n + 1 (xi x) x) This extra term adds to the interval width to reflect the added uncertainty for an individual case
Estimation of Mean Values: Example Confidence Interval Estimate for E(Y n+1 X n+1 ) Find the 95% confidence interval for the mean price of,000 square-foot houses Predicted Price y i = 317.85 ($1,000s) 1 (xi x) yˆ n + 1 ± tn-,α/se + = 317.85 ± 37.1 n (x x) i The confidence interval endpoints are 80.66 and 354.90, or from $80,660 to $354,900
Estimation of Individual Values: Example Confidence Interval Estimate for y n+1 Find the 95% confidence interval for an individual house with,000 square feet Predicted Price y i = 317.85 ($1,000s) 1 (Xi X) ˆ n + 1 ± tn-1,α/se 1+ + = 317.85 ± 10.8 n (X X) y i The confidence interval endpoints are 15.50 and 40.07, or from $15,500 to $40,070
Finding Confidence and Prediction Intervals in Excel In Excel, use PHStat regression simple linear regression Check the confidence and prediction interval for x= box and enter the x-value and confidence level desired
Finding Confidence and Prediction Intervals in Excel (continued) Input values y Confidence Interval Estimate for E(Y n+1 X n+1 ) Confidence Interval Estimate for individual y n+1