Chapter 15 Multiple Regression
|
|
- Lester Gilbert
- 6 years ago
- Views:
Transcription
1 Multiple Regression Learning Objectives 1. Understand how multiple regression analysis can be used to develop relationships involving one dependent variable and several independent variables. 2. Be able to interpret the coefficients in a mu ltiple regression analysis. 3. Know the assumptions necessary to conduct statistical tests involving the hypothesized regression model. 4. Understand the role of computer packages in performing multiple regression analysis. 5. Be able to interpret and use computer output to develop the estimated regression equation. 6. Be able to determine how good a fit is provided by the estimated regression equation. 7. Be able to test for the significance of the regression equation. 8. Understand how multicollinearity affects multiple regression analysis. 9. Know how residual analysis can be used to make a judgement as to the appropriateness of the model, identify outliers, and determine which observations are influential. 15 1
2 Solutions: 1. a. b 1 =.5906 is an estimate of the change in y corresponding to a 1 unit change in x 1 when x 2 is held constant. b 2 =.4980 is an estimate of the change in y corresponding to a 1 unit change in x 2 when x 1 is held constant. 2. a. The estimated regression equation is ŷ = x 1 An estimate of y when x 1 = 45 is ŷ = (45) = b. The estimated regression equation is ŷ = x 2 An estimate of y when x 2 = 15 is ŷ = (15) = c. The estimated regression equation is ŷ = x x 2 An estimate of y when x 1 = 45 and x 2 = 15 is ŷ = (45) (15) = a. b 1 = 3.8 is an estimate of the change in y corresponding to a 1 unit change in x 1 when x 2, x 3, and x 4 are held constant. b 2 = 2.3 is an estimate of the change in y corresponding to a 1 unit change in x 2 when x 1, x 3, and x 4 are held constant. b 3 = 7.6 is an estimate of the change in y corresponding to a 1 unit change in x 3 when x 1, x 2, and x 4 are held constant. b 4 = 2.7 is an estimate of the change in y corresponding to a 1 unit change in x 4 when x 1, x 2, and x 3 are held constant. 4. a. ŷ = (15) + 8(10) = 255; sales estimate: $255,000 b. Sales can be expected to increase by $10 for every dollar increase in inventory investment when advertising exp enditure is held constant. Sales can be expected to increase by $8 for every dollar increase in advertising expenditure when inventory investment is held constant. 15 2
3 Multiple Regression 5. a. The Minitab output is shown below: Revenue = TVAdv Constant TVAdv S = RSq = 65.3% RSq(adj) = 59.5% Regression Residual Error Total b. The Minitab output is shown below: Revenue = TVAdv NewsAdv Constant TVAdv NewsAdv S = RSq = 91.9% RSq(adj) = 88.7% Regression Residual Error Total Source DF Seq SS TVAdv NewsAdv c. No, it is 1.60 in part 2(a) and 2.99 above. In this exercise it represents the marginal change in revenue due to an increase in television advertising with newspaper advertising held constant. d. Revenue = (3.5) (1.8) = $93.56 or $93, a. The Minitab output is shown below: Speed = Weight Constant Weight
4 S = RSq = 31.1% RSq(adj) = 26.2% Regression Error Total b. The Minitab output is shown below: Speed = Weight Horsepwr Constant Weight Horsepwr S = RSq = 88.0% RSq(adj) = 86.2% Regression Residual Error Total a. The Minitab output is shown below: Sales = Compet$ Heller$ Constant Compet$ Heller$ S = RSq = 65.3% RSq(adj) = 55.4% Regression Residual Error Total b. b 1 =.414 is an estimate of the change in the quantity sold (1000s) of the Heller mower with respect to a $1 change in price in competitor s mower with the price of the Heller mower held constant. b 2 =.270 is an estimate of the change in the quantity sold (1000s) of the Heller mower with respect to a $1 change in its price with the price of the competitor s mower held constant. 15 4
5 Multiple Regression c. ŷ = (170) 0.270(160) = or 93,680 units 8. a. The Minitab output is shown below: Return = Safety ExpRatio Constant Safety ExpRatio S = RSq = 58.2% RSq(adj) = 53.3% Regression Residual Error Total b. y ˆ = (7.5) (2) = a. The Minitab output is shown below: %College = Size SatScore Constant Size SatScore S = RSq = 38.2% RSq(adj) = 30.0% Regression Residual Error Total b. ŷ = (20) (1000) = 73.8 Estimate is 73.8% 10. a. The Minitab output is shown below: Revenue = Cars 15 5
6 Constant Cars S = RSq = 92.5% RSq(adj) = 91.9% Regression Error Total b. An increase of 1000 cars in service will result in an increase in revenue of $7.98 million. c. The Minitab output is shown below: Revenue = Cars Location Constant Cars Location S = RSq = 94.2% RSq(adj) = 93.2% Regression Error Total a. SSE = SST SSR = 6, , = b. c. 2 SSR 6, R = = =.924 SST 6, n Ra = 1 (1 R ) = 1 (1.924) =.902 n p d. The estimated regression equation provided an excellent fit. 12. a. b. 2 SSR 14,052.2 R = = =.926 SST 15, n Ra = 1 (1 R ) = 1 (1.926) =.905 n p
7 Multiple Regression c. Yes; after adjusting for the number of independent variables in the model, we see that 90.5% of the variability in y has been accounted for. 2 SSR a. R = = =.975 SST 1805 b. 2 2 n Ra = 1 (1 R ) = 1 (1.975) =.971 n p c. The estimated regression equation provided an excellent fit. 14. a. b. 2 SSR 12,000 R = = =.75 SST 16, n 1 9 Ra = 1 (1 R ) = 1.25 =.68 n p 1 7 c. The adjusted coefficient of determination shows that 68% of the variability has been explained by the two independent variables; thus, we conclude that the model does not explain a large amount of variability. 2 SSR a. R = = =.919 SST n Ra = 1 (1 R ) = 1 (1.919) =.887 n p b. Multiple regression analysis is preferred since both R 2 and R 2 show an increased percentage of the a variability of y explained when both independent variables are used. 16. Note: the Minitab output is shown with the solution to Exercise 6. a. No; RSq = 31.1% b. Multiple regression analysis is preferred since both RSq and RSq(adj) show an increased percentage of the variability of y explained when both independent variables are used. 2 SSR a. R = = =.382 SST n Ra = 1 (1 R ) = 1 (1.382) =.30 n p b. The fit is not very good 18. Note: The Minitab output is shown with the solution to Exercise 10. a. RSq = 94.2% RSq(adj) = 93.2% b. The fit is very good. 15 7
8 19. a. MSR = SSR/p = 6, /2 = 3, SSE MSE = n p 1 = = b. F = MSR/MSE = 3, / = F.05 = 4.74 (2 degrees of freedom numerator and 7 denominator) Since F = > F.05 = 4.74 the overall model is significant. c. t =.5906/.0813 = 7.26 t.025 = (7 degrees of freedom) Since t = > t.025 = 2.365, β 1 is significant. d. t =.4980/.0567 = 8.78 Since t = 8.78 > t.025 = 2.365, β 2 is significant. 20. A portion of the Minitab output is shown below. Y = X X2 Constant X X S = RSq = 92.6% RSq(adj) = 90.4% Regression Residual Error Total a. Since the pvalue corresponding to F = is.000 < α =.05, we reject H 0 : β 1 = β 2 = 0; there is a significant relationship. b. Since the pvalue corresponding to t = 8.13 is.000 < α =.05, we reject H 0 : β 1 = 0; β 1 is significant. c. Since the pvalue corresponding to t = 5.00 is.002 < α =.05, we reject H 0 : β 2 = 0; β 2 is significant. 21. a. In the two independent variable case the coefficient of x 1 represents the expected change in y corresponding to a one unit increase in x 1 when x 2 is held constant. In the single independent variable case the coefficient of x 1 represents the expected change in y corresponding to a one unit increase in x 1. b. Yes. If x 1 and x 2 are correlated one would expect a change in x 1 to be accompanied by a change in x
9 Multiple Regression 22. a. SSE = SST SSR = = SSE 4000 s = n p1 = 7 = SSR MSR = = = 6000 p 2 b. F = MSR/MSE = 6000/ = F.05 = 4.74 (2 degrees of freedom numerator and 7 denominator) Since F = > F.05 = 4.74, we reject H 0. There is a significant relationship among the variables. 23. a. F = F.01 = (2 degrees of freedom, numerator and 1 denominator) Since F > F.01 = 13.27, reject H 0. Alternatively, the pvalue of.002 leads to the same conclusion. b. t = 7.53 t.025 = Since t > t.025 = 2.571, β 1 is significant and x 1 should not be dropped from the model. c. t = 4.06 t.025 = Since t > t.025 = 2.571, β 2 is significant and x 2 should not be dropped from the model. 24. Note: The Minitab output is shown in part (b) of Exercise 6 a. F = F.05 = 3.81 (2 degrees of freedom numerator and 13 denominator) Since F = > F.05 = 3.81, we reject H 0 : β 1 = β 2 = 0. Alternatively, since the pvalue =.000 < α =.05 we can reject H 0. b. For Weight: H 0 : β 1 = 0 H a : β
10 Since the pvalue = > α = 0.05, we cannot reject H 0 For Horsepower: H 0 : β 2 = 0 H a : β 2 0 Since the pvalue = < α = 0.05, we can reject H a. The Minitab output is shown below: P/E = Profit% Sales% Constant Profit% Sales% S = RSq = 47.2% RSq(adj) = 39.0% Regression Residual Error Total b. Since the pvalue = < α = 0.05, there is a significant relationship among the variables. c. For Profit%: Since the pvalue = < α = 0.05, Profit% is significant. For Sales%: Since the pvalue = > α = 0.05, Sales% is not significant. 26. Note: The Minitab output is shown with the solution to Exercise 10. a. Since the pvalue corresponding to F = is < α =.05, there is a significant relationship among the variables. b. For Cars: Since the pvalue = < α = 0.05, Cars is significant c. For Location: Since the pvalue = > α = 0.05, Location is not significant 27. a. ŷ = (180) (310) = b. The point estimate for an individual value is ŷ = , the same as the point estimate of the mean value. 28. a. Using Minitab, the 95% confidence interval is to b. Using Minitab, the 95% prediction interval is to
11 Multiple Regression 29. a. ŷ = (3.5) (1.8) = or $93,555 Note: In Exercise 5b, the Minitab output also shows that b 0 = , b 1 = , and b 2 = ; hence, ŷ = x x 2. Using this estimated regression equation, we obtain ŷ = (3.5) (1.8) = or $93,588 The difference ($93,588 $93,555 = $33) is simply due to the fact that additional significant digits are used in the computations. From a practical point of view, however, the difference is not enough to be concerned about. In practice, a computer software package is always used to perform the computations and this will not be an issue. The Minitab output is shown below: Fit Stdev.Fit 95% C.I. 95% P.I ( , ) ( , ) Note that the value of FIT ( ŷ ) is b. Confidence interval estimate: to or $92,840 to $94,335 c. Prediction interval estimate: to or $91,774 to $95, a. Since weight is not statistically significant (see Exercise 24), we will use an estimated regression equation which uses only Horsepower to predict the speed at 1/4 mile. The Minitab output is shown below: Speed = Horsepwr Constant Horsepwr S = RSq = 87.3% RSq(adj) = 86.4% Regression Residual Error Total Unusual Observations Obs Horsepwr Speed Fit SE Fit Residual St Resid R X 15 11
12 R denotes an observation with a large standardized residual X denotes an observation whose X value gives it large influence. The output shows that the point estimate is a speed of miles per hour. b. The 95% confidence interval is to miles per hour. c. The 95% prediction interval is to miles per hour. 31. a. Using Minitab the 95% confidence interval is 58.37% to 75.03%. b. Using Minitab the 95% prediction interval is 35.24% to 90.59%. 32. a. E(y) = β 0 + β 1 x 1 + β 2 x 2 where x 2 = 0 if level 1 and 1 if level 2 b. E(y) = β 0 + β 1 x 1 + β 2 (0) = β 0 + β 1 x 1 c. E(y) = β 0 + β 1 x 1 + β 2 (1) = β 0 + β 1 x 1 + β 2 d. β 2 = E(y level 2) E(y level 1) 33. a. two β 1 is the change in E(y) for a 1 unit change in x 1 holding x 2 constant. b. E(y) = β 0 + β 1 x 1 + β 2 x 2 + β 3 x 3 where x 2 x 3 Level c. E(y level 1) = β 0 + β 1 x 1 + β 2 (0) + β 3 (0) = β 0 + β 1 x 1 E(y level 2) = β 0 + β 1 x 1 + β 2 (1) + β 3 (0) = β 0 + β 1 x 1 + β 2 E(y level 3) = β 0 + β 1 x 1 + β 2 (0) + β 3 (0) = β 0 + β 1 x 1 + β 3 β 2 = E(y level 2) E(y level 1) β 3 = E(y level 3) E(y level 1) β 1 is the change in E(y) for a 1 unit change in x 1 holding x 2 and x 3 constant. 34. a. $15,300 b. Estimate of sales = (2) + 6.8(8) (0) = 56.1 or $56,100 c. Estimate of sales = (1) + 6.8(3) (1) = 41.6 or $41,
13 Multiple Regression 35. a. Let Type = 0 if a mechanical repair Type = 1 if an electrical repair The Minitab output is shown below: Time = Type Constant Type S = RSq = 8.7% RSq(adj) = 0.0% Regression Residual Error Total b. The estimated regression equation did not provide a good fit. In fact, the pvalue of.408 shows that the relationship is not significant for any reasonable value of α. c. Person = 0 if Bob Jones performed the service and Person = 1 if Dave Newton performed the service. The Minitab output is shown below: Time = Person Constant Person S = RSq = 61.1% RSq(adj) = 56.2% Regression Residual Error Total d. We see that 61.1% of the variability in repair time has been explained by the repair person that performed the service; an acceptable, but not good, fit. 36. a. The Minitab output is shown below: Time = Months Type Person 15 13
14 Constant Months Type Person S = RSq = 90.0% RSq(adj) = 85.0% Regression Residual Error Total b. Since the pvalue corresponding to F = is.002 < α =.05, the overall model is statistically significant. c. The pvalue corresponding to t = 1.57 is.167 > α =.05; thus, the addition of Person is not statistically significant. Person is highly correlated with Months (the sample correlation coefficient is.691); thus, once the effect of Months has been accounted for, Person will not add much to the model. 37. a. Let Position = 0 if a guard Position = 1 if an offensive tackle. b. The Minitab output is shown below: Rating = Position Weight 2.28 Speed Constant Position Weight Speed S = RSq = 47.5% RSq(adj) = 40.1% Regression Residual Error Total c. Since the pvalue corresponding to F = 6.35 is.003 < α =.05, there is a significant relationship between rating and the independent variables. d. The value of RSq (adj) is 40.1%; the estimated regression equation did not provide a very good fit. e. Since the pvalue for Position is t = 2.53 < α =.05, position is a significant factor in the player s rating
15 Multiple Regression f. y ˆ = (1) (300) 2.28(5.1) = a. The Minitab output is shown below: Risk = Age Pressure Smoker Constant Age Pressure Smoker S = RSq = 87.3% RSq(adj) = 85.0% Regression Residual Error Total b. Since the pvalue corresponding to t = 2.91 is.010 < α =.05, smoking is a significant factor. c. Using Minitab, the point estimate is 34.27; the 95% prediction interval is to Thus, the probability of a stroke (.2135 to.4718 at the 95% confidence level) appears to be quite high. The physician would probably recommend that Art quit smoking and begin some type of treatment designed to reduce his blood pressure. 39. a. The Minitab output is shown below: Y = X Constant X S = RSq = 84.5% RSq(adj) = 79.3% Regression
16 Residual Error Total b. Using Minitab we obtained the following values: x i y i y ˆi Standardized Residual The point (3,5) does not appear to follow the trend of remaining data; however, the value of the standardized residual for this point, 1.65, is not large enough for us to conclude that (3, 5) is an outlier. c. Using Minitab, we obtained the following values: x i y i Studentized Deleted Residual t.025 = (n p 2 = = 2 degrees of freedom) Since the studentized deleted residual for (3, 5) is 4.42 < 4.303, we conclude that the 3rd observation is an outlier. 40. a. The Minitab output is shown below: Y = X Predicator Coef Stdev tratio p Constant X s = Rsq = 98.8% Rsq (adj) = 98.3% SOURCE DF SS MS F p Regression Error Total b. Using the Minitab we obtained the following values: x i y i Studentized Deleted Residual
17 Multiple Regression t.025 = (n p 2 = = 2 degrees of freedom) Since none of the studentized deleted residuals are less than or greater than 4.303, none of the observations can be classified as an outlier. c. Using Minitab we obtained the following values: The critical value is 3( p + 1) 3(1+ 1) = = 1.2 n 5 x i y i h i Since none of the values exceed 1.2, we conclude that there are no influential observations in the data. d. Using Minitab we obtained the following values: x i y i D i Since D 5 = > 1 (rule of thumb critical value), we conclude that the fifth observation is influential. 41. a. The Minitab output appears in the solution to part (b) of Exercise 5; the estimated regression equation is: Revenue = TVAdv NewsAdv b. Using Minitab we obtained the following values: y ˆi Standardized Residual
18 With the relatively few observations, it is difficult to determine if any of the assumptions regarding the error term have been violated. For instance, an argument could be made that there does not appear to be any pattern in the plot; alternatively an argument could be made that there is a curvilinear pattern in the plot. c. The values of the standardized residuals are greater than 2 and less than +2; thus, using test, there are no outliers. As a further check for outliers, we used Minitab to compute the following studentized deleted residuals: Studentized Observation Deleted Residual t.025 = (n p 2 = = 4 degrees of freedom) Since none of the studentized deleted residuals is less tan or greater than 2.776, we conclude that there are no outliers in the data. d. Using Minitab we obtained the following values: Observation h i D i The critical average value is 3( p + 1) 3(2+ 1) = = n 8 Since none of the values exceed 1.125, we conclude that there are no influential observations
19 Multiple Regression However, using Cook s distance measure, we see that D 1 > 1 (rule of thumb critical value); thus, we conclude the first observation is influential. Final Conclusion: observations 1 is an influential observation. 42. a. The Minitab output is shown below: Speed = Price Horsepwr Constant Price Horsepwr S = RSq = 91.9% RSq(adj) = 90.7% Regression Residual Error Total Source DF Seq SS Price Horsepwr Unusual Observations Obs Price Speed Fit SE Fit Residual St Resid X X denotes an observation whose X value gives it large influence. b. The standardized residual plot is shown below. There appears to be a very unusual trend in the standardized residuals. xx x 1.2+ x SRES1 x x x 0.0+ x x x x x x 1.2+ x x x 15 19
20 +++++FITS c. The Minitab output shown in part (a) did not identify any observations with a large standardized residual; thus, there does not appear to be any outliers in the data. d. The Minitab output shown in part (a) identifies observation 2 as an influential observation. 43. a. The Minitab output is shown below: %College = SatScore Constant SatScore S = RSq = 29.7% RSq(adj) = 25.3% Regression Residual Error Total Unusual Observations Obs SatScore %College Fit SE Fit Residual St Resid X X denotes an observation whose X value gives it large influence. b. The Minitab output shown in part a identifies observation 3 as an influential observation. c. The Minitab output appears in the solution to Exercise 9; the estimates regression equation is %College = Size SATScore d. The following Minitab output was also provided as part of the regression output for part c. Unusual Observations Obs. Size %College Fit Stdev.Fit Residual St.Resid X X denotes an obs. whose X value gives it large influence. Observation 3 is still identified as an influential observation. 44. a. The expected increase in final college grade point average corresponding to a one point increase in high school grade point average is.0235 when SAT mathematics score does not change. Similarly, the expected increase in final college grade point average corresponding to a one point increase in the SAT mathematics score is when the high school grade point average does not change. b. ŷ = (84) (540) =
21 Multiple Regression 45. a. Job satisfaction can be expected to decrease by 8.69 units with a one unit increase in length of service if the wage rate does not change. A dollar increase in the wage rate is associated with a 13.5 point increase in the job satisfaction score when the length of service does not change. b. ŷ = (4) (6.5) = a. The computer output with the missing values filled in is as follows: Y = X X2 Predicator Coef Stdev tratio Constant X X s = 3.35 Rsq = 92.3% Rsq (adj) = 91.0% SOURCE DF SS MS F Regression Error , Total b. t.025 = (12 DF) for β 1 : 3.61 > 2.179; reject H 0 : β 1 = 0 for β 2 : 5.08 > 2.179; reject H 0 : β 2 = 0 c. See computer output. d R a = 1 (1.923) = a. Y = X X2 Predictor Coef Stdev tratio Constant X X s = Rsq = 93.7% Rsq (adj) = 91.9% SOURCE DF SS MS F Regression Error
22 Total b. F.05 = 4.74 (2 DF numerator, 7 DF denominator) F = > F.05 ; significant relationship. 2 SSR c. R = =.937 SST R = 1 (1.937) = a good fit d. t.025 = (7 DF) for B 1 : t = 2.71 > 2.365; reject H 0 : B 1 = 0 for B 2 : t = 4.51 > 2.365; reject H 0 : B 2 = a. Y = X X2 Predictor Coef Stdev tratio Constant X X s = Rsq = 90.1% Rsq (adj) = 86.1% SOURCE DF SS MS F Regression Error Total b. F.05 = 5.79 (5 DF) F = > F.05 ; significant relationship. 2 SSR c. R = =.901 SST R = 1 (1.901) = a good fit d. t.025 = (5 DF) for β 1 : t = 5.59 < 2.571; reject H 0 : β 1 =
23 Multiple Regression for β 2 : t = 6.48 > 2.571; reject H 0 : β 2 = a. The Minitab output is shown below: Price = BookVal Constant BookVal S = RSq = 29.4% RSq(adj) = 26.9% Regression Error Total b. The value of Rsq is 29.4%; the estimated regression equation does not provide a good fit. c. The Minitab output is shown below: Price = BookVal ReturnEq Constant BookVal ReturnEq S = RSq = 56.7% RSq(adj) = 53.5% Regression Error Total Since the pvalue corresponding to the F test is 0.000, the relationship is significant. 50. a. The Minitab output is shown below: Speed = Price Weight Horsepwr 2.48 Zero60 Constant Price Weight
24 Horsepwr Zero S = RSq = 95.0% RSq(adj) = 93.2% Regression Residual Error Total b. Since the pvalue corresponding to the F test is 0.000, the relationship is significant. c. Since the pvalues corresponding to the t test for both Horsepwr (pvalue =.003) and Zero60 (pvalue =.025) are less than.05, both of these independent variables are significant. d. The Minitab output is shown below: Speed = Horsepwr 3.19 Zero60 Constant Horsepwr Zero S = RSq = 93.1% RSq(adj) = 92.0% Regression Residual Error Total Source DF Seq SS Horsepwr Zero Unusual Observations Obs Horsepwr Speed Fit SE Fit Residual St Resid R X R denotes an observation with a large standardized residual X denotes an observation whose X value gives it large influence. e. The standardized residual plot is shown below: 15 24
25 Multiple Regression SRES x 1.5+ x x 2 x x 0.0+ x x 2 x xx 1.5+ x x FIT There is an unusual trend in the plot and one observation appears to be an outlier. f. The Minitab output indicates that observation 2 is an outlier g. The Minitab output indicates that observation 12 is an influential observation. 51. a. The Minitab output is shown below: 640+ x Exposure 480+ x x 320+ x 160+ x 3 x x TimesAir b. The Minitab output is shown below: Exposure = TimesAir 15 25
26 Constant TimesAir S = RSq = 96.6% RSq(adj) = 96.2% Regression Error Total Since the pvalue is 0.000, the relationship is significant. c. The Minitab output is shown below: Exposure = TimesAir BigAds Constant TimesAir BigAds S = RSq = 99.5% RSq(adj) = 99.3% Regression Error Total d. The pvalue corresponding to the t test for BigAds is 0.000; thus, the dummy variable is significant. e. The dummy variable enables us to fit two different lines to the data; this approach is referred to as piecewise linear approximation. 52. a. The Minitab output is shown below: Resale% = Price Constant Price S = RSq = 36.7% RSq(adj) = 34.4% 15 26
27 Multiple Regression Regression Residual Error Total Since the pvalue corresponding to F = is.000 < α =.05, there is a significant relationship between Resale% and Price. b. RSq = 36.7%; not a very good fit. c. Let Type1 = 0 and Type2 = 0 if a small pickup; Type1 = 1 and Type2 = 0 if a fullsize pickup; and Type1 = 0 and Type2 = 1 if a sport utility. The Minitab output using Type1, Type2, and Price is shown below: Resale% = Type Type Price Constant Type Type Price S = RSq = 63.1% RSq(adj) = 58.8% Regression Residual Error Total d. Since the pvalue corresponding to F = is.000 < α =.05, there is a significant relationship between Resale% and the independent variables. Note that individually, Price is not significant at the.05 level of significance. If we rerun the regression using just Type1 and Type2 the value of RSq (adj) decreases to 54.4%, a drop of only 4%. Thus, it appears that for these data, the type of vehicle is the strongest predictor of the resale value
Ch 13 & 14 - Regression Analysis
Ch 3 & 4 - Regression Analysis Simple Regression Model I. Multiple Choice:. A simple regression is a regression model that contains a. only one independent variable b. only one dependent variable c. more
More informationThe simple linear regression model discussed in Chapter 13 was written as
1519T_c14 03/27/2006 07:28 AM Page 614 Chapter Jose Luis Pelaez Inc/Blend Images/Getty Images, Inc./Getty Images, Inc. 14 Multiple Regression 14.1 Multiple Regression Analysis 14.2 Assumptions of the Multiple
More informationBasic Business Statistics, 10/e
Chapter 4 4- Basic Business Statistics th Edition Chapter 4 Introduction to Multiple Regression Basic Business Statistics, e 9 Prentice-Hall, Inc. Chap 4- Learning Objectives In this chapter, you learn:
More informationMultiple Regression Methods
Chapter 1: Multiple Regression Methods Hildebrand, Ott and Gray Basic Statistical Ideas for Managers Second Edition 1 Learning Objectives for Ch. 1 The Multiple Linear Regression Model How to interpret
More informationSTA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #6
STA 8 Applied Linear Models: Regression Analysis Spring 011 Solution for Homework #6 6. a) = 11 1 31 41 51 1 3 4 5 11 1 31 41 51 β = β1 β β 3 b) = 1 1 1 1 1 11 1 31 41 51 1 3 4 5 β = β 0 β1 β 6.15 a) Stem-and-leaf
More informationRegression Models. Chapter 4. Introduction. Introduction. Introduction
Chapter 4 Regression Models Quantitative Analysis for Management, Tenth Edition, by Render, Stair, and Hanna 008 Prentice-Hall, Inc. Introduction Regression analysis is a very valuable tool for a manager
More informationChapter 4. Regression Models. Learning Objectives
Chapter 4 Regression Models To accompany Quantitative Analysis for Management, Eleventh Edition, by Render, Stair, and Hanna Power Point slides created by Brian Peterson Learning Objectives After completing
More informationLI EAR REGRESSIO A D CORRELATIO
CHAPTER 6 LI EAR REGRESSIO A D CORRELATIO Page Contents 6.1 Introduction 10 6. Curve Fitting 10 6.3 Fitting a Simple Linear Regression Line 103 6.4 Linear Correlation Analysis 107 6.5 Spearman s Rank Correlation
More informationConfidence Interval for the mean response
Week 3: Prediction and Confidence Intervals at specified x. Testing lack of fit with replicates at some x's. Inference for the correlation. Introduction to regression with several explanatory variables.
More informationEcon 3790: Business and Economics Statistics. Instructor: Yogesh Uppal
Econ 3790: Business and Economics Statistics Instructor: Yogesh Uppal yuppal@ysu.edu Sampling Distribution of b 1 Expected value of b 1 : Variance of b 1 : E(b 1 ) = 1 Var(b 1 ) = σ 2 /SS x Estimate of
More informationThe Multiple Regression Model
Multiple Regression The Multiple Regression Model Idea: Examine the linear relationship between 1 dependent (Y) & or more independent variables (X i ) Multiple Regression Model with k Independent Variables:
More informationChapter 14 Student Lecture Notes 14-1
Chapter 14 Student Lecture Notes 14-1 Business Statistics: A Decision-Making Approach 6 th Edition Chapter 14 Multiple Regression Analysis and Model Building Chap 14-1 Chapter Goals After completing this
More informationChapter 14 Multiple Regression Analysis
Chapter 14 Multiple Regression Analysis 1. a. Multiple regression equation b. the Y-intercept c. $374,748 found by Y ˆ = 64,1 +.394(796,) + 9.6(694) 11,6(6.) (LO 1) 2. a. Multiple regression equation b.
More informationRegression Models REVISED TEACHING SUGGESTIONS ALTERNATIVE EXAMPLES
M04_REND6289_10_IM_C04.QXD 5/7/08 2:49 PM Page 46 4 C H A P T E R Regression Models TEACHING SUGGESTIONS Teaching Suggestion 4.1: Which Is the Independent Variable? We find that students are often confused
More informationChapter 4: Regression Models
Sales volume of company 1 Textbook: pp. 129-164 Chapter 4: Regression Models Money spent on advertising 2 Learning Objectives After completing this chapter, students will be able to: Identify variables,
More informationSTAT 212 Business Statistics II 1
STAT 1 Business Statistics II 1 KING FAHD UNIVERSITY OF PETROLEUM & MINERALS DEPARTMENT OF MATHEMATICAL SCIENCES DHAHRAN, SAUDI ARABIA STAT 1: BUSINESS STATISTICS II Semester 091 Final Exam Thursday Feb
More informationMultiple Regression Examples
Multiple Regression Examples Example: Tree data. we have seen that a simple linear regression of usable volume on diameter at chest height is not suitable, but that a quadratic model y = β 0 + β 1 x +
More information1. An article on peanut butter in Consumer reports reported the following scores for various brands
SMAM 314 Review Exam 1 1. An article on peanut butter in Consumer reports reported the following scores for various brands Creamy 56 44 62 36 39 53 50 65 45 40 56 68 41 30 40 50 50 56 65 56 45 40 Crunchy
More informationChapter 12: Multiple Regression
Chapter 12: Multiple Regression 12.1 a. A scatterplot of the data is given here: Plot of Drug Potency versus Dose Level Potency 0 5 10 15 20 25 30 0 5 10 15 20 25 30 35 Dose Level b. ŷ = 8.667 + 0.575x
More informationChapter 7 Student Lecture Notes 7-1
Chapter 7 Student Lecture Notes 7- Chapter Goals QM353: Business Statistics Chapter 7 Multiple Regression Analysis and Model Building After completing this chapter, you should be able to: Explain model
More informationStatistics for Managers using Microsoft Excel 6 th Edition
Statistics for Managers using Microsoft Excel 6 th Edition Chapter 13 Simple Linear Regression 13-1 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of
More informationInference for the Regression Coefficient
Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression line. We can shows that b 0 and b 1 are the unbiased estimates
More informationChapter 14 Simple Linear Regression (A)
Chapter 14 Simple Linear Regression (A) 1. Characteristics Managerial decisions often are based on the relationship between two or more variables. can be used to develop an equation showing how the variables
More informationBasic Business Statistics 6 th Edition
Basic Business Statistics 6 th Edition Chapter 12 Simple Linear Regression Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value of a dependent variable based
More informationChapter 3 Multiple Regression Complete Example
Department of Quantitative Methods & Information Systems ECON 504 Chapter 3 Multiple Regression Complete Example Spring 2013 Dr. Mohammad Zainal Review Goals After completing this lecture, you should be
More informationChapter 13. Multiple Regression and Model Building
Chapter 13 Multiple Regression and Model Building Multiple Regression Models The General Multiple Regression Model y x x x 0 1 1 2 2... k k y is the dependent variable x, x,..., x 1 2 k the model are the
More informationCorrelation Analysis
Simple Regression Correlation Analysis Correlation analysis is used to measure strength of the association (linear relationship) between two variables Correlation is only concerned with strength of the
More informationModel Building Chap 5 p251
Model Building Chap 5 p251 Models with one qualitative variable, 5.7 p277 Example 4 Colours : Blue, Green, Lemon Yellow and white Row Blue Green Lemon Insects trapped 1 0 0 1 45 2 0 0 1 59 3 0 0 1 48 4
More informationEcon 3790: Statistics Business and Economics. Instructor: Yogesh Uppal
Econ 3790: Statistics Business and Economics Instructor: Yogesh Uppal Email: yuppal@ysu.edu Chapter 14 Covariance and Simple Correlation Coefficient Simple Linear Regression Covariance Covariance between
More informationMultiple Linear Regression
Andrew Lonardelli December 20, 2013 Multiple Linear Regression 1 Table Of Contents Introduction: p.3 Multiple Linear Regression Model: p.3 Least Squares Estimation of the Parameters: p.4-5 The matrix approach
More informationSTATISTICS 110/201 PRACTICE FINAL EXAM
STATISTICS 110/201 PRACTICE FINAL EXAM Questions 1 to 5: There is a downloadable Stata package that produces sequential sums of squares for regression. In other words, the SS is built up as each variable
More informationChapter 14 Student Lecture Notes Department of Quantitative Methods & Information Systems. Business Statistics. Chapter 14 Multiple Regression
Chapter 14 Student Lecture Notes 14-1 Department of Quantitative Methods & Information Systems Business Statistics Chapter 14 Multiple Regression QMIS 0 Dr. Mohammad Zainal Chapter Goals After completing
More informationChapter Learning Objectives. Regression Analysis. Correlation. Simple Linear Regression. Chapter 12. Simple Linear Regression
Chapter 12 12-1 North Seattle Community College BUS21 Business Statistics Chapter 12 Learning Objectives In this chapter, you learn:! How to use regression analysis to predict the value of a dependent
More informationChapter 9. Correlation and Regression
Chapter 9 Correlation and Regression Lesson 9-1/9-2, Part 1 Correlation Registered Florida Pleasure Crafts and Watercraft Related Manatee Deaths 100 80 60 40 20 0 1991 1993 1995 1997 1999 Year Boats in
More informationChapter 14. Multiple Regression Models. Multiple Regression Models. Multiple Regression Models
Chapter 14 Multiple Regression Models 1 Multiple Regression Models A general additive multiple regression model, which relates a dependent variable y to k predictor variables,,, is given by the model equation
More information5. Let W follow a normal distribution with mean of μ and the variance of 1. Then, the pdf of W is
Practice Final Exam Last Name:, First Name:. Please write LEGIBLY. Answer all questions on this exam in the space provided (you may use the back of any page if you need more space). Show all work but do
More informationMathematics for Economics MA course
Mathematics for Economics MA course Simple Linear Regression Dr. Seetha Bandara Simple Regression Simple linear regression is a statistical method that allows us to summarize and study relationships between
More informationMarketing Research Session 10 Hypothesis Testing with Simple Random samples (Chapter 12)
Marketing Research Session 10 Hypothesis Testing with Simple Random samples (Chapter 12) Remember: Z.05 = 1.645, Z.01 = 2.33 We will only cover one-sided hypothesis testing (cases 12.3, 12.4.2, 12.5.2,
More informationINFERENCE FOR REGRESSION
CHAPTER 3 INFERENCE FOR REGRESSION OVERVIEW In Chapter 5 of the textbook, we first encountered regression. The assumptions that describe the regression model we use in this chapter are the following. We
More informationOrdinary Least Squares Regression Explained: Vartanian
Ordinary Least Squares Regression Explained: Vartanian When to Use Ordinary Least Squares Regression Analysis A. Variable types. When you have an interval/ratio scale dependent variable.. When your independent
More informationSMAM 314 Practice Final Examination Winter 2003
SMAM 314 Practice Final Examination Winter 2003 You may use your textbook, one page of notes and a calculator. Please hand in the notes with your exam. 1. Mark the following statements True T or False
More informationRegression Analysis II
Regression Analysis II Measures of Goodness of fit Two measures of Goodness of fit Measure of the absolute fit of the sample points to the sample regression line Standard error of the estimate An index
More informationSMAM 319 Exam1 Name. a B.The equation of a line is 3x + y =6. The slope is a. -3 b.3 c.6 d.1/3 e.-1/3
SMAM 319 Exam1 Name 1. Pick the best choice. (10 points-2 each) _c A. A data set consisting of fifteen observations has the five number summary 4 11 12 13 15.5. For this data set it is definitely true
More informationBusiness Statistics. Chapter 14 Introduction to Linear Regression and Correlation Analysis QMIS 220. Dr. Mohammad Zainal
Department of Quantitative Methods & Information Systems Business Statistics Chapter 14 Introduction to Linear Regression and Correlation Analysis QMIS 220 Dr. Mohammad Zainal Chapter Goals After completing
More informationLecture 18: Simple Linear Regression
Lecture 18: Simple Linear Regression BIOS 553 Department of Biostatistics University of Michigan Fall 2004 The Correlation Coefficient: r The correlation coefficient (r) is a number that measures the strength
More informationMultiple Regression. Inference for Multiple Regression and A Case Study. IPS Chapters 11.1 and W.H. Freeman and Company
Multiple Regression Inference for Multiple Regression and A Case Study IPS Chapters 11.1 and 11.2 2009 W.H. Freeman and Company Objectives (IPS Chapters 11.1 and 11.2) Multiple regression Data for multiple
More information(ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box.
FINAL EXAM ** Two different ways to submit your answer sheet (i) Use MS-Word and place it in a drop-box. (ii) Scan your answer sheets INTO ONE FILE only, and submit it in the drop-box. Deadline: December
More informationLecture 10 Multiple Linear Regression
Lecture 10 Multiple Linear Regression STAT 512 Spring 2011 Background Reading KNNL: 6.1-6.5 10-1 Topic Overview Multiple Linear Regression Model 10-2 Data for Multiple Regression Y i is the response variable
More information9 Correlation and Regression
9 Correlation and Regression SW, Chapter 12. Suppose we select n = 10 persons from the population of college seniors who plan to take the MCAT exam. Each takes the test, is coached, and then retakes the
More informationSMAM 314 Exam 42 Name
SMAM 314 Exam 42 Name Mark the following statements True (T) or False (F) (10 points) 1. F A. The line that best fits points whose X and Y values are negatively correlated should have a positive slope.
More informationSimple Linear Regression
9-1 l Chapter 9 l Simple Linear Regression 9.1 Simple Linear Regression 9.2 Scatter Diagram 9.3 Graphical Method for Determining Regression 9.4 Least Square Method 9.5 Correlation Coefficient and Coefficient
More information1-Way ANOVA MATH 143. Spring Department of Mathematics and Statistics Calvin College
1-Way ANOVA MATH 143 Department of Mathematics and Statistics Calvin College Spring 2010 The basic ANOVA situation Two variables: 1 Categorical, 1 Quantitative Main Question: Do the (means of) the quantitative
More informationSolution: X = , Y = = = = =
Q1.19) Grade point average. The director of admissions of a small college selected 120 students at random from the new freshman class in a study to determine whether a student's grade point average (OPA)
More informationRegression Analysis. BUS 735: Business Decision Making and Research. Learn how to detect relationships between ordinal and categorical variables.
Regression Analysis BUS 735: Business Decision Making and Research 1 Goals of this section Specific goals Learn how to detect relationships between ordinal and categorical variables. Learn how to estimate
More informationEstimating σ 2. We can do simple prediction of Y and estimation of the mean of Y at any value of X.
Estimating σ 2 We can do simple prediction of Y and estimation of the mean of Y at any value of X. To perform inferences about our regression line, we must estimate σ 2, the variance of the error term.
More informationPre-Calculus Multiple Choice Questions - Chapter S8
1 If every man married a women who was exactly 3 years younger than he, what would be the correlation between the ages of married men and women? a Somewhat negative b 0 c Somewhat positive d Nearly 1 e
More informationBayesian Analysis LEARNING OBJECTIVES. Calculating Revised Probabilities. Calculating Revised Probabilities. Calculating Revised Probabilities
Valua%on and pricing (November 5, 2013) LEARNING OBJECTIVES Lecture 7 Decision making (part 3) Regression theory Olivier J. de Jong, LL.M., MM., MBA, CFD, CFFA, AA www.olivierdejong.com 1. List the steps
More informationLinear Regression. Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x).
Linear Regression Simple linear regression model determines the relationship between one dependent variable (y) and one independent variable (x). A dependent variable is a random variable whose variation
More informationCorrelation & Simple Regression
Chapter 11 Correlation & Simple Regression The previous chapter dealt with inference for two categorical variables. In this chapter, we would like to examine the relationship between two quantitative variables.
More informationLINEAR REGRESSION ANALYSIS. MODULE XVI Lecture Exercises
LINEAR REGRESSION ANALYSIS MODULE XVI Lecture - 44 Exercises Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Exercise 1 The following data has been obtained on
More informationRegression and correlation were introduced in Chapter 5 as techniques for describing
Chapter 13 Simple Linear Regression and Correlation: Inferential Methods Arne Hodalic/Corbis... Introduction Regression and correlation were introduced in Chapter 5 as techniques for describing and summarizing
More informationInferences for Regression
Inferences for Regression An Example: Body Fat and Waist Size Looking at the relationship between % body fat and waist size (in inches). Here is a scatterplot of our data set: Remembering Regression In
More informationSMAM 319 Exam 1 Name. 1.Pick the best choice for the multiple choice questions below (10 points 2 each)
SMAM 319 Exam 1 Name 1.Pick the best choice for the multiple choice questions below (10 points 2 each) A b In Metropolis there are some houses for sale. Superman and Lois Lane are interested in the average
More informationInference with Simple Regression
1 Introduction Inference with Simple Regression Alan B. Gelder 06E:071, The University of Iowa 1 Moving to infinite means: In this course we have seen one-mean problems, twomean problems, and problems
More informationChapter 16. Simple Linear Regression and dcorrelation
Chapter 16 Simple Linear Regression and dcorrelation 16.1 Regression Analysis Our problem objective is to analyze the relationship between interval variables; regression analysis is the first tool we will
More informationIntroduction to Regression
Introduction to Regression ιατµηµατικό Πρόγραµµα Μεταπτυχιακών Σπουδών Τεχνο-Οικονοµικά Συστήµατα ηµήτρης Φουσκάκης Introduction Basic idea: Use data to identify relationships among variables and use these
More informationECO220Y Simple Regression: Testing the Slope
ECO220Y Simple Regression: Testing the Slope Readings: Chapter 18 (Sections 18.3-18.5) Winter 2012 Lecture 19 (Winter 2012) Simple Regression Lecture 19 1 / 32 Simple Regression Model y i = β 0 + β 1 x
More informationInference for Regression Inference about the Regression Model and Using the Regression Line
Inference for Regression Inference about the Regression Model and Using the Regression Line PBS Chapter 10.1 and 10.2 2009 W.H. Freeman and Company Objectives (PBS Chapter 10.1 and 10.2) Inference about
More informationStat 231 Final Exam. Consider first only the measurements made on housing number 1.
December 16, 1997 Stat 231 Final Exam Professor Vardeman 1. The first page of printout attached to this exam summarizes some data (collected by a student group) on the diameters of holes bored in certain
More informationSchool of Mathematical Sciences. Question 1. Best Subsets Regression
School of Mathematical Sciences MTH5120 Statistical Modelling I Practical 9 and Assignment 8 Solutions Question 1 Best Subsets Regression Response is Crime I n W c e I P a n A E P U U l e Mallows g E P
More informationBNAD 276 Lecture 10 Simple Linear Regression Model
1 / 27 BNAD 276 Lecture 10 Simple Linear Regression Model Phuong Ho May 30, 2017 2 / 27 Outline 1 Introduction 2 3 / 27 Outline 1 Introduction 2 4 / 27 Simple Linear Regression Model Managerial decisions
More informationQuestion Possible Points Score Total 100
Midterm I NAME: Instructions: 1. For hypothesis testing, the significant level is set at α = 0.05. 2. This exam is open book. You may use textbooks, notebooks, and a calculator. 3. Do all your work in
More informationInference. ME104: Linear Regression Analysis Kenneth Benoit. August 15, August 15, 2012 Lecture 3 Multiple linear regression 1 1 / 58
Inference ME104: Linear Regression Analysis Kenneth Benoit August 15, 2012 August 15, 2012 Lecture 3 Multiple linear regression 1 1 / 58 Stata output resvisited. reg votes1st spend_total incumb minister
More informationPART I. (a) Describe all the assumptions for a normal error regression model with one predictor variable,
Concordia University Department of Mathematics and Statistics Course Number Section Statistics 360/2 01 Examination Date Time Pages Final December 2002 3 hours 6 Instructors Course Examiner Marks Y.P.
More informationECON3150/4150 Spring 2016
ECON3150/4150 Spring 2016 Lecture 4 - The linear regression model Siv-Elisabeth Skjelbred University of Oslo Last updated: January 26, 2016 1 / 49 Overview These lecture slides covers: The linear regression
More informationInference for Regression
Inference for Regression Section 9.4 Cathy Poliak, Ph.D. cathy@math.uh.edu Office in Fleming 11c Department of Mathematics University of Houston Lecture 13b - 3339 Cathy Poliak, Ph.D. cathy@math.uh.edu
More informationSTA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007
STA 302 H1F / 1001 HF Fall 2007 Test 1 October 24, 2007 LAST NAME: SOLUTIONS FIRST NAME: STUDENT NUMBER: ENROLLED IN: (circle one) STA 302 STA 1001 INSTRUCTIONS: Time: 90 minutes Aids allowed: calculator.
More informationMultiple Regression: Chapter 13. July 24, 2015
Multiple Regression: Chapter 13 July 24, 2015 Multiple Regression (MR) Response Variable: Y - only one response variable (quantitative) Several Predictor Variables: X 1, X 2, X 3,..., X p (p = # predictors)
More informationMAT 212 Introduction to Business Statistics II Lecture Notes
MAT 212 Introduction to Business Statistics II Lecture Notes Muhammad El-Taha Department of Mathematics and Statistics University of Southern Maine 96 Falmouth Street Portland, ME 04104-9300 MAT 212, Spring
More informationMulticollinearity occurs when two or more predictors in the model are correlated and provide redundant information about the response.
Multicollinearity Read Section 7.5 in textbook. Multicollinearity occurs when two or more predictors in the model are correlated and provide redundant information about the response. Example of multicollinear
More informationChapter 6 Multiple Regression
STAT 525 FALL 2018 Chapter 6 Multiple Regression Professor Min Zhang The Data and Model Still have single response variable Y Now have multiple explanatory variables Examples: Blood Pressure vs Age, Weight,
More informationSimple Linear Regression
Simple Linear Regression ST 430/514 Recall: A regression model describes how a dependent variable (or response) Y is affected, on average, by one or more independent variables (or factors, or covariates)
More informationChapter 16. Simple Linear Regression and Correlation
Chapter 16 Simple Linear Regression and Correlation 16.1 Regression Analysis Our problem objective is to analyze the relationship between interval variables; regression analysis is the first tool we will
More informationSTA 4210 Practise set 2a
STA 410 Practise set a For all significance tests, use = 0.05 significance level. S.1. A multiple linear regression model is fit, relating household weekly food expenditures (Y, in $100s) to weekly income
More informationDepartment of Mathematics & Statistics STAT 2593 Final Examination 17 April, 2000
Department of Mathematics & Statistics STAT 2593 Final Examination 17 April, 2000 TIME: 3 hours. Total marks: 80. (Marks are indicated in margin.) Remember that estimate means to give an interval estimate.
More informationSTAT 360-Linear Models
STAT 360-Linear Models Instructor: Yogendra P. Chaubey Sample Test Questions Fall 004 Note: The following questions are from previous tests and exams. The final exam will be for three hours and will contain
More informationFinding Relationships Among Variables
Finding Relationships Among Variables BUS 230: Business and Economic Research and Communication 1 Goals Specific goals: Re-familiarize ourselves with basic statistics ideas: sampling distributions, hypothesis
More information1: a b c d e 2: a b c d e 3: a b c d e 4: a b c d e 5: a b c d e. 6: a b c d e 7: a b c d e 8: a b c d e 9: a b c d e 10: a b c d e
Economics 102: Analysis of Economic Data Cameron Spring 2016 Department of Economics, U.C.-Davis Final Exam (A) Tuesday June 7 Compulsory. Closed book. Total of 58 points and worth 45% of course grade.
More informationLectures on Simple Linear Regression Stat 431, Summer 2012
Lectures on Simple Linear Regression Stat 43, Summer 0 Hyunseung Kang July 6-8, 0 Last Updated: July 8, 0 :59PM Introduction Previously, we have been investigating various properties of the population
More information28. SIMPLE LINEAR REGRESSION III
28. SIMPLE LINEAR REGRESSION III Fitted Values and Residuals To each observed x i, there corresponds a y-value on the fitted line, y = βˆ + βˆ x. The are called fitted values. ŷ i They are the values of
More informationEconomics 326 Methods of Empirical Research in Economics. Lecture 14: Hypothesis testing in the multiple regression model, Part 2
Economics 326 Methods of Empirical Research in Economics Lecture 14: Hypothesis testing in the multiple regression model, Part 2 Vadim Marmer University of British Columbia May 5, 2010 Multiple restrictions
More informationProblem Set #3-Key. wage Coef. Std. Err. t P> t [95% Conf. Interval]
Problem Set #3-Key Sonoma State University Economics 317- Introduction to Econometrics Dr. Cuellar 1. Use the data set Wage1.dta to answer the following questions. a. For the regression model Wage i =
More informationEXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 2000
EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY (formerly the Examinations of the Institute of Statisticians) HIGHER CERTIFICATE IN STATISTICS, 2000 CERTIFICATE IN OFFICIAL STATISTICS, 2000 Paper I : Statistical
More informationSimple Linear Regression Using Ordinary Least Squares
Simple Linear Regression Using Ordinary Least Squares Purpose: To approximate a linear relationship with a line. Reason: We want to be able to predict Y using X. Definition: The Least Squares Regression
More informationST430 Exam 1 with Answers
ST430 Exam 1 with Answers Date: October 5, 2015 Name: Guideline: You may use one-page (front and back of a standard A4 paper) of notes. No laptop or textook are permitted but you may use a calculator.
More information(1) The explanatory or predictor variables may be qualitative. (We ll focus on examples where this is the case.)
Introduction to Analysis of Variance Analysis of variance models are similar to regression models, in that we re interested in learning about the relationship between a dependent variable (a response)
More informationANOVA: Analysis of Variation
ANOVA: Analysis of Variation The basic ANOVA situation Two variables: 1 Categorical, 1 Quantitative Main Question: Do the (means of) the quantitative variables depend on which group (given by categorical
More informationData Set 8: Laysan Finch Beak Widths
Data Set 8: Finch Beak Widths Statistical Setting This handout describes an analysis of covariance (ANCOVA) involving one categorical independent variable (with only two levels) and one quantitative covariate.
More informationTable of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z).
Table of z values and probabilities for the standard normal distribution. z is the first column plus the top row. Each cell shows P(X z). For example P(X 1.04) =.8508. For z < 0 subtract the value from
More informationMultiple Linear Regression
Multiple Linear Regression Simple linear regression tries to fit a simple line between two variables Y and X. If X is linearly related to Y this explains some of the variability in Y. In most cases, there
More information