AGEC 621 Lecture 16 David Bessler This is a RATS output for the dummy variable problem given in GHJ page 422; the beer expenditure lecture (last time). I do not expect you to know RATS but this will give you knowledge of what standard outputs look like and allow us to analyze the output and relationships between T-stats and F- tests. RATS386 4.20. Run on Oct 28 1999 (c) 1992-5 Thomas A. Doan. All rights reserved * Beer Data * 621 lecture 24 * *Ignore the next several lines, it s just RATS input stuff. cal 1901 1 1 alloc 30 1950:1 eqv 1 2 3 4 5 6 7 8 9 10 11 BE S E1 E2 E3 Y AGE E1S E2S E3S YS declare symmetric v open data a:beer.txt data(format=free,org=obs) 1901:1 1940:1 1 to 7 set E1S 1901:1 1940:1 = %x(t,2)*%x(t,3) set E2S 1901:1 1940:1 = %x(t,2)*%x(t,4) set E3S 1901:1 1940:1 = %x(t,2)*%x(t,5) set YS 1901:1 1940:1 = %x(t,2)*%x(t,6)
Here we ask the computer to do an ols estimation in RATS commands (the (vcv) matrix asks for the variance covariance matrix on coefficients we will use this later). linreg(vcv) 1 1901:1 1940:1 # constant 2 3 4 5 6 8 9 10 11 Dependent Variable BE - Estimation by Least Squares Annual Data From 1901:01 To 1940:01 Usable Observations 40 Degrees of Freedom 30 Centered R**2 0.565582 R Bar **2 0.435256 Uncentered R**2 0.829555 T x R**2 33.182 Mean of Dependent Variable 191.55000000 Std Error of Dependent Variable 155.88061671 Standard Error of Estimate 117.14338783 Sum of Squared Residuals 411677.19936 Regression F(9,30) 4.3398 Significance Level of F 0.00109138 Durbin-Watson Statistic 2.603148 Q(10-0) 29.183473 Significance Level of Q 0.00116360 Variable Coeff Std Error T-Stat Signif ********************************************************* 1. Constant 134.888420 72.4876871 1.86085 0.07259250 2. S -106.6133853 94.0210140-1.13393 0.26580352 3. E1 65.3920683 79.8104741 0.81934 0.41905319 4. E2 71.6217443 85.8196550 0.83456 0.41056162 5. E3-39.5438529 145.334692-0.27209 0.78741811 6. Y 0.0023207 0.000990 2.34305 0.02594961 7. E1S 7.4837739 108.478239 0.06899 0.94545640 8. E2S -67.4434193 115.332876-0.58477 0.56307370 9. E3S -33.5589733 180.551845-0.18587 0.85379864 10. YS -0.0009713 0.001182-0.82127 0.41797243
Covariance\Correlation Matrix of Coefficients Constant S E1 E2 Constant 5254.46477-0.7709732531-0.7502866974-0.6144750987 S -5254.46477 8839.95108 0.5784509759 0.4737438658 E1-4340.61613 4340.61613 6369.71177 0.7055113395 E2-3822.56866 3822.56866 4832.26400 7365.01318 E3-3187.81058 3187.81058 5050.21100 6105.97862 Y -0.02583 0.02583-0.00887-0.02854 E1S 4340.61613-7738.79701-6369.71177-4832.26400 E2S 3822.56866-6903.30707-4832.26400-7365.01318 E3S 3187.81058-6190.12204-5050.21100-6105.97862 YS 0.02583-0.03388 0.00887 0.02854 E3 Y E1S E2S Constant -0.3025930648-0.3598136118 0.5520069042 0.4572333802 S 0.2332911595 0.2774066708-0.7587625669-0.6366183279 E1 0.4353918918-0.1122082002-0.7357279639-0.5249738114 E2 0.4895524400-0.3357926193-0.5190644213-0.7441040023 E3 21122.17282-0.3657390397-0.3203299900-0.3642779300 Y -0.05265 9.81007e-007 0.0825547106 0.2498646319 E1S -5050.21100 0.00887 11767.52839 0.6663090466 E2S -6105.97862 0.02854 8336.26402 13301.67239 E3S -21122.17282 0.05265 8570.65303 10504.54074 YS 0.05265-9.81007e-007-0.01056-0.04672 E3S YS Constant 0.2435714234 0.3013183107 S -0.3646467511-0.3046432429 E1-0.3504674599 0.0939663876 E2-0.3940638386 0.2812024378 E3-0.8049471443 0.3062804351 Y 0.2944005955-0.8374288821 E1S 0.4375919808-0.0822774637 E2S 0.5044545158-0.3424972056 E3S 32598.96890-0.3507376306 YS -0.07490 1.39887e-006
F-Tests on Restrictions First we want to compare the F-test (from restricted versus an unrestricted ols regressions) with the t-test as given on the previous page. restrict 1 # 3 # 1.0 0.0 F(1,30)= 0.67132 with Significance Level 0.41905319 Notice that my F(1,30) significance level of.41905 is the same as the significance level on the t-test associated with the E1 variable from the ols regression. Next I do the same thing for the coefficient associated with E2 restrict 1 # 4 # 1.0 0.0 F(1,30)= 0.69649 with Significance Level 0.41056162 Again check the significance level on E2 from the ols output. Finally I look at a linear combination of estimated coefficients. restrict 1 # 3 4 # 1.0-1.0 0.0 F(1,30)= 0.00953 with Significance Level 0.92286229 How do we do this last test (without just invoking the magic of the computer!)? This lecture focuses attention on hypothesis testing on a linear combination of coefficients.
Recall the formula for the variance of a linear combinations of variables. Say x 1, x 2, x 3,..., x k are k random variables. Then the variance of the sum of a linear combination of these variables is given as: Var (a 1 x 1 + a 2 x 2 + a 3 x 3 +... + a k x k ) k = G (a i 2 VAR (x i )) + G G a i. a j COV (x i,x j ) i=1 i j What does this formula for the variance of a linear combination of variables mean? For constants a 1, a 2,..., a k when combine the random variables x 1, x 2,..., x k by multiplying each by it s associated constant and add these, the sum (the result) has a variance given by the right-hand-side of the above equation. We square each of the constant weights (a i ) and multiply each by the variance of each of the associated random variables (x i ) we add these. To this sum we add the covariances between i and j (i not equal to j) weighted by the product of the constants (a i times a j ).
Let s write this out for k = 2: Var (a 1 x 1 + a 2 x 2 ) = (a 1 2 VAR (x 1 )) + (a 2 2 VAR (x 2 )) + a 1. a 2 COV (x 1,x 2 ) + a 2. a 1 COV (x 2, x 1 ) = (a 1 2 VAR (x 1 )) + (a 2 2 VAR (x 1 )) + 2 a 1. a 2 COV (x 1, x 2 ) Again but now for k = 3: Var (a 1 x 1 + a 2 x 2 + a 3 x 3 ) 2 = (a 1 VAR (x 1 )) + (a 2 2 VAR (x 2 ) + (a 2 3 VAR (x 3 )) + a 1. a 2 COV (x 1, x 2 ) + a 1. a 3 COV (x 1, x 3 ) + a 2. a 1 COV (x 2, x 1 ) + a 2.a 3 COV (x 2, x 3 ) + a 3. a 1 COV (x 3, x 1 ) + a 3, a 2 COV (x 3, x 2 ) Var (a 1 x 1 + a 2 x 2 + a 3 x 3 ) = (a 1 2 VAR (x 1 )) + (a 2 2 VAR (x 2 ) + (a 3 2 VAR (x 3 )) + 2 a 1. a 2 COV (x 1, x 2 ) + 2 a 1. a 3 COV (x 1, x 3 ) + 2 a 2. a 3 COV (x 2, x 3 ) Now what does all this have to do with hypothesis testing? Well, say we want to test the hypothesis that : $ 1 = $ 2. We could write this as: Ho: $ 1 - $ 2 = 0 We test this using the formulas we derived earlier:
t = (b 1 b 2 0) / (Var (b 1 b 2 )) ½ Now in the formula given above the x 1 is b 1, a 1 is 1, x 2 is b 2, and a 2 is 1 (k=2). Let s test the hypothesis that the coefficient associated with E1 equals the coefficient associated with E2 in the above problem: t= (65.392-71.622) / (6369.71177 + 7365.01318-2(4832.264)).5 = -.09765 Comparing this with the t-table for T-K = 30 degrees of freedom at a 5% significance level we fail to reject the null hypothesis that the two betas are equal.
Make sure you can pick out the correct variances and covariances from the above printout. (On a test I ll give you the matrix just like above, so be sure you know what is going on - - correlations above the diagonal, covariances below the diagonal and variances on the diagonal.) Just a point of note, if we performed this test as an F-test, where we restricted versus unrestricted test, we would get a calculated u-value (see last lecture on u) of.00953, which at F(1,30) again suggests that we do not reject the null hypothesis. It turns out that if we take our calculated t value from above and square it (-.09756) we get (.00953). This result is general t 2 = F (for a single linear restriction, it s not clear on multiple restrictions even how to do the t-test). Let s try this out this last bit of insight on a single hypothesis test, say that the beta associated with E1 is equal to zero (in the regression out given above). Now from the regression output we see the t-value is.81934 (65.3920683/79.8104741). Now if we square.81934 we get.67132; which is the F-value computed above for the test that the beta on E1 = 0 (see the restricted test given above). Another Example Test the hypothesis that the coefficients associated with E1, E2 and E3 in the above output sum to 100. Here all of the a i weights are 1.0 as: Ho: $ 1 + $ 2 + $ 3 100 = 0; don t worry that the estimated coefficient of the third beta is negative, the hypothesis ask only that we test the sum of the coefficients: t = (65.39 + 71.62-39.54-100.00) / [ 6369.71 + 7365.01 = -.00979 + 21,122.17 + 2(4832.26) + 2(5050.21) + 2(6105.98)].5
This last figure is what I get when I run a restricted versus unrestricted test using the F-statistic. For our purposes please be sure you can calculate the t-test on a linear combination of variables.