Regression Significance

To statistically determine whether the regression is actually a model of the average behavior of your dependent variable, you can use the F-statistic. In this case, H0 is the proposition that y does not depend on any of the independent variables, and Hi is that it does.

Ho : ^1 + ei

Hi : ві + в2Xi2 + … + fikXik + ei

The null hypothesis can alternately be expressed as в2,вз, …,вк = 0, a set of K — 1 linear restrictions. In Big Andy’s Burger Barn the script is

1 open "@gretldirdatapoeandy. gdt"

2 square advert

3 ols sales const price advert sq_advert

4 restrict

5 b[2] = 0

6 b[3] = 0

7 b[4] = 0

8 end restrict

In lines 3-8 the model is estimated and the three slopes are restricted to be zero. The test result is shown in Figure 6.7 below. You can see that the F-statistic for this test is equal to 24.4593.

image187

Figure 6.7: The results obtained from using the restrict statements via the dialog box to conduct the overall F-test of regression significance.

You should also notice that the same number appears in the regression results as F(3, 71). This is not coincidental. The test of regression significance is important enough that it appears on the default output of every linear regression estimated using gretl. The statistic and its p-value are highlighted in Figure 6.7. Since the p-value is less than = 0.05, we reject the null hypothesis that the model is insignificant at the five percent level.

This is also a good opportunity to use the omit statement and to show the effect of the —wald

option. Consider the script

1 open "@gretldirdatapoeandy. gdt"

2 square advert

3 list xvars = price advert sq_advert

4 ols sales const xvars —quiet

5 omit xvars —wald

6 omit xvars

The regressors that carry slopes are collected into the list called xvars. Then, the overall F-test can be performed by simply omitting the xvars from the model. This tests the hypothesis that each coefficient is zero against the alternative that at least one is not. The —wald option will perform the test without imposing the restrictions. The chi-square form is actually very similar to the F-form; divide the chi-square form by its degrees of freedom and you will get the F. Their are slight differences in the Xj/J and the Fj, n-K distributions, which accounts for the small difference in the reported p-values.

The second omit xvars statement will then repeat the test, this time imposing the restrictions on the model. The output is shown if Figure 6.8. You can see that the F-form in the top portion

Null hypothesis: the regression parameters are zero for the variables

Подпись: Asymptotic test statistic: Wald chi-square<3) = 73.3779, with p-value = 8.06688e-016 F-form: F(3, 71) = 24.4593, with p-value = 5.59996e-011

price, advert, sq_advert omit XVarS -“Wald

Model 2: OLS, using observations 1-75

Dependent variable: sales Oltlit xVdTS

coefficient std. error t-ratio p-value

Подпись: 0.749232

image190

9.62Є-082 * * *

Подпись: Comparison of Model 1 and Model 2:omit with no option estimates
restricted model

Подпись: Test statistic: F(3, 71) = 24.4593, with p-value = 5.59996e-011 Of the 3 model selection statistics, 0 have improved.

Null hypothesis: the regression parameters are zero for the variables price, advert, sq advert

Figure 6.8: The results obtained from using the omit statements to conduct the overall F-test of regression significance.

of the output and the test statistic at the bottom match each other as well as the one obtained using restrict. No regression output follows the first version because of the –wald option. In the second instance, the model is restricted and the estimate of the constant (the series mean in this case) is given before printing the test result.

One can also perform the test manually using saved results from the estimated model. The script to do so is:

1 ols sales const price advert sq_advert

2 scalar sseu = $ess

3 scalar unrest_df = $df

4 ols sales const

5 scalar sser = $ess

6 scalar rest_df = $df

7

7 scalar J = rest_df – unrest_df

8 scalar Fstat=((sser-sseu)/J)/(sseu/(unrest_df)) io pvalue F J unrest_df Fstat

Since there are three hypotheses to test jointly the numerator degrees of freedom for the F-statistic is J = K — 1 = 3. The saved residual degrees of freedom from the restricted model can be used to obtain the number of restrictions imposed. Each unique restriction in a linear model reduces the number of parameters in the model by one. So, imposing one restriction on a three parameter unrestricted model (e. g., Big Andy’s), reduces the number of parameters in the restricted model to two. Let Kr be the number of regressors in the restricted model and Ku the number in the unrestricted model. Subtracting the degrees of freedom in the unrestricted model (N — Ku) from those of the restricted model (N — Kr) will yield the number of restrictions you’ve imposed, i. e., (N — Kr) — (N — Ku) = (Ku — Kr) = J.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>