Vector Autoregression
The vector autoregression model (VAR) is actually a little simpler to estimate than the VEC model. It is used when there is no cointegration among the variables and it is estimated using timeseries that have been transformed to their stationary values.
In the example from POE4, we have macroeconomic data on RPDI and RPCE for the United States. The data are found in the fred. gdt dataset and have already been transformed into their natural logarithms. In the dataset, y is the log of real disposable income and c is log of real consumption expenditures. As in the previous example, the first step is to determine whether the variables are stationary. If they are not, then you transform them into stationary timeseries and test for cointegration.
The data need to be analyzed in the same way as the GDP series in the VECM example. Examine the plots to determine possible trends and use the ADF tests to determine which form
Figure 13.7: Plot of the error correction terms from the vecm 3 1 aus usa command. 
of the data are stationary. These data are nonstationary in levels, but stationary in differences. Then, estimate the cointegrating vector and test the stationarity of its residuals. If stationary, the series are cointegrated and you estimate a VECM. If not, then a VAR treatment is sufficient.
Open the data and take a look at the timeseries plots.
1 open "@gretldirdatapoefred. gdt"
2 scatters c diff(c) y diff(y)
The plots appear in Figure 13.10. The levels series appear to be trending together. The differences may be trending downward ever so slightly. The mean of the difference series appears to be greater than zero, suggesting that a least a constant be included in the ADF regressions. Inclusion of a trend could be tested using a ttest based on the regression output.
The other decision that needs to be made is the number of lagged differences to include in the augmented DickeyFuller regressions. The principle to follow is to include just enough so that the residuals of the ADF regression are not autocorrelated. The recommendation is to test down using the —testdown option of the adf command.
1 adf 12 c —ct —testdown —verbose
2 adf 12 y —ct —testdown —verbose
Figure 13.8: Plot of the error correction terms from the vecm 3 1 aus usa where the cointegrating vector is aus = usa. 
After some experimentation, the decision was made to keep the trend in the ADF regresions. The term was significant for both series. The testdown procedure chose 3 lagged differences of c in the first model and 10 lagged differences of y in the second. In both cases, the unit root hypothesis could not be rejected at 10%. See Figures 13.11 and 13.12.
It is probably a good idea to confirm that the differences are stationary, since VAR in differences will require this.
If c and y are cointegrated then you would estimate a VECM. The EngleGranger tests reveals that they are not.
Augmented DickeyFuller test for uhat including 5 lags of (1L)uhat (max was 12) sample size 194
unitroot null hypothesis: a = 1
model: (1L)y = b0 + (a1)*y(1) + … + e 1storder autocorrelation coeff. for e: 0.008 lagged differences: F(5, 188) = 5.028 [0.0002] estimated value of (a – 1): 0.0798819
Сазе 3: Unrestricted constant
Restrictions on beta: bl 4 Ь2 = 0
Unrestricted loglikelihood (lu) = 179.93953 Restricted loglikelihood (lr) = 180.13562 2 – (lu – lr) = 0.39217S P(Chisquare(1) > 0.392178) = 0.531157
beta (cointegrating vectors, standard errors in parentheses)
1.0 (0 . ЗЗЭ5Э) 1.0000 (0 ■ ЗЗЭ5Э)
alpha (adjustment vectors)
New adjustment parameters.
Figure 13.9: Output from the restricted VECM model. The cointegrating relationship is A=U.
test statistic: tau_c(2) = 2.39489 asymptotic pvalue 0.327
There is evidence for a cointegrating relationship if:
(a) The unitroot hypothesis is not rejected for the individual variables.
(b) The unitroot hypothesis is rejected for the residuals (uhat) from the cointegrating regression.
The pvalue on the test statistic is 0.327. We cannot reject the unit root hypothesis for the residuals and therefore the series are not cointegrated. We are safe to estimate the VAR in differences.
The basic syntax for the var command appears below
var
Arg um ents: order ylist [ ; xiist ]
Options: —nc [do not include a constant)
—trend (include a linear trend)
—seasonals (include seasonal dummy variables)
—robust (robust standard errors)
—robusthac (НАС standard errors)
—impulseresponses (print impulse responses)
—variancedecomp (print variance decompositions)
—lags elect (show information criteria for lag selection)
You specify the lag order, the series to place in the VAR, and any options you want. You can choose HAC standard errors and ways to model deterministic trends in the model. Estimating the
var 12 diff(c) diff(y) —lagselect
We’ve chosen that option here with the first few lines of the result:
VAR system, maximum lag order 12
The asterisks below indicate the best (that is, minimized) values of the respective information criteria, AIC = Akaike criterion,
BIC = Schwarz Bayesian criterion and HQC = HannanQuinn criterion.
lags 
loglik 
p(LR) 
AIC 
BIC 
HQC 
1 
1319.59415 
14.049135 
13.945463* 
14.007127* 

2 
1323.61045 
0.09039 
14.049310 
13.876523 
13.979296 
3 
1329.48171 
0.01937 
14.069323* 
13.827422 
13.971305 
Augmented DickeyFuller test for c including
sample size 196
unitroot null hypothesis: a
with constant and trend
model: (1L) у = ЬО + M*t + (al)*y(l) + … +
lstorder autocorrelation coeff. for e: 0.009 lagged differences: F(3, 190) = 12.601 [0.0000] estimated value of (a – 1): 0.0412939
Augmented DickeyFuller regression OLS, using observations 1961:12009:4 (T Dependent variable: d c
AIC: 1428.12 BIC: 1408.46 HQC: 1420.16
Figure 13.11: ADF tests of ln(RPCE)
4 1333.38145 0.09921 14.068251 13.757235 13.942227
The BIC (SC) and HQC pick the same number of lags, 1. That is what we’ve estimated so we are satisfied. You can also issue a model test command after the VAR to determine if there is any remaining autocorrelation in the residuals. If there is, you probably need to add additional lags to the VAR. When used here, the LjungBox Q statistics for both equations have pvalues above 0.10 and the null hypothesis of no autocorrelation is not rejected.
The model output is found in Table 13.1
You can also get gretl to generate the VAR’s lag selection command through the dialogs. Select Model>Time series>VAR lag selection from the pulldown menu. This reveals the VAR lag selection dialog box. You can choose the maximum lag to consider, the variables to include in the model, and whether the model should contain constant, trend, or seasonal dummies.
model: (lL)y = bQ + bl*t + (al)*y(l) + … + e
lstorder autocorrelation coeff. for e: 0.003 lagged differences: F(10, 176) – 1.354 [0.2056] estimated value of (a – 1): 0.0496395 test statistic: tau_ct(l) = 2.67729 asymptotic pvalue 0.2461
Augmented DickeyFuller regression
OL5, using observations 1962:42009:4 (T = 189)
Dependent variable: d у
coefficient 
std. error 
cratio 
pvalue 

const 
0.3Э124Э 
0.142240 
2.751 
0.0066 *** 
У 1 
0.04 Э 63 35 
0.0185409 
2.677 
0.2461 
d у 1 
0.0888139 
0.0736815 
1.205 
0.2297 
d у 2 
0.106948 
0.0739483 
1.446 
0.14ЭЭ 
d у 3 
0.0640443 
0.0745454 
0.3591 
0.3914 
d у 4 
0.0184872 
0.0744306 
0.2484 
0.8041 
d у 5 
0.115714 
0.0743093 
1.557 
0.1212 
d у 6 
0.040370 Э 
0.0760118 
0.5311 
0.5960 
d у 7 
0.0338480 
0.0765387 
0.4422 
0.6589 
d у 3 
0.0507988 
0.0762423 
0.6663 
0.5061 
d у 9 
0.0997858 
0.0759764 
1.313 
0.1903 
& О 
0.134116 
0.0762475 
1.759 
0.0303 * 
time 
0.000366905 
0.000149227 
2.459 
0.014Э ** 
AIC: 1241 
.27 BIC: 1199.12 HQC: 1224.19 
Leave a reply