Why so many lags in the VAR?

#1
Hello everyone,

I am currently working on the replication of an article called "The Impact of Oil Price Shocks on the US Stock Market" (2009) by Kilian and Park. In this paper, the authors used a VAR with 24 lags on monthly data.

I have used the MATLAB function varorder by Ruey S. Tsay to detect the minimizing information criterion AIC (and find the optimal lag length for the VAR model), but it seems that the optimal lag would be around 2,3, not 24! (see picture below). The results are even more striking using BIC or HQ.

AIC Screenshot

Do you know what could justify to that such a long lag?

nb: the paper is investigating impulse response functions and forecast error variance decomposition
 

noetsi

Fortran must die
#2
They may have had a theory on how many lags would influence the results? That is something which substantively suggested X or Y 2 years in the past would continue to influence a Y now.

What does their article say about that? You can always ask them why they chose that many lags.
 

hlsmith

Not a robit
#3
Not familiar with VAR model yet. But I know in other time series, you should look at lags for at least 3 years if you suspect any seasonality, otherwise you might miss them if a year had an anomaly or by change the seasonality was dampened enough not to be noticeable.
 

noetsi

Fortran must die
#5
Hlsmith if you get familiar with them let me know. I have studied them for a while and remain confused :)

You might want to look at "Multiple Time Series Modeling Using the SAS VARMAX Procedure by Anders Milhoj. Not his best book, but a start.
 
#6
Hlsmith if you get familiar with them let me know. I have studied them for a while and remain confused :)

You might want to look at "Multiple Time Series Modeling Using the SAS VARMAX Procedure by Anders Milhoj. Not his best book, but a start.
Lutz Kilian answered me, actually the response can be found on his book "Structural Vector Autoregressive Analysis" (see attachement)
 

Attachments