I use [TS] varsoc to obtain the optimum lag length for the Granger causality test. This command reports the optimal number of lags based on different criteria such as Akaike's information criterion (AIC).
Is there any way to store the optimal lag number (obtained based on AIC) in a variable and use it in the next command to estimate causality? Something like this:
Lag= varsoc X Y
tvgc X Y, p(Lag) d(Lag) trend window(30) prefix(_) graph
Related Posts with Optimal lag selection in Granger Causality tests
replacing all missing ".a" to "."?Dear All, How can I replace all ".a" in the following dataset Code: * Example generated by -dataex…
Basics - Including control variable with many missings in regressionHello, I am confused about a simple issue: Imagine a cross-sectional dataset with plenty of ids. If …
Bar Chart with standard errors (move from two to three bars)Dear all, I hope all is well with you. I wanted to create a bar chart with three bars along with th…
How get rowname of max value using matsort, tabstatmat and st_matrixrowstripeHi all, I am trying to get row name of a max value in a matrix. Previously i did install matsort an…
Get rownames from tabstat matrixHi all, I am trying to get rownames from a tabstat matrix, example code: Code: sysuse auto, clear…
Subscribe to:
Post Comments (Atom)
0 Response to Optimal lag selection in Granger Causality tests
Post a Comment