I need to run stationarity test for multiple time series. I use the ur.df function from the urca package to do them. I then store the outputs from each of these adf test as a list of lists - as each of these is a list itself.
I need to be able to store all the parameters like from the output in to a data frame. Is there a way to do it?
I know I can extract some of the parameters like the @teststat, @cval and the like but how do we get out all of the parameters as we can with the lm regression output using broom
For example, if this is the output from ur.df function
test1 <- ur.df(usage_1601_1612, type = "none", lags = 1, selectlags = "AIC")
The contents of test1 are shown below
Test regression none
Call:
lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
Residuals:
Min 1Q Median 3Q Max
-6093.2 -1385.8 -100.9 1414.3 6962.8
Coefficients:
Estimate Std. Error t value Pr(>|t|)
z.lag.1 -0.004212 0.005191 -0.811 0.4177
z.diff.lag -0.126685 0.052161 -2.429 0.0156 *
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 2351 on 362 degrees of freedom
Multiple R-squared: 0.01838, Adjusted R-squared: 0.01296
F-statistic: 3.39 on 2 and 362 DF, p-value: 0.03479
Value of test-statistic is: -0.8114
Critical values for test statistics:
1pct 5pct 10pct
tau1 -2.58 -1.95 -1.62
This is the general code that I run
urresultorigobjects <-
lapply(usagextsobjects, function(x) {
summary(ur.df(x, type = "none", lags = 1, selectlags = "AIC"))
})
This generates the list urresultorigobjects containing the outputs from the stationarity test on each of the xts objects in the list usagextsobjects.
I would like to be able to capture all of the output and store the relevant information in a data frame for code based reference for downstream processing.
Any suggestions would be greatly appreciated.
I don't know if there is an easier way but I managed it like this:
(ur.df(valdata, type="none", lags=0))@testreg[["coefficients"]][,"Pr(>|t|)"]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With