You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello!
The case is pretty simple. Say I define a simple SETAR model:
set <- setar(training, m=1, nthresh = 1, include = "const")
and then I'm trying to calculate a prediction from that: > predict(set, n.ahead = 1) Time Series: Start = 80 End = 80 Frequency = 1 [1] -1.083009
everything works just fine. However the same code with include="none":
set <- setar(training, m=1, nthresh = 1, include = "none")
predict(set, n.ahead = 1)
Time Series:
Start = 80
End = 80
Frequency = 1
[1] NaN
I really don't think it's the fault of the data (this error has been occurring to me multiple times on different datasets) so if I'm not doing anything wrong (am I?) I think that this is a bug that will need some attention.
Hope you have a nice day!
PS: the dataset I'm using is the datasets::lynx
The text was updated successfully, but these errors were encountered:
I am afraid not, as I have very little bandwidth currently, and the issue is bigger than I thought: the current code uses the old oneStep, while it should use setar.gen.
If you need the code sooner, you can just see how predict.TVAR uses TVAR.gen (likewise for predict.VAR and VAR.gen), to adapt it to predicta setar model. Literally all you have to do is to feed in the correct initial values and decide on which innovations to use.
Hello!
The case is pretty simple. Say I define a simple SETAR model:
set <- setar(training, m=1, nthresh = 1, include = "const")
and then I'm trying to calculate a prediction from that:
> predict(set, n.ahead = 1) Time Series: Start = 80 End = 80 Frequency = 1 [1] -1.083009
everything works just fine. However the same code with
include="none"
:I really don't think it's the fault of the data (this error has been occurring to me multiple times on different datasets) so if I'm not doing anything wrong (am I?) I think that this is a bug that will need some attention.
Hope you have a nice day!
PS: the dataset I'm using is the datasets::lynx
The text was updated successfully, but these errors were encountered: