You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Ubuntu 16.x LTS, R latest, modeltime.ensemble latest
A submodels_tbl has 15 correctly fitted models.
When I try to use them with modeltime_fit_resamples(), only a fraction of them show up in the result of that function (only 4)
Is there an explanation available?
Side note: When I use less then the 15 models, the code breaks while fitting a glmnet-metaLearner. It promps:
x Slice1: preprocessor 1/1, model 1/1: Error: For the glmnet engine, `penalty`
must be a single number (or a value of `tune()`).
... where the model ist correctly tagged with 'penalty=tune::tune()'
I noticed the same effect with lasso (mixture=1).
My guess is, it will be forgotten anywhere in modeltime.ensemble-internal code. Currently I'm testing different metaLearners. Xgboost metaLearner seem to work only without xgboost submodels. Others work fine so far.
It would be nice to have a fallback/try-catch option in modeltime.resample. Otherwise code breaks in huge projects, any time something fails at this point.
The text was updated successfully, but these errors were encountered:
... results in a clickable, ready to drill down view in Rstudio.
... comparing with this, the result seems to be corrupted and is not clickable. The difference lies in the different lengths of .resample_results.
What would be, if we were able to reduce the predictions on min. length of resample.results?
update: There is more... .predictions=NULL in a GLMNET-model
Every failing model will be lost...
Since we have no influence on how models will be treated in modeltime_fit_resamples(), there is no chance to fix it- other then hardcoded.
Error in if (is.numeric(args$mixture) && (args$mixture < 0 | args$mixture > : missing value where TRUE/FALSE needed
Ubuntu 16.x LTS, R latest, modeltime.ensemble latest
A submodels_tbl has 15 correctly fitted models.
When I try to use them with modeltime_fit_resamples(), only a fraction of them show up in the result of that function (only 4)
Is there an explanation available?
Side note: When I use less then the 15 models, the code breaks while fitting a glmnet-metaLearner. It promps:
... where the model ist correctly tagged with 'penalty=tune::tune()'
I noticed the same effect with lasso (mixture=1).
My guess is, it will be forgotten anywhere in modeltime.ensemble-internal code. Currently I'm testing different metaLearners. Xgboost metaLearner seem to work only without xgboost submodels. Others work fine so far.
It would be nice to have a fallback/try-catch option in modeltime.resample. Otherwise code breaks in huge projects, any time something fails at this point.
The text was updated successfully, but these errors were encountered: