Skip to content

Commit

Permalink
Created TimeRegrTask and started on Arima Learner
Browse files Browse the repository at this point in the history
NEWS

NEWS: are OK until HERE

xgboost: missing: go back to set it NA in mlr

xgboost: missing: simply use NULL as default

fix xgboost tests (mlr-org#1234)

* fix xgboost tests

* fix more tests

test for xgboost printer

Add support for visualizing tasks with 2 or more hyperparameters (mlr-org#1233)

* Add support for visualizing tasks with 2 or more hyperparameters

* Add tests for partial dependence

* Edit documentation

* Forgot to regenerate documentation

* Fixed checks for using partial dependence and minor style fixes

* Fix typos in argname

* Fix arg name in test

NEWS for mlr-org#1233

remove weight.fun in place of expanded fun in generatePartialDependence (mlr-org#1235)

* remove weight.fun in place of expanded fun in generatePartialDependence

 - internal wrapper for fun arg to allow passing of internal
   newdata (prediction grid) and data (training data from input arg)
   which allows computation of weights in fun instead of via an extra
   step using another arg, weight.fun (now removed)

* fix typo

NEWS for mlr-org#1235

update auto-generated documentation [ci skip]

Update description with mason (mlr-org#1237)

travis does not work with rdevel, i will open an issue

Added ctb (mlr-org#1242)

* Added Bruno Vieira as ctb.

* Added Bruno Vieira as ctb.

fixes for issue mlr-org#63 in the tutorial (mlr-org#1243)

- incorrect jacobian function in doPartialDerivativeIteratoin
 - improper fun/fun.wrapper (for weights use)
 - test added based on tutorial fail
 - simplified code a bit

renamed file for consistency

update auto-generated documentation [ci skip]

added the colsample_bylevel parameter in the xgboost learners (mlr-org#1245)

* Update RLearner_classif_xgboost.R

* Update RLearner_regr_xgboost.R

NEWS for mlr-org#1245 and add xgboost version number requirements

forgot space...

ksvm mini tunable fix for hyper par settings (mlr-org#1249)

New measures: Cohen's Kappa and Mean Quadratic Weighted Kappa (mlr-org#1250)

* new measure 'mean quadratic weighted kappa'

* add note for mqwk

* rename objects in test

* rename to wk and fix typo in note

* yet another typo

* rename wk to wkappa

* new_measure_cohens_kappa

* correct measure ranges

NEWS for mlr-org#1250

fixed broken url

listLearners output as S3 class with print (mlr-org#1213)

Make hyperparseffect tests faster with less iterations (mlr-org#1260)

Created TimeRegrTask and started on Arima Learner

Added ARMA learner. For now, allowing cl on line 92 of predictLearner (checkPredictLearnerOutput) to be a ts object

Predict added for Arima.

Prediction now returns the response, but the 'truth' variables is NA, since forecasts do not know the true value at the time of forecast

Added new forecast function. Need to figure out why Arima and forecast are not going to the namespace.

Fixed forecast to use holdout set, made mase measure

Updated namespace to import forecast, then use the method for WrappedModel. Dunno if this meeses with forecast() in the forecast package.

Created Windowing description functions and starting adding Windowing instances

Created fixed and growing window instances, may not work for horizon > 1.

Added window() function, mostly copying resample(). Need to add functions for windowing with aggregation.

Added window level to zzz.R, Created checkAggrBeforeWindow function, WindowPrediction, makeWindowPrediction. Fixed growing and fixed windows by using code from caret.

Windowing works for arima, should probably do something about n.ahead and horizon being the same thing.

Imported forecast to resample, no longer need forecast functions or windows

Removed window and forecast functions, removed window from zzz levels

Added skip parameter to growing CV and fixed CV so user does not have to run every iteration.

had to capitalize L in makeRLearner for Arima

Added docs for time components in resample and resampleDesc

Added imports from xts and zoo. Added xreg to Arima.

Added Lag and Difference preprocess wrapper.

Made createLagDiffFeatures a task preprocessor.

Changed names of timeReg to ForecastRegr and timeregr to fcregr.

Testing

Making sure rebase worked.

Updates now pass base tests

Updated prediction from timereg to forecastreg. Updated README with some examples of using forecasting.

Trying to upload caret picture for windowing.

Updated readme with examples.

Updated readme.

Fixed createLagDiffFeatures. But NA's are handled poorly.

added bats, ets, garch, nnetr, tbats, and thief. Not tested yet, but garch works.

garch now works for resampling.

bats, ets, garch, nnetar, tbats are now working. Updated Readme. thief is not working (frowny face)

Made pre processing wrapper using LambertW transform

Added LambertW to description suggests and updated the readme.

Updated lag and diff preproc func for seasonal lag and differences. Untested.

Updated lag and diff preproc to have seasonal lags and diffs.

Fixed lag diff preproc to include padding and lag lengths for differencing.

Updated docs for createLagDiffFeatures

Added forecast helper objects and started working on unit test for Arima.

Added ARMA learner. For now, allowing cl on line 92 of predictLearner (checkPredictLearnerOutput) to be a ts object

Predict added for Arima.

Prediction now returns the response, but the 'truth' variables is NA, since forecasts do not know the true value at the time of forecast

Added new forecast function. Need to figure out why Arima and forecast are not going to the namespace.

Fixed forecast to use holdout set, made mase measure

Updated namespace to import forecast, then use the method for WrappedModel. Dunno if this meeses with forecast() in the forecast package.

Created Windowing description functions and starting adding Windowing instances

Created fixed and growing window instances, may not work for horizon > 1.

Added window() function, mostly copying resample(). Need to add functions for windowing with aggregation.

Added window level to zzz.R, Created checkAggrBeforeWindow function, WindowPrediction, makeWindowPrediction. Fixed growing and fixed windows by using code from caret.

Windowing works for arima, should probably do something about n.ahead and horizon being the same thing.

Imported forecast to resample, no longer need forecast functions or windows

had to capitalize L in makeRLearner for Arima

Added Lag and Difference preprocess wrapper.

Testing

Making sure rebase worked.

Updated readme.

garch now works for resampling.

Added LambertW to description suggests and updated the readme.

Fixed lag diff preproc to include padding and lag lengths for differencing.

Updated docs for createLagDiffFeatures

Fixed training for fcregr tasks to only use subsets.

Moved test for bats to testthat.

Added tests for tbats and ets

Added test for createLagDiffFeatures

Moved thief to to-do and implimented arfima with a test.

Added se prediction type to arfima, bats, ets, nnetar, and tbats

Created TimeRegrTask and started on Arima Learner

Added ARMA learner. For now, allowing cl on line 92 of predictLearner (checkPredictLearnerOutput) to be a ts object

Prediction now returns the response, but the 'truth' variables is NA, since forecasts do not know the true value at the time of forecast

Added new forecast function. Need to figure out why Arima and forecast are not going to the namespace.

Updated namespace to import forecast, then use the method for WrappedModel. Dunno if this meeses with forecast() in the forecast package.

Created fixed and growing window instances, may not work for horizon > 1.

Added window() function, mostly copying resample(). Need to add functions for windowing with aggregation.

Added window level to zzz.R, Created checkAggrBeforeWindow function, WindowPrediction, makeWindowPrediction. Fixed growing and fixed windows by using code from caret.

Windowing works for arima, should probably do something about n.ahead and horizon being the same thing.

Imported forecast to resample, no longer need forecast functions or windows

Removed window and forecast functions, removed window from zzz levels

had to capitalize L in makeRLearner for Arima

Added imports from xts and zoo. Added xreg to Arima.

Added Lag and Difference preprocess wrapper.

Made createLagDiffFeatures a task preprocessor.

Changed names of timeReg to ForecastRegr and timeregr to fcregr.

Testing

Making sure rebase worked.

Updates now pass base tests

Updated prediction from timereg to forecastreg. Updated README with some examples of using forecasting.

Trying to upload caret picture for windowing.

Fixed createLagDiffFeatures. But NA's are handled poorly.

added bats, ets, garch, nnetr, tbats, and thief. Not tested yet, but garch works.

garch now works for resampling.

bats, ets, garch, nnetar, tbats are now working. Updated Readme. thief is not working (frowny face)

Made pre processing wrapper using LambertW transform

Added LambertW to description suggests and updated the readme.

Updated lag and diff preproc func for seasonal lag and differences. Untested.

Fixed lag diff preproc to include padding and lag lengths for differencing.

Updated docs for createLagDiffFeatures

Added forecast helper objects and started working on unit test for Arima.

Added ARMA learner. For now, allowing cl on line 92 of predictLearner (checkPredictLearnerOutput) to be a ts object

Predict added for Arima.

Prediction now returns the response, but the 'truth' variables is NA, since forecasts do not know the true value at the time of forecast

Added new forecast function. Need to figure out why Arima and forecast are not going to the namespace.

Fixed forecast to use holdout set, made mase measure

Updated namespace to import forecast, then use the method for WrappedModel. Dunno if this meeses with forecast() in the forecast package.

Created Windowing description functions and starting adding Windowing instances

Created fixed and growing window instances, may not work for horizon > 1.

Added window() function, mostly copying resample(). Need to add functions for windowing with aggregation.

Added window level to zzz.R, Created checkAggrBeforeWindow function, WindowPrediction, makeWindowPrediction. Fixed growing and fixed windows by using code from caret.

Windowing works for arima, should probably do something about n.ahead and horizon being the same thing.

Imported forecast to resample, no longer need forecast functions or windows

Removed window and forecast functions, removed window from zzz levels

Added skip parameter to growing CV and fixed CV so user does not have to run every iteration.

had to capitalize L in makeRLearner for Arima

Added imports from xts and zoo. Added xreg to Arima.

Added Lag and Difference preprocess wrapper.

Made createLagDiffFeatures a task preprocessor.

Changed names of timeReg to ForecastRegr and timeregr to fcregr.

Testing

Making sure rebase worked.

Updates now pass base tests

Updated prediction from timereg to forecastreg. Updated README with some examples of using forecasting.

Updated readme.

Fixed createLagDiffFeatures. But NA's are handled poorly.

added bats, ets, garch, nnetr, tbats, and thief. Not tested yet, but garch works.

garch now works for resampling.

bats, ets, garch, nnetar, tbats are now working. Updated Readme. thief is not working (frowny face)

Added LambertW to description suggests and updated the readme.

Updated lag and diff preproc func for seasonal lag and differences. Untested.

Updated lag and diff preproc to have seasonal lags and diffs.

Fixed lag diff preproc to include padding and lag lengths for differencing.

Updated docs for createLagDiffFeatures

Updated merge for Arima prediction.

Fixed training for fcregr tasks to only use subsets.

Moved test for bats to testthat.

Added tests for tbats and ets

Added garch unit test.

Added test for createLagDiffFeatures

Added helper objects for forecast unit testing and Arima can now return standard errors at set levels

fixed typo in garch test

Moved thief to to-do and implimented arfima with a test.

Added se prediction type to arfima, bats, ets, nnetar, and tbats

Added updateLearner function and updateModel function to update online models.

Added docs for updateModel and built basic test for forecast task. Need to test multiplexer.

Fixed Lambert W and created test for forecast
  • Loading branch information
SteveBronder committed Oct 12, 2016
1 parent 852a14d commit 8996dfe
Show file tree
Hide file tree
Showing 82 changed files with 2,274 additions and 201 deletions.
10 changes: 8 additions & 2 deletions DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,10 @@ Imports:
parallelMap (>= 1.3),
shiny,
survival,
utils
utils,
xts,
lubridate,
zoo
Suggests:
ada,
adabag,
Expand Down Expand Up @@ -145,7 +148,10 @@ Suggests:
tgp,
TH.data,
xgboost (>= 0.4-4),
XML
XML,
forecast,
rugarch,
LambertW
LazyData: yes
ByteCompile: yes
Version: 2.10
Expand Down
44 changes: 44 additions & 0 deletions NAMESPACE
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ S3method(capLargeValues,Task)
S3method(capLargeValues,data.frame)
S3method(createDummyFeatures,Task)
S3method(createDummyFeatures,data.frame)
S3method(createLagDiffFeatures,TimeTask)
S3method(createLagDiffFeatures,xts)
S3method(downsample,ResampleInstance)
S3method(downsample,Task)
S3method(estimateRelativeOverfitting,ResampleDesc)
Expand Down Expand Up @@ -104,6 +106,7 @@ S3method(listMeasures,default)
S3method(makePrediction,TaskDescClassif)
S3method(makePrediction,TaskDescCluster)
S3method(makePrediction,TaskDescCostSens)
S3method(makePrediction,TaskDescForecastRegr)
S3method(makePrediction,TaskDescMultilabel)
S3method(makePrediction,TaskDescRegr)
S3method(makePrediction,TaskDescSurv)
Expand Down Expand Up @@ -198,6 +201,13 @@ S3method(makeRLearner,cluster.cmeans)
S3method(makeRLearner,cluster.dbscan)
S3method(makeRLearner,cluster.kkmeans)
S3method(makeRLearner,cluster.kmeans)
S3method(makeRLearner,fcregr.Arima)
S3method(makeRLearner,fcregr.arfima)
S3method(makeRLearner,fcregr.bats)
S3method(makeRLearner,fcregr.ets)
S3method(makeRLearner,fcregr.garch)
S3method(makeRLearner,fcregr.nnetar)
S3method(makeRLearner,fcregr.tbats)
S3method(makeRLearner,multilabel.cforest)
S3method(makeRLearner,multilabel.rFerns)
S3method(makeRLearner,multilabel.randomForestSRC)
Expand Down Expand Up @@ -398,6 +408,13 @@ S3method(predictLearner,cluster.cmeans)
S3method(predictLearner,cluster.dbscan)
S3method(predictLearner,cluster.kkmeans)
S3method(predictLearner,cluster.kmeans)
S3method(predictLearner,fcregr.Arima)
S3method(predictLearner,fcregr.arfima)
S3method(predictLearner,fcregr.bats)
S3method(predictLearner,fcregr.ets)
S3method(predictLearner,fcregr.garch)
S3method(predictLearner,fcregr.nnetar)
S3method(predictLearner,fcregr.tbats)
S3method(predictLearner,multilabel.cforest)
S3method(predictLearner,multilabel.rFerns)
S3method(predictLearner,multilabel.randomForestSRC)
Expand Down Expand Up @@ -491,7 +508,9 @@ S3method(print,FeatSelResult)
S3method(print,FeatureImportance)
S3method(print,Filter)
S3method(print,FilterValues)
S3method(print,FixedCVDesc)
S3method(print,FunctionalANOVAData)
S3method(print,GrowingCVDesc)
S3method(print,HoldoutDesc)
S3method(print,HyperParsEffectData)
S3method(print,ImputationDesc)
Expand All @@ -512,6 +531,7 @@ S3method(print,ResampleResult)
S3method(print,SubsampleDesc)
S3method(print,SupervisedTask)
S3method(print,Task)
S3method(print,TimeTask)
S3method(print,TuneControl)
S3method(print,TuneMultiCritControl)
S3method(print,TuneMultiCritResult)
Expand Down Expand Up @@ -651,6 +671,13 @@ S3method(trainLearner,cluster.cmeans)
S3method(trainLearner,cluster.dbscan)
S3method(trainLearner,cluster.kkmeans)
S3method(trainLearner,cluster.kmeans)
S3method(trainLearner,fcregr.Arima)
S3method(trainLearner,fcregr.arfima)
S3method(trainLearner,fcregr.bats)
S3method(trainLearner,fcregr.ets)
S3method(trainLearner,fcregr.garch)
S3method(trainLearner,fcregr.nnetar)
S3method(trainLearner,fcregr.tbats)
S3method(trainLearner,multilabel.cforest)
S3method(trainLearner,multilabel.rFerns)
S3method(trainLearner,multilabel.randomForestSRC)
Expand Down Expand Up @@ -754,6 +781,7 @@ export(configureMlr)
export(convertBMRToRankMatrix)
export(convertMLBenchObjToTask)
export(createDummyFeatures)
export(createLagDiffFeatures)
export(crossval)
export(cv10)
export(cv2)
Expand All @@ -770,6 +798,7 @@ export(f1)
export(fdr)
export(featperc)
export(filterFeatures)
export(fixedcv)
export(fn)
export(fnr)
export(fp)
Expand Down Expand Up @@ -844,6 +873,7 @@ export(getTaskType)
export(getTuneResult)
export(gmean)
export(gpr)
export(growingcv)
export(hasLearnerProperties)
export(hasProperties)
export(holdout)
Expand Down Expand Up @@ -889,6 +919,7 @@ export(makeFeatSelWrapper)
export(makeFilter)
export(makeFilterWrapper)
export(makeFixedHoldoutInstance)
export(makeForecastRegrTask)
export(makeImputeMethod)
export(makeImputeWrapper)
export(makeLearner)
Expand All @@ -908,6 +939,7 @@ export(makeOversampleWrapper)
export(makePrediction)
export(makePreprocWrapper)
export(makePreprocWrapperCaret)
export(makePreprocWrapperLambert)
export(makeRLearner)
export(makeRLearnerClassif)
export(makeRLearnerCluster)
Expand Down Expand Up @@ -936,6 +968,7 @@ export(makeUndersampleWrapper)
export(makeWeightedClassesWrapper)
export(makeWrappedModel)
export(mape)
export(mase)
export(mcc)
export(mcp)
export(meancosts)
Expand Down Expand Up @@ -1090,6 +1123,9 @@ export(tuneParams)
export(tuneParamsMultiCrit)
export(tuneThreshold)
export(undersample)
export(updateLearner)
export(updateLearner2)
export(updateModel)
export(wkappa)
import(BBmisc)
import(ParamHelpers)
Expand All @@ -1109,6 +1145,7 @@ importFrom(ggvis,layer_points)
importFrom(ggvis,layer_ribbons)
importFrom(ggvis,prop)
importFrom(graphics,hist)
importFrom(lubridate,is.POSIXt)
importFrom(shiny,headerPanel)
importFrom(shiny,mainPanel)
importFrom(shiny,pageWithSidebar)
Expand All @@ -1132,4 +1169,11 @@ importFrom(utils,head)
importFrom(utils,methods)
importFrom(utils,tail)
importFrom(utils,type.convert)
importFrom(xts,diff.xts)
importFrom(xts,lag.xts)
importFrom(xts,reclass)
importFrom(xts,try.xts)
importFrom(xts,xts)
importFrom(zoo,coredata)
importFrom(zoo,index)
useDynLib(mlr,c_smote)
38 changes: 38 additions & 0 deletions R/ForecastRegrTask.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
#' @export
#' @rdname Task
#' @importFrom zoo index coredata
makeForecastRegrTask = function(id = deparse(substitute(data)), data, target,
weights = NULL, blocking = NULL, fixup.data = "warn",
check.data = TRUE, frequency = 1L) {
assertString(id)
assertClass(data,"xts")
assertString(target)
assertInteger(frequency, lower = 0L, max.len = 1L)
assertChoice(fixup.data, choices = c("no", "quiet", "warn"))
assertFlag(check.data)

data <- data.frame(dates = index(data), coredata(data))

if (fixup.data != "no") {
if (is.integer(data[[target]]))
data[[target]] = as.double(data[[target]])
}

task = makeSupervisedTask("regr", data, target, weights, blocking, fixup.data = fixup.data, check.data = check.data)

if (check.data) {
assertNumeric(data[[target]], any.missing = FALSE, finite = TRUE, .var.name = target)
}

task$task.desc = makeTaskDesc.ForecastRegrTask(task, id, target, frequency)
addClasses(task, c("ForecastRegrTask","TimeTask"))
}

makeTaskDesc.ForecastRegrTask = function(task, id, target, frequency) {
td = makeTaskDescInternal(task, "regr", id, target, time = TRUE)
td$dates = c(task$env$data$dates[1],task$env$data$dates[nrow(task$env$data)])
if (missing(frequency))
frequency = task$task.desc$frequency
td$frequency = frequency
addClasses(td, c("TaskDescForecastRegr", "TaskDescSupervised"))
}
38 changes: 38 additions & 0 deletions R/ForecastRegrTask.R~HEAD
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
#' @export
#' @rdname Task
#' @importFrom zoo index coredata
makeForecastRegrTask = function(id = deparse(substitute(data)), data, target,
weights = NULL, blocking = NULL, fixup.data = "warn",
check.data = TRUE, frequency = 1L) {
assertString(id)
assertClass(data,"xts")
assertString(target)
assertInteger(frequency, lower = 0L, max.len = 1L)
assertChoice(fixup.data, choices = c("no", "quiet", "warn"))
assertFlag(check.data)

data <- data.frame(dates = index(data), coredata(data))

if (fixup.data != "no") {
if (is.integer(data[[target]]))
data[[target]] = as.double(data[[target]])
}

task = makeSupervisedTask("regr", data, target, weights, blocking, fixup.data = fixup.data, check.data = check.data)

if (check.data) {
assertNumeric(data[[target]], any.missing = FALSE, finite = TRUE, .var.name = target)
}

task$task.desc = makeTaskDesc.ForecastRegrTask(task, id, target, frequency)
addClasses(task, c("ForecastRegrTask","TimeTask"))
}

makeTaskDesc.ForecastRegrTask = function(task, id, target, frequency) {
td = makeTaskDescInternal(task, "regr", id, target, time = TRUE)
td$dates = c(task$env$data$dates[1],task$env$data$dates[nrow(task$env$data)])
if (missing(frequency))
frequency = task$task.desc$frequency
td$frequency = frequency
addClasses(td, c("TaskDescForecastRegr", "TaskDescSupervised"))
}
81 changes: 81 additions & 0 deletions R/LambertWFeaturesWrapper.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
#' @title Gaussianize Numerical Data with a Lambert W Transform
#'
#' @description
#' Takes numeric data and "Gaussianizes" it using an h, hh, or s Lambert W transform.
#'
#' @template arg_learner
#' @param type [\code{character(1)}] \cr
#' What type of non-normality: symmetric heavy-tails "h" (default),
#' skewed heavy-tails "hh", or just skewed "s".
#' @param methods [\code{character(1)}] \cr
#' What estimator should be used: "MLE" or "IGMM". "IGMM" gives exactly
#' Gaussian characteristics (kurtosis = 3 for "h" or skewness = 0 for "s"),
#' "MLE" comes close to this. Default: "IGMM" since it is much faster than "MLE".
#' @param verbose [\code{logical(1)}] \cr
#' If TRUE, it prints out progress information in the console. Default: FALSE.
#' @param target.proc [\code{logical(1)}]\cr
#' If TRUE, applies LambertW transform to target variable.
#' @export
#' @family wrapper
#' @template ret_learner
makePreprocWrapperLambert = function(learner, target.proc = FALSE, type = c("h", "hh", "s"),
methods = c("IGMM", "MLE"), verbose = FALSE) {
type = match.arg(type)
methods = match.arg(methods)
learner = checkLearner(learner)
args = list(target.proc = target.proc, type = type, methods = methods, verbose = verbose)
rm(list = names(args))

trainfun = function(data, target, args) {
cns = colnames(data)
if (args$target.proc){
nums = cns[sapply(data, is.numeric)]
} else {
nums = setdiff(cns[sapply(data, is.numeric)], target)
}
x = as.matrix(data[, nums, drop = FALSE])

x = LambertW::Gaussianize(x, type = args$type, method = args$methods, verbose = args$verbose,
return.tau.mat = TRUE)
control = args
control$tau.mat = x$tau.mat

x = as.data.frame(x$input)
colnames(x) = nums
data = data[, setdiff(cns, nums), drop = FALSE]
data = cbind(data, x)
return(list(data = data, control = control))
}
predictfun = function(data, target, args, control) {
cns = colnames(data)
if (args$target.proc){
nums = cns[sapply(data, is.numeric)]
} else {
nums = setdiff(cns[sapply(data, is.numeric)], target)
}
x = as.matrix(data[, nums, drop = FALSE])
x = LambertW::Gaussianize(x, type = control$type, method = control$methods,
verbose = control$verbose, tau.mat = control$tau.mat)
x = as.data.frame(x)
colnames(x) = nums
data = data[, setdiff(cns, nums), drop = FALSE]
data = cbind(data, x)
return(data)
}

lrn = makePreprocWrapper(
learner,
train = trainfun,
predict = predictfun,
par.set = makeParamSet(
makeDiscreteLearnerParam("type", values = c("h", "hh", "s"), default = "s"),
makeLogicalLearnerParam("target.proc", default = FALSE),
makeDiscreteLearnerParam("methods", values = c("IGMM", "MLE"), default = "IGMM"),
makeLogicalLearnerParam("verbose", default = FALSE, tunable = FALSE)
),
par.vals = args
)
addClasses(lrn, "PreprocWrapperLambert")
}


5 changes: 3 additions & 2 deletions R/Learner_properties.R
Original file line number Diff line number Diff line change
Expand Up @@ -65,10 +65,11 @@ getSupportedLearnerProperties = function(type = NA_character_) {
p = list(
classif = c("numerics", "factors", "ordered", "missings", "weights", "prob", "oneclass", "twoclass", "multiclass", "class.weights", "featimp"),
multilabel = c("numerics", "factors", "ordered", "missings", "weights", "prob", "oneclass", "twoclass", "multiclass"),
regr = c("numerics", "factors", "ordered", "missings", "weights", "se", "featimp"),
regr = c("numerics", "factors", "ordered", "missings", "weights", "se", "featimp", "ts"),
cluster = c("numerics", "factors", "ordered", "missings", "weights", "prob"),
surv = c("numerics", "factors", "ordered", "missings", "weights", "prob", "lcens", "rcens", "icens", "featimp"),
costsens = c("numerics", "factors", "ordered", "missings", "weights", "prob", "twoclass", "multiclass")
costsens = c("numerics", "factors", "ordered", "missings", "weights", "prob", "twoclass", "multiclass"),
fcregr = c("numerics", "factors", "ordered", "missings", "weights", "se", "featimp", "ts")
)
if (is.na(type))
unique(unlist(p))
Expand Down
26 changes: 26 additions & 0 deletions R/Prediction.R
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,9 @@ makePrediction = function(task.desc, row.names, id, truth, predict.type, predict
UseMethod("makePrediction")
}




#' @export
makePrediction.TaskDescRegr = function(task.desc, row.names, id, truth, predict.type, predict.threshold = NULL, y, time, error = NA_character_) {
data = namedList(c("id", "truth", "response", "se"))
Expand Down Expand Up @@ -188,6 +191,29 @@ makePrediction.TaskDescCostSens = function(task.desc, row.names, id, truth, pred
)
}

#' @export
makePrediction.TaskDescForecastRegr = function(task.desc, row.names, id, truth, predict.type, predict.threshold = NULL, y, time, error = NA_character_) {
data = namedList(c("id", "truth", "response", "se"))
data$id = id
data$truth = truth
if (predict.type == "response") {
data$response = y
} else {
y = as.data.frame(y)
data$response = y[, 1L, drop = FALSE]
data$se = y[, 2L:I(ncol(y)), drop = FALSE]
}

makeS3Obj(c("PredictionRegr", "Prediction"),
predict.type = predict.type,
data = setRowNames(as.data.frame(filterNull(data), row.names = NULL), row.names),
threshold = NA_real_,
task.desc = task.desc,
time = time,
error = error
)
}

#' @export
print.Prediction = function(x, ...) {
catf("Prediction: %i observations", nrow(x$data))
Expand Down
Loading

0 comments on commit 8996dfe

Please sign in to comment.