-
-
Notifications
You must be signed in to change notification settings - Fork 403
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
parallel ensembles #615
parallel ensembles #615
Conversation
what is the purpose of setting |
this is ready for review @berndbischl @larskotthoff. i think there are some wrappers i've missed but most are done now. training and prediction are parallelized, new tests added, etc. |
You've got some commented code in there that probably shouldn't be pulled in. |
fixed |
Thanks. Looks good to me! |
Are here still any problems, or why is this not merged? I mention parallelization in our multilabel paper (@bernd)... |
it was fine then but i'll have to rebase it |
we will draw in the multilabel PR #977 first, then review here whether the multilabel ensembles work as well in parallel |
this will be removed, disregard pls |
ok will rebase asap. |
- baggingwrapper - constsensregrwrapper - homogeneousensemble - multiclasswrapper - multilabelbinaryrelevancewrapper - overbaggingwrapper - stackedlearner
this is rebased now. the multilabel tests are the only thing that are failing now. in particular the only multilabel problem is the |
this is not ready for merge but is here for review. the relevant issue is #603.
left to do are:
CostSensRegrWrapper
MultilabelBinaryRelevanceWrapper
CostSensWeightedPairsWrapper
StackedLearner
any others? I am currently parallelizing training and prediction using the same level.
i have written but haven't pushed tests for all of this.