New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add stacked ensembling to AutoML documentation #1425
Conversation
Codecov Report
@@ Coverage Diff @@
## main #1425 +/- ##
=======================================
Coverage 100.0% 100.0%
=======================================
Files 223 223
Lines 14930 14930
=======================================
Hits 14923 14923
Misses 7 7 Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM just some comments!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@angela97lin This looks good to me! This is out of scope of this PR but this sentence might be confusing to users because I don't think we ever explain how the batching in AutoML works:
The stacking ensemble pipeline runs in its own batch after a whole cycle of training has occurred (each allowed pipeline trains for one batch). Note that this means a large number of iterations may need to run before the stacking ensemble runs.
One thing we can do is add something to the AutoML printout like this if the number of iterations selected by the user is too small:
Ensembling set to true but the number of iterations is too small
for ensembling to run. Set it to at least ...
and
Ensembling is set to True and max_iterations is set to ...
so ensembling will run in batches 4, 8, 12
That way users are not so confused about whether or not ensembling will run. Like I said, out of scope of this PR, but I think this would be a useful enhancement in the future!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Just left one comment but not blocking
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@angela97lin looks good to me!
I only had one change request, which was to use max_batches=5
(or 4? forgetting) instead of max_iterations=20
@freddyaboulton Great suggestion!! I filed #1461 to track this |
Closes #1329
Docs here