-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add XGBoost Regressor and Regression Pipeline #666
Conversation
…28_xgboost_reg
Codecov Report
@@ Coverage Diff @@
## master #666 +/- ##
==========================================
+ Coverage 99.02% 99.04% +0.02%
==========================================
Files 135 139 +4
Lines 4709 4810 +101
==========================================
+ Hits 4663 4764 +101
Misses 46 46
Continue to review full report at Codecov.
|
SEED_MIN = -2**31 | ||
SEED_MAX = 2**31 - 1 | ||
|
||
def __init__(self, eta=0.1, max_depth=3, min_child_weight=1, n_estimators=100, random_state=0): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. Are there any regression-specific params we should add now? I'm excited to do some tuning of this later lol
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From what I can see and understand the parameters between classification and regression should be the same! It has been a while since I've looked at boosted trees though 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah that's what I think too. This is great for now. Thanks!
@@ -59,6 +60,7 @@ def all_components(): | |||
import_or_raise("xgboost", error_msg="XGBoost not installed.") | |||
except ImportError: | |||
components.pop(XGBoostClassifier.name) | |||
components.pop(XGBoostRegressor.name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!! I can't wait to delete all this code... :)
|
||
def __init__(self, parameters, random_state=0): | ||
super().__init__(parameters=parameters, | ||
random_state=random_state) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this __init__
necessary?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nope! Will remove.
I haven't read up on this but I think this is what Steve was talking about in the github workshop last week: @jeremyliweishih you did a |
@dsherry Not sure what you’re referring to but this was just renaming a file |
@jeremyliweishih lol yeah that's what I was asking. Question was totally not related to this PR 😂 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good stuff!! ⛴️
@jeremyliweishih attach this PR to #528 ? |
Fixes #528.