-
Notifications
You must be signed in to change notification settings - Fork 89
Renames feature_importances and related methods to singular #883
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report
@@ Coverage Diff @@
## master #883 +/- ##
=======================================
Coverage 99.75% 99.75%
=======================================
Files 195 195
Lines 8505 8505
=======================================
Hits 8484 8484
Misses 21 21
Continue to review full report at Codecov.
|
@@ -39,6 +41,7 @@ Changelog | |||
**Breaking Changes** | |||
* Pipelines' static ``component_graph`` field must contain either ``ComponentBase`` subclasses or ``str``, instead of ``ComponentBase`` subclass instances :pr:`850` | |||
* Rename ``handle_component`` to ``handle_component_class``. Now standardizes to ``ComponentBase`` subclasses instead of ``ComponentBase`` subclass instances :pr:`850` | |||
* Pipelines' and classifiers' `feature_importances` is renamed `feature_importance`, `graph_feature_importances` is renamed `graph_feature_importance` :pr:`883` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!
def feature_importances(self): | ||
"""Returns feature importances. Since baseline regressors do not use input features to calculate predictions, returns an array of zeroes. | ||
def feature_importance(self): | ||
"""Returns importance associated with each feature. Since baseline regressors do not use input features to calculate predictions, returns an array of zeroes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
@@ -29,5 +29,5 @@ def __init__(self, n_estimators=100, max_depth=6, n_jobs=-1, random_state=0, **k | |||
random_state=random_state) | |||
|
|||
@property | |||
def feature_importances(self): | |||
def feature_importance(self): | |||
return self._component_obj.feature_importances_ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see xgboost many and sklearn models use "feature_importances_" and catboost uses "get_feature_importance". Funny to see the differences. I like our rename.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for jumping on this! 🚢
Closes #868