Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Option to pass orig features to meta regressor in stackingregressor #418

Merged
merged 1 commit into from
Jul 20, 2018

Conversation

rasbt
Copy link
Owner

@rasbt rasbt commented Jul 19, 2018

Description

Adding support for merging the meta features with the original input features in StackingRegressor (via use_features_in_secondary) like it is already supported in the other Stacking classes.

Related issues or pull requests

Fixes #418

Pull Request Checklist

  • Added a note about the modification or contribution to the ./docs/sources/CHANGELOG.md file (if applicable)
  • Added appropriate unit test functions in the ./mlxtend/*/tests directories (if applicable)
  • Modify documentation in the corresponding Jupyter Notebook under mlxtend/docs/sources/ (if applicable)
  • Ran nosetests ./mlxtend -sv and make sure that all unit tests pass (for small modifications, it might be sufficient to only run the specific test file, e.g., nosetests ./mlxtend/classifier/tests/test_stacking_cv_classifier.py -sv)
  • Checked for style issues by running flake8 ./mlxtend

@pep8speaks
Copy link

Hello @rasbt! Thanks for submitting the PR.

Line 69:80: E501 line too long (80 > 79 characters)

@coveralls
Copy link

Coverage Status

Coverage increased (+0.06%) to 91.24% when pulling 877c021 on stackingclassifier-metafeatures into c027a1c on master.

@rasbt rasbt merged commit 655f160 into master Jul 20, 2018
@rasbt rasbt deleted the stackingclassifier-metafeatures branch November 10, 2018 19:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants