Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes meta feature reshuffling bug in StackingCVClassifier #370

Merged
merged 2 commits into from
Apr 20, 2018

Conversation

rasbt
Copy link
Owner

@rasbt rasbt commented Apr 19, 2018

Description

Meta features are now saved in order of the original input data regardless of whether shuffling is enabled.

Related issues or pull requests

Fixes #366

Pull Request Checklist

  • Added a note about the modification or contribution to the ./docs/sources/CHANGELOG.md file (if applicable)
  • Added appropriate unit test functions in the ./mlxtend/*/tests directories (if applicable)
  • Modify documentation in the corresponding Jupyter Notebook under mlxtend/docs/sources/ (if applicable)
  • Ran nosetests ./mlxtend -sv and make sure that all unit tests pass (for small modifications, it might be sufficient to only run the specific test file, e.g., nosetests ./mlxtend/classifier/tests/test_stacking_cv_classifier.py -sv)
  • Checked for style issues by running flake8 ./mlxtend

@pep8speaks
Copy link

pep8speaks commented Apr 19, 2018

Hello @rasbt! Thanks for updating the PR.

Cheers ! There are no PEP8 issues in this Pull Request. 🍻

Comment last updated on April 19, 2018 at 16:02 Hours UTC

@coveralls
Copy link

coveralls commented Apr 19, 2018

Coverage Status

Coverage increased (+0.02%) to 91.876% when pulling 943ff62 on order-meta-features into 85a48f5 on master.

@rasbt rasbt merged commit 41a5f99 into master Apr 20, 2018
@rasbt rasbt deleted the order-meta-features branch May 12, 2018 22:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants