Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: default parameter value inconsistent from doc #711

merged 4 commits into from Jun 8, 2020


Copy link

@zerolfx zerolfx commented May 26, 2020

What does this implement/fix? Explain your changes.

The default value of min_samples_leaf in BalancedRandomForestClassifier constructor is 2, but in doc, the default value is 1.

Any other comments?

This default value in RandomForestClassifier(sklearn) is also 1.

@glemaitre glemaitre self-assigned this Jun 8, 2020
Copy link

@glemaitre glemaitre left a comment

LGTM thanks for catching this.

Copy link

codecov bot commented Jun 8, 2020

Codecov Report

Merging #711 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #711   +/-   ##
  Coverage   96.37%   96.37%           
  Files          82       82           
  Lines        4989     4989           
  Hits         4808     4808           
  Misses        181      181           
Impacted Files Coverage Δ
imblearn/ensemble/ 97.26% <ø> (ø)
imblearn/ensemble/tests/ 100.00% <ø> (ø)
imblearn/over_sampling/ 97.22% <ø> (ø)
imblearn/ 93.93% <ø> (ø)
...mpling/_prototype_generation/ 100.00% <ø> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 37f27ee...1de6ac4. Read the comment docs.

@glemaitre glemaitre merged commit 703cee1 into scikit-learn-contrib:master Jun 8, 2020
3 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
None yet

Successfully merging this pull request may close these issues.

None yet

2 participants