Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Tabular] Removing scikit-learn upgrade cap and handling failures in DecisionTreeRegressor #3881

Merged
merged 4 commits into from
Jan 25, 2024

Conversation

prateekdesai04
Copy link
Contributor

Issue #, if available: A follow-up PR on #3872

Description of changes: This PR handles the failures that are produced as a result of upgrading to scikit-learn 1.4.0.
Currently, the way we fix this is re-calculate y_train_ and y_train_leaves_ in DecisionTreeRegressor and make the object similar to what it was when we were using scikit-learn 1.3.2. This might be updated if we find a better fix in the future.
Link to detailed discussion about the issue: #3872 (comment)

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

Copy link

Job PR-3881-a399ce8 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-3881/a399ce8/index.html

@Innixma Innixma merged commit 26ef76d into autogluon:master Jan 25, 2024
28 checks passed
@Innixma Innixma added this to the 1.0.1 Release milestone Feb 13, 2024
@Innixma Innixma modified the milestones: 1.0.1 Release, 1.1 Release Apr 5, 2024
@prateekdesai04 prateekdesai04 self-assigned this Apr 5, 2024
LennartPurucker pushed a commit to LennartPurucker/autogluon that referenced this pull request Jun 1, 2024
…DecisionTreeRegressor (autogluon#3881)

Co-authored-by: Ubuntu <ubuntu@ip-172-31-9-154.us-west-2.compute.internal>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants