New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add XGBoost integration #65
Add XGBoost integration #65
Conversation
…gration to optuna_integration
@y0z Could you review this PR? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your contribution.
I leave a review comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think one change should be made.
Also, please add XGBoost to init.py as same as other integrations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your PR, please address my comments:)
@buruzaemon Basically, the conflict is happening in pyproject.toml, but you can keep both torch and xgboost in pyproject.toml. After addressing my comments and the conflict resolve, we can merge this PR! |
... and finally, resolved that conflict in pyproject.toml. ptal |
Codecov ReportAttention:
Additional details and impacted files@@ Coverage Diff @@
## main #65 +/- ##
==========================================
- Coverage 65.06% 64.92% -0.14%
==========================================
Files 25 26 +1
Lines 1869 1933 +64
==========================================
+ Hits 1216 1255 +39
- Misses 653 678 +25 ☔ View full report in Codecov by Sentry. |
Could you please apply isort? |
sure, i am on that now... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@buruzaemon
Here are the changes you should make by isort.
tests/test_xgboost.py
Outdated
import numpy as np | ||
import pytest | ||
|
||
import optuna | ||
from optuna_integration._imports import try_import | ||
from optuna_integration.xgboost import XGBoostPruningCallback | ||
from optuna.testing.pruners import DeterministicPruner |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
import numpy as np | |
import pytest | |
import optuna | |
from optuna_integration._imports import try_import | |
from optuna_integration.xgboost import XGBoostPruningCallback | |
from optuna.testing.pruners import DeterministicPruner | |
import numpy as np | |
import optuna | |
from optuna.testing.pruners import DeterministicPruner | |
import pytest | |
from optuna_integration._imports import try_import | |
from optuna_integration.xgboost import XGBoostPruningCallback |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
... and this is done!
optuna_integration/xgboost.py
Outdated
from typing import Any | ||
|
||
import optuna | ||
import optuna_integration |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
from typing import Any | |
import optuna | |
import optuna_integration | |
from typing import Any | |
import optuna | |
import optuna_integration |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
... and this change to the imports is done as well!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the changes, LGTM!
Motivation
Description of the changes