You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
GBDTModel fit method has early_stopping_rounds parameter, but latest lightGBM (v4.0.0) has no parameter of it.
now it's using callbacks for early_stopping_rounds.
To Reproduce
Steps to reproduce the behavior:
import deepchem as dc
from lightgbm import LGBMRegressor
lgb = LGBMRegressor()
model = dc.models.GBDTModel(lgb)
model.fit(train_dataset)
File [.venv\lib\site-packages\deepchem\models\gbdt_models\gbdt_model.py:107, in GBDTModel.fit(self, dataset)
104 # Find optimal n_estimators based on original learning_rate and early_stopping_rounds
105 X_train, X_test, y_train, y_test = train_test_split(
106 X, y, test_size=0.2, random_state=seed, stratify=stratify)
--> 107 self.model.fit(
108 X_train,
109 y_train,
110 early_stopping_rounds=self.early_stopping_rounds,
111 eval_metric=self.eval_metric,
112 eval_set=[(X_test, y_test)])
114 # retrain model to whole data using best n_estimators * 1.25
115 if self.model.class.name.startswith('XGB'):
TypeError: LGBMRegressor.fit() got an unexpected keyword argument 'early_stopping_rounds'
Expected behavior
working as lightGBM(v3.3.5)
Environment
OS: Windows
Python version: 3.10
DeepChem version: 2.7.1
RDKit version (optional):
TensorFlow version (optional):
PyTorch version (optional):
Any other relevant information:
Additional context
requirements on official document mentioned lightGBM version as "latest"
it need to be fixed to "v3.3.5" for current(v2.7.1 or before) version of deepchem
and maybe more implementation with callbacks for future version of deepchem
The text was updated successfully, but these errors were encountered:
Redix8
changed the title
lightgbm early_stopping_rounds is deprecated.
lightgbm early_stopping_rounds parameter on fit method is deprecated.
Sep 6, 2023
馃悰 Bug
GBDTModel fit method has early_stopping_rounds parameter, but latest lightGBM (v4.0.0) has no parameter of it.
now it's using callbacks for early_stopping_rounds.
To Reproduce
Steps to reproduce the behavior:
File [.venv\lib\site-packages\deepchem\models\gbdt_models\gbdt_model.py:107, in GBDTModel.fit(self, dataset)
104 # Find optimal n_estimators based on original learning_rate and early_stopping_rounds
105 X_train, X_test, y_train, y_test = train_test_split(
106 X, y, test_size=0.2, random_state=seed, stratify=stratify)
--> 107 self.model.fit(
108 X_train,
109 y_train,
110 early_stopping_rounds=self.early_stopping_rounds,
111 eval_metric=self.eval_metric,
112 eval_set=[(X_test, y_test)])
114 # retrain model to whole data using best n_estimators * 1.25
115 if self.model.class.name.startswith('XGB'):
TypeError: LGBMRegressor.fit() got an unexpected keyword argument 'early_stopping_rounds'
Expected behavior
working as lightGBM(v3.3.5)
Environment
Additional context
requirements on official document mentioned lightGBM version as "latest"
it need to be fixed to "v3.3.5" for current(v2.7.1 or before) version of deepchem
and maybe more implementation with callbacks for future version of deepchem
The text was updated successfully, but these errors were encountered: