You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When passing the variables for the optimizer to fit_generator() function in the TorchModel class, an error occurs if the variables object passed is the return object from model.parameters().
The error is ValueError: optimizer got an empty parameter list.
This happens because model.parameters() returns a generator object that's consumed and stored in a variable and the empty generator object later is passed to the optimizer. The fit_generator() snippet that causes the error is pasted below.
ifvariablesisNone:
optimizer=self._pytorch_optimizerlr_schedule=self._lr_scheduleelse:
var_key=tuple(variables) # <-------------------- the generator gets used up hereifvar_keyinself._optimizer_for_vars:
optimizer, lr_schedule=self._optimizer_for_vars[var_key]
else:
optimizer=self.optimizer._create_pytorch_optimizer(variables) # <------ empty generator object is passed hereifisinstance(self.optimizer.learning_rate,
LearningRateSchedule):
lr_schedule=self.optimizer.learning_rate._create_pytorch_schedule(
optimizer)
else:
lr_schedule=Noneself._optimizer_for_vars[var_key] = (optimizer, lr_schedule)
File "/home/gautham/Desktop/repos/deepchem-test/file1.py", line 35, in <module>
model.fit(dataset)
File "/home/gautham/Desktop/repos/deepchem-test/file1.py", line 27, in fit
return self.fit_generator(self.default_generator(dataset), variables=variables)
File "/home/gautham/Desktop/repos/deepchem/deepchem/models/torch_models/torch_model.py", line 405, in fit_generator
optimizer = self.optimizer._create_pytorch_optimizer(variables)
File "/home/gautham/Desktop/repos/deepchem/deepchem/models/optimizers.py", line 233, in _create_pytorch_optimizer
return torch.optim.Adam(params,
File "/home/gautham/anaconda3/envs/deepchem/lib/python3.9/site-packages/torch/optim/adam.py", line 45, in __init__
super().__init__(params, defaults)
File "/home/gautham/anaconda3/envs/deepchem/lib/python3.9/site-packages/torch/optim/optimizer.py", line 261, in __init__
raise ValueError("optimizer got an empty parameter list")
ValueError: optimizer got an empty parameter list
Expected behavior
The fit_generator() function should work the same if passed a generator object or a parameter list/tuple.
Environment
OS: PopOS 22
Python version: 3.9
DeepChem version: 2.7.2.dev
PyTorch version: 2.1.2+cu121
The text was updated successfully, but these errors were encountered:
馃悰 Bug
When passing the variables for the optimizer to
fit_generator()
function in the TorchModel class, an error occurs if the variables object passed is the return object frommodel.parameters()
.The error is
ValueError: optimizer got an empty parameter list
.This happens because
model.parameters()
returns a generator object that's consumed and stored in a variable and the empty generator object later is passed to the optimizer. Thefit_generator()
snippet that causes the error is pasted below.To Reproduce
Run the below code to reproduce the error:
Stack Trace:
Expected behavior
The
fit_generator()
function should work the same if passed a generator object or a parameter list/tuple.Environment
The text was updated successfully, but these errors were encountered: