You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In line 20 you store the parameter of the HyperNetwork to be trained in the variable "self.hypernet_params".
self.hypernet_params = filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters())
From line 77 to 84 you update the optimizer with this variable self.hypernet_params
Since self.model_hyper.parameters() returns an iterator, self.hypernet_params is an iterator and after the optimizer is initialized, it becomes empty. Therefore, after the first epoch, the optimizer doesn't optimize the HyperNetwork anymore because self.hypernet_params is empty.
Setting self.hypernet_params as a list should fix the problem: self.hypernet_params = list(filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters()))
The text was updated successfully, but these errors were encountered:
In line 20 you store the parameter of the HyperNetwork to be trained in the variable "self.hypernet_params".
self.hypernet_params = filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters())
From line 77 to 84 you update the optimizer with this variable self.hypernet_params
Since self.model_hyper.parameters() returns an iterator, self.hypernet_params is an iterator and after the optimizer is initialized, it becomes empty. Therefore, after the first epoch, the optimizer doesn't optimize the HyperNetwork anymore because self.hypernet_params is empty.
Setting self.hypernet_params as a list should fix the problem: self.hypernet_params = list(filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters()))
After trained the model, I found the performances are almost the same with or without using the list.
Should we have to use list(...) for self. hypernet_params ?
With respect to the file "HyerIQASolver.py"
In line 20 you store the parameter of the HyperNetwork to be trained in the variable "self.hypernet_params".
self.hypernet_params = filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters())
From line 77 to 84 you update the optimizer with this variable
self.hypernet_params
Since
self.model_hyper.parameters()
returns an iterator,self.hypernet_params
is an iterator and after the optimizer is initialized, it becomes empty. Therefore, after the first epoch, the optimizer doesn't optimize the HyperNetwork anymore becauseself.hypernet_params is empty
.Setting
self.hypernet_params
as a list should fix the problem:self.hypernet_params = list(filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters()))
The text was updated successfully, but these errors were encountered: