Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HyperNetWork is not being trained #32

Open
mleoelm opened this issue Apr 27, 2021 · 2 comments
Open

HyperNetWork is not being trained #32

mleoelm opened this issue Apr 27, 2021 · 2 comments

Comments

@mleoelm
Copy link

mleoelm commented Apr 27, 2021

With respect to the file "HyerIQASolver.py"

In line 20 you store the parameter of the HyperNetwork to be trained in the variable "self.hypernet_params".

self.hypernet_params = filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters())

From line 77 to 84 you update the optimizer with this variable self.hypernet_params

Since self.model_hyper.parameters() returns an iterator, self.hypernet_params is an iterator and after the optimizer is initialized, it becomes empty. Therefore, after the first epoch, the optimizer doesn't optimize the HyperNetwork anymore because self.hypernet_params is empty.

Setting self.hypernet_params as a list should fix the problem: self.hypernet_params = list(filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters()))

@SuperBruceJia
Copy link

@SSL92

@SuperBruceJia
Copy link

With respect to the file "HyerIQASolver.py"

In line 20 you store the parameter of the HyperNetwork to be trained in the variable "self.hypernet_params".

self.hypernet_params = filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters())

From line 77 to 84 you update the optimizer with this variable self.hypernet_params

Since self.model_hyper.parameters() returns an iterator, self.hypernet_params is an iterator and after the optimizer is initialized, it becomes empty. Therefore, after the first epoch, the optimizer doesn't optimize the HyperNetwork anymore because self.hypernet_params is empty.

Setting self.hypernet_params as a list should fix the problem: self.hypernet_params = list(filter(lambda p: id(p) not in backbone_params, self.model_hyper.parameters()))

After trained the model, I found the performances are almost the same with or without using the list.
Should we have to use list(...) for self. hypernet_params ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants