Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multitask learning with uncertainty ? #19

Closed
kakusikun opened this issue Apr 14, 2020 · 8 comments
Closed

Multitask learning with uncertainty ? #19

kakusikun opened this issue Apr 14, 2020 · 8 comments

Comments

@kakusikun
Copy link

Here, it looks like that the uncertainty is used to learn multitask.

BUT, I can NOT find that the parameters is updated by any optimizer ...
I just found that the instance of loss is made here.

Could you point out where the uncertainty is learned during training?

@ifzhang
Copy link
Owner

ifzhang commented Apr 14, 2020

We set the parameters here:

self.s_det = nn.Parameter(-1.85 * torch.ones(1))

@kakusikun
Copy link
Author

Thanks for your quick reply.
I know the uncertainty is set as a parameter.
But it needs to be added to the optimizer to be updated not just make it a parameter.
Still the question, where it is updated.

@ifzhang
Copy link
Owner

ifzhang commented Apr 14, 2020

I use the parameters in the loss below:

loss = torch.exp(-self.s_det) * det_loss + torch.exp(-self.s_id) * id_loss + (self.s_det + self.s_id)

@kakusikun
Copy link
Author

kakusikun commented Apr 14, 2020

I mean something like

optimizer = torch.optim.Adam(model.parameters(), opt.lr)

could you point out the same action on uncertainty like above.

@ifzhang
Copy link
Owner

ifzhang commented Apr 15, 2020

Thank you very much for your question! I have added the parameters to optimizer and fixed the bug.

@whut2962575697
Copy link

I found that you added a new param_group in the optimizer, so the optimizer will fail when it is resumed

@whut2962575697
Copy link

maybe you should change the position of resume code

@ifzhang
Copy link
Owner

ifzhang commented Apr 20, 2020

Thanks, I will fix the bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants