Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: dividing speed_loss by batch size twice #37

Open
dHonerkamp opened this issue Dec 3, 2020 · 2 comments
Open

Bug: dividing speed_loss by batch size twice #37

dHonerkamp opened this issue Dec 3, 2020 · 2 comments

Comments

@dHonerkamp
Copy link

In the file coiltraine/network/loss.py

we find the following lines from 56 onwards:

   loss_function = loss_branches_vec[0] + loss_branches_vec[1] + loss_branches_vec[2] + \
                      loss_branches_vec[3]
    
   speed_loss = loss_branches_vec[4]/(params['branches'][0].shape[0])

   return torch.sum(loss_function) / (params['branches'][0].shape[0])\
                + torch.sum(speed_loss) / (params['branches'][0].shape[0]),\

It seems the speed_loss is being divided by params['branches'][0].shape[0] (the batch_size?) twice instead of only once. While the rest of the loss ('loss_function') is not.

Is this indeed a bug that changes the scaling of the different losses or am I missing something?

@Kailiangdong
Copy link

I confuse about this points. I also want to know.

@felipecode
Copy link
Owner

Yes, I found this recently. It was dividing twice.
Thanks for pointing this out. I wouldnt' change that on the repository since this is necessary
in order to reproduce the results.

This shows that the speed prediction should probably use a way smaller
weight than what i used originally.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants