Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in Utils.time_loss #5

Open
saurabhdash opened this issue Mar 22, 2021 · 2 comments
Open

Error in Utils.time_loss #5

saurabhdash opened this issue Mar 22, 2021 · 2 comments

Comments

@saurabhdash
Copy link

There is an error in the time_loss computation:

event_time[:, 1:] - event_time[:, :-1] causes an extra term to appear.

Example: If the sequence t has 3 events but gets 0 padded to have a length 5. t = [1, 3, 4, 0, 0]
delta should be t = [2, 1, 0, 0]

but performing event_time[:, 1:] - event_time[:, :-1] leads to [2, 1, -4, 0].

Solution is to pass time_gap to this function instead of subtracting event_time. Alternatively, this difference can be multiplied by non_pad_mask to remove this artifact term.

@unnir
Copy link

unnir commented May 31, 2021

will you do the pull request?

@ritvik06
Copy link

ritvik06 commented Jul 20, 2021

I agree with you. These calculations impact training as well because the RMSE loss is one of the loss functions being minimized overall. This raises serious doubts over the code and the reproducibility of the paper.

A simple fix to this could be

true = event_time[:, 1:] - event_time[:, :-1]
true = torch.where(true>=0., true, torch.zeros(true.shape).to(torch.device("cuda")))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants