You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
event_time[:, 1:] - event_time[:, :-1] causes an extra term to appear.
Example: If the sequence t has 3 events but gets 0 padded to have a length 5. t = [1, 3, 4, 0, 0]
delta should be t = [2, 1, 0, 0]
but performing event_time[:, 1:] - event_time[:, :-1] leads to [2, 1, -4, 0].
Solution is to pass time_gap to this function instead of subtracting event_time. Alternatively, this difference can be multiplied by non_pad_mask to remove this artifact term.
The text was updated successfully, but these errors were encountered:
I agree with you. These calculations impact training as well because the RMSE loss is one of the loss functions being minimized overall. This raises serious doubts over the code and the reproducibility of the paper.
There is an error in the time_loss computation:
event_time[:, 1:] - event_time[:, :-1] causes an extra term to appear.
Example: If the sequence t has 3 events but gets 0 padded to have a length 5. t = [1, 3, 4, 0, 0]
delta should be t = [2, 1, 0, 0]
but performing event_time[:, 1:] - event_time[:, :-1] leads to [2, 1, -4, 0].
Solution is to pass time_gap to this function instead of subtracting event_time. Alternatively, this difference can be multiplied by non_pad_mask to remove this artifact term.
The text was updated successfully, but these errors were encountered: