Open
Description
Hello, I find that in your class "loss_coteaching", the parameters you passed are y_1, y_2, which are already the log_softmax results. But you then used the cross_entropy, which has combined the log_softmax and nll_loss. This will use log_softmax twice. I am not sure whether I am wrong. Or do this problem not occur in your pytorch version?
By the way, even though using log_softmax function twice, your code is still right~
Metadata
Metadata
Assignees
Labels
No labels