Skip to content

A question about your loss_coteaching function #4

Open
@nihaomiao

Description

@nihaomiao

Hello, I find that in your class "loss_coteaching", the parameters you passed are y_1, y_2, which are already the log_softmax results. But you then used the cross_entropy, which has combined the log_softmax and nll_loss. This will use log_softmax twice. I am not sure whether I am wrong. Or do this problem not occur in your pytorch version?
By the way, even though using log_softmax function twice, your code is still right~

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions