Skip to content

Why the optimizer.step() write twice? #8

Closed
@San-ctuary

Description

@San-ctuary

loss1.backward(retain_graph=True)
self.optimizer.step()
loss2.backward()
self.optimizer.step()

When the first optimizer.step() execute all the gradient relate to loss1 will update,but some variable in loss2 are common in loss1.So this maybe cause some problem.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions