Skip to content

Hi, a little bit confuse about your code, please give me some help. #1

@tbwxmu

Description

@tbwxmu

I noticed that they use the "Save and Load Checkpoints" to synchronize all models in different process in the PyTorch tutorial https://pytorch.org/tutorials/intermediate/ddp_tutorial.html
image

So, I want to know if there are some implicit synchronization mechanisms in your distributed_tutorial code.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions