Skip to content
This repository was archived by the owner on Aug 1, 2025. It is now read-only.
This repository was archived by the owner on Aug 1, 2025. It is now read-only.

Support distributed training #43

@jansel

Description

@jansel

This is a placeholer task to make distributed training work with TorchDynamo + AOT Autograd. The main work seems to be making sure the relevant ops can be traced with AOT Autograd and are properly added to the FX graph by TorchDynamo.

I expect most of the issues will be at the AOT Autograd level, because TorchDynamo treats most torch.* ops as black box. We should test and verify this though.

@alanwaketan can fill in details.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions