Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[export] ExportedProgram #102259

Closed
wants to merge 7 commits into from
Closed

Conversation

angelayi
Copy link
Contributor

@angelayi angelayi commented May 25, 2023

class ExportedProgram:
    graph_module: torch.fx.GraphModule
    graph_signature: ExportGraphSignature
    call_spec: CallSpec
    state_dict: Dict[str, Any]    
    symbol_to_range: Dict[sympy.Symbol, Tuple[int, int]]

    @property
    def graph(self):
        return self.graph_module.graph

    def transform(self, *passes: PassType) -> "ExportedProgram":
        # Runs graph based transformations on the given ExportedProgram 
        # and returns a new transformed ExportedProgram
        ...

    def add_runtime_assertions(self) -> "ExportedProgram":
        # Adds runtime assertions based on the constraints
        ...

# Information to maintain user calling/returning specs
@dataclasses.dataclass
class CallSpec:
    in_spec: Optional[pytree.TreeSpec] = None
    out_spec: Optional[pytree.TreeSpec] = None


# Extra information for joint graphs
@dataclasses.dataclass
class ExportBackwardSignature:
    gradients_to_parameters: Dict[str, str]
    gradients_to_user_inputs: Dict[str, str]
    loss_output: str


@dataclasses.dataclass
class ExportGraphSignature:
    parameters: List[str]
    buffers: List[str]

    user_inputs: List[str]
    user_outputs: List[str]
    inputs_to_parameters: Dict[str, str]
    inputs_to_buffers: Dict[str, str]

    buffers_to_mutate: Dict[str, str]

    backward_signature: Optional[ExportBackwardSignature]

Stack from ghstack (oldest at bottom):

cc @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented May 25, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/102259

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit d57b238:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

Copy link
Contributor

@ydwu4 ydwu4 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the rapid pr! This looks good to me.

torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/__init__.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Show resolved Hide resolved
angelayi added a commit that referenced this pull request May 25, 2023
ghstack-source-id: b78ea1d863bea74f7e9a3cb11bdd3824b057cc9e
Pull Request resolved: #102259
cc voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx

[ghstack-poisoned]
torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Outdated Show resolved Hide resolved
@angelayi angelayi mentioned this pull request May 25, 2023
cc voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx

[ghstack-poisoned]
cc voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx

[ghstack-poisoned]
cc voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx

[ghstack-poisoned]
torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Show resolved Hide resolved
torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Outdated Show resolved Hide resolved
torch/_export/graph_module.py Outdated Show resolved Hide resolved
input_name_to_example_inputs[node.name] = example_input
input_tracker += 1

exported_program._input_shape_constraints = input_shape_constraints_by_src_name
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not a fan of keeping these private fields. We should be able to eliminate ep._input_shape_constraints in lieu of symbol_to_constraints (generalizing symbol_to_range to include equality constraints, which are basically other symbols).

Also, instead of _input_name_to_example_inputs we should be able to just remember a map from names to the shapes of the example inputs, or even a map from (name, dim) pairs to constant values. Because the only reason we need them is for specialization assertions.

If you still want to keep the example inputs for additional debugging, then sure, keep them...but there should not be any remaining client for it after this change. I believe the right thing to do is to just forget them.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Chatted with Avik offline -- I'll clean this up in a separate diff.

torch/_export/graph_module.py Outdated Show resolved Hide resolved
self.current_gm: Optional[torch.fx.GraphModule] = None
self.constraints = self._process_shape_constraints(input_shape_constraints)
self.input_name_to_example_inputs = input_name_to_example_inputs
self.inline_constraints = inline_constraints

def _process_shape_constraints(self, constraints) -> Dict[str, List[ConstraintSpec]]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't need this function if you move it before export, right?

if expr in self.inline_constraints:
constraint = self.inline_constraints[expr]
lower = _convert_to_int(constraint.lower)
upper = _convert_to_int(constraint.upper)
lower = _convert_to_int(constraint[0])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can move this before export by preprocessing input_name_to_example_inputs

cc voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen XiaobingSuper zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx

[ghstack-poisoned]
angelayi added a commit that referenced this pull request May 26, 2023
ghstack-source-id: fee3ffcbf73a576e6a3bfbbc4210eb2202e3b399
Pull Request resolved: #102259
Copy link
Contributor

@avikchaudhuri avikchaudhuri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for working on this! This is beginning to shape up to something awesome (no pun intended lol).

@facebook-github-bot facebook-github-bot deleted the gh/angelayi/47/head branch June 8, 2023 15:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants