Skip to content

Commit

Permalink
Update op replacement tutorial (#50377)
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: #50377

Test Plan: Imported from OSS

Reviewed By: jamesr66a

Differential Revision: D25870409

Pulled By: ansley

fbshipit-source-id: b873b89c2e62b57cd5d816f81361c8ff31be2948
  • Loading branch information
Ansley Ussery authored and facebook-github-bot committed Jan 11, 2021
1 parent ec51b67 commit 3d263d1
Showing 1 changed file with 3 additions and 6 deletions.
9 changes: 3 additions & 6 deletions torch/fx/examples/replace_op.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
from torch.fx import symbolic_trace
import operator


"""
How to replace one op with another
1. Iterate through all Nodes in your GraphModule's Graph.
Expand Down Expand Up @@ -39,12 +38,10 @@ def forward(self, x, y):

# As demonstrated in the above example, there are several different ways
# to denote addition. The possible cases are:
# 1. `x + y` - A `call_function` Node with target
# `<built-in function add>`. This is `operator.add`, so we can
# match on equality with that function directly.
# 1. `x + y` - A `call_function` Node with target `operator.add`.
# We can match for equality on that `operator.add` directly.
# 2. `torch.add(x, y)` - A `call_function` Node with target
# `<built-in method add of type object at MEMORY-LOCATION-OF-TORCH>`.
# This is `torch.add`, which we can similarly match directly.
# `torch.add`. Similarly, we can match this function directly.
# 3. `x.add(y)` - The Tensor method call, whose target we can match
# as a string.

Expand Down

0 comments on commit 3d263d1

Please sign in to comment.