Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fixbug] Prevent allreduce op from being fused #304

Merged
merged 1 commit into from
Jul 5, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
3 changes: 2 additions & 1 deletion python/hidet/graph/transforms/fuse_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
from hidet.graph.flow_graph import FlowGraph, Operator, Tensor
from hidet.graph.ops.special import BarrierOp
from hidet.graph.ops.transfer import TransferOp
from hidet.graph.ops.distributed import AllReduceOp
from hidet.graph.graph_utils.functors import analyze_usage
from hidet.graph.transforms.base import GraphPass
from hidet.utils.structure import DirectedGraph
Expand All @@ -24,7 +25,7 @@


# the following operators are not fusible
NOT_FUSIBLE = {BarrierOp, TransferOp}
NOT_FUSIBLE = {BarrierOp, TransferOp, AllReduceOp}


class FusibleGraph:
Expand Down