Skip to content
This repository was archived by the owner on Aug 1, 2025. It is now read-only.
This repository was archived by the owner on Aug 1, 2025. It is now read-only.

[Bug]: Inductor IMA's when inputs are on a non-default dvice  #1717

@ngimel

Description

@ngimel

🐛 Describe the bug

Error logs

cuda illegal memory access

Did Dynamo succeed?

  • Does dynamo.optimize("eager") succeed?

Did AOT succeed?

  • Did dynamo.optimize("aot_eager") succeed?

Did Inductor succeed?

  • Does dynamo.optimize("inductor") succeed?

Minified repro

import torch
import torch._inductor
import torch._inductor.config as config

#torch._dynamo.config.dynamic_shapes = True
torch._inductor.config.triton.cudagraphs = False
torch._inductor.config.debug=True
import torch._dynamo as torchdynamo
@torchdynamo.optimize("inductor")
def div(x, y):
    return torch.ops.aten.div(x, y)


x = torch.randn(4, dtype=torch.float, device="cuda:1")
y = torch.randn(4, dtype=torch.float, device="cuda:1")
print(div(x,y))

In eager we set device guard to temporarily set current device to the device of the inputs, inductor doesn't do that.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinginductor

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions