This repository was archived by the owner on Aug 1, 2025. It is now read-only.

Description
🐛 Describe the bug
from torch.nn.quantized import FloatFunctional
class TorchAdd(torch.nn.Module):
"""Wrapper around torch.add so that all ops can be found at build"""
def __init__(self):
super().__init__()
self.add_func = FloatFunctional()
def forward(self, x, y):
return self.add_func.add(x, y)
torchdynamo.config.dynamic_shapes = True
torchdynamo.export(TorchAdd(), torch.randn(3, 4), torch.randn(3, 4), aten_graph=True)
Error logs
File "/mnt/xarfuse/uid-23137/7756fc57-seed-nspid4026533012_cgpid17879168-ns-4026533009/torchdynamo/symbolic_convert.py", line 1515, in inline_call_
unimplemented(
File "/mnt/xarfuse/uid-23137/7756fc57-seed-nspid4026533012_cgpid17879168-ns-4026533009/torchdynamo/exc.py", line 71, in unimplemented
raise Unsupported(msg)
torchdynamo.exc.Unsupported: inline in skipfiles: add /mnt/xarfuse/uid-23137/7756fc57-seed-nspid4026533012_cgpid17879168-ns-4026533009/torch/ao/nn/quantized/modules/functional_modules.py
Did Dynamo succeed?
Did AOT succeed?
Did Inductor succeed?
Minified repro
Repro given above