Skip to content

to_onehot can't be torchscripted #1584

@ydcjeff

Description

@ydcjeff

🐛 Bug description

I need to do one hot conversion and I use ignite.utils.to_onehot, but when I try to torchscript the model, it occurs a RuntimeError. Stacktrace:

RuntimeError: 
cannot statically infer the expected size of a list in this context:
  File "/usr/local/lib/python3.6/dist-packages/ignite/utils.py", line 59
    input's device`.
    """
    onehot = torch.zeros(indices.shape[0], num_classes, *indices.shape[1:], dtype=torch.uint8, device=indices.device)
                                                        ~~~~~~~~~~~~~~~~~ <--- HERE
    return onehot.scatter_(1, indices.unsqueeze(1), 1)
'to_onehot' is being compiled since it was called from 'SLP.forward'
  File "<ipython-input-3-0f76d1651906>", line 12
    def forward(self, x):
        x = to_onehot(x, 10)
        ~~~~~~~~~~~~~~~~~~~ <--- HERE
        return self.l1(x)

Environment

Colab

Reproducible Link

https://colab.research.google.com/drive/13mWdz9sGXvAHgKd3vtp5I6Y-gZDRlNAd?usp=sharing

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions