Skip to content

[FEATURE/ Issue for timm] Add operator HardSwishJitAutoFn export to ONNX support #400

@hwangdeyu

Description

@hwangdeyu

Description

When I tried to export some models from the pytorch pub to onnx for Inference farther, such as mealv2_mobilenetv3_small_075 model, it will call the timm package. And will throw the error about operator "HardSwishJitAutoFn".

To Reproduce

import torch
model = torch.hub.load('szq0214/MEAL-V2', 'meal_v2', pretrained=True, **{'model_name': 'mealv2_mobilenetv3_small_075'})
model.eval()
model = model.module
input_tensor = torch.randn(2, 3, 224, 224, requires_grad=True)

if isinstance(input_tensor, torch.Tensor):
    input_tensor = (input_tensor,)

if torch.cuda.is_available():
    model.to('cuda')
    if isinstance(input_tensor, tuple):
        new_tensor = [x.to('cuda') for x in input_tensor]
        input_tensor = tuple(new_tensor)
    else: 
        input_tensor = input_tensor.to('cuda')

torch.onnx.export(model, input_tensor, 'mealv2_mobilenetv3_small_075.onnx', verbose=False, opset_version=12, use_external_data_format=False)

Then, the following error will occur:

RuntimeError: ONNX export failed: Couldn't export Python operator HardSwishJitAutoFn

So, it's needed to support the HardSwishJitAutoFn export to ONNX in the timm.

Finally, thanks for providing the timm tool. it's very useful!

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions