Skip to content

two inputs (data and weights) are allowed only in explicit-quantization mode. #1675

@lsdNorman

Description

@lsdNorman

root@90e0df0b7943:/home/TensorRT/parsers/onnx/build# ./onnx2trt /home/my.onnx -o my.trt
Input filename: /home/my.onnx
ONNX IR version: 0.0.6
Opset version: 12
Producer name: pytorch
Producer version: 1.8
Domain:
Model version: 0
Doc string:
Parsing model
[2021-12-20 08:58:04 WARNING] onnx2trt_utils.cpp:367: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[2021-12-20 08:58:04 ERROR] Conv_90: two inputs (data and weights) are allowed only in explicit-quantization mode.
While parsing node number 90 [Conv -> "883"]:
ERROR: ModelImporter.cpp:177 In function parseGraph:
[6] Invalid Node - Conv_90
Conv_90: two inputs (data and weights) are allowed only in explicit-quantization mode.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Module:ONNXIssues relating to ONNX usage and importtriagedIssue has been triaged by maintainers

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions