Navigation Menu

Skip to content
This repository has been archived by the owner on Nov 16, 2023. It is now read-only.

onnx model error: int64 is not supported #168

Open
aohan237 opened this issue Jul 17, 2019 · 24 comments
Open

onnx model error: int64 is not supported #168

aohan237 opened this issue Jul 17, 2019 · 24 comments

Comments

@aohan237
Copy link

i use a model from https://github.com/onnx/models/tree/master/vision/object_detection_segmentation/mask-rcnn mask-rcnn

and load the model in onnxjs

just follow the instructions.

const model = new onnx.InferenceSession();
await model.loadModel("./mask_rcnn_R_50_FPN_1x.onnx");

but errors occur, int64 is not supported.

why is that? how to solve it?

@micah5
Copy link

micah5 commented May 2, 2020

Did you find a solution?

@aohan237
Copy link
Author

aohan237 commented May 6, 2020

sorry,no

@Manojbhat09
Copy link

Same issue, where might it be going wrong.

@28Smiles
Copy link
Contributor

Javascript has no int64 type. You can cast your model to int32.

@waittim
Copy link

waittim commented Nov 11, 2020

You can cast your model to int32.

Does it mean quantization?

@28Smiles
Copy link
Contributor

Where is your model using the int64?
U can use a tool like netron to look at the graph and check for the corresponding nodes.

@waittim
Copy link

waittim commented Nov 14, 2020

The ONNX model used int64 in every CONV layer.
image
But in the original cfg file, they are int32.
image
The int64 comes from the process of Darknet to ONNX.

@28Smiles
Copy link
Contributor

this, can not be the Problem, the parameters you show are always i64 and work for my model too, since they are just node configuration.
image

You should search for in graph usage. Some input or output of type i64

@waittim
Copy link

waittim commented Nov 14, 2020

Thank you!
I checked it again. The input of Rang, Reshape, Slide, Concat and Mul are INT64 based.
image
image
image
image
image

May I know is there any way to force them to be INT32? Like during the exporting or other processes.

@28Smiles
Copy link
Contributor

I had similar issues with my graphs. The way I went for was using the winmltools for loading the graph into memory. Afterwards i edit the graph in python. I know there is near to no documentation on this. I used a debugger to analyse the object. You can try it like this, I guess...

@waittim
Copy link

waittim commented Nov 14, 2020

Would you mind talking about some of the details? Like what's the function to check the nodes and change its data type. The introduction on the WinMLtools official website didn't talk about it.
The only function I found is

import winmltools
model = winmltools.load_model('weights/yolo-fastest.onnx')
model.graph.input
model.graph.ouput

I'm feeling like I lost the direction......
Thank you so much!

@28Smiles
Copy link
Contributor

For example my pytorch export was exporting the graph with -1 as dimenstion of expand. I've wrote a function to replace the -1 with the fixed graph dimensions:

model = onnx.load_model("input.onnx")

for node in model.graph.node:
    if node.op_type == 'Expand':
        for initializer in model.graph.initializer:
            if initializer.name == node.input[0]:
                for n in model.graph.node:
                    if node.input[1] in n.output:
                        if n.op_type == 'Constant':
                            b, x, y, z = unpack("qqqq", n.attribute[0].t.raw_data)
                            if b == -1:
                                b = initializer.dims[0]
                            if x == -1:
                                x = initializer.dims[1]
                            if y == -1:
                                y = initializer.dims[2]
                            if z == -1:
                                z = initializer.dims[3]

                            n.attribute[0].t.raw_data = pack("qqqq", b, x, y, z)
                        break
                break

onnx.save_model(model, "model_opt.onnx")

@28Smiles
Copy link
Contributor

28Smiles commented Nov 14, 2020

here is a script to compress the nodes names, saves a few kb on large models. Still working on quantisation, but had not the time to do so right now.

_b85alphabet = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz!#$%&()*+-;<=>?@^_`{|}~"


def int_to_base85(n: int) -> str:
    if n == 0:
        "0"
    o = ""
    while n:
        o += _b85alphabet[int(n % 85)]
        n //= 85
    return o


def compress_names(model):
    reserved_names = set()
    name_map = {}
    i = 1
    for input in model.graph.input:
        reserved_names.add(input.name)
    for output in model.graph.output:
        reserved_names.add(output.name)
    for initializer in model.graph.initializer:
        name_map[initializer.name] = int_to_base85(i)
        initializer.name = name_map[initializer.name]
        i += 1
    for node in model.graph.node:
        for no, input in enumerate(node.input):
            if not input in reserved_names:
                if input in name_map:
                    node.input[no] = name_map[input]
                else:
                    name_map[input] = int_to_base85(i)
                    node.input[no] = name_map[input]
                    i += 1
        for no, output in enumerate(node.output):
            if not output in reserved_names:
                if output in name_map:
                    node.output[no] = name_map[output]
                else:
                    name_map[output] = int_to_base85(i)
                    node.output[no] = name_map[output]
                    i += 1


model = onnx.load_model("input.onnx")
compress_names(model)
onnx.save_model(model, 'compressed.onnx')

@28Smiles
Copy link
Contributor

We really need a (better) library for this in the onnx ecosystem. Working with and fixing graphs is tedious this way.

@waittim
Copy link

waittim commented Nov 16, 2020

@28Smiles I do find a way to remove the old node/initializer and insert the new one. However, the new problem is onnx Operator ConstantOfShape doesn't support INT32 as its input. So when I'm using onnxruntime to test the model, it tells me that InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : Load model from weights/yolo-fastest-transfer.onnx failed:This is an invalid model. Type Error: Type 'tensor(int32)' of input parameter (1540) of operator (ConstantOfShape) in node (ConstantOfShape_245) is invalid.

Maybe there is another Operator which has a similar function but support INT32 as input, however, I don't know what it is.

@28Smiles
Copy link
Contributor

Is this an onnx.js error? Why do you need a variable shape?

@waittim
Copy link

waittim commented Nov 16, 2020

It's an error in Python actually. I just modified the node based on the original model and test it in Python onnxruntime before using it in onnx.js. Could it be handled in onnx.js?;
Actually, for using in onnx.js, there is another error that happened before dealing with ConstantOfShape:

opset.ts:48 Uncaught (in promise) TypeError: cannot resolve operator 'Shape' with opsets: ai.onnx v11
    at Object.e.resolveOperator (opset.ts:48)
    at t.resolve (session-handler.ts:60)
    at e.initializeOps (session.ts:235)
    at session.ts:92
    at t.event (instrument.ts:294)
    at e.initialize (session.ts:81)
    at e.<anonymous> (session.ts:63)
    at onnx.min.js:14
    at Object.next (onnx.min.js:14)
    at a (onnx.min.js:14)

In the document of onnx.js, it is said Shape has been supported for Cpu Backend.

@28Smiles
Copy link
Contributor

yes, but its unreleased, you have to build the master

@waittim
Copy link

waittim commented Nov 17, 2020

Just to confirm, will ConstantOfShape be supported in the new version? Thank you!

@28Smiles
Copy link
Contributor

Only the Shape operator

@kleinicke
Copy link

Just wasted 6 hours with this problem. Can someone please fix this?
I have a model that has some ConstantOfShapes in 64 bit format. It didn't change even through I set the inputs in the model to np.int32.
I tried to set the values in the model manually on 32 bit, but I was only able to change the ConstantOfShape Attributes. Not the Inputs and not the values in the next layers that expect 64 bit.

@waittim
Copy link

waittim commented Dec 31, 2020

I am sorry to tell you that the ConstantOfShapes node requires int64. I set all of the values to be int32 manually before, and it becomes invalid.
At the end of the story, I choose to try another approach instead of onnxjs.

@mickeygo
Copy link

mickeygo commented Apr 7, 2021

@waittim which approach you choose ?

@jkassis
Copy link

jkassis commented Nov 9, 2021

This works if you use the "wasm" backend and bring onnx into the browser using <script> tags. I did not have any luck with esbuild bundler. And the docs clearly state (once I read them), that the 'webgl' backend supports only a subset of operators.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants