-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Can nni support to prune model tranformed from onnx format #5499
Comments
I think it can, could you give us an example script to get this transformed model, then we can check this? |
I chose the DeepLabv3 model as an example, and the onnx2torch tool(https://github.com/ENOT-AutoDL/onnx2torch) to convert it from onnx to pytorch format.
config list:
error:
Environment: NNI version:2.10 |
And the pruning code:
|
Maybe I should give you the complete error:
When I just converted it and didn't prune it, the onnx2torch warnning was:
|
|
pruning code:
error:
|
could you show how to build your model, so that we can find where goes wrong, seems that the forward function is not a class member function. |
sorry, I directly used an onnx format from others, and I don't know how the model was built.
error:
|
I have encount the same error "RuntimeError: Tracing expected 0 arguments but got 1 concrete arguments". and my error is caused by the definition of forward function. My definition is like this:
Then I change the definition to this and the problem has been solved.
|
Describe the issue:
after I use some open-source tool to transform an onnx model into a pytorch model, can I still use nni to prune it?
If I can, how to set the config_list? Because the transformed pytorch model have different named_modules
The text was updated successfully, but these errors were encountered: