-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Refactor SONNX #706
Comments
can you extract the values for the shape from the tensor and pass them to init Reshape?
out_channel is required. Rename it to nb_kernels.
ok.
With SONNXModel, I think we do not need metaclass anymore.
The code above is for SONNXModel's forward.
|
|
Hi, all, during refactoring SONNX, I found the following issues:
Input
ONNX prefers to use tensors as input instead of attributes, which may incurs some issues when we create SINGA operators(or layers). There are two cases:
The params of an operator come from the ONNX Initializer(pre-stored weights). This part is ok now.
For some operators of ONNX(OneHot, Tile, Gather, Reshape, Slice, Clip). Some attributes of these operators, they come from other operators' outputs. We cannot handle this case.
For example, in BERT, for this Reshape operator, its shape comes from the previous operator:
Layers
for @dcslin
BatchNorm2d
Conv2d
Gemm
In some model, the developer prefers gemm instead of linear, so we need to add gemm to Layer,
Metaclass
I've checked the metaclass carefully, but It seems I cannot use the metaclass to modify the forward function in this case. The case is, I have a graph written by ONNX, I need to write a forward by using SINGA's operator. In this case, I can call the SINGA's operator by the graph, but I cannot write a forward function automatically from the graph.
This more like the
exec
function.for example, I have a graph like this:
So, the above forward is my current implementation.
The text was updated successfully, but these errors were encountered: