Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor SONNX #706

Closed
joddiy opened this issue May 19, 2020 · 2 comments
Closed

Refactor SONNX #706

joddiy opened this issue May 19, 2020 · 2 comments

Comments

@joddiy
Copy link
Member

joddiy commented May 19, 2020

Hi, all, during refactoring SONNX, I found the following issues:

Input

ONNX prefers to use tensors as input instead of attributes, which may incurs some issues when we create SINGA operators(or layers). There are two cases:

  1. SINGA params <- ONNX Initializer
    The params of an operator come from the ONNX Initializer(pre-stored weights). This part is ok now.
  2. SINGA params <- ONNX operator, dynamic graph
    For some operators of ONNX(OneHot, Tile, Gather, Reshape, Slice, Clip). Some attributes of these operators, they come from other operators' outputs. We cannot handle this case.
    For example, in BERT, for this Reshape operator, its shape comes from the previous operator:
    image

Layers

for @dcslin

BatchNorm2d

  • remove num_features
  • self.allow_params = ["scale", "bias", "running_mean", "running_var"]

Conv2d

  • remove in_channels, out_channels

Gemm

In some model, the developer prefers gemm instead of linear, so we need to add gemm to Layer,

Metaclass

I've checked the metaclass carefully, but It seems I cannot use the metaclass to modify the forward function in this case. The case is, I have a graph written by ONNX, I need to write a forward by using SINGA's operator. In this case, I can call the SINGA's operator by the graph, but I cannot write a forward function automatically from the graph.

This more like the exec function.

for example, I have a graph like this:

graph = {
    "op1" : {"inputs":["a1"], outputs:["a2"]},
    "op2" : {"inputs":["a2"], outputs:["a3"]},
}
# what I can do
def forward(x):
    tensors = []
    for op, op_info in graph.items():
        inputs = [tensors[inp] for inp in op_info.inputs]
        outputs = op()
        for (outp, val) in zip(op_info.outputs, outputs):
            tensors[outp] = val
# what I cannot do by metaclass but can with exec
program = parse_graph_to_str(graph)
# 'a2=op1(a1)\na3=op2(a2)'
exec(program)

So, the above forward is my current implementation.

@nudles
Copy link
Member

nudles commented May 20, 2020

Hi, all, during refactoring SONNX, I found the following issues:

Input

ONNX prefers to use tensors as input instead of attributes, which may incurs some issues when we create SINGA operators(or layers). There are two cases:

1. SINGA params <- ONNX Initializer
   The params of an operator come from the ONNX Initializer(pre-stored weights). This part is ok now.

2. **SINGA params <- ONNX operator, dynamic graph**
   For some operators of ONNX(OneHot, Tile, Gather, Reshape, Slice, Clip). Some attributes of these operators, they come from other operators' outputs. We cannot handle this case.
   For example, in BERT, for this Reshape operator, its shape comes from the previous operator:
   ![image](https://user-images.githubusercontent.com/14108933/82379081-dcc63c00-9a58-11ea-8ba1-ab637dbe417d.png)

can you extract the values for the shape from the tensor and pass them to init Reshape?

Layers

for @dcslin

BatchNorm2d

* remove num_features

* self.allow_params = ["scale", "bias", "running_mean", "running_var"]

running_mean and running_var are not params (not updated via sgd).
They are state variables.

Conv2d

* remove in_channels, out_channels

out_channel is required. Rename it to nb_kernels.

Gemm

In some model, the developer prefers gemm instead of linear, so we need to add gemm to Layer,

ok.

Metaclass

I've checked the metaclass carefully, but It seems I cannot use the metaclass to modify the forward function in this case. The case is, I have a graph written by ONNX, I need to write a forward by using SINGA's operator. In this case, I can call the SINGA's operator by the graph, but I cannot write a forward function automatically from the graph.

With SONNXModel, I think we do not need metaclass anymore.

This more like the exec function.

for example, I have a graph like this:

graph = {
    "op1" : {"inputs":["a1"], outputs:["a2"]},
    "op2" : {"inputs":["a2"], outputs:["a3"]},
}
# what I can do
def forward(x):
    tensors = []
    for op, op_info in graph.items():
        inputs = [tensors[inp] for inp in op_info.inputs]
        outputs = op()
        for (outp, val) in zip(op_info.outputs, outputs):
            tensors[outp] = val

The code above is for SONNXModel's forward.
You just need to consider the aux_output

what I cannot do by metaclass but can with exec

program = parse_graph_to_str(graph)

'a2=op1(a1)\na3=op2(a2)'

exec(program)


So, the above forward is my current implementation.

@joddiy
Copy link
Member Author

joddiy commented May 20, 2020

  1. can you extract the values for the shape from the tensor and pass them to init Reshape?
    Yes, I'm going to do this.

  2. running_mean and running_var are not params (not updated via sgd). They are state variables.
    Got it, but I canot see the set_states yet at Layer class.

  3. out_channel is required. Rename it to nb_kernels.
    Got it, I think I can parse the out_channels from the ONNX's weight.

  4. With SONNXModel, I think we do not need metaclass anymore. You just need to consider the aux_output
    Yes, this part has been done.

@joddiy joddiy closed this as completed May 26, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants