Skip to content
This repository has been archived by the owner on May 31, 2024. It is now read-only.

WeightedDirectedGraph doesn't support multiple edges from one node to another #120

Open
bezineb5 opened this issue Sep 6, 2019 · 5 comments

Comments

@bezineb5
Copy link
Contributor

bezineb5 commented Sep 6, 2019

Context
I have a graph which contains the following "Sub" operation:
image
=> It subracts a value from itself
(why is it like that? It's the conversion of Tensorflow's Maximum/Minimum in ONNX 7 which is translated like that in order to support broadcast - however I would expect the behaviour to be the same on a square X*X operation)

Problem
When running the model, I get the following error:
bad arity for operation (have 1, want 2)

Cause
When looking at the implementation of WeightedDirectedGraph.SetWeightedEdge, it seems that it's not possible to have 2 identical edges from one node to another.

Solution
It's not clear how to fix that without big changes. However, it means that it can happen in other cases and prevent correct execution of many ONNX models.

Any idea on a fix?

@owulveryck
Copy link
Owner

Hello,
Many thanks for submitting this issue and for your investigation.
I will have a closer look at the bug as soon as.possible.
First step is to write a simple test to reproduce this.
It will ease the analysis and the debugging process.

@bezineb5
Copy link
Contributor Author

bezineb5 commented Sep 7, 2019

Hi,
Thanks for this. I attached an example model and source code:
example_mul.zip
The graph just squares the input:
image

Expected output (with input = 2.0): 4.0
Actual output:
bad arity for operation (have 1, want 2)

@owulveryck
Copy link
Owner

owulveryck commented Sep 8, 2019

Thanks a lot.
I've created a branch for tracking this issue.

I think the solution would be to play with the self edge (cf godoc).

I've dumped the content of your onnx file for info.
I will work on it tomorrow evening CEST, but from what I see, the solution would be to add a self edge when two inputs are listed and equal.

The main problem is that we are losing the order of inputs as the self-edge must have a fixed weight within the current implementation.

pb.ModelProto{
        IrVersion: 5,
        OpsetImport: []*pb.OperatorSetIdProto{
                &pb.OperatorSetIdProto{
                        Domain:  "",
                        Version: 7,
                },
        },
        ProducerName:    "tf2onnx",
        ProducerVersion: "1.5.3",
        Domain:          "",
        ModelVersion:    0,
        DocString:       "",
        Graph: &pb.GraphProto{
                Node: []*pb.NodeProto{
                        &pb.NodeProto{
                                Input: []string{
                                        "x:0",
                                        "x:0",
                                },
                                Output: []string{
                                        "mul:0",
                                },
                                Name:      "mul",
                                OpType:    "Mul",
                                Domain:    "",
                                Attribute: nil,
                                DocString: "",
                        },
                },
                Name:        "tf2onnx",
                Initializer: nil,
                DocString:   "converted from ./model_nowind_test/export/",
                Input: []*pb.ValueInfoProto{
                        &pb.ValueInfoProto{
                                Name: "x:0",
                                Type: &pb.TypeProto{
                                        Value: &pb.TypeProto_TensorType{
                                                TensorType: &pb.TypeProto_Tensor{
                                                        ElemType: 1,
                                                        Shape: &pb.TensorShapeProto{
                                                                Dim: []*pb.TensorShapeProto_Dimension{
                                                                        &pb.TensorShapeProto_Dimension{
                                                                                Value: &pb.TensorShapeProto_Dimension_DimValue{
                                                                                        DimValue: 1,
                                                                                },
                                                                                Denotation: "",
                                                                        },
                                                                },
                                                        },
                                                },
                                        },
                                        Denotation: "",
                                },
                                DocString: "",
                        },
                },
                Output: []*pb.ValueInfoProto{
                        &pb.ValueInfoProto{
                                Name: "mul:0",
                                Type: &pb.TypeProto{
                                        Value: &pb.TypeProto_TensorType{
                                                TensorType: &pb.TypeProto_Tensor{
                                                        ElemType: 1,
                                                        Shape: &pb.TensorShapeProto{
                                                                Dim: []*pb.TensorShapeProto_Dimension{
                                                                        &pb.TensorShapeProto_Dimension{
                                                                                Value: &pb.TensorShapeProto_Dimension_DimValue{
                                                                                        DimValue: 1,
                                                                                },
                                                                                Denotation: "",
                                                                        },
                                                                },
                                                        },
                                                },
                                        },
                                        Denotation: "",
                                },
                                DocString: "",
                        },
                },
                ValueInfo: nil,
        },
        MetadataProps: nil,
}

@owulveryck
Copy link
Owner

owulveryck commented Sep 8, 2019

The plan is:

  • to write a (red) test in onnx-go based on the structure in the comment
  • make the test pass by modifying the onnx decoder
  • write a similar test in gorgonnx
  • make the test pass.

@owulveryck
Copy link
Owner

some update about this issue.
The branch has the tests which is already something. At first, I thought it would be easy to implement the self-link, at least in the main graph (in the Model structure).
The problem is that the simple graph implementation we use as a receiver for gorgonnx does not handle self links.

type Graph struct {
g *simple.WeightedDirectedGraph

From the godoc of gonum:

SetWeightedEdge adds a weighted edge from one node to another. If the nodes do not exist, they are added and are set to the nodes of the edge otherwise. It will panic if the IDs of the e.From and e.To are equal.

Therefore using a self-link is more difficult than expected.
(On top of that, walking the graph in gorgonnx would not be trivial.)

Therefore, it may be safer to think about another implementation to solve this issue.
By now, I am out-of-idea, but I am sure a solution exists. We just need to figure it out.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants