Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Frontend][TENSORFLOW] batch_norm() got an unexpected keyword argument 'exponential_avg_factor' #5978

Closed
Leslie-Fang opened this issue Jul 2, 2020 · 3 comments

Comments

@Leslie-Fang
Copy link
Contributor

Leslie-Fang commented Jul 2, 2020

I am trying to convert a resnet style model into TVM, but I got the error message:

Traceback (most recent call last):
  File "tensorflow_tvm.py", line 47, in <module>
    mod, params = relay.frontend.from_tensorflow(graph_def,
  File "/home/lesliefang/tvm/tvm/python/tvm/relay/frontend/tensorflow.py", line 3574, in from_tensorflow
    mod, params = g.from_tensorflow(graph, layout, shape, outputs)
  File "/home/lesliefang/tvm/tvm/python/tvm/relay/frontend/tensorflow.py", line 2967, in from_tensorflow
    func = self._get_relay_func(graph, layout=layout, shape=shape, outputs=outputs)
  File "/home/lesliefang/tvm/tvm/python/tvm/relay/frontend/tensorflow.py", line 2926, in _get_relay_func
    self._backtrack_construct(node.name)
  File "/home/lesliefang/tvm/tvm/python/tvm/relay/frontend/tensorflow.py", line 3508, in _backtrack_construct
    op = self._convert_operator(node.op, inputs, attr, self._graph)
  File "/home/lesliefang/tvm/tvm/python/tvm/relay/frontend/tensorflow.py", line 3365, in _convert_operator
    sym = convert_map[op_name](inputs, attrs, self._params, self._mod)
  File "/home/lesliefang/tvm/tvm/python/tvm/relay/frontend/tensorflow.py", line 1265, in _impl
    out = AttrCvt(op_name='batch_norm',
  File "/home/lesliefang/tvm/tvm/python/tvm/relay/frontend/common.py", line 417, in __call__
    return get_relay_op(op_name)(*inputs, **new_attrs)
TypeError: batch_norm() got an unexpected keyword argument 'exponential_avg_factor'

Refer to this issue #3428, I have checked my model's bn op. it doesn't set this attr. But since this PR tensorflow/tensorflow#37176, tensorflow should set the default value of this attr.

I have 2 questions:

  1. It seems TVM will also check the op definition in Tensorflow's implementation, why not just check the user-defined attribution in the PB model?
  2. is it ok for me to just add the
self._ignores.append('exponential_avg_factor')

here? https://github.com/apache/incubator-tvm/blob/512ed3930a61daf38e80e1f71e51f0d1f139fb8e/python/tvm/relay/frontend/common.py#L367-L374

@Leslie-Fang
Copy link
Contributor Author

@FrozenGene Could you help to take a look of this issue:) Thanks

@tqchen
Copy link
Member

tqchen commented Jul 3, 2020

Thanks for reporting the problem, the community uses https://discuss.tvm.ai/ for quick trouble shooting and discussions, please open a new thread there

@tqchen tqchen closed this as completed Jul 3, 2020
@mikeseven
Copy link
Contributor

mikeseven commented Sep 27, 2020

@tqchen I don't see the issue solved with top of tree today.
but adding @Leslie-Fang solution to common.py does work. I think it's ok because these ops are training ops that should have been removed from frozen graphs....
Same thing for RandomUniform, FIFOQueue(V2) , QueueDequeueMany(V2)?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants