You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
#4676
Closed
hchings opened this issue
Nov 25, 2022
· 0 comments
I'm trying to convert a Huggingface's CANINE model to onnx.
example = {'input_ids': tensor([[57344, 48, 48, 48, 48, 48, 48, 57345],
[57344, 48, 48, 48, 48, 48, 48, 57345]]),
'token_type_ids': tensor([[0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0]]),
'attention_mask': tensor([[1, 1, 1, 1, 1, 1, 1, 1],
[1, 1, 1, 1, 1, 1, 1, 1]])}}
torch.onnx.export(model, # model being run
example, # model input (or a tuple for multiple inputs)
"./onnx/canine.onnx", # where to save the model (can be a file or file-like object)
opset_version=13, # the ONNX version to export the model to
do_constant_folding=True,
input_names = ['input_ids', 'attention_mask', 'token_type_ids'], # the model's input names
output_names = ['last_hidden_state'], # the model's output names
dynamic_axes={'input_ids': {0: 'batch', 1: 'sequence'}, # variable length axes
'attention_mask': {0: 'batch', 1: 'sequence'},
'token_type_ids': {0: 'batch', 1: 'sequence'},
'last_hidden_state' : {0 : 'batch_size'}})
After the model is exported, when I try to do inference with different sequence length (e.g., 9), I got the error:
[E:onnxruntime:, sequential_executor.cc:369 Execute] Non-zero status code returned while running Concat node. Name:'Concat_1714' Status Message: concat.cc:159 PrepareForCompute Non concat axis dimensions must match: Axis 1 has mismatched dimensions of 1536 and 9
src/transformers/models/canine/modeling_canine.py:605: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
The problem code snippet in model that caused those warnings:
Ask a Question
Question
I'm trying to convert a Huggingface's CANINE model to onnx.
After the model is exported, when I try to do inference with different sequence length (e.g., 9), I got the error:
I suspect it's due to the few TracerWarnings during export [complete TracerWarnings and Exported graph logs] :
The problem code snippet in model that caused those warnings:
My questions:
Further information
Relevant Area: onnx export
Versions:
Is this issue related to a specific model?
Model name: CANINE
Model opset: 13
The text was updated successfully, but these errors were encountered: