Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

convert our model(s) from onnx to tf #56

Closed
cwmeijer opened this issue Nov 2, 2021 · 9 comments
Closed

convert our model(s) from onnx to tf #56

cwmeijer opened this issue Nov 2, 2021 · 9 comments

Comments

@cwmeijer
Copy link
Member

cwmeijer commented Nov 2, 2021

No description provided.

@cwmeijer
Copy link
Member Author

cwmeijer commented Nov 2, 2021

use
https://github.com/onnx/onnx-tensorflow/graphs/contributors
first try it on our models.

@cwmeijer cwmeijer changed the title convert our model(s) from onnx to pt through tf convert our model(s) from onnx to tf Nov 2, 2021
@cwmeijer cwmeijer self-assigned this Nov 2, 2021
@cwmeijer
Copy link
Member Author

cwmeijer commented Nov 4, 2021

Explored it with a single model (leafsnap) and predictions were equivalent to the original onnx. Script can be found here:
https://github.com/dianna-ai/dianna-exploration/blob/56-onnx-to-tf-exploration/onnx_conversion_scripts/onnx_to_tensorflow.ipynb

@cwmeijer cwmeijer assigned geek-yang and unassigned cwmeijer Nov 11, 2021
@cwmeijer cwmeijer added the standup Temp label- for disscussion with the team next standup label Nov 18, 2021
@elboyran
Copy link
Contributor

Can somebody who has time, please test the Tutorial notebook of onnx-tensorflow executing exactly the same notebook and just appending it with the shap step to see if it works?

@loostrum
Copy link
Member

That tutorial didn't work for me even without changing anything, the model they provide fails when running prepare (see also the discussion in the teams channel). I couldn't find which versions of e.g. tensorflow they used.

I did do some tests together with Chris and Yang, see this notebook: https://github.com/dianna-ai/dianna-exploration/blob/test-shap-tf-onnx/onnx_conversion_scripts/test_shap_tf_onnx.ipynb
We're getting further, i.e. we can generate an explainer object, but getting the shap values still fails.
shap also fails on a native keras model (see the same notebook), it looks like the DeepExplainer is not (fully) compatible with tensorflow 2, or at least not with recent versions.

@cwmeijer
Copy link
Member Author

2 remarks:

@elboyran
Copy link
Contributor

Note that the relevant branches are in dianna-exploration. If it results in a PR we should not forget it in the stand-ups.

@elboyran
Copy link
Contributor

My interpretations of the errors we got in https://github.com/dianna-ai/dianna-exploration/blob/test-shap-tf-onnx/onnx_conversion_scripts/test_shap_tf_onnx.ipynb:

For our model- the warning after cell [21] about exporting a tensor dict might be related to the error later after cell [24]:

TypeError: Expected any non-tensor type, got a tensor instead.

Looks like versioning issues indeed. But maybe we can try not with mnist. but another or our models? I recall @geek-yang saying the mnist had a different structure after conversion than the others.

For the native Keras model: I can report our error also in their issue, hoping someone will pay attention. Also worth looking at how the skomatin guy changed his model and avoided the error (post from 11 Sep 2020). He thinks shap does not support TF 2.0 :-( @loostrum what shap and TF versions did you use?

@elboyran
Copy link
Contributor

One more tough: if DeepExplainer attempts ultimately fail, any point of trying the slower Shap's KernalExplainer?

https://github.com/slundberg/shap

@loostrum
Copy link
Member

I tried the latest version of shap (0.40), a slightly older version (0.38) and the latest master branch.
Using tensorflow 2.6

@cwmeijer cwmeijer removed the standup Temp label- for disscussion with the team next standup label Nov 30, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants