-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use WIT for model trained in tfx #37
Comments
Looking at official documentation (https://www.tensorflow.org/guide/saved_model#savedmodels_from_estimators), it seems that when you load a saved model from disk, what you get back is not an estimator. But you should still be able to call predict on that object, by defining your own custom prediction function like is done in that documentation and then providing that custom predict function to the WitConfigBuilder. Let me know if an approach similar to the predict(x) function in that link works for you. |
@jameswex When using the predict function:
(With imported being I get the error: |
Since you have defined your own custom prediction function, instead of using a tf.Estimator, you want to change your code to something like: |
@jameswex OK this is better now but I have a problem - my features are from type list:
So I get the error: I have total of more 30 features and they are all from types |
Are you able to share a colab notebook with your code that loads up your saved model so I could see the issue? I'm imagining that perhaps the saved model as reloaded wants the example in a very different format than the tf.Example format and so some conversion function will be necessary but its hard to know what that will need to be without playing with it myself. |
@jameswex It's internal code so it will be problematic to share.. I'll try to play with it and make it work, Thanks! |
Looking at the example in the link I sent above, it seems your custom predict fn might need to take the provided tf.Examples, serialize them and wrap them in a tf.constant like: |
Hi
I trained a model with tfx and it was exported as
saved_model.pb.
Now, I want to reload it and visualize it using WIT.
How can I do this?
I couldn't find a way to do it since when reloading the model:
imported = tf.saved_model.load(export_dir=trained_model_path)
I get object from the type :<tensorflow.python.training.tracking.tracking.AutoTrackable at 0x7f3d71e456a0>
instead of an estimator.
Thanks
The text was updated successfully, but these errors were encountered: