Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TEST/CI] Add tests covering TensorflowSavedModelArtifact #721

Closed
parano opened this issue May 27, 2020 · 8 comments · Fixed by #1285
Closed

[TEST/CI] Add tests covering TensorflowSavedModelArtifact #721

parano opened this issue May 27, 2020 · 8 comments · Fixed by #1285
Assignees

Comments

@parano
Copy link
Member

parano commented May 27, 2020

Currently, there isn't any integration test covering the TensorflowSavedModelArtifact:

  • Similar to PR [CI] Include onnx artifact test on travis #726, add a unit test assuming tensorflow pypi package installed, and create a BentoService with TensorflowSavedModelArtifact, test the major functionalities including save, load, run inference

  • Build a docker container with the saved bundle and test the API server endpoint, see related PR Add integration tests #855 and [CI] Refactor API Server integration tests #871 which have some utility function code that can be re-used for this

  • Add a script ./travis/run-tensorflow-v2.2-tests.sh that runs pip install tensorflow==2.2 and pytest tests/test_tensorflow_v2.2_model_artifact.py ...

  • Add another script ./travis/run-tensorflow-v1.14-tests.sh that runs pip install tensorflow==1.14 and pytest tests/test_tensorflow_v1.14_model_artifact.py ...

  • Add two separate task in travis.yml that launches the two test scripts above https://github.com/bentoml/BentoML/blob/master/.travis.yml

@parano parano added good-first-issue Good for newcomers help-wanted An issue currently lacks a contributor test labels May 27, 2020
@joshuacwnewton
Copy link
Contributor

Hi @parano. I'm a student in the MLH Fellowship program. I'd like to begin working on this issue. Thanks! :)

@joshuacwnewton
Copy link
Contributor

It looks like the requirements have updated, and #876 no longer fixes this. I will make the new changes to address the updated issue. :)

@joshuacwnewton
Copy link
Contributor

joshuacwnewton commented Jul 22, 2020

Add another script ./travis/run-tensorflow-v1.14-tests.sh that runs pip install tensorflow==1.14 and pytest tests/test_tensorflow_v1.14_model_artifact.py ...

Two questions:

@joshuacwnewton
Copy link
Contributor

joshuacwnewton commented Jul 23, 2020

I am having some difficulty developing a simple TF 1.14 example. I have only used TF2, so it is a bit challenging to search through old TF1 documentation! 😛

Because the test I am writing involves saving and loading a TF1 model artifact, I have run into this warning:

loaded = tf.compat.v2.saved_model.load(path)
if isinstance(loaded, AutoTrackable) and not hasattr(loaded, "__call__"):
logger.warning(
'''Importing SavedModels from TensorFlow 1.x.
`outputs = imported(inputs)` is not supported in bento service due to
tensorflow API.
Recommended usage:
```python
from tensorflow.python.saved_model import signature_constants
imported = tf.saved_model.load(path_to_v1_saved_model)
wrapped_function = imported.signatures[
signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
wrapped_function(tf.ones([]))
```
See https://www.tensorflow.org/api_docs/python/tf/saved_model/load for
details.
'''
)

But I am having trouble figuring out how to give a test model the .signatures attribute for the above fix.

@bojiang: I think you wrote the warning in #421. Do you think this will cause some problems writing a TF1 test? Or, do you know of a good workaround? Thanks much! :)

@joshuacwnewton
Copy link
Contributor

I am a bit curious about the motivation behind supporting TF1 in BentoML, considering the strong push away from TF1 from Google. For clarity, what were your thoughts when the tf==1.14 test was initially proposed? :)

Part of me is wondering if it is more trouble that it's worth to have to take on the maintenance costs of accommodating for legacy TF1 API features (e.g. models which use tf.Session, input placeholders, feed_dict, etc.). It is a bit challenging to write/debug TF1 code given how scattered TF1 documentation can be these days -- some materials are outdated and some are not. But, there could be benefits that I'm not aware of, so I'm happy to hear other opinions.

@bojiang
Copy link
Member

bojiang commented Jul 29, 2020

Hi @joshuacwnewton ,

It took me a while to retrieve an example using non-keras TensorFlow v1 API:
https://github.com/bentoml/gallery/blob/3ce0644d04ccce7b262947b85d3b3bcd1bc57dd8/tensorflow/fashion-mnist/tensorflow_1_fashion_mnist.ipynb

In fact, we did not do extra work to make bentoml support tensorflow v1. tensorflow.saved_model did most of the dirty work. And I think there are still a certain number of projects using this API.

@joshuacwnewton
Copy link
Contributor

joshuacwnewton commented Jul 29, 2020

Great work finding this example @bojiang! It looks like it has all the necessary steps, so thank you. I will see what I can do to adapt the steps to a simpler model for testing purposes. 😄

Interestingly, I notice that the example you linked does not use bentoml.save or bentoml.load. Instead, it uses tf.saved_model.simple_save, and then uses bentoml.pack directly from the saved folder.

I think not using bentoml.save might be a key difference, because bentoml.save uses tf.saved_model.save, which requires signatures to be indicated explicitly.

return tf.saved_model.save(
self.obj,
self.tf_artifact._saved_model_path(dst),
signatures=self.signatures,
)

By using tf.saved_model.simple_save instead, manually setting signatures can be bypassed, because it handles SavedModel signatures automatically for you (using tf.saved_model.signature_def_utils.)

@parano
Copy link
Member Author

parano commented Nov 24, 2020

@bojiang has already implemented tests for V2, and planning to add TF V1 tests

@parano parano removed the help-wanted An issue currently lacks a contributor label Nov 24, 2020
@bojiang bojiang linked a pull request Dec 2, 2020 that will close this issue
18 tasks
@bojiang bojiang closed this as completed Dec 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants