-
Notifications
You must be signed in to change notification settings - Fork 74k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: TF Serving compatibility with TF 2.0. #25363
Comments
Is this still current? I've seen some presos that suggest that TF-serving does support TF 2.0. |
Its on the done column in https://github.com/orgs/tensorflow/projects/4#card-17112802 so I would assume its complete, right? |
Tensorflow2.0 training model,Has compatibility been fully achieved on tf-serving? |
Any update on this? Thanks! |
Would appreciate any update on this if possible! |
EDIT: never mind, heard from my team that it is working. For anyone else running into the issue described below, you'll want your saved model to be under another folder with a version number. Other than that, this issue should be marked as closed if it's done so there's no confusion. OLD ------------
I verified that the SavedModel exists in that path. It was exported like so:
Can someone confirm whether this issue was resolved or not? |
This is already been supported.Closing the request. |
TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs. TensorFlow Serving provides out-of-the-box integration with TensorFlow models, but can be easily extended to serve other types of models and data.
The purpose of this issue is to upgrade Serving to be TF 2.0 compliant. The system must be eager and distribution compatible, with tests, and all associated engineering artifacts.
The text was updated successfully, but these errors were encountered: