Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: TF Serving compatibility with TF 2.0. #25363

Closed
dynamicwebpaige opened this issue Jan 31, 2019 · 7 comments
Closed

Feature: TF Serving compatibility with TF 2.0. #25363

dynamicwebpaige opened this issue Jan 31, 2019 · 7 comments
Labels
TF 2.0 Issues relating to TensorFlow 2.0 type:feature Feature requests

Comments

@dynamicwebpaige
Copy link
Contributor

TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs. TensorFlow Serving provides out-of-the-box integration with TensorFlow models, but can be easily extended to serve other types of models and data.

The purpose of this issue is to upgrade Serving to be TF 2.0 compliant. The system must be eager and distribution compatible, with tests, and all associated engineering artifacts.

@dynamicwebpaige dynamicwebpaige added type:feature Feature requests TF 2.0 Issues relating to TensorFlow 2.0 labels Jan 31, 2019
@dynamicwebpaige dynamicwebpaige changed the title Feature: TF Serving compatibility. Feature: TF Serving compatibility with TF 2.0. Jan 31, 2019
@dynamicwebpaige dynamicwebpaige added this to In progress in TensorFlow 2.0 Jan 31, 2019
@dynamicwebpaige dynamicwebpaige moved this from In progress to Done in TensorFlow 2.0 Mar 4, 2019
@amygdala
Copy link
Contributor

Is this still current? I've seen some presos that suggest that TF-serving does support TF 2.0.

@joaqo
Copy link

joaqo commented Jun 9, 2019

Its on the done column in https://github.com/orgs/tensorflow/projects/4#card-17112802 so I would assume its complete, right?

@DearChuck
Copy link

Tensorflow2.0 training model,Has compatibility been fully achieved on tf-serving?

@QiJune
Copy link

QiJune commented Oct 18, 2019

Any update on this? Thanks!

@njerschow
Copy link

Would appreciate any update on this if possible!

@neil-119
Copy link

neil-119 commented Dec 27, 2019

EDIT: never mind, heard from my team that it is working. For anyone else running into the issue described below, you'll want your saved model to be under another folder with a version number. Other than that, this issue should be marked as closed if it's done so there's no confusion.

OLD ------------
We trained a model with the tf.keras API and created a SavedModel with Keras. We then tried to deploy it to tf-serving, but we're getting:

No versions of servable XX found under base path /models/XX

I verified that the SavedModel exists in that path. It was exported like so:

model = tf.keras.models.load_model("model.h5", custom_objects={'KerasLayer':hub.KerasLayer})
model.save(some_path_here, save_format="tf")

Can someone confirm whether this issue was resolved or not?

@rthadur
Copy link
Contributor

rthadur commented Jun 24, 2021

This is already been supported.Closing the request.

@rthadur rthadur closed this as completed Jun 24, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
TF 2.0 Issues relating to TensorFlow 2.0 type:feature Feature requests
Projects
Development

No branches or pull requests

8 participants