-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to add new model to the serving? #452
Comments
You can add multiple models by using a ModelServerConfig at startup time. See this tutorial for some documentation. tl;dr: when you run the model_server, run using something like this: Where the config should be a file on disk that looks like this:
changing the |
what you said is only for how to run server, but I want to know how to build the new model? |
Hi @tbchj, can you share a bit more about your use-case and/or setup? Are you looking for examples on how to export new models? TensorFlow Serving uses the SavedModel format, the documentation for which is at: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/python/saved_model |
I'm sorry, your guys did not understand mine. Maybe because of my bad English expression. now I can run inception server and mnist server which already in the serving package. and the clients both of inception and mnist example all running ok. but now, I want to add some model like incetion_v4, resnet or other models except the mnist and inception which already in the serving package. I think tf-serving can provide many model server not just mnist and inception which already in serving. right? I guess if I add new models to serving, I will modify or add some proto files in the serving ,and if running, also need to building the new model I added. |
There are two parts to serving a new model:
I get the impression the issue you're describing is with exporting. For that please take a look at the SavedModel links above. One helpful approach is to look at the inception example we provide in our repo already (https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/inception_saved_model.py) and see how the |
Can i get more information about signature_def? |
Hi @com9009, you can find SignatureDef documentation here: |
Thank you @sukritiramesh |
ok, I'll try. thank you @kirilg @sukritiramesh |
@tbchj don't know how much you got. i recently wrote down how i added inception-v4 and inception-resnet-v2 into tensorflow_serving, https://gyang274.github.io/docker-tensorflow-serving-slim/0x02.slim.html, might help. |
now I have a request. I want to add other model net to the serving. someone like incetion_v4, resnet or any one else. but how can I do ? Should I change or add some code to the source code of serving ?
I need some help.
The text was updated successfully, but these errors were encountered: