-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serving a model #153
Comments
Hi @dimidd, thanks for the feature request. At the moment the development branch does not support exposing the model through tensorflow serving. However, we're in the middle of a large refactor (#148) that should migrate finetune onto the tensorflow estimator API. Can't make any promises, but as tensorflow serving has explicit support for the estimator framework it seems likely that exposing some functionality to make finetune work with tensorflow serving will be straightforward. Will keep you posted via this ticket. --Madison |
Hi Madison, Now that the TF-estimator refactor is done, I'd like to follow-up on this. Are there tasks I can help with? Any low-hanging fruits? Thanks again |
Hi @dimidd, thanks for the follow-up! I haven't worked with tensorflow serving before so you'll have to bear with me but I'll try to use this ticket to lay out a rough plan of attack:
I think this what we could add for MVP support for this feature. Hosting the saved model would probably be deferred to the end user for now. Is this roughly what you were thinking of? |
Thanks! I'll dive into the code. |
@dimidd Had a discussion with @madisonmay about a TF-Serving solution. I have worked with this in the past and have enough experience to know this is not something we will be able to officially support with finetune. The problems with using the Serving API with finetune is that:
However, serving a model from finetune with something like flask, would be pretty simple, and with the inference optimizations discussed in #188 should be performant enough for most use cases. |
Hi Ben, Agreed. I've actually used Madison's suggestions (e.g. something like dimidd@ecedc5c) with flask, and it works quite well.
|
Is your feature request related to a problem? Please describe.
Serving a trained model in production.
Describe the solution you'd like
I'd like to understand how to interface with tensorflow.
Describe alternatives you've considered
I'm able to
save
andload
a model, but not sure how to restore and serve it using TF.The text was updated successfully, but these errors were encountered: