New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
use this great tool for other models #637
Comments
Serving has an extensible API that will allow you to define an arbitrary Servable that conforms to an interface that Serving understands. So what I think it means is that you can in theory, for example, run a scikit, etc model in Serving if you write a Servable plugin. |
thx. |
I have implemented XGBoost Serving that is a fork of TensorFlow Serving. It supports serving XGBoost models and XGBoost && FM models. You can read the README for more details and try it in a few minutes. If you encounter any problems, please submit an issue or email me directly. |
Hi,
reading https://research.googleblog.com/2017/11/latest-innovations-in-tensorflow-serving.html I learnt that this great tool can be used to server other models as well.
But so far I could not find much documentation about i.e. serving a sklearn / xgboost / R / ... via TensorFlow serving.
It would be great if you could point me to some resources
The text was updated successfully, but these errors were encountered: