-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
load model from file and predict #11
Comments
I'd see why one would want to load a model at runtime instead of baking it as executable code. One way to achieve this is to have Let me get back to you when I have time estimate for this feature. |
Great. Let me know if I can help.
Perhaps the the trees could be loaded then serialized with something like cereal, which provides a lot of format flexibilty: portable binary archive, xml, json, etc. Thoughts? |
Any updates on this one? Background: I have a few regression/detection tasks for mobile devices where this could significantly streamline the xgboost model deployment. |
Seems to be fixed by #45 ? |
@headupinclouds I don't think it goes all the way you want, since models will still need to be compiled in order to be fed into the runtime. |
@headupinclouds Conceptually, to do what you want, we'd need to add a new runtime that directly reads from Protobuf. Any reason why current approach (generating shared libs) doesn't work for your scenario? |
I see, thanks for the clarification.
I'm interested in using treelite as a general purpose alternative to direct static linking of xgboost in a couple real-time cross platform computer vision applications, although I see my requirements may not be in line with current project goals. I'd like to have a single universal prediction module that supports a variety of pretrained regression and classification tasks, where different model sizes may be required due to varying application specific and platform constraints. A mobile model <= 1 MB may be required for some builds, and a more accurate 8 MB model may be appropriate for desktop platforms. You mentioned SHARED libs, allthough I'm guessing this isn't a hard constraint. For iOS, one would have to generate a SHARED lib in the form of a dynamic framework, which takes some work. In my use case, static linking is generally preferable. |
@headupinclouds You can use static linking for prediction runtime. Currently, the model itself will be encoded as a |
I can have one protobuf file that will work on Linux, Windows, macOS, iOS, Android and Raspberry Pi, but I would need 6 |
@headupinclouds Yes, you'd need to prepare multiple |
Okay, thanks for the input. Treelite is a great idea. |
@headupinclouds and @hcho3 Thanks for interesting discussion. I followed tutorial given on https://treelite.readthedocs.io/en/latest/ and generated But now no idea on how to convert it into protobuf file so that I can use same model into iOS and Android. Any reference link is very much appreciated. |
That isn't how this library is designed. From the above discussion this would have to be added as a new feature, but it is quite orthogonal to the current design, so it isn't clear it will necessarily be added. The |
I decided to remove Protobuf entirely from Treelite. For the use case you described, I suggest that you write a custom runtime that operates on the Treelite C++ object (https://github.com/dmlc/treelite/blob/mainline/include/treelite/tree.h). For example, The Forest Inferencing Library (FIL) in cuML uses this approach. |
I'm interested in using
treelite
as a fast and lightweight C/C++ prediction module. It seems the preferred model is to load a standard supported format, compile/bake an optimized c file/library, and then integrate the generated c file into an application for prediction. Is it possible to usetreelite
to load a model from file at runtime for prediction? (If this isn't possible through the current API would you accept this feature). This would provide some flexibility and code size reductions compared to usingxgboost
orlightgbm
directly.The text was updated successfully, but these errors were encountered: