-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exporting/Loading model and weights independently #47
Comments
No, we don't export graph and weights separately. For a graph with multiple "datasets", it should be exported as multiple "versions" of the same model. We want model exports to be self-contained. Besides, in most cases, the footprint of graph itself is relatively small/insignificant compared to its weights in serialized format. Yes, multiple versions of a model can be loaded simultaneously and used by the same Could you tell us a bit more about the use case? |
Ah, thanks @fangweili - the multiple versions of the model makes sense. Basically, I have discrete datasets for predicting labels (via the same model) that are under a taxonomy, with sublabels. eg,
The models may (or may not share) the same graph. I'm building this system to be flexible in case I need different graphs per use case, or re-use the same one across all of them. I've created a new
|
For "manager to load different models into memory", the recommended solution is to extend |
Is there an example/pointers to a case where,
Classify()
to run?The text was updated successfully, but these errors were encountered: