-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix for loading pretrained models & storing models server side #456
Conversation
b8590c7
to
6bd91fb
Compare
Currently loading the model looks as follows: export async function model (): Promise<tf.LayersModel> {
const file = 'file://./../models/mobileNetV2_35_alpha_2_classes/model.json'
return await tf.loadLayersModel(file)
} the issue is that |
7532309
to
0279f6b
Compare
265a477
to
a24646a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix for loading pretrained models by conditional import of tfjs-node
When importing
tfjs-node
two things happen:It seems that if we import
tfjs-node
the only 1. happens, since when loading and saving files the API still thinks it's tfjs (no node version); if we import 'tfjs-node' conditionally earlier then 2. seems to happen properly allowing us to save and load models again.Note that due to some webpack dependency checks, we need to have tfjs-node inside a string to take advantage of lazy loading to avoid weback issues.
TODO:
Fixes #446
Also Fixes #363