Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for loading pretrained models & storing models server side #456

Merged
merged 6 commits into from
Aug 29, 2022

Conversation

Nacho114
Copy link
Contributor

@Nacho114 Nacho114 commented Aug 24, 2022

Fix for loading pretrained models by conditional import of tfjs-node

When importing tfjs-node two things happen:

  1. The appropriate tfjs engine is loaded
  2. The appropriate functions are loaded

It seems that if we import tfjs-node the only 1. happens, since when loading and saving files the API still thinks it's tfjs (no node version); if we import 'tfjs-node' conditionally earlier then 2. seems to happen properly allowing us to save and load models again.

Note that due to some webpack dependency checks, we need to have tfjs-node inside a string to take advantage of lazy loading to avoid weback issues.

TODO:

  • model.json was not added, check on this
  • check if it is possible to save models in the server
  • Added timeout for test, was running in infinite loop if server crashed right after build.
  • find better way for model path (current one is relative, might not be ideal since it assumes CLI and server are at a same distance from model path).
  • update TASK.md

Fixes #446
Also Fixes #363

@Nacho114 Nacho114 requested a review from s314cy August 24, 2022 14:51
server/src/get_server.ts Outdated Show resolved Hide resolved
discojs/src/tasks/simple_face.ts Outdated Show resolved Hide resolved
@Nacho114 Nacho114 force-pushed the 446-fix-load-nacho branch 3 times, most recently from b8590c7 to 6bd91fb Compare August 24, 2022 15:45
@Nacho114 Nacho114 changed the title Fix for loading pretrained models by conditional import of tfjs-node Fix for loading pretrained models & storing models server side Aug 24, 2022
@Nacho114
Copy link
Contributor Author

Currently loading the model looks as follows:

export async function model (): Promise<tf.LayersModel> {
  const file = 'file://./../models/mobileNetV2_35_alpha_2_classes/model.json'
  return await tf.loadLayersModel(file)
}

the issue is that file is relative to where the code is executed from. Note that it is either run via the server, or the benchmark. In both cases the model is at ../ distance from the cwd. So it is not very robust. One solution is to store the path in process.env and define it once in benchmark and once in server, this way the stored models path is relative not to cwd, but to each env.

@Nacho114
Copy link
Contributor Author

@tharvik @s314cy wdyt?

@Nacho114 Nacho114 force-pushed the 446-fix-load-nacho branch 2 times, most recently from 7532309 to 0279f6b Compare August 29, 2022 13:48
@Nacho114 Nacho114 marked this pull request as ready for review August 29, 2022 13:50
@Nacho114
Copy link
Contributor Author

@tharvik @s314cy wdyt?

Found a clean solution by following the server config as suggested by @tharvik

Copy link
Contributor

@s314cy s314cy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image

@Nacho114 Nacho114 merged commit 66962ab into develop Aug 29, 2022
@Nacho114 Nacho114 deleted the 446-fix-load-nacho branch August 29, 2022 14:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Loading pre-trained models Persistent models server-side
2 participants