Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

put frozen model in download package #7

Closed
jasonhe88 opened this issue May 15, 2023 · 7 comments
Closed

put frozen model in download package #7

jasonhe88 opened this issue May 15, 2023 · 7 comments

Comments

@jasonhe88
Copy link

jasonhe88 commented May 15, 2023

I am AI newbie struggling to freeze the model but can not make it so far.

It would be very nice to put frozen model in the zipped model package as well.

issue 6

@TheBoatyMcBoatFace
Copy link

I'm working on dockerizing all of this and making it into an api so you can sent a POST with any messages you want to test.

I've still got a few bugs to work out, but can you let me know any other difficulties or challenges you've faced?

Are you looking, at least initially, to train or just use the model?

And yes, the model is hit or miss when downloading.

@jasonhe88
Copy link
Author

jasonhe88 commented May 24, 2023

Thank you, I am trying to deploy this model on mobile platform via Alibaba's MNN Github link

MNN has it's own model format, it provides a convertor to import models in other format, as for TensorFlow, this convertor only accepts frozen format. I wrote many scripts to freeze the model first but couldn't get through.

So I am wondering if you can provide model in frozen format as well.

@Steeeephen
Copy link
Contributor

Hey all, sorry was OOO so just getting to this now

It would be very nice to put frozen model in the zipped model package as well

So just to confirm, the pb in saved_model/saved_model.pb does not work for the converter? Never had to deal with MNN/freezing models in this way so I'm not 100% sure how that all works.

The repo does have all code for training the model so you can initialise it, restore the checkpoint and then freeze it as you wish - code for initialising is here and then model can be frozen from there, no need to mess around with the SavedModel format that it comes in

I've still got a few bugs to work out [...] And yes, the model is hit or miss when downloading.

Curious what issues you're running into? We've deployed the internal version of the model on our inhouse infra (so it's not 1:1 the same) but it should be relatively straightforward to deploy the OSS version, happy to help with this

@jasonhe88
Copy link
Author

jasonhe88 commented May 24, 2023

hi @Steeeephen,

yeah, the converter only accept frozen model, here is the message:

[ERROR] MNNConvert just support tensorflow frozen graph model. Model file is not tf frozen graph model.

will try to figure out how to freeze it as you said

@Steeeephen
Copy link
Contributor

Modified the train.py to load the checkpoint and freeze the graph as per https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/. Again, never really dealt with frozen graphs so I can guarantee much haha. Try it out and lmk if it works for you - if it does we can add it to the download zip

Just put this in the base directory (where train.py is) and run python3 load_checkpoint_freeze_graph.py --checkpoint_dir saved_checkpoint. It should save it to frozen_models/

https://gist.github.com/Steeeephen/fff11d5270f0e34f287e8a6afe6979c9

@jasonhe88
Copy link
Author

Modified the train.py to load the checkpoint and freeze the graph as per https://leimao.github.io/blog/Save-Load-Inference-From-TF2-Frozen-Graph/. Again, never really dealt with frozen graphs so I can guarantee much haha. Try it out and lmk if it works for you - if it does we can add it to the download zip

Just put this in the base directory (where train.py is) and run python3 load_checkpoint_freeze_graph.py --checkpoint_dir saved_checkpoint. It should save it to frozen_models/

https://gist.github.com/Steeeephen/fff11d5270f0e34f287e8a6afe6979c9

just run load_checkpoint_freeze_graph.py and get the frozen model, then converted it to mnn format!

the only problem is converted model is about 200M, too big to deploy on mobile, will try to quantize it using MNN compressing tool

Thank you so much!

@Steeeephen
Copy link
Contributor

Niiice glad to hear it 🥳

I added a zip with the frozen model to the bucket and changed the download link on the repo, so the default download should now include the frozen model. Thanks for raising this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants