-
Notifications
You must be signed in to change notification settings - Fork 914
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to export frozen graph #17
Comments
i see this function in utils.py file:
is this code for frozen model? and how to freeze model? |
The freeze_all function is for transfer learning in the training process. the official guide recommends the SavedModel format SavedModel is in .pb format but it comes with extra files for the variables and parameters |
hello @zzh8829 thank for your help. i try to deploy serving model using tensorrt inference server. but i got this error:
So, Do you know why? |
That might be related to outdated tensorflow/serving, try with newer version and see if that works: |
thank @andydion. but i use tensorrt inference server instead of tf serving. |
Hello @andydion i write a client code, but it doesn't work.
|
Hi I think this is a issue with tensorrt compatibility, i recommend using tensorflow serving instead |
hello @zzh8829. Thank for your code!
How to convert your checkpoint to frozen graph (.pb file)
The text was updated successfully, but these errors were encountered: