-
Notifications
You must be signed in to change notification settings - Fork 195
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Freeze a model to serve within API #10
Comments
@PauloQuerido I too am trying to get a frozen graph to work. I got the .pb file from the link you posted using his freeze_graph function with output_node_names=decoder/decoder/transpose_1 |
@dongjun-Lee Do you possibly have any insight on why |
@gifflarn I'm sorry but I'm not familiar with the frozen graph. I'll look at it soon. |
@gifflarn I tried to use the following code to extract the output node names:
This is my code to freeze the Graph: https://gist.github.com/gogasca/ac743e3664c3e9cb668e9666c9e7b025 I'm unable to restore PB an generate predictions. While the test.py achieves reading file in a local environment what I want to do is offer an API. Anyone had any luck restoring the PB ? |
@gogasca isn't using every node in the graph as a output node counterproductive? |
@gifflarn Is possible that I don't really need to enter all the output nodes, I'm just testing. I will continue working on it today, how did you freeze the model to .pb ? The only difference I see b/w train and test is the the way run is executed and the parameters he passed. Train
Test
|
@gogasca From my understanding you only specify the last layer(s) from the graph as output nodes, 'freezing' everything between input and output node. I only specified decoder/decoder/transpose_1 as output node. And I hoped I could get it to work like this, without success
|
@gifflarn Export to PB Serving using .PB file Based on: I get this error when I run the second script in gist.
Error
Any other suggestions? I modified the utils.py to read text instead of a file, also replaced map and lambdas with list comprehension to improve readability.
|
I did a slight modification of test.py and now I'm able to do API requests. This is just a workaround, as I haven't solved the export to .pb issue.
This is the Flask server:
API Request:
|
I too got the script to answer to API calls, but it was very slow (~2s per sentence), hence why I am trying to freeze it. Could you time your solution? Maybe freezing the graph is not necessary. And to your earlier comment, I did feed dummy values to Tensor_2 and Tensor_3, this yielded decimal values which could not be read from the dictionary, and if Flooring the decimals, I got really weird results. |
Now is running in a Mac Pro ~16GB RAM/Intel Core i7.
|
@gogasca Did you progress any further on this? I kind of put this project on the bench for now, but I really want it to work, so if you have any ideas, I'm willing to try. |
@gifflarn I'm resuming this project today, need to present some results within the next 2 weeks, I will update the progress. |
@gogasca I had the idea of rewriting model.py and putting the decoder placeholders under a |
Hi.
I successfully tested a portuguese corpus I prepared and trained ( change line in utils.py
for word in word_tokenize(sentence, language='portuguese'):
).I'd like to have a frozen model in a single .pb file in order to serve within an API. I tried several approaches, like this: https://blog.metaflow.fr/tensorflow-how-to-freeze-a-model-and-serve-it-with-a-python-api-d4f3596b3adc
But unsuccessfully.
Would you consider providing some method to export a saved model? Or point me to the right direction?
Thanks!
The text was updated successfully, but these errors were encountered: