-
-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use TensorFlow's Universal Sentence Encoder #21
Comments
bump |
Hi @drbh! First of all, sorry for the late reply but I completely missed the issue! However, I've just tried to execute the Python code, using TensorFlow 1.14 in a google colab and it doesn't work. import tensorflow as tf
import tensorflow_hub as hub
embed = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
init = tf.global_variables_initializer()
embeddings = embed([
"The quick brown fox jumps over the lazy dog.",
"I am a sentence for which I would like to get its embedding"])
with tf.Session() as sess:
sess.run(init)
print(sess.run(embeddings)) I got an error about a non initialized table. So I fixed the Python code in this way: import tensorflow as tf
import tensorflow_hub as hub
embed = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
inits = [tf.global_variables_initializer(), tf.tables_initializer()]
embeddings = embed([
"The quick brown fox jumps over the lazy dog.",
"I am a sentence for which I would like to get its embedding"])
with tf.Session() as sess:
sess.run(inits)
print(sess.run(embeddings)) Now that I got a model working, I can try to create a SavedModel from Python and then use it from go. From what I understand from the The method with tf.Session() as sess:
sess.run(inits)
embed.export("wat", sess) Produces the following content in the
Ok, it really looks like a I don't have a PC with a Go installation right now (yeah, just formatted and I still have to setup the development env), but I guess that after extracting the needed information from the Here are the info I can get from Python: print(embed.get_signature_names())
print(embed.get_input_info_dict())
print(embed.get_output_info_dict()) That gives
So (haven't tested, but I guess it should work or at least is a good starting point for further investigations), from Go you can load the model from the package main
import (
"fmt"
tg "github.com/galeone/tfgo"
tf "github.com/tensorflow/tensorflow/tensorflow/go"
)
func main() {
model := tg.LoadModel("wat/", []string{"default"}, nil)
} Then, maybe, you can use the Let me know if it helps and if you are able to load the model (and how!) Sorry again for the huge delay |
Wow thanks so much for the detailed and fast response! Also thank you for explaining how to save a model. Sadly I've followed those instructions and re-saved the Tensorflow hub model into a SavedModel. However this model still comes up tagless. The Python model import and resave - import tensorflow as tf
import tensorflow_hub as hub
embed = hub.Module("https://tfhub.dev/google/universal-sentence-encoder-large/3")
inits = [tf.global_variables_initializer(), tf.tables_initializer()]
print(embed.get_signature_names())
print(embed.get_input_info_dict())
print(embed.get_output_info_dict())
with tf.Session() as sess:
sess.run(inits)
embed.export("wat", sess)
# $ python3 model.py what saved_model_cli scan --dir wat/
# The given SavedModel contains the following tag-sets: When I try to import in Golang package main
import (
"fmt"
tg "github.com/galeone/tfgo"
// tf "github.com/tensorflow/tensorflow/tensorflow/go"
)
func main() {
model := tg.LoadModel("wat/", []string{"default"}, nil)
fmt.Println(model)
}
// $ go run main.go
// 2019-10-01 08:22:32.184989: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: wat/
// 2019-10-01 08:22:32.210609: I tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { default }
// 2019-10-01 08:22:32.218247: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { default }; Status: fail. Took 33265 microseconds.
// panic: Could not find meta graph def matching supplied tags: { default }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli` Any idea on how to add or update the tagsets? Thanks again for pointing me in the right direction 👍 |
You're welcome! However, perhaps the way to go is to use
It should give you all the information available for each tag (and so you will have the correct tag name). Let me know if it helps 👍 |
Thanks for the advice - sadly the above command did not yield any tags. However, I just needed to save the model differently (from Python) to get those tags in right place with tf.Session() as sess:
sess.run(inits)
builder = tf.saved_model.builder.SavedModelBuilder("wat")
builder.add_meta_graph_and_variables(sess, [tf.saved_model.tag_constants.SERVING],)
builder.save()
# INFO:tensorflow:No assets to save.
# INFO:tensorflow:No assets to write.
# INFO:tensorflow:SavedModel written to: wat/saved_model.pb
# INFO:tensorflow:No assets to save.
# INFO:tensorflow:No assets to write.
# INFO:tensorflow:SavedModel written to: wat/saved_model.pb then from Go I can finally load the model (note this took a couple mins to load into my GPU memory) func main() {
model := tg.LoadModel("wat/", []string{"serve"}, nil)
fmt.Println(model)
}
// 2019-10-09 08:53:42.904583: I tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
// 2019-10-09 08:56:59.418976: I tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 196777906 microseconds.
// &{0xc42009a220} profit 🎉 Any thoughts on why this way worked? Thanks again for your advice, it was extremely helpful. |
Great! Perhaps your solution worked because I supposed that Your solution instead, fallbacks on the I guess this makes sense. Let me know if you need any help or I can close this issue |
A bit embarrassed to say, but I can't seem to At a high level I want to pass the model a list of strings and get the vector embeddings back... i.e. in python embeddings = embed([
"The quick brown fox jumps over the lazy dog.",
"I am a sentence for which I would like to get its embedding"])
...
sess.run(embeddings) I've tried to pass a |
You can use Let me know if it helps |
Closing for inactivity. |
Did you resolve the issue? |
How would I load in the
universal-sentence-encoder-large
embedding model?In Python
In GO I've tried
but the program panics a when trying to load in the model 😕
when I use the
saved_model_cli
I get empty resultsHow would I use the model?
The directory looks like:
and the data was downloaded and unzipped from https://tfhub.dev/google/universal-sentence-encoder-large/3?tf-hub-format=compressed
The text was updated successfully, but these errors were encountered: