You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Very appreciate for the sample code of RGTN, I try to use Hugging Face transformer-XL to get semantic embedding of the node text.
Here is the official code in the Hugging Face transformer-XL doc:
tokenizer = TransfoXLTokenizer.from_pretrained("transfo-xl-wt103")
model = TransfoXLModel.from_pretrained("transfo-xl-wt103")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state`
===
But the last_hidden_states was the embedding vector of the word not the embedding of text, can you please give me the way to get the text embedding or release the part of sample code?
The text was updated successfully, but these errors were encountered:
Dear Author:
Very appreciate for the sample code of RGTN, I try to use Hugging Face transformer-XL to get semantic embedding of the node text.
Here is the official code in the Hugging Face transformer-XL doc:
===
`from transformers import TransfoXLTokenizer, TransfoXLModel
import torch
tokenizer = TransfoXLTokenizer.from_pretrained("transfo-xl-wt103")
model = TransfoXLModel.from_pretrained("transfo-xl-wt103")
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state`
===
But the last_hidden_states was the embedding vector of the word not the embedding of text, can you please give me the way to get the text embedding or release the part of sample code?
The text was updated successfully, but these errors were encountered: