-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Embeddings from ELMo Transformer #2351
Comments
The transformer and LSTM architectures are sufficiently different that it is not possible to convert the torch |
Thanks for the explanation :) Unfortunately, the Transformer ELMo documentation only shows how to this via a |
Hey @stefan-it, here's a code snippet:
Two things to note:
Let me know if you have any other questions! |
- Improves documentation for using transformer ELMo. - A user requested an example of how to use transformer ELMo directly. - #2351
Thanks so much! It is perfectly working :) Whenever I'll have questions about getting access to specific layers, I'll open a follow-up issue! |
Hi,
I have the following question regarding to the ELMo transformer model. I trained an own model and a
model.tar.gz
was created successfully :)With the "normal" ELMo model I would do the following to get the embeddings:
Could you provide a small snippet of how to get the embeddings from a transformer-based ELMo model 🤔
Some more questions:
The
model.tar.gz
containsconfig.json
andweights.th
. Could theweights.th
file converted to a hdf5 file so that theElmoEmbedder
can be used?I guess
BidirectionalLanguageModelTokenEmbedder
should be used:It is derived from
LanguageModelTokenEmbedder
, but this base class does not provide any "embed" methods :(Thanks ❤️
The text was updated successfully, but these errors were encountered: