You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank you for this great paper and implementation. I have a question about loading a trained model in a separate process and using it for inference.
I have been able to successfully train a BertForPromptFinetuning model using a BERT MLM, and it has been saved to a directory. I am attempting to load it for inference using the .from_pretrained function, like so:
# load models, config and tokenizers
config = AutoConfig.from_pretrained("../LM-BFF/result/tmp/config.json")
model = BertForPromptFinetuning.from_pretrained('../LM-BFF/result/tmp/', config=config)
tokenizer = AutoTokenizer.from_pretrained('../LM-BFF/result/tmp/')
This required loading the custom class BertForPromptFinetuning from your repo, along with some other configuration, e.g. initialising label_word_list to the list of vocabulary IDs used during training, which in my case was:
I get an error relating to prediction_mask_scores: IndexError: index 413 is out of bounds for dimension 1 with size 1
Now, I am unsure if what I attempting is even possible. I am pretty new to using transformers, so I could be doing something silly. But I wondered if you had any advice on using the model in this way.
The text was updated successfully, but these errors were encountered:
Hi, thank you for this great paper and implementation. I have a question about loading a trained model in a separate process and using it for inference.
I have been able to successfully train a
BertForPromptFinetuning
model using a BERT MLM, and it has been saved to a directory. I am attempting to load it for inference using the.from_pretrained
function, like so:This required loading the custom class
BertForPromptFinetuning
from your repo, along with some other configuration, e.g. initialisinglabel_word_list
to the list of vocabulary IDs used during training, which in my case was:However, when attempting inference like this:
I get an error relating to
prediction_mask_scores
:IndexError: index 413 is out of bounds for dimension 1 with size 1
Now, I am unsure if what I attempting is even possible. I am pretty new to using transformers, so I could be doing something silly. But I wondered if you had any advice on using the model in this way.
The text was updated successfully, but these errors were encountered: