Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Some weights of the model checkpoint were not used" - is that intended? #167

Open
Jakobhenningjensen opened this issue Mar 14, 2022 · 0 comments

Comments

@Jakobhenningjensen
Copy link

Jakobhenningjensen commented Mar 14, 2022

When running

from danlp.models import load_bert_base_model 
model = load_bert_base_model()
vectorizerd_text = [model.embed_text(sentence)[1] for sentence in df["TextColumns"]]

I get the following

Some weights of the model checkpoint at C:\Users\Jakob.danlp\bert.botxo.pytorch were not used when initializing BertModel: ['cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.weight']
- This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

is that intended i.e should it be ignored or is it a bug?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant