You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @sbmaruf
Currently model is trained to predict only NER tags for sequence length of 128 tokens, you can input sentence length more than 128, but output won't be good. The reason why i say it won't be good is ,BERT have positional embeddings, so after fine tuning only first 128 positions are fine tuned for NER task even though bert can accept maximum sequence length of 512.
In train set only 1 sentence has sequence length greater than 128 tokens. 2,4 in dev and test respectively .
Hi @kamalkraj !
Nice repo.
If a sentence has length more than 128 how do you predict NER tags for those sentences?
Especially for test data.
The text was updated successfully, but these errors were encountered: