You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found the problem. In this part, I did not trim the length of the labels. Besides, >= is necessary. i have update the code and add recall f-score evaluation. Thanks for your help.
hi kyzhouhzau~
thank you for this project :)
there is a minor error which i'd like to report.
tokenizer.convert_tokens_to_ids(ntokens)
would generate longer list thanmax_seq_length
when we are using--max_seq_length=128
.so, i ran with
--max_seq_length=150
. it was fine.The text was updated successfully, but these errors were encountered: