You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi!
Thank you for sharing BROS!
I run into a document where the entities are beyond the limit of 512 tokens,
I do see BROS has a configuration parameter to extend this limit
max_seq_length: 512
but the pre-trained model available in huggingface is only for 512 tokens,
then finetuning will only limited up to 512 tokens?
thank you,
The text was updated successfully, but these errors were encountered:
Hi!
Thank you for sharing BROS!
I run into a document where the entities are beyond the limit of 512 tokens,
I do see BROS has a configuration parameter to extend this limit
but the pre-trained model available in huggingface is only for 512 tokens,
then finetuning will only limited up to 512 tokens?
thank you,
The text was updated successfully, but these errors were encountered: