You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @luyang-huang96, thanks so much for posting the code. I noticed that the function align_embed_position keeps the first 1026 tokens' positional embedding weight and concatenates the same weight for tokens after 1026.
Hi @luyang-huang96, thanks so much for posting the code. I noticed that the function align_embed_position keeps the first 1026 tokens' positional embedding weight and concatenates the same weight for tokens after 1026.
LongDocSum/Model/longbart/longbartmodel.py
Lines 278 to 285 in d2b9bd0
I have two questions:
Does BART support more than 1024 tokens in inference of summarization task? facebookresearch/fairseq#1685 (comment)
The text was updated successfully, but these errors were encountered: