You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Rotary embeddings, key query embeddings, or maybe an adapted ALiBi embedding all seem worth implementing to improve the performance of LEAP. Ideally this would be implemented as an option that can be passed in the config like BERT's position_embedding_type (link below) maybe even the same code could be used!
This should also enable Infinite Context Length, though constant memory gradient calculation (see #12) would be needed to really allow for Infinite Context Length.
Rotary embeddings, key query embeddings, or maybe an adapted ALiBi embedding all seem worth implementing to improve the performance of LEAP. Ideally this would be implemented as an option that can be passed in the config like BERT's
position_embedding_type
(link below) maybe even the same code could be used!This should also enable Infinite Context Length, though constant memory gradient calculation (see #12) would be needed to really allow for Infinite Context Length.
https://huggingface.co/docs/transformers/v4.21.1/en/model_doc/bert#transformers.BertConfig
The text was updated successfully, but these errors were encountered: