Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Latest embeddings techniques (Infinite Context Length?) #14

Open
mtanghu opened this issue Sep 3, 2022 · 0 comments
Open

Latest embeddings techniques (Infinite Context Length?) #14

mtanghu opened this issue Sep 3, 2022 · 0 comments
Labels
enhancement New feature or request

Comments

@mtanghu
Copy link
Owner

mtanghu commented Sep 3, 2022

Rotary embeddings, key query embeddings, or maybe an adapted ALiBi embedding all seem worth implementing to improve the performance of LEAP. Ideally this would be implemented as an option that can be passed in the config like BERT's position_embedding_type (link below) maybe even the same code could be used!

This should also enable Infinite Context Length, though constant memory gradient calculation (see #12) would be needed to really allow for Infinite Context Length.

https://huggingface.co/docs/transformers/v4.21.1/en/model_doc/bert#transformers.BertConfig

@mtanghu mtanghu added enhancement New feature or request help wanted Extra attention is needed good first issue Good for newcomers labels Sep 3, 2022
@mtanghu mtanghu changed the title Latest embeddings techniques Latest embeddings techniques (Infinite Context Length?) Sep 3, 2022
@mtanghu mtanghu removed help wanted Extra attention is needed good first issue Good for newcomers labels Dec 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant