Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BASED #29466

Open
2 tasks done
axelmagn opened this issue Mar 5, 2024 · 1 comment
Open
2 tasks done

BASED #29466

axelmagn opened this issue Mar 5, 2024 · 1 comment

Comments

@axelmagn
Copy link

axelmagn commented Mar 5, 2024

Model description

BASED is an attention model which combines sliding window attention and global linear attention to capture similar dependencies to transformers in a subquadratic model.

It outperforms other similar models such as Mamba.

Open source status

  • The model implementation is available
  • The model weights are available

Provide useful links for the implementation

@simran-arora
Copy link

Hi I'm curious if it's possible to add in these models. Is there anything I can do to speed it along?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants