Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Microsoft Phi-3 128k context length #2123

Open
niutech opened this issue Apr 24, 2024 · 1 comment
Open

Support for Microsoft Phi-3 128k context length #2123

niutech opened this issue Apr 24, 2024 · 1 comment

Comments

@niutech
Copy link
Contributor

niutech commented Apr 24, 2024

Please allow to use microsoft/Phi-3-mini-128k-instruct model in the candle-phi example, which uses the LongRope scaling. Thanks!

@EricLBuehler
Copy link
Member

@niutech, we support 128k context length on mistral.rs which you can run with cargo run --release --features ... -- -i plain -m microsoft/Phi-3-mini-128k-instruct -a phi3.

For reference, here is our implementation: https://github.com/EricLBuehler/mistral.rs/blob/6334b30fdf6447fa787dcbedb032fb825c22ae1f/mistralrs-core/src/models/layers.rs#L84

I would be happy to contribute it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants