-
-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Phi-3 models #58
Comments
Hi. work normal with this template
And BOS option enabled. |
Hi. How can I make it generate until EOS? If I select the option, the app crashes. |
BOS is enabled, I have set that prompt, but I am getting an error as reply for every message: |
@guinmoon when you say "works normal" are you referring to the development version or the version in the App store? The stable version from the app store isn't honoring the end token and the app crashes if you try enabling EOS. |
development version |
Make sure Metal=on, BOS=on, EOS=off. And try setting contextsize=1024. I got 8-9 Tok/sec. Officially phi3 is only supported starting with llama.cpp release b2717. The latest LLMFarm commit uses b2692. The Testflight version uses b2135 which officially supports only phi2. |
see huggingface for the models
The text was updated successfully, but these errors were encountered: