We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hi i like configure catai.
https://withcatai.github.io/node-llama-cpp/types/LlamaModelOptions.html
but this website dont work.
can you provide me some good configure to your bot?
PS.
i lunch this catai github program on S8+ just install git and cmake and the catai run on termux normal :)
The text was updated successfully, but these errors were encountered:
This website has been changed a bit, you can look here instead: https://withcatai.github.io/node-llama-cpp/api/type-aliases/LlamaContextOptions https://withcatai.github.io/node-llama-cpp/api/type-aliases/LLamaChatPromptOptions
A good configuration really depends on your needs:
I am glad to hear it, I actually did that too :)
Sorry, something went wrong.
🎉 This issue has been resolved in version 3.0.1 🎉
The release is available on:
Your semantic-release bot 📦🚀
Successfully merging a pull request may close this issue.
hi i like configure catai.
https://withcatai.github.io/node-llama-cpp/types/LlamaModelOptions.html
but this website dont work.
can you provide me some good configure to your bot?
PS.
i lunch this catai github program on S8+ just install git and cmake and the catai run on termux normal :)
The text was updated successfully, but these errors were encountered: