Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

configure website down #51

Closed
Gasiorrr opened this issue Oct 14, 2023 · 2 comments · Fixed by #53
Closed

configure website down #51

Gasiorrr opened this issue Oct 14, 2023 · 2 comments · Fixed by #53
Labels
bug Something isn't working released

Comments

@Gasiorrr
Copy link

hi i like configure catai.

https://withcatai.github.io/node-llama-cpp/types/LlamaModelOptions.html

but this website dont work.

can you provide me some good configure to your bot?

PS.

i lunch this catai github program on S8+ just install git and cmake and the catai run on termux normal :)

@Gasiorrr Gasiorrr added the bug Something isn't working label Oct 14, 2023
@ido-pluto
Copy link
Collaborator

This website has been changed a bit, you can look here instead:
https://withcatai.github.io/node-llama-cpp/api/type-aliases/LlamaContextOptions
https://withcatai.github.io/node-llama-cpp/api/type-aliases/LLamaChatPromptOptions

A good configuration really depends on your needs:

  • if you need that model to remember more you may want to increase the context
  • If it is slow try to increase/decrease the thread amount
  • You want it to be more random, increase the temperature

I am glad to hear it, I actually did that too :)

@ido-pluto ido-pluto mentioned this issue Oct 26, 2023
4 tasks
@github-actions
Copy link

🎉 This issue has been resolved in version 3.0.1 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working released
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants