Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Роль, промт #686

Open
den47999 opened this issue Apr 1, 2024 · 3 comments
Open

Роль, промт #686

den47999 opened this issue Apr 1, 2024 · 3 comments
Labels
upgrade New feature or request

Comments

@den47999
Copy link

den47999 commented Apr 1, 2024

Good afternoon:)
Can you add the ability to add your own PROMT and Raleigh? Also add the ability to adjust the temperature as I would like to set 0.1-0.3.
Also tell me in the settings is specified model its context 128000 tokens but when sending to the chat long context 5000 tokens error comes out.

<title>414 Request-URI слишком большой</title>

414 Request-URI слишком большой

![Screenshot_1](https://github.com/khoj-ai/khoj/assets/118296790/109bf057-de01-4245-927e-f90a258d11cc)
@den47999 den47999 added the upgrade New feature or request label Apr 1, 2024
@den47999
Copy link
Author

den47999 commented Apr 1, 2024

Screenshot_1
Еще подскажите в настройках указана модель ее контекст 128000 токенов но при отправке в чат длинный контекст 5000 токенов вылезает ошибка.

@sabaimran
Copy link
Collaborator

Hi @den47999 ! We've limited the token limit used for this model. Thanks for pointing this out, I'll update it.

You'd like a custom prompt in the cloud instance?

@den47999
Copy link
Author

I want the number of tokens to be 128000 as it helps me with writing code, as well as add temperature selection and promt roles

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
upgrade New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants