Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Support for Model settings #42

Open
twalderman opened this issue Jan 30, 2024 · 4 comments
Open

Feature Request: Support for Model settings #42

twalderman opened this issue Jan 30, 2024 · 4 comments

Comments

@twalderman
Copy link

Support Min-P, Repeat Penalty, Repeat Penalty Tokens

@longy2k
Copy link
Owner

longy2k commented Feb 3, 2024

v1.8.4

I have not tested all the parameters but let me know if this work for your use case.

Thanks

EDIT: I noticed the regen/edit is not as random as I want it to be. I set the seed parameter default value to '0' which has caused the model to generate the same response for the same prompt. I will clear the seed parameter tomorrow.

@longy2k
Copy link
Owner

longy2k commented Feb 4, 2024

v1.8.5

Cleared the seed parameter default value.

You may still have to go to Ollama Local LLMS > Advanced Settings > seed and clear the input field.

@twalderman
Copy link
Author

It would be good if there was a way to save presets of ollama settings.

@twalderman
Copy link
Author

twalderman commented Feb 5, 2024

Min_P is not yet supported by Ollama. I put in a feature request for it with the ollama project

More info:

https://www.reddit.com/r/LocalLLaMA/comments/17vonjo/your_settings_are_probably_hurting_your_model_why/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants