-
Notifications
You must be signed in to change notification settings - Fork 14.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatOllama is missing the parameters seed and base_url #24703
Comments
I'm looking into it. I'll PR if I come to a nice solution |
There are actually more Options missing. |
Sounds good. To be honest me neither. I mentioned seed and base_url cause i use those. Thanks a lot! |
@noggynoggy I also use Base URL , would appreciate if you could create a PR and get it merged. |
The base_url functionality was added by #24719. Maybe ill add the other Parameters later with a PR |
Can you please add seed parameters. It's also kind of an essential one. |
## Description Adds seed parameter to ChatOllama ## Resolves Issues - #24703 ## Dependency Changes None Co-authored-by: Lennart J. Kurzweg (Nx2) <git@nx2.site>
@chriss1245 you can close this issue. Both have been implemented. The latest release does not include the changes yet, so in the meantime: wget https://raw.githubusercontent.com/langchain-ai/langchain/master/libs/partners/ollama/langchain_ollama/chat_models.py -O .venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py make sure the path works for you (venv name, python version) |
Sure, thanks a lot! |
Thanks for doing this, I've been so confused! |
…#24782) ## Description Adds seed parameter to ChatOllama ## Resolves Issues - langchain-ai#24703 ## Dependency Changes None Co-authored-by: Lennart J. Kurzweg (Nx2) <git@nx2.site>
Checked other resources
Example Code
When running:
The parameters base_url and seed get ignored. Reviewing the code of this instance, I see that the class definition is missing these attributes.
Error Message and Stack Trace (if applicable)
No response
Description
Regarding seed, in PR 249 in ollama, this feature was added to allow reproducibility of the experiments.
Regarding base_url, since ollama allow us to host llms in our own servers, we need to be able to specify the url of the server.
Plus in OllamaFunctions from the package langchain_experimental does provide support to this.
System Info
langchain==0.2.11
langchain-chroma==0.1.2
langchain-community==0.2.10
langchain-core==0.2.23
langchain-experimental==0.0.63
langchain-groq==0.1.6
langchain-ollama==0.1.0
langchain-text-splitters==0.2.2
The text was updated successfully, but these errors were encountered: