-
-
Notifications
You must be signed in to change notification settings - Fork 316
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why Ollama Provider baseurl is fixed to "localhost:11443"? #902
Comments
Thanks @ujongnoh for opening the issue. Looks like the Maybe there could be a jupyter-ai/packages/jupyter-ai-magics/jupyter_ai_magics/partner_providers/openai.py Lines 55 to 57 in 12d069e
|
Wow!! Exactly what i want it.. Thanks!! |
Hello.. I'm aplogize for reopen this issue.. i think that this promblem reason is base url of Ollama object in langchain_community library is fixed by 'localhost:11434"
but do not have permission modify this base url code .. isn't it?? |
@ujongnoh |
Hello! Thank you for all Developer who Develop Jupyter AI Service..
Our team want to Connect between JupyterLab Pod that KubeSpawner deploy in Kubernetes and Ollama Pod in same namespace
but.. ollama provider baseurl is fixed "localhost:11443" in source code
Do you have a way to make the baseurl attractive without modifying the code?
if you don't any other plans for this thing, do you have plan about this request??
Thank you!!!
The text was updated successfully, but these errors were encountered: