Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why Ollama Provider baseurl is fixed to "localhost:11443"? #902

Closed
ujongnoh opened this issue Jul 18, 2024 · 5 comments · Fixed by #904
Closed

Why Ollama Provider baseurl is fixed to "localhost:11443"? #902

ujongnoh opened this issue Jul 18, 2024 · 5 comments · Fixed by #904
Labels
enhancement New feature or request

Comments

@ujongnoh
Copy link

Hello! Thank you for all Developer who Develop Jupyter AI Service..
Our team want to Connect between JupyterLab Pod that KubeSpawner deploy in Kubernetes and Ollama Pod in same namespace
but.. ollama provider baseurl is fixed "localhost:11443" in source code
Do you have a way to make the baseurl attractive without modifying the code?
if you don't any other plans for this thing, do you have plan about this request??

Thank you!!!

@ujongnoh ujongnoh added the enhancement New feature or request label Jul 18, 2024
@jtpio
Copy link
Member

jtpio commented Jul 18, 2024

Thanks @ujongnoh for opening the issue.

Looks like the base_url is set in the langchain package here: https://github.com/langchain-ai/langchain/blob/0dec72cab08cad712a6916368dc37c70faefb7a0/libs/community/langchain_community/llms/ollama.py#L32

Maybe there could be a TextField for the Ollama provider to allow providing another URL, similar to the OpenAI provdider?

TextField(
key="openai_api_base", label="Base API URL (optional)", format="text"
),

@ujongnoh
Copy link
Author

Wow!! Exactly what i want it.. Thanks!!

@ujongnoh
Copy link
Author

ujongnoh commented Jul 30, 2024

Hello.. I'm aplogize for reopen this issue..
i'm trying to api test using jupyter_ai 2.19.1 version but occured this error
File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1257, in _create_direct_connection raise last_exc File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1226, in _create_direct_connection transp, proto = await self._wrap_create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/aiohttp/connector.py", line 1033, in _wrap_create_connection raise client_error(req.connection_key, exc) from exc aiohttp.client_exceptions.ClientConnectorError: Cannot connect to host localhost:11434 ssl:default [Connect call failed ('127.0.0.1', 11434)]

i think that this promblem reason is base url of Ollama object in langchain_community library is fixed by 'localhost:11434"
so If you change the source code, wouldn't it look like this?

class _OllamaCommon(BaseLanguageModel):
before -> base_url: str = "http://localhost:11434"
to -> base_url: Optional[List[str]] = None
"""Base url the model is hosted under."""

but do not have permission modify this base url code .. isn't it??
Thank you!!

@taehee
Copy link

taehee commented Jul 31, 2024

@ujongnoh
I also had the same issue and solved it by referring to the link below.
langchain-ai/langchain#24703
I solved it by installing
langchain-ollama==0.1.0
Good luck!!

@sqlreport
Copy link

@dlqqq we are still seeing same issue #1004

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants