Skip to content

Unable to connect to proxied model from python api with self-signed SSL certificate #555

@s-andrews

Description

@s-andrews

I'm running an ollama model on a linux system which I'm proxying through an apache server to a subdirectory, so the model will is running on

http://localhost:12345

..is proxied to https://webserver/ABCDEF/

This all works, and I can see the correct responses in the web server api, eg https://webserver/ABCDEF/api/tags returns a valid json document.

I can connect to the command line client OK and get the results from that.

C:\Users\me>set OLLAMA_HOST=https://webserver/ollama/PPKDEZXGHKWEKNOZCWSS/

C:\Users\me>ollama list
NAME           ID              SIZE     MODIFIED
gpt-oss:20b    f2b8351c629c    13 GB    36 minutes ago

C:\Users\me>ollama run gpt-oss:20b
>>> say hi
Thinking...
User just says "say hi". Should respond politely.
...done thinking.

Hello! 👋 How can I help you today?

I can also use the same URL to connect to page assist and that works.

When I try the same thing in the python API I can't get it to work though.

#!python

import ollama

client = ollama.Client(host="https://webserver/ollama/PPKDEZXGHKWEKNOZCWSS/")

response = client.chat(
    model="gpt-oss:20b", 
    messages=[
        {
            "role":"user",
            "content":"Say hi"
        }
    ]
)

print(response.message.content)

...fails with...

Traceback (most recent call last):
  File "C:\Users\andrewss\Desktop\Ollama\ollama_test.py", line 7, in <module>
    response = client.chat(
        model="gpt-oss:20b",
    ...<5 lines>...
        ]
    )
  File "C:\Users\andrewss\Desktop\Ollama\venv\Lib\site-packages\ollama\_client.py", line 342, in chat
    return self._request(
           ~~~~~~~~~~~~~^
      ChatResponse,
      ^^^^^^^^^^^^^
    ...<12 lines>...
      stream=stream,
      ^^^^^^^^^^^^^^
    )
    ^
  File "C:\Users\andrewss\Desktop\Ollama\venv\Lib\site-packages\ollama\_client.py", line 180, in _request
    return cls(**self._request_raw(*args, **kwargs).json())
                 ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "C:\Users\andrewss\Desktop\Ollama\venv\Lib\site-packages\ollama\_client.py", line 126, in _request_raw
    raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download

The same script works on the linux host machine if I point to the http://localhost:12345 address, so it's something about the use of the proxy which is causing the failure.

Any suggestions for how to fix or debug this some more would be appreciated. Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions