Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: CA truststore support #1398

Open
GillesBodart opened this issue Apr 2, 2024 · 2 comments
Open

feat: CA truststore support #1398

GillesBodart opened this issue Apr 2, 2024 · 2 comments
Labels
help wanted Extra attention is needed

Comments

@GillesBodart
Copy link

Bug Report

Description

When you run the docker image in an enterprise context, based on the company policy, you may have SSL interception in order to anlyse the traffic

Bug Summary:
Impossible to add some CA to the internal trustore used to make the REST API request

Steps to Reproduce:
need to have SSL interception enable on your laptop that breaks the SSL chain

Expected Behavior:
Possibility to add CA to the used truststore

Actual Behavior:

CA can't be added

Environment

  • **Windows 11 running docker image ghcr.io/open-webui/open-webui:main
  • **Browser Chrome

Reproduction Details

Confirmation:

  • [* ] I have read and followed all the instructions provided in the README.md.
  • [* ] I am on the latest version of both Open WebUI and Ollama.
  • [* ] I have included the browser console logs.
  • [ *] I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
{
"detail": "Something went wrong :/\nHTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/images/generations (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1006)')))"
}

Docker Container Logs:
INFO: 172.17.0.1:50270 - "GET /ollama/api/tags HTTP/1.1" 200 OK

INFO:apps.openai.main:get_all_models()

ERROR:apps.openai.main:Connection error: Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1006)')]

INFO:apps.openai.main:models: {'data': []}

INFO:apps.openai.main:get_all_models()

ERROR:apps.openai.main:Connection error: Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1006)')]

INFO:apps.openai.main:models: {'data': []}

INFO: 172.17.0.1:50270 - "GET /openai/api/models HTTP/1.1" 200 OK

INFO: 172.17.0.1:50270 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK

INFO: 172.17.0.1:50294 - "GET /_app/immutable/nodes/8.0396dff0.js HTTP/1.1" 200 OK

INFO: 172.17.0.1:50298 - "GET /ollama/api/version HTTP/1.1" 200 OK

INFO: 172.17.0.1:50304 - "GET /ollama/api/version HTTP/1.1" 200 OK

INFO: 172.17.0.1:50304 - "GET /ollama/urls HTTP/1.1" 200 OK

INFO: 172.17.0.1:50304 - "GET /ollama/api/version HTTP/1.1" 200 OK

INFO: 172.17.0.1:50304 - "GET /litellm/api/model/info HTTP/1.1" 200 OK

INFO: 172.17.0.1:50316 - "GET /api/config HTTP/1.1" 200 OK

INFO: 172.17.0.1:50316 - "GET /api/v1/auths/ HTTP/1.1" 200 OK

INFO:apps.ollama.main:get_all_models()

INFO: 172.17.0.1:50316 - "GET /ollama/api/tags HTTP/1.1" 200 OK

INFO:apps.openai.main:get_all_models()

ERROR:apps.openai.main:Connection error: Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1006)')]

INFO:apps.openai.main:models: {'data': []}

INFO:apps.openai.main:get_all_models()

ERROR:apps.openai.main:Connection error: Cannot connect to host api.openai.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self-signed certificate in certificate chain (_ssl.c:1006)')]

INFO:apps.openai.main:models: {'data': []}

INFO: 172.17.0.1:50316 - "GET /openai/api/models HTTP/1.1" 200 OK

INFO: 172.17.0.1:50316 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK

INFO: 172.17.0.1:50316 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK

INFO: 172.17.0.1:50316 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK

INFO: 172.17.0.1:50316 - "GET /api/v1/documents/ HTTP/1.1" 200 OK

INFO: 172.17.0.1:50316 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK

INFO:apps.ollama.main:get_all_models()

Screenshots (if applicable):

image

Installation Method

Docker vanilla install with Open API key

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

@tjbck tjbck changed the title SSL interception cause a Self signed certificate issue feat: CA truststore support Apr 2, 2024
@strikeoncmputrz
Copy link

This would be an excellent feature. I'd rather not have to run my openai compatible inference server in http mode but I'm using a private public key infrastructure

@tjbck
Copy link
Contributor

tjbck commented Apr 14, 2024

Feel free to make a PR!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants