Remote Ollama server: Model Selection Issue #2359
SamuelDevdas
started this conversation in
General
Replies: 3 comments 5 replies
-
The same problem, I don't know how to solve it |
Beta Was this translation helpful? Give feedback.
1 reply
-
I may find the reason. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Bug Report
Description
Bug Summary:
[When using ollama installed on remote server using docker, how to select models?(
)]
Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]
Expected Behavior:
[Available models should be listed, or an options while setting up docker container should be available]
Actual Behavior:
[Models not available to select, thus chat cant be started]
Environment
Open WebUI Version: [e.g., 0.1.120]
Ollama (if applicable): [e.g., 0.1.30, 0.1.32-rc1]
Operating System: [raspbian]
Browser (if applicable): [e.g., Chrome 100.0]
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Beta Was this translation helpful? Give feedback.
All reactions