-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: Poll ollama for new endpoints #979
Comments
@krrishdholakia , I am confused how ollama is expected to work.
|
Python solution for building a proxy config from the list of Ollama models may serve as one piece of what's needed for this issue. import requests
import yaml
import copy
# Fetch the list of models
response = requests.get('http://ollama:11434/api/tags')
models = [model['name'] for model in response.json()['models']]
# Define the template
template = {
"model_name": "MODEL",
"litellm_params": {
"model": "MODEL",
"api_base": "http://ollama:11434",
"stream": False
}
}
# Build the model_list
model_list = []
for model in models:
new_item = copy.deepcopy(template)
new_item['model_name'] = model
new_item['litellm_params']['model'] = f"ollama/{model}"
model_list.append(new_item)
litellm_config = {
"model_list": model_list
}
# Print the result
print(yaml.dump(litellm_config)) |
so we have a background health check functionality already - https://docs.litellm.ai/docs/proxy/health#background-health-checks Perhaps for ollama, we could point it to call |
The Feature
ollama exposes a
/api/tags
poll it, if ollama models passed in to check if new models are available
Motivation, pitch
improve user lives
https://github.com/Luxadevi/Ollama-Companion
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: