Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: add missing seed parameter to ollama input #3923 #3924

Merged
merged 1 commit into from
May 30, 2024

Conversation

devdev999
Copy link
Contributor

@devdev999 devdev999 commented May 30, 2024

Current Ollama interfacing does not allow for seed, which is supported in https://github.com/ollama/ollama/blob/main/docs/api.md#parameters and https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values

This PR resolves that by adding in handling of seed parameter. Linked to #3923

Title

Add missing seed parameter to Ollama input

Relevant issues

Fixes #3923

Type

🐛 Bug Fix

Changes

  • Add seed in the handling of optional_params in ollama.py and ollama_chat.py
  • Moved Ollama supported parameters from utils.py to ollama.py as get_supported_openai_params()

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

If UI changes, send a screenshot/GIF of working UI fixes

pytest test_ollama.py test_completion.py::test_ollama_image -x -vv

image

Current ollama interfacing does not allow for seed, which is supported in https://github.com/ollama/ollama/blob/main/docs/api.md#parameters and https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values

This resolves that by adding in handling of seed parameter.
Copy link

vercel bot commented May 30, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 30, 2024 5:59pm

@krrishdholakia
Copy link
Contributor

nice - can you add a screenshot of it passing the ollama testing

just do

cd litellm/litellm

pytest test_ollama.py test_completion.py::test_ollama_image -x -vv

@devdev999
Copy link
Contributor Author

nice - can you add a screenshot of it passing the ollama testing

just do

cd litellm/litellm

pytest test_ollama.py test_completion.py::test_ollama_image -x -vv

@krrishdholakia Ran the tests and all passed, I have updated the screenshot in the first post.

@krrishdholakia krrishdholakia merged commit 1bc4540 into BerriAI:main May 30, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Ollama does not handle seed as parameter
2 participants