Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add support ollama api #250

Merged
merged 14 commits into from
Jan 19, 2024

Conversation

davidberenstein1957
Copy link
Member

@davidberenstein1957 davidberenstein1957 commented Jan 15, 2024

In this PR I've included basic support for ollama

Some design choices:

  • use builtin urllib library. we might replace this with requests to also allow for passing a client
  • retry 5 times with n+1 backoff in case of 500 (server errors) as fixed config for some robustness but to avoid spending a lot of time on this
  • default to http://localhost:11434 but overwrite with OLLAMA_HOST env var as in Ollama docs
  • generation kwargs don't align with other standards (maybe we can forcefully set them to deployment config when running ollama run model?)

Things to note:

  • we use the api/chat endpoint
  • we can't do n_generations in one request so we make multiple

@davidberenstein1957 davidberenstein1957 linked an issue Jan 15, 2024 that may be closed by this pull request
Copy link
Contributor

@plaguss plaguss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice addition!! For when you tackle the docs, it would be nice to include an example of how to setup ollama so it is as contained as possible

src/distilabel/llm/ollama.py Outdated Show resolved Hide resolved
@davidberenstein1957
Copy link
Member Author

davidberenstein1957 commented Jan 16, 2024

Difficulty w.r.t. __init__ args is further covered here #246.

Copy link
Contributor

@plaguss plaguss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! Could you add some tests like the ones here? After that we should have a new cool integration 😃

@davidberenstein1957 davidberenstein1957 merged commit 291c23a into main Jan 19, 2024
4 checks passed
@davidberenstein1957 davidberenstein1957 deleted the feat/245-feature-add-ollama-llm-integration branch January 19, 2024 15:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEATURE] add ollama LLM integration
2 participants