Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LiteLLM Support #2

Open
MiscellaneousStuff opened this issue Apr 11, 2024 · 2 comments
Open

Add LiteLLM Support #2

MiscellaneousStuff opened this issue Apr 11, 2024 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@MiscellaneousStuff
Copy link
Owner

MiscellaneousStuff commented Apr 11, 2024

Add support for LiteLLM within SWE-agent so people can use any open-source model they want for the agent, rather than just OpenAI or Anthropic APIs.

Edit: Switched from Ollama to LiteLLM as LiteLLM should be a bit more abstract than Ollama and support a wider range of closed source and open source LLMs so the community can have more freedom over what they use.

@MiscellaneousStuff MiscellaneousStuff added the enhancement New feature or request label Apr 11, 2024
@MiscellaneousStuff MiscellaneousStuff self-assigned this Apr 11, 2024
@MiscellaneousStuff MiscellaneousStuff changed the title Add Ollama Support Add LiteLLM Support Apr 18, 2024
@BradKML
Copy link

BradKML commented Apr 23, 2024

Seconding this since this project might come in handy in the near future. Also there are ideas about LiteLLM being able to call multiple models from Ollama that is worth anticipating. (AutoGen and other tools have similar supports as well)

@MiscellaneousStuff
Copy link
Owner Author

Turns out that inside of the SWE-agent, you can call ollama models using ollama:<OLLAMA_MODEL_NAME_GOES_HERE already.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Backlog
Development

No branches or pull requests

2 participants