Skip to content

Support for Alternative LLM Providers (Beyond OpenAI) #1

@pravintargaryen

Description

@pravintargaryen

I wanted to ask if there's any plan to support alternative LLM providers, such as:

  • Local models via vLLM, llama.cpp, or Ollama
  • Other cloud providers like Mistral, Together AI, or Cohere

This would provide more flexibility for users who prefer self-hosted or non-OpenAI options. Would you be open to adding support for this? I'd be happy to help test or contribute if needed.

Looking forward to your thoughts!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions