Skip to content

Support multiple LLMs through Litellm #222

@dhirenmathur

Description

@dhirenmathur

Replace LangChain Chat Clients with LiteLLM

Objective

Replace all LangChain chat client implementations with LiteLLM while maintaining existing functionality.
APIs to manage user's LLM choice and keys and ensure that right LLM is selected during processing and agent execution.

Requirements

  • Keep existing provider service pattern intact
  • Support all current LLMs through LiteLLM
  • Maintain inference and agent interaction functionality
  • Remove CrewAI specialized handling (This was added because crewai was not able to support anthropic through the langchain client due to some conflict witht their internal litellm)
  • Preserve streaming functionality

Implementation

  1. Replace LangChain imports with LiteLLM
  2. Update provider services
  3. Remove CrewAI-client specific code
  4. Update configurations and environment variables

Testing

  • Verify provider service functionality
  • Test all LLM integrations
  • Validate inference and agent operations
  • Check streaming implementations
  • Run existing test suite

Success Criteria

  • All LangChain chat clients replaced
  • Potpie supports any model that is supported by Litellm
  • User is able to set preference and keys for their LLM of choice
  • No breaks in existing functionality
  • Streaming working as expected

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions