-
Notifications
You must be signed in to change notification settings - Fork 557
Open
Labels
Description
Replace LangChain Chat Clients with LiteLLM
Objective
Replace all LangChain chat client implementations with LiteLLM while maintaining existing functionality.
APIs to manage user's LLM choice and keys and ensure that right LLM is selected during processing and agent execution.
Requirements
- Keep existing provider service pattern intact
- Support all current LLMs through LiteLLM
- Maintain inference and agent interaction functionality
- Remove CrewAI specialized handling (This was added because crewai was not able to support anthropic through the langchain client due to some conflict witht their internal litellm)
- Preserve streaming functionality
Implementation
- Replace LangChain imports with LiteLLM
- Update provider services
- Remove CrewAI-client specific code
- Update configurations and environment variables
Testing
- Verify provider service functionality
- Test all LLM integrations
- Validate inference and agent operations
- Check streaming implementations
- Run existing test suite
Success Criteria
- All LangChain chat clients replaced
- Potpie supports any model that is supported by Litellm
- User is able to set preference and keys for their LLM of choice
- No breaks in existing functionality
- Streaming working as expected
Reactions are currently unavailable