Skip to content

Comments

Feat/litellm#3

Merged
tqtensor merged 2 commits intomainfrom
feat/litellm
May 8, 2025
Merged

Feat/litellm#3
tqtensor merged 2 commits intomainfrom
feat/litellm

Conversation

@tqtensor
Copy link
Owner

@tqtensor tqtensor commented May 8, 2025

What does this PR do?

This PR integrates LiteLLM to provide a unified interface for multiple Vision LLM providers. It simplifies configuration and expands model support while maintaining compatibility with existing implementations.

Key changes:

  • Added support for LiteLLM proxy models, treating them as OpenAI models internally

  • Replaced provider-specific configs (openai_config, gemini_config, ollama_config) with a universal provider_config parameter

  • Fixed async/sync handling for better performance with concurrent processing

  • Simplified worker calculation to focus on CPU cores

  • Updated documentation with examples for the new unified API

  • Addresses issue (#XXX)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Ran make lint and make format to handle lint / formatting issues.
  • Ran make test to run relevant tests scripts.
  • Read the contributor guidelines.
  • Wrote necessary unit or integration tests.

@tqtensor tqtensor merged commit 7882200 into main May 8, 2025
1 of 5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant