-
Notifications
You must be signed in to change notification settings - Fork 876
Open
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers
Description
I apologise if I have missed how to do this, but as far as I can tell the feature doesn't exist yet.
It would be great to add support for a custom base URL for embedding models (like OPENAI_EMBEDDING_BASE_URL or embedding_base_url config option).
Currently, the EmbeddingClient in openevolve/embedding.py only supports:
- OpenAI embedding models (text-embedding-3-small, text-embedding-3-large) using api.openai.com
- Azure embedding models
This makes it impossible to use other providers for embeddings like: - OpenRouter (which offers qwen/qwen3-embedding-8b and other embedding models)
- Local embedding servers
- Other OpenAI-compatible APIs
The fix would involve:
- Adding a new config option (e.g., embedding_base_url) to DatabaseConfig in openevolve/config.py
- Modifying EmbeddingClient.init to accept a base_url parameter
- Passing base_url to the OpenAI client initialization
This would allow users to set OPENAI_EMBEDDING_BASE_URL env var or embedding_base_url in their config to use any OpenAI-compatible endpoint for embeddings.
Example use case: Using OpenRouter for embeddings while using a different provider for LLMs.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers