diff --git a/docs/docs/ai/llm.mdx b/docs/docs/ai/llm.mdx
index 69337001..b356c998 100644
--- a/docs/docs/ai/llm.mdx
+++ b/docs/docs/ai/llm.mdx
@@ -26,7 +26,7 @@ We support the following types of LLM APIs:
| [Anthropic](#anthropic) | `LlmApiType.ANTHROPIC` | ✅ | ❌ |
| [Voyage](#voyage) | `LlmApiType.VOYAGE` | ❌ | ✅ |
| [LiteLLM](#litellm) | `LlmApiType.LITE_LLM` | ✅ | ❌ |
-| [OpenRouter](#openrouter) | `LlmApiType.OPEN_ROUTER` | ✅ | ❌ |
+| [OpenRouter](#openrouter) | `LlmApiType.OPEN_ROUTER` | ✅ | ✅ |
| [vLLM](#vllm) | `LlmApiType.VLLM` | ✅ | ❌ |
| [Bedrock](#bedrock) | `LlmApiType.BEDROCK` | ✅ | ❌ |
@@ -400,7 +400,7 @@ You can find the full list of models supported by LiteLLM [here](https://docs.li
To use the OpenRouter API, you need to set the environment variable `OPENROUTER_API_KEY`.
You can generate the API key from [here](https://openrouter.ai/settings/keys).
-A spec for OpenRouter looks like this:
+A text generation spec for OpenRouter looks like this:
@@ -415,6 +415,27 @@ cocoindex.LlmSpec(
+OpenRouter also supports some text embedding models. Note that for OpenRouter embedding
+models, you need to explicitly provide the `output_dimension` parameter in the spec.
+Here's how you can define the spec to use an OpenRouter embedding model:
+
+
+
+
+```python
+cocoindex.functions.EmbedText(
+ api_type=cocoindex.LlmApiType.OPEN_ROUTER,
+ model="openai/text-embedding-3-small",
+ # Task type for embedding model
+ task_type="SEMANTICS_SIMILARITY",
+ # Required: the number of output dimensions for the embedding model
+ output_dimension=1536,
+)
+```
+
+
+
+
You can find the full list of models supported by OpenRouter [here](https://openrouter.ai/models).
### vLLM
diff --git a/rust/cocoindex/src/llm/mod.rs b/rust/cocoindex/src/llm/mod.rs
index 172550f5..fb5e29c2 100644
--- a/rust/cocoindex/src/llm/mod.rs
+++ b/rust/cocoindex/src/llm/mod.rs
@@ -166,6 +166,10 @@ pub async fn new_llm_embedding_client(
LlmApiType::Ollama => {
Box::new(ollama::Client::new(address).await?) as Box
}
+ LlmApiType::OpenRouter => {
+ Box::new(openrouter::Client::new_openrouter(address, api_key).await?)
+ as Box
+ }
LlmApiType::Gemini => {
Box::new(gemini::AiStudioClient::new(address, api_key)?) as Box
}
@@ -178,11 +182,7 @@ pub async fn new_llm_embedding_client(
Box::new(gemini::VertexAiClient::new(address, api_key, api_config).await?)
as Box
}
- LlmApiType::OpenRouter
- | LlmApiType::LiteLlm
- | LlmApiType::Vllm
- | LlmApiType::Anthropic
- | LlmApiType::Bedrock => {
+ LlmApiType::LiteLlm | LlmApiType::Vllm | LlmApiType::Anthropic | LlmApiType::Bedrock => {
api_bail!("Embedding is not supported for API type {:?}", api_type)
}
};