Skip to content

fix(embeddings): add allowed_openai_params for OpenAI-compatible embedding dimensions#1320

Merged
nicoloboschi merged 1 commit intovectorize-io:mainfrom
zwcf5200:feature/embedding-dimension-fix
Apr 29, 2026
Merged

fix(embeddings): add allowed_openai_params for OpenAI-compatible embedding dimensions#1320
nicoloboschi merged 1 commit intovectorize-io:mainfrom
zwcf5200:feature/embedding-dimension-fix

Conversation

@zwcf5200
Copy link
Copy Markdown
Contributor

What

Fix the allowed_openai_params issue in LiteLLMSDKEmbeddings when using OpenAI-compatible custom embedding models.

Problem

When using litellm-sdk as the embeddings provider with an OpenAI-compatible custom model (e.g., openai/Qwen3-Embedding-4B-4bit-DWQ), the dimensions parameter is silently rejected by litellm unless it is explicitly allow-listed via allowed_openai_params.

This means HINDSIGHT_API_EMBEDDINGS_LITELLM_SDK_OUTPUT_DIMENSIONS has no effect — the embedding returns the model's default dimension instead of the configured one.

Solution

Add allowed_openai_params = ["dimensions"] to the litellm kwargs when the model name starts with openai/ and output_dimensions is set.

This affects both the async (aembedding) and sync (embedding) code paths in LiteLLMSDKEmbeddings.

Changes

  • hindsight-api-slim/hindsight_api/engine/embeddings.py: Add allowed_openai_params in 2 places (async and sync paths)
  • hindsight-api-slim/tests/test_litellm_sdk_embeddings.py: Add test test_openai_output_dimensions_allows_litellm_dimensions_param and update existing tests to verify allowed_openai_params behavior

Testing

  • Unit tests pass (see test_litellm_sdk_embeddings.py)
  • Verified locally with litellm-sdk + OpenAI-compatible custom embedding endpoint + OUTPUT_DIMENSIONS=2000

…dding dimensions

When using litellm-sdk with OpenAI-compatible custom models (model name
starts with "openai/"), the "dimensions" parameter is rejected by litellm
unless it is explicitly allow-listed via allowed_openai_params.

This fix adds the allow-listing so that HINDSIGHT_API_EMBEDDINGS_LITELLM_SDK_OUTPUT_DIMENSIONS
works correctly with OpenAI-compatible embedding endpoints.

Fixes: custom embedding models with OpenAI-compatible APIs reject the
dimensions parameter unless allowed_openai_params includes "dimensions".
Copy link
Copy Markdown
Collaborator

@nicoloboschi nicoloboschi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@nicoloboschi nicoloboschi merged commit 324b4b0 into vectorize-io:main Apr 29, 2026
54 checks passed
@zwcf5200 zwcf5200 deleted the feature/embedding-dimension-fix branch April 30, 2026 00:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants