Skip to content

fix: bug fixes and OpenAI-compatible base_url support#2

Merged
Grandvizir merged 4 commits intoinformatique-cdc:mainfrom
Astral0:fix/upstream-bugfixes
Mar 18, 2026
Merged

fix: bug fixes and OpenAI-compatible base_url support#2
Grandvizir merged 4 commits intoinformatique-cdc:mainfrom
Astral0:fix/upstream-bugfixes

Conversation

@Astral0
Copy link
Copy Markdown

@Astral0 Astral0 commented Mar 17, 2026

Summary

  • fix: setuptools packages configpip install -e . fails on the flat-layout Django project because setuptools discovers multiple top-level packages. Adds explicit [tool.setuptools.packages.find] in pyproject.toml.
  • feat: OPENAI_BASE_URL support — Allows configuring a custom base URL for the OpenAI client via OPENAI_BASE_URL env var, enabling use with LiteLLM, vLLM, or any OpenAI-compatible API proxy.
  • fix: dimensions param for non-OpenAI models — LiteLLM rejects the dimensions parameter for non-OpenAI embedding models (e.g. bge-m3). Only sends it for text-embedding-* models.
  • fix(chat): hardcoded Mistral branding — Replaces Mistral SVG icons and model selector with generic AI icon and actual configured model name from config.yaml.

Test plan

  • pip install -e . works without error on clean virtualenv
  • Setting OPENAI_BASE_URL=http://localhost:4000 routes requests to proxy
  • Embedding with non-OpenAI models (bge-m3) works without dimensions error
  • Chat UI shows actual model name instead of "Medium" / Mistral branding

Aimery Assire added 4 commits March 17, 2026 17:38
The flat-layout Django project caused setuptools to fail on `pip install -e .`
because it discovered multiple top-level packages. Explicitly list them in
`[tool.setuptools.packages.find]`.
Allow configuring a custom base URL for the OpenAI client via the
OPENAI_BASE_URL environment variable. This enables use with LiteLLM,
vLLM, or any OpenAI-compatible API proxy.
LiteLLM rejects the dimensions parameter for non-OpenAI embedding
models like bge-m3. Only send it for text-embedding-* models.
- Replace Mistral SVG icons with generic AI icon in model toolbar button
- Display actual configured model name from config.yaml instead of "Medium"
- Remove non-functional Mistral model selector dropdown (Medium/Large/Codestral/Magistral)
- Pass chat_model_name from LLM_CONFIG to template context
- Clean up orphaned JS references to model dropdown handlers
@Grandvizir Grandvizir merged commit 3d69f4f into informatique-cdc:main Mar 18, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants