-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Description
What happened?
Problem integrating Mistral OCR with LiteLLM (Azure AI Foundry)
We tried to integrate Mistral OCR through the LiteLLM Proxy, but we encountered issues.
We first attempted this endpoint:
127.0.0.1:34666 - "POST /mistral/v1/ocr HTTP/1.1" 404 Not Found
And got the following error:
httpx.InvalidURL: /v1/ocr
Since this approach did not work, we tried using the official pass through endpoint feature in LiteLLM:
👉 https://docs.litellm.ai/docs/proxy/pass_through
Using this method, we managed to bypass the limitation, but this feels like a workaround, not a proper integration.
Why this matters
It looks like we are not the only ones affected — see the last comment on this issue:
#9051
Clearly, there is growing interest in supporting Mistral OCR natively in LiteLLM, without needing to rely on pass-through hacks.
Feature request
👉 Add official support for Mistral OCR in LiteLLM.
👉 Possibly make the /v1/ocr route supported out of the box (similar to /v1/chat/completions).
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.72.0
Twitter / LinkedIn details
No response