Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Mistral AI #76

Open
rloutrel opened this issue Jan 15, 2024 · 6 comments
Open

Support Mistral AI #76

rloutrel opened this issue Jan 15, 2024 · 6 comments
Labels
enhancement New feature or request

Comments

@rloutrel
Copy link

rloutrel commented Jan 15, 2024

Describe the feature you'd like to request

I wish I could use Mistral AI instead of Open Ai

Describe the solution you'd like

When configuring the openAI and LocalAI account integration (as Administrator and/or as a normal user), I would like to have the choice of Mistral AI (https://docs.mistral.ai/#api-access).

From what I understood, technically, endpoints and API structure equivalent to the openAI is available for easy port of applications. Ref: https://mistral.ai/news/la-plateforme/

Describe alternatives you've considered

Suggest a "integration_mistralai" project. But I considered, that the user experience (admin AND final user) would be better if all AI integration configurations are grouped (as long as it is 'light' to maintain).

@rloutrel rloutrel added the enhancement New feature or request label Jan 15, 2024
@rloutrel
Copy link
Author

For the discussion, it is a french solution, avoiding the use of a non-EU solution, for customers expecting a better control of their data.
DISCLAIMER: MAYBE the servers of mistral AI are located out of EU (I did not investigate further), but I think it is not the case.

@MB-Finski
Copy link
Contributor

Hmm.. If the API endpoint is fully openAI compatible, it should work already...

@rloutrel
Copy link
Author

Just by providing the API url (like for localAi)?

I will give a try tonight.

@traklo
Copy link

traklo commented Mar 23, 2024

When using Mistral API you need to provide the model name as a part of your request:
https://docs.mistral.ai/#api-access-with-the-mistral-ai-platform

There's no way to add a model via settings/admin/connected-accounts and the ChatGPT-like text generation window (accessed in Text editor via smart picker) does not allow for manual input of the model to use (presumably it would not work anyway).

Anyone had luck tinkering with this?

@thiswillbeyourgithub
Copy link

thiswillbeyourgithub commented Apr 8, 2024

May I suggest instead relying directly on litellm instead of reinventing a part of the wheel. It allows supporting hundreds of models and virtually all providers (including localai!) with a unified syntax: mistral/mistral-large or openai/gpt-4-0125-preview etc

@julien-nc
Copy link
Member

This app can use MistralAI after a small adjustment.
Just setting https://api.mistral.ai as service URL and selecting the "Chat completions" endpoint (the MistralAI API does not support /v1/completions).
The small issue was that integration_openai always adds a 'user' parameter in the chat completion requests which is rejected by the MistralAI API. This was fixed in 545134f which will be included in the next release (v2.0.1) which is coming soon.

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants