-
Notifications
You must be signed in to change notification settings - Fork 162
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request: Add openrouter.ai endpoint support #285
Comments
Hi, for this and all other requests to add a provider, I cannot personally devote time to it. The best solution is to submit a PR for it. Someone did this recently to add together.ai support to ChainForge. The error you are getting suggests it is something with your call that is incorrect. The custom provider shouldn't be the problem here. I would check your Python code and make sure the call works outside of ChainForge first. |
Thank you for your quick feedback! Since I'm not a developer(I can read some easy code but I don't professional), I think I have no ability to submit a PR for it... Actually the reason that I want to use openrouter.ai basically is for two models: The weried thing is that if I specify some models from OpenAI like 「gpt-3.5-turbo」 or 「gpt-4-turbo」 in Openrouter.ai customized provider settings, it worked, ChainForge can successfully response to openrouter.ai correctly. However if I specify the model name like 「google/gemini-1.5-pro」 or 「cohere/command-r-plus」 in Openrouter.ai customized provider settings(.py), it doesn't worked, responsed 400 error. Below is the python code I wrote under GPT 4 assist:
|
Hmm this sounds like it has to do with the "/" in the path. It's certainly a workaround, but you can try changing all slashes to | or something else, then convert them back to slashes in your Python code. It's probably something on CF's end with how custom providers work with the slash notation, that it's cutting off the prefix slashes. |
Let me clarify: So I personally assumed that it might not be the "/" issue? |
Hi,
I have tried adding the openrouter.ai into ChainForge as the custom provider, but it always responsed"Error encountered while calling custom provider function: 400 Client Error: Bad Request for url: https://openrouter.ai/api/v1/chat/completions".
I'm not a developer, I have tried my best to debug this issue with GPT 4 turbo for multiple hours, and it was still not working:(
Could you please consider adding this endpoint like toghether.ai as the native endpoint when you are free?
https://openrouter.ai/docs#quick-start
This endpoint is really popular in the market.
Really appreciate if you can consider this request!
The text was updated successfully, but these errors were encountered: