You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To use Open Interpreter with a model from OpenRouter, set the `model` flag to begin with `openrouter/`:
We can do it ourselves tbh - if you can give us a green light, it'd be done within a couple of days
Describe alternatives you've considered
No response
Additional context
Hi!
I'm from the Integrations team over at AI/ML API
I've pinged you guys a couple of times 'cos I think your product is dope, and we'd love to have a native integration with it.
It seems now that to do such a thing is easier then ever - 'cos the way you add providers is ingenious and very simple
Say you're interested, and we'll test the compatibility, update your docs to include us, and add a tutorial on using OpenInterpreter with AI/ML API to our docs too
Best,
Sergey
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
No response
Describe the solution you'd like
add an entry like the following one to the docs:
open-interpreter/docs/language-models/hosted-models/openrouter.mdx
Line 5 in 21babb1
We can do it ourselves tbh - if you can give us a green light, it'd be done within a couple of days
Describe alternatives you've considered
No response
Additional context
Hi!
I'm from the Integrations team over at AI/ML API
I've pinged you guys a couple of times 'cos I think your product is dope, and we'd love to have a native integration with it.
It seems now that to do such a thing is easier then ever - 'cos the way you add providers is ingenious and very simple
Say you're interested, and we'll test the compatibility, update your docs to include us, and add a tutorial on using OpenInterpreter with AI/ML API to our docs too
Best,
Sergey
The text was updated successfully, but these errors were encountered: