-
-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support other models #8
Comments
I think a good way to solve this in bosquet is to allow to "pass" custom complete functions ina template:
|
Doing this probably means that we do not need "impl" any more, as "different functions" could be created, one for "openAI" and an other for "Azure OpenAI" |
I tried this out and it looks very clean to me: Having 2 functions:
allows ten to switch in teh templates:
Then we could start to add more models, or the user could do himself. |
So "impl" is just an internal flag now, the end user will not see / use it anymore |
Adding other models would be great. Maybe as a first step a new library is needed. Just like https://github.com/wkok/openai-clojure but supporting other models. Once this is in place Bosquet can use it as a dependency |
I am not sure, if a "new library first" is the right step. But LLMs are very different and have different APIs. I believe that bosquet should do "something" , so that a user can plugin his own function. "text" + params go in, and text comes back. |
Ones PR #15 is merged, we can then easely allow to use either "keywords" or a user given fn as the ":impl":
This would allow a user of bosquet to plugin his own function.
This is then a first step to support other models and would close this concrete issue. |
Thats good. We can then build on top of it, maybe adding extra tags wrapping some of the config and specifics of model calls |
I think we should add something to easily use non-openai models.
Main reason being that both OpenAI and AzureOpenAI are not really open or are even paid.
Ideally we should be able to use the same code (using bosquet) and we can easely "swap" the model
The text was updated successfully, but these errors were encountered: