Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support other models #8

Closed
behrica opened this issue Jun 3, 2023 · 8 comments · Fixed by #17
Closed

support other models #8

behrica opened this issue Jun 3, 2023 · 8 comments · Fixed by #17

Comments

@behrica
Copy link
Contributor

behrica commented Jun 3, 2023

I think we should add something to easily use non-openai models.
Main reason being that both OpenAI and AzureOpenAI are not really open or are even paid.

Ideally we should be able to use the same code (using bosquet) and we can easely "swap" the model

@behrica
Copy link
Contributor Author

behrica commented Jun 3, 2023

I think a good way to solve this in bosquet is to allow to "pass" custom complete functions ina template:

{{text}}

What is the name of the licence ?
{% llm-generate  
   model=testtextdavanci003 
   impl=azure
   complete-fn=bosquet.openai/create-completion
   % }

@behrica
Copy link
Contributor Author

behrica commented Jun 3, 2023

Doing this probably means that we do not need "impl" any more, as "different functions" could be created, one for "openAI" and an other for "Azure OpenAI"
This might help to fix #9

@behrica
Copy link
Contributor Author

behrica commented Jun 3, 2023

I tried this out and it looks very clean to me:

Having 2 functions:

(defn azure-openai-create-completion
  "Create completion (not chat) for `prompt` based on model `params` and invocation `opts`"
  [prompt params opts]
  (-> (api/create-completion
       (assoc params :prompt prompt)
       (assoc opts :impl :azure))
      :choices first :text))

(defn openai-create-completion
  "Create completion (not chat) for `prompt` based on model `params` and invocation `opts`"
  [prompt params opts]
  (-> (api/create-completion
       (assoc params :prompt prompt)
       (dissoc opts :impl))
      :choices first :text))

allows ten to switch in teh templates:

  "
What is the name of the licence ?
{% llm-generate  
   model=testtextdavanci003 
   complete-fn=bosquet.openai/openai-create-completion
   % }"

vs

  "
{% llm-generate  
   model=testtextdavanci003 
   complete-fn=bosquet.openai/azure-openai-create-completion
   % }"

Then we could start to add more models, or the user could do himself.
Just implementing a fn with this signature:
[prompt params opts] returning String

@behrica
Copy link
Contributor Author

behrica commented Jun 3, 2023

So "impl" is just an internal flag now, the end user will not see / use it anymore

@zmedelis
Copy link
Owner

zmedelis commented Jun 9, 2023

Adding other models would be great. Maybe as a first step a new library is needed. Just like https://github.com/wkok/openai-clojure but supporting other models. Once this is in place Bosquet can use it as a dependency

@behrica
Copy link
Contributor Author

behrica commented Jun 15, 2023

I am not sure, if a "new library first" is the right step.
This would assume, that we should be able to make a single library,
which abstracts over all LLMs.

But LLMs are very different and have different APIs.

I believe that bosquet should do "something" , so that a user can plugin his own function.
The reason being that bosquet will only support a very small set out of "all possible operation of an LLM",
start by "completion", where

"text" + params go in, and text comes back.
So the user of bosquet would decide what and if he uses an other library or just does http calls to a model completion API himself

@behrica
Copy link
Contributor Author

behrica commented Jun 24, 2023

Ones PR #15 is merged, we can then easely allow to use either "keywords" or a user given fn as the ":impl":

(def azure-open-ai-config
  {
   :impl (fn [prompt args] "the completion")
   :parameter-1 "my-model-key-if-any"
   )

This would allow a user of bosquet to plugin his own function.
(Maybe even stop supporting the keywords and provide functions for OpenAI,
so a user would use a config like this:

(def azure-open-ai-config
  {
   :impl  bosquet.complete/complete-azure-openai

   )

This is then a first step to support other models and would close this concrete issue.

@zmedelis
Copy link
Owner

I tried this out and it looks very clean to me:

Having 2 functions:

(defn azure-openai-create-completion
  "Create completion (not chat) for `prompt` based on model `params` and invocation `opts`"
  [prompt params opts]
  (-> (api/create-completion
       (assoc params :prompt prompt)
       (assoc opts :impl :azure))
      :choices first :text))

(defn openai-create-completion
  "Create completion (not chat) for `prompt` based on model `params` and invocation `opts`"
  [prompt params opts]
  (-> (api/create-completion
       (assoc params :prompt prompt)
       (dissoc opts :impl))
      :choices first :text))

allows ten to switch in teh templates:

  "
What is the name of the licence ?
{% llm-generate  
   model=testtextdavanci003 
   complete-fn=bosquet.openai/openai-create-completion
   % }"

vs

  "
{% llm-generate  
   model=testtextdavanci003 
   complete-fn=bosquet.openai/azure-openai-create-completion
   % }"

Then we could start to add more models, or the user could do himself. Just implementing a fn with this signature: [prompt params opts] returning String

Thats good. We can then build on top of it, maybe adding extra tags wrapping some of the config and specifics of model calls

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants