-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The ability to use OpenAI embeddings with Groq #120
Comments
Thanks! We will do that |
hi, we did it in this file example! link |
Well, that wasn't the inteded solution but nice workaround haha |
ok, make a pull request and we will do it |
Alright I'll try to implement my solution and will submit a pull request as soon as I get the chance. |
Remember to read the contributions rules :) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hey there! It would be great if we could use OpenAI embeddings (or any other supported API-based embedding models) with Groq (or any other supported llm). With the current way the code is organized, you can only use OpenAI embeddings with OpenAI models. If I want to use Groq as my main llm, I would have to use Ollama, which is ok if you want to run models locally. But I don't want to install models on my local machine, I would prefer to use OpenAI as my embedder service.
One way to add this is to change the way
self.embedder_model
is initialized in theAbstractGraph
class. Currently, bothself.llm_model
andself.embedder_model
are initialized using one method,self._create_llm()
, which kinda makes our options limited. One possible solution is to add another function e.g.self._create_embedder()
and completely separate the logic for initialization of llms and embedder models.The text was updated successfully, but these errors were encountered: